Sample records for producing high-consequence software

  1. Fragment Impact Toolkit (FIT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shevitz, Daniel Wolf; Key, Brian P.; Garcia, Daniel B.

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  2. Canputer Science and Technology: Introduction to Software Packages

    DTIC Science & Technology

    1984-04-01

    Table 5 Sources of Software Packages.20 Table 6 Reference Services Matrix . 33 Table 7 Reference Matrix.40 LIST OF FIGURES Figure 1 Document...consideration should be given to the acquisition of appropriate software packages to replace or upgrade existing services and to provide services not...Consequently, there are many companies that produce only software packages, and are committed to providing training, service , and support. These vendors

  3. Big Science, Small-Budget Space Experiment Package Aka MISSE-5: A Hardware And Software Perspective

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan

    2007-01-01

    Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.

  4. A Hardware and Software Perspective of the Fifth Materials on the International Space Station Experiment (MISSE-5)

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan

    2005-01-01

    Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.

  5. Quality Market: Design and Field Study of Prediction Market for Software Quality Control

    ERIC Educational Resources Information Center

    Krishnamurthy, Janaki

    2010-01-01

    Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…

  6. Separation in Logistic Regression: Causes, Consequences, and Control.

    PubMed

    Mansournia, Mohammad Ali; Geroldinger, Angelika; Greenland, Sander; Heinze, Georg

    2018-04-01

    Separation is encountered in regression models with a discrete outcome (such as logistic regression) where the covariates perfectly predict the outcome. It is most frequent under the same conditions that lead to small-sample and sparse-data bias, such as presence of a rare outcome, rare exposures, highly correlated covariates, or covariates with strong effects. In theory, separation will produce infinite estimates for some coefficients. In practice, however, separation may be unnoticed or mishandled because of software limits in recognizing and handling the problem and in notifying the user. We discuss causes of separation in logistic regression and describe how common software packages deal with it. We then describe methods that remove separation, focusing on the same penalized-likelihood techniques used to address more general sparse-data problems. These methods improve accuracy, avoid software problems, and allow interpretation as Bayesian analyses with weakly informative priors. We discuss likelihood penalties, including some that can be implemented easily with any software package, and their relative advantages and disadvantages. We provide an illustration of ideas and methods using data from a case-control study of contraceptive practices and urinary tract infection.

  7. Cognitive Consequences of Participation in a "Fifth Dimension" After-School Computer Club.

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Quilici, Jill; Moreno, Roxana; Duran, Richard; Woodbridge, Scott; Simon, Rebecca; Sanchez, David; Lavezzo, Amy

    1997-01-01

    Children who attended the Fifth Dimension after-school computer club at least 10 times during the 1994-95 school year performed better on word problem comprehension tests than did non-participating children. Results support the hypothesis that experience in using computer software in the Fifth Dimension club produces measurable, resilient, and…

  8. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  9. Spatio-temporally resolved spectral measurements of laser-produced plasma and semiautomated spectral measurement-control and analysis software

    NASA Astrophysics Data System (ADS)

    Cao, S. Q.; Su, M. G.; Min, Q.; Sun, D. X.; O'Sullivan, G.; Dong, C. Z.

    2018-02-01

    A spatio-temporally resolved spectral measurement system of highly charged ions from laser-produced plasmas is presented. Corresponding semiautomated computer software for measurement control and spectral analysis has been written to achieve the best synchronicity possible among the instruments. This avoids the tedious comparative processes between experimental and theoretical results. To demonstrate the capabilities of this system, a series of spatio-temporally resolved experiments of laser-produced Al plasmas have been performed and applied to benchmark the software. The system is a useful tool for studying the spectral structures of highly charged ions and for evaluating the spatio-temporal evolution of laser-produced plasmas.

  10. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  11. Design and Applications of Rapid Image Tile Producing Software Based on Mosaic Dataset

    NASA Astrophysics Data System (ADS)

    Zha, Z.; Huang, W.; Wang, C.; Tang, D.; Zhu, L.

    2018-04-01

    Map tile technology is widely used in web geographic information services. How to efficiently produce map tiles is key technology for rapid service of images on web. In this paper, a rapid producing software for image tile data based on mosaic dataset is designed, meanwhile, the flow of tile producing is given. Key technologies such as cluster processing, map representation, tile checking, tile conversion and compression in memory are discussed. Accomplished by software development and tested by actual image data, the results show that this software has a high degree of automation, would be able to effectively reducing the number of IO and improve the tile producing efficiency. Moreover, the manual operations would be reduced significantly.

  12. Architecture for interoperable software in biology.

    PubMed

    Bare, James Christopher; Baliga, Nitin S

    2014-07-01

    Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.

  13. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  14. High-school software development project helps increasing students' awareness of geo-hydrological hazards and their risks

    NASA Astrophysics Data System (ADS)

    Marchesini, Ivan; Rossi, Mauro; Balducci, Vinicio; Salvati, Paola; Guzzetti, Fausto; Bianchini, Andrea; Grzeleswki, Emanuell; Canonico, Andrea; Coccia, Rita; Fiorucci, Gianni Mario; Gobbi, Francesca; Ciuchetti, Monica

    2015-04-01

    In Italy, inundation and landslides are widespread phenomena that impact the population and cause significant economic damage to private and public properties. The perception of the risk posed by these natural geo-hydrological hazards varies geographically and in time. The variation in the perception of the risks has negative consequences on risk management, and limits the adoption of effective risk reduction strategies. We maintain that targeted education can foster the understanding of geo-hydrological hazards, improving their perception and the awareness of the associated risk. Collaboration of a research center experienced in geo-hydrological hazards and risks (CNR IRPI, Perugia) and a high school (ITIS Alessandro Volta, Perugia) has resulted in the design and execution of a project aimed at improving the perception of geo-hydrological risks in high school students and teachers through software development. In the two-year project, students, high school teachers and research scientists have jointly developed software broadly related to landslide and flood hazards. User requirements and system specifications were decided to facilitate the distribution and use of the software among students and their peers. This allowed a wider distribution of the project results. We discuss two prototype software developed by the high school students, including an application of augmented reality for improved dissemination of information of landslides and floods with human consequences in Italy, and a crowd science application to allow students (and others, including their families and friends) to collect information on landslide and flood occurrence exploiting modern mobile devices. This information can prove important e.g., for the validation of landslide forecasting models.

  15. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  16. Unintended adverse consequences of a clinical decision support system: two cases.

    PubMed

    Stone, Erin G

    2018-05-01

    Many institutions have implemented clinical decision support systems (CDSSs). While CDSS research papers have focused on benefits of these systems, there is a smaller body of literature showing that CDSSs may also produce unintended adverse consequences (UACs). Detailed here are 2 cases of UACs resulting from a CDSS. Both of these cases were related to external systems that fed data into the CDSS. In the first case, lack of knowledge of data categorization in an external pharmacy system produced a UAC; in the second case, the change of a clinical laboratory instrument produced the UAC. CDSSs rely on data from many external systems. These systems are dynamic and may have changes in hardware, software, vendors, or processes. Such changes can affect the accuracy of CDSSs. These cases point to the need for the CDSS team to be familiar with these external systems. This team (manager and alert builders) should include members in specific clinical specialties with deep knowledge of these external systems.

  17. Future of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  18. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  19. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  20. The Implementation of Satellite Attitude Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek

    1998-01-01

    NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.

  1. A Guideline of Using Case Method in Software Engineering Courses

    ERIC Educational Resources Information Center

    Zainal, Dzulaiha Aryanee Putri; Razali, Rozilawati; Shukur, Zarina

    2014-01-01

    Software Engineering (SE) education has been reported to fall short in producing high quality software engineers. In seeking alternative solutions, Case Method (CM) is regarded as having potential to solve the issue. CM is a teaching and learning (T&L) method that has been found to be effective in Social Science education. In principle,…

  2. Evaluation of an Area-Based matching algorithm with advanced shape models

    NASA Astrophysics Data System (ADS)

    Re, C.; Roncella, R.; Forlani, G.; Cremonese, G.; Naletto, G.

    2014-04-01

    Nowadays, the scientific institutions involved in planetary mapping are working on new strategies to produce accurate high resolution DTMs from space images at planetary scale, usually dealing with extremely large data volumes. From a methodological point of view, despite the introduction of a series of new algorithms for image matching (e.g. the Semi Global Matching) that yield superior results (especially because they produce usually smooth and continuous surfaces) with lower processing times, the preference in this field still goes to well established area-based matching techniques. Many efforts are consequently directed to improve each phase of the photogrammetric process, from image pre-processing to DTM interpolation. In this context, the Dense Matcher software (DM) developed at the University of Parma has been recently optimized to cope with very high resolution images provided by the most recent missions (LROC NAC and HiRISE) focusing the efforts mainly to the improvement of the correlation phase and the process automation. Important changes have been made to the correlation algorithm, still maintaining its high performance in terms of precision and accuracy, by implementing an advanced version of the Least Squares Matching (LSM) algorithm. In particular, an iterative algorithm has been developed to adapt the geometric transformation in image resampling using different shape functions as originally proposed by other authors in different applications.

  3. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  4. Predictive Model and Software for Inbreeding-Purging Analysis of Pedigreed Populations

    PubMed Central

    García-Dorado, Aurora; Wang, Jinliang; López-Cortegano, Eugenio

    2016-01-01

    The inbreeding depression of fitness traits can be a major threat to the survival of populations experiencing inbreeding. However, its accurate prediction requires taking into account the genetic purging induced by inbreeding, which can be achieved using a “purged inbreeding coefficient”. We have developed a method to compute purged inbreeding at the individual level in pedigreed populations with overlapping generations. Furthermore, we derive the inbreeding depression slope for individual logarithmic fitness, which is larger than that for the logarithm of the population fitness average. In addition, we provide a new software, PURGd, based on these theoretical results that allows analyzing pedigree data to detect purging, and to estimate the purging coefficient, which is the parameter necessary to predict the joint consequences of inbreeding and purging. The software also calculates the purged inbreeding coefficient for each individual, as well as standard and ancestral inbreeding. Analysis of simulation data show that this software produces reasonably accurate estimates for the inbreeding depression rate and for the purging coefficient that are useful for predictive purposes. PMID:27605515

  5. Topographic Structure from Motion

    NASA Astrophysics Data System (ADS)

    Fonstad, M. A.; Dietrich, J. T.; Courville, B. C.; Jensen, J.; Carbonneau, P.

    2011-12-01

    The production of high-resolution topographic datasets is of increasing concern and application throughout the geomorphic sciences, and river science is no exception. Consequently, a wide range of topographic measurement methods have evolved. Despite the range of available methods, the production of high resolution, high quality digital elevation models (DEMs) generally requires a significant investment in personnel time, hardware and/or software. However, image-based methods such as digital photogrammetry have steadily been decreasing in costs. Initially developed for the purpose of rapid, inexpensive and easy three dimensional surveys of buildings or small objects, the "structure from motion" photogrammetric approach (SfM) is a purely image based method which could deliver a step-change if transferred to river remote sensing, and requires very little training and is extremely inexpensive. Using the online SfM program Microsoft Photosynth, we have created high-resolution digital elevation models (DEM) of rivers from ordinary photographs produced from a multi-step workflow that takes advantage of free and open source software. This process reconstructs real world scenes from SfM algorithms based on the derived positions of the photographs in three-dimensional space. One of the products of the SfM process is a three-dimensional point cloud of features present in the input photographs. This point cloud can be georeferenced from a small number of ground control points collected via GPS in the field. The georeferenced point cloud can then be used to create a variety of digital elevation model products. Among several study sites, we examine the applicability of SfM in the Pedernales River in Texas (USA), where several hundred images taken from a hand-held helikite are used to produce DEMs of the fluvial topographic environment. This test shows that SfM and low-altitude platforms can produce point clouds with point densities considerably better than airborne LiDAR, with horizontal and vertical precision in the centimeter range, and with very low capital and labor costs and low expertise levels. Advanced structure from motion software (such as Bundler and OpenSynther) are currently under development and should increase the density of topographic points rivaling those of terrestrial laser scanning when using images shot from low altitude platforms such as helikites, poles, remote-controlled aircraft and rotocraft, and low-flying manned aircraft. Clearly, the development of this set of inexpensive and low-required-expertise tools has the potential to fundamentally shift the production of digital fluvial topography from a capital-intensive enterprise of a low number of researchers to a low-cost exercise of many river researchers.

  6. High community faecal carriage rates of CTX-M ESBL-producing Escherichia coli in a specific population group in Birmingham, UK.

    PubMed

    Wickramasinghe, Nimal H; Xu, Li; Eustace, Andrew; Shabir, Sahida; Saluja, Tranprit; Hawkey, Peter M

    2012-05-01

    To determine the proportion of E. coli carrying specific CTX-M extended-spectrum β-lactamase (ESBL) genotypes in a community population of East and North Birmingham. General practice and outpatient stool samples from 732 individuals submitted for examination for faecal pathogens in 2010 were screened for ESBL-producing E. coli using chromogenic agar. Multiplex PCR, denaturing HPLC, DNA sequencing and PFGE were used to determine the CTX-M genotype and clonal subtype. Isolates from people were assigned to 'Europe', 'Middle East/South Asia' (MESA) or 'uncategorized' groups using software to determine probable global origin based on the subject's full name. Prevalence of CTX-M carriage in the sample population was 11.3%. There was a statistically significant difference (P < 0.001) between carriage in the Europe group (8.1%) and the MESA group (22.8%). There was also a higher rate of carriage of CTX-M-15-producing E. coli (P < 0.001) in MESA subjects. The high community carriage rate and the significant difference in carriage between the Europe and MESA subjects may have important consequences for therapy. If the rising trend in carriage of bacteria producing ESBLs continues, guidelines for empirical therapy for patients presenting from the community may need to be modified. The findings also raise the concern that the pattern and routes of spread of CTX-M-15 may be replicated in the future by broader-spectrum β-lactamases, such as New Delhi metallo-β-lactamase ('NDM-1').

  7. [Three-dimensional 3D modeling: First applications in radioanatomy and interventional radiology under CT guidance].

    PubMed

    Aubry, S; Pousse, A; Sarliève, P; Laborie, L; Delabrousse, E; Kastler, B

    2006-11-01

    To model vertebrae in 3D to improve radioanatomic knowledge of the spine with the vascular and nerve environment and simulate CT-guided interventions. Vertebra acquisitions were made with multidetector CT. We developed segmentation software and specific viewer software using the Delphi programming environment. This segmentation software makes it possible to model 3D high-resolution segments of vertebrae and their environment from multidetector CT acquisitions. Then the specific viewer software provides multiplanar reconstructions of the CT volume and the possibility to select different 3D objects of interest. This software package improves radiologists' radioanatomic knowledge through a new 3D anatomy presentation. Furthermore, the possibility of inserting virtual 3D objects in the volume can simulate CT-guided intervention. The first volumetric radioanatomic software has been born. Furthermore, it simulates CT-guided intervention and consequently has the potential to facilitate learning interventions using CT guidance.

  8. System support software for the Space Ultrareliable Modular Computer (SUMC)

    NASA Technical Reports Server (NTRS)

    Hill, T. E.; Hintze, G. C.; Hodges, B. C.; Austin, F. A.; Buckles, B. P.; Curran, R. T.; Lackey, J. D.; Payne, R. E.

    1974-01-01

    The highly transportable programming system designed and implemented to support the development of software for the Space Ultrareliable Modular Computer (SUMC) is described. The SUMC system support software consists of program modules called processors. The initial set of processors consists of the supervisor, the general purpose assembler for SUMC instruction and microcode input, linkage editors, an instruction level simulator, a microcode grid print processor, and user oriented utility programs. A FORTRAN 4 compiler is undergoing development. The design facilitates the addition of new processors with a minimum effort and provides the user quasi host independence on the ground based operational software development computer. Additional capability is provided to accommodate variations in the SUMC architecture without consequent major modifications in the initial processors.

  9. Misalignment of disposable pulse oximeter probes results in false saturation readings that influence anesthetic management.

    PubMed

    Guan, Zhonghui; Baker, Keith; Sandberg, Warren S

    2009-11-01

    We report a small case series in which misaligned disposable pulse oximeter sensors gave falsely low saturation readings. In each instance, the sensor performed well during preinduction oxygen administration and the early part of the case, most notably by producing a plethysmographic trace rated as high quality by the oximeter software. The reported pulse oximeter oxygen saturation eventually decreased to concerning levels in each instance, but the anesthesiologists, relying on the reported high-quality signal, initially sought other causes for apparent hypoxia. They undertook maneuvers and diagnostic procedures later deemed unnecessary. When the malpositioned sensors were discovered and repositioned, the apparent hypoxia was quickly relieved in each case. We then undertook a survey of disposable oximeter sensors as patients entered the recovery room, and discovered malposition of more than 1 cm in approximately 20% of all sensors, without apparent consequence. We conclude that the technology is quite robust, but that the diagnosis of apparent hypoxia should include a quick check of oximeter position early on.

  10. Coordinating the Complexity of Tools, Tasks, and Users: On Theory-Based Approaches to Authoring Tool Usability

    ERIC Educational Resources Information Center

    Murray, Tom

    2016-01-01

    Intelligent Tutoring Systems authoring tools are highly complex educational software applications used to produce highly complex software applications (i.e. ITSs). How should our assumptions about the target users (authors) impact the design of authoring tools? In this article I first reflect on the factors leading to my original 1999 article on…

  11. Development of instructional, interactive, multimedia anatomy dissection software: a student-led initiative.

    PubMed

    Inwood, Matthew J; Ahmad, Jamil

    2005-11-01

    Although dissection provides an unparalleled means of teaching gross anatomy, it constitutes a significant logistical and financial investment for educational institutions. The increasing availability and waning cost of computer equipment has enabled many institutions to supplement their anatomy curriculum with Computer Aided Learning (CAL) software. At the Royal College of Surgeons in Ireland, two undergraduate medical students designed and produced instructional anatomy dissection software for use by first and second year medical students. The software consists of full-motion, narrated, QuickTime MPG movies presented in a Macromedia environment. Forty-four movies, between 1-11 min in duration, were produced. Each movie corresponds to a dissection class and precisely demonstrates the dissection and educational objectives for that class. The software is distributed to students free of charge and they are encouraged to install it on their Apple iBook computers. Results of a student evaluation indicated that the software was useful, easy to use, and improved the students' experience in the dissection classes. The evaluation also indicated that only a minority of students regularly used the software or had it installed on their laptop computers. Accordingly, effort should also be directed toward making the software more accessible and increasing students' comfort and familiarity with novel instructional media. The successful design and implementation of this software demonstrates that CAL software can be employed to augment, enhance and improve anatomy instruction. In addition, effective, high quality, instructional multimedia software can be tailored to an educational institution's requirements and produced by novice programmers at minimal cost. Copyright 2005 Wiley-Liss, Inc

  12. Ensuring critical event sequences in high consequence computer based systems as inspired by path expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kidd, M.E.C.

    1997-02-01

    The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.

  13. Ethical education in software engineering: responsibility in the production of complex systems.

    PubMed

    Génova, Gonzalo; González, M Rosario; Fraga, Anabel

    2007-12-01

    Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics.

  14. Control-structure-thermal interactions in analysis of lunar telescopes

    NASA Technical Reports Server (NTRS)

    Thompson, Roger C.

    1992-01-01

    The lunar telescope project was an excellent model for the CSTI study because a telescope is a very sensitive instrument, and thermal expansion or mechanical vibration of the mirror assemblies will rapidly degrade the resolution of the device. Consequently, the interactions are strongly coupled. The lunar surface experiences very large temperature variations that range from approximately -180 C to over 100 C. Although the optical assemblies of the telescopes will be well insulated, the temperature of the mirrors will inevitably fluctuate in a similar cycle, but of much smaller magnitude. In order to obtain images of high quality and clarity, allowable thermal deformations of any point on a mirror must be less than 1 micron. Initial estimates indicate that this corresponds to a temperature variation of much less than 1 deg through the thickness of the mirror. Therefore, a lunar telescope design will most probably include active thermal control, a means of controlling the shape of the mirrors, or a combination of both systems. Historically, the design of a complex vehicle was primarily a sequential process in which the basic structure was defined without concurrent detailed analyses or other subsystems. The basic configuration was then passed to the different teams responsible for each subsystem, and their task was to produce a workable solution without requiring major alterations to any principal components or subsystems. Consequently, the final design of the vehicle was not always the most efficient, owing to the fact that each subsystem design was partially constrained by the previous work. This procedure was necessary at the time because the analysis process was extremely time-consuming and had to be started over with each significant alteration of the vehicle. With recent advances in the power and capacity of small computers, and the parallel development of powerful software in structural, thermal, and control system analysis, it is now possible to produce very detailed analyses of intermediate designs in a much shorter period of time. The subsystems can thus be designed concurrently, and alterations in the overall design can be quickly adopted into each analysis; the design becomes an iterative process in which it is much easier to experiment with new ideas, configurations, and components. Concurrent engineering has the potential to produce efficient, highly capable designs because the effect of one subystem on another can be assessed in much more detail at a very early point in the program. The research program consisted of several tasks: scale a prototype telescope assembly to a 1 m aperture, develop a model of the telescope assembly by using finite element (FEM) codes that are available on site, determine structural deflections of the mirror surfaces due to the temperature variations, develop a prototype control system to maintain the proper shape of the optical elements, and most important of all, demonstrate the concurrent engineering approach with this example. In addition, the software used for the finite element models and thermal analysis was relatively new within the Program Development Office and had yet to be applied to systems this large or complex; understanding the software and modifying it for use with this project was also required. The I-DEAS software by Structural Dynamics Research Corporation (SDRC) was used to build the finite element models, and TMG developed by Maya Heat Transfer Technologies, Ltd. (which runs as an I-DEAS module) was used for the thermal model calculations. All control system development was accomplished with MATRIX(sub X) by Integrated Systems, Inc.

  15. Prototyping with Data Dictionaries for Requirements Analysis.

    DTIC Science & Technology

    1985-03-01

    statistical packages and software for screen layout. These items work at a higher level than another category of prototyping tool, program generators... Program generators are software packages which, when given specifications, produce source listings, usually in a high order language such as COBCL...with users and this will not happen if he must stop to develcp a detailed program . [Ref. 241] Hardware as well as software should be considered in

  16. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  17. Kepler Planet Detection Metrics: Statistical Bootstrap Test

    NASA Technical Reports Server (NTRS)

    Jenkins, Jon M.; Burke, Christopher J.

    2016-01-01

    This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.

  18. A Trophic Model of a Sandy Barrier Lagoon at Chiku in Southwestern Taiwan

    NASA Astrophysics Data System (ADS)

    Lin, H.-J.; Shao, K.-T.; Kuo, S.-R.; Hsieh, H.-L.; Wong, S.-L.; Chen, I.-M.; Lo, W.-T.; Hung, J.-J.

    1999-05-01

    Using the ECOPATH 3.0 software system, a balanced trophic model of a sandy barrier lagoon with intensive fishery activities at Chiku in tropical Taiwan was constructed. The lagoon model comprised 13 compartments. Trophic levels of the compartments varied from 1·0 for primary producers and detritus to 3·6 for piscivorous fish. Hanging-cultured oysters accounted for 39% of the harvestable fishery biomass and were the most important fishery species. The most prominent group in terms of biomass and energy flow in the lagoon was herbivorous zooplankton. Manipulations of the biomass of herbivorous zooplankton would have a marked impact on most compartments. Both total system throughput and fishery yield per unit area were high when compared to other reported marine ecosystems. This appears mainly due to high planktonic primary production, which is probably promoted by enriched river discharges draining mangroves and aquaculture ponds. Consequently, more than half of the total system throughput originates from primary producers in the lagoon. Although half of the primary production was not immediately used by upper trophic levels and flowed into the detrital pool, most of the detritus was directly consumed, passed up the food web and was exported to the fishery. Thus only a small proportion of energy was recycled through detritus pathways. This mechanism produces short pathways with high trophic efficiencies at higher trophic levels. The high fishery yield in the lagoon is due to high primary production and short pathways. This is the first model of a tropical sandy barrier lagoon with intensive fishery activities and thus may serve as a basis for future comparisons and ecosystem management.

  19. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  20. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  1. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  2. On the release of cppxfel for processing X-ray free-electron laser images.

    PubMed

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K; Stuart, David Ian

    2016-06-01

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Here cppxfel , a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set. Cppxfel is released with the hope that the unique and useful elements of this package can be repurposed for existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.

  3. On the release of cppxfel for processing X-ray free-electron laser images

    DOE PAGES

    Ginn, Helen Mary; Evans, Gwyndaf; Sauter, Nicholas K.; ...

    2016-05-11

    As serial femtosecond crystallography expands towards a variety of delivery methods, including chip-based methods, and smaller collected data sets, the requirement to optimize the data analysis to produce maximum structure quality is becoming increasingly pressing. Herecppxfel, a software package primarily written in C++, which showcases several data analysis techniques, is released. This software package presently indexes images using DIALS (diffraction integration for advanced light sources) and performs an initial orientation matrix refinement, followed by post-refinement of individual images against a reference data set.Cppxfelis released with the hope that the unique and useful elements of this package can be repurposed formore » existing software packages. However, as released, it produces high-quality crystal structures and is therefore likely to be also useful to experienced users of X-ray free-electron laser (XFEL) software who wish to maximize the information extracted from a limited number of XFEL images.« less

  4. Junior High Publications: Junior High School Staff Members Master Same Desktop Publishing as High School Counterparts.

    ERIC Educational Resources Information Center

    Pyle, Betty; Cangelosi, Sandy

    1988-01-01

    Argues that middle and junior high schools can produce professional looking student publications by using desktop publishing. Presents three newspaper pages designed with the Apple Macintosh, using "Pagemaker,""Cricket Draw," and "Microsoft Word" software. (MM)

  5. Design Aids for Real-Time Systems (DARTS)

    NASA Technical Reports Server (NTRS)

    Szulewski, P. A.

    1982-01-01

    Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.

  6. Chemical variability along the value chains of turmeric (Curcuma longa): a comparison of nuclear magnetic resonance spectroscopy and high performance thin layer chromatography.

    PubMed

    Booker, Anthony; Frommenwiler, Debora; Johnston, Deborah; Umealajekwu, Chinenye; Reich, Eike; Heinrich, Michael

    2014-03-14

    Herbal medicine value chains have generally been overlooked compared with food commodities. Not surprisingly, revenue generation tends to be weighted towards the end of the chain and consequently the farmers and producers are the lowest paid beneficiaries. Value chains have an impact both on the livelihood of producers and on the composition and quality of products commonly sold locally and globally and consequently on the consumers. In order to understand the impact of value chains on the composition of products, we studied the production conditions for turmeric (Curcuma longa) and the metabolomic composition of products derived from it. We aimed at integrating these two components in order to gain a better understanding of the effect of different value chains on the livelihoods of some producers. This interdisciplinary project uses a mixed methods approach. Case studies were undertaken on two separate sites in India. Data was initially gathered on herbal medicine value chains by means of semi-structured interviews and non-participant observations. Samples were collected from locations in India, Europe and the USA and analysed using (1)H NMR spectroscopy coupled with multivariate analysis software and with high performance thin layer chromatography (HPTLC). We investigate medicinal plant value chains and interpret the impact different value chains have on some aspects of the livelihoods of producers in India and, for the first time, analytically assess the chemical variability and quality implications that different value chains may have on the products available to end users in Europe. There are benefits to farmers that belonged to an integrated chain and the resulting products were subject to a higher standard of processing and storage. By using analytical methods, including HPTLC and (1)H NMR spectroscopy, it has been possible to correlate some variations in product composition for selected producers and identify strengths and weaknesses of some types of value chains. The two analytical techniques provide different and complementary data and together they can be used to effectively differentiate between a wide variety of crude drug powders and herbal medicinal products. This project demonstrates that there is a need to study the links between producers and consumers of commodities produced in so-called 'provider countries' and that metabolomics offer a novel way of assessing the chemical variability along a value chain. This also has implications for understanding the impact this has on the livelihood of those along the value chain. Copyright © 2014 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. Formal Validation of Aerospace Software

    NASA Astrophysics Data System (ADS)

    Lesens, David; Moy, Yannick; Kanig, Johannes

    2013-08-01

    Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.

  8. Would Boys and Girls Benefit from Gender-Specific Educational Software?

    ERIC Educational Resources Information Center

    Luik, Piret

    2011-01-01

    Most boys and girls interact differently with educational software and have different preferences for the design of educational software. The question is whether the usage of educational software has the same consequences for both genders. This paper investigates the characteristics of drill-and-practice programmes or drills that are efficient for…

  9. VennDIS: a JavaFX-based Venn and Euler diagram software to generate publication quality figures.

    PubMed

    Ignatchenko, Vladimir; Ignatchenko, Alexandr; Sinha, Ankit; Boutros, Paul C; Kislinger, Thomas

    2015-04-01

    Venn diagrams are graphical representations of the relationships among multiple sets of objects and are often used to illustrate similarities and differences among genomic and proteomic datasets. All currently existing tools for producing Venn diagrams evince one of two traits; they require expertise in specific statistical software packages (such as R), or lack the flexibility required to produce publication-quality figures. We describe a simple tool that addresses both shortcomings, Venn Diagram Interactive Software (VennDIS), a JavaFX-based solution for producing highly customizable, publication-quality Venn, and Euler diagrams of up to five sets. The strengths of VennDIS are its simple graphical user interface and its large array of customization options, including the ability to modify attributes such as font, style and position of the labels, background color, size of the circle/ellipse, and outline color. It is platform independent and provides real-time visualization of figure modifications. The created figures can be saved as XML files for future modification or exported as high-resolution images for direct use in publications. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. High-Resolution C-Arm CT and Metal Artifact Reduction Software: A Novel Imaging Modality for Analyzing Aneurysms Treated with Stent-Assisted Coil Embolization.

    PubMed

    Yuki, I; Kambayashi, Y; Ikemura, A; Abe, Y; Kan, I; Mohamed, A; Dahmani, C; Suzuki, T; Ishibashi, T; Takao, H; Urashima, M; Murayama, Y

    2016-02-01

    Combination of high-resolution C-arm CT and novel metal artifact reduction software may contribute to the assessment of aneurysms treated with stent-assisted coil embolization. This study aimed to evaluate the efficacy of a novel Metal Artifact Reduction prototype software combined with the currently available high spatial-resolution C-arm CT prototype implementation by using an experimental aneurysm model treated with stent-assisted coil embolization. Eight experimental aneurysms were created in 6 swine. Coil embolization of each aneurysm was performed by using a stent-assisted technique. High-resolution C-arm CT with intra-arterial contrast injection was performed immediately after the treatment. The obtained images were processed with Metal Artifact Reduction. Five neurointerventional specialists reviewed the image quality before and after Metal Artifact Reduction. Observational and quantitative analyses (via image analysis software) were performed. Every aneurysm was successfully created and treated with stent-assisted coil embolization. Before Metal Artifact Reduction, coil loops protruding through the stent lumen were not visualized due to the prominent metal artifacts produced by the coils. These became visible after Metal Artifact Reduction processing. Contrast filling in the residual aneurysm was also visualized after Metal Artifact Reduction in every aneurysm. Both the observational (P < .0001) and quantitative (P < .001) analyses showed significant reduction of the metal artifacts after application of the Metal Artifact Reduction prototype software. The combination of high-resolution C-arm CT and Metal Artifact Reduction enables differentiation of the coil mass, stent, and contrast material on the same image by significantly reducing the metal artifacts produced by the platinum coils. This novel image technique may improve the assessment of aneurysms treated with stent-assisted coil embolization. © 2016 by American Journal of Neuroradiology.

  11. Automated identification of retained surgical items in radiological images

    NASA Astrophysics Data System (ADS)

    Agam, Gady; Gan, Lin; Moric, Mario; Gluncic, Vicko

    2015-03-01

    Retained surgical items (RSIs) in patients is a major operating room (OR) patient safety concern. An RSI is any surgical tool, sponge, needle or other item inadvertently left in a patients body during the course of surgery. If left undetected, RSIs may lead to serious negative health consequences such as sepsis, internal bleeding, and even death. To help physicians efficiently and effectively detect RSIs, we are developing computer-aided detection (CADe) software for X-ray (XR) image analysis, utilizing large amounts of currently available image data to produce a clinically effective RSI detection system. Physician analysis of XRs for the purpose of RSI detection is a relatively lengthy process that may take up to 45 minutes to complete. It is also error prone due to the relatively low acuity of the human eye for RSIs in XR images. The system we are developing is based on computer vision and machine learning algorithms. We address the problem of low incidence by proposing synthesis algorithms. The CADe software we are developing may be integrated into a picture archiving and communication system (PACS), be implemented as a stand-alone software application, or be integrated into portable XR machine software through application programming interfaces. Preliminary experimental results on actual XR images demonstrate the effectiveness of the proposed approach.

  12. Integration of LCoS-SLM and LabVIEW based software to simulate fundamental optics, wave optics, and Fourier optics

    NASA Astrophysics Data System (ADS)

    Lyu, Bo-Han; Wang, Chen; Tsai, Chun-Wei

    2017-08-01

    Jasper Display Corp. (JDC) offer high reflectivity, high resolution Liquid Crystal on Silicon - Spatial Light Modulator (LCoS-SLM) which include an associated controller ASIC and LabVIEW based modulation software. Based on this LCoS-SLM, also called Education Kit (EDK), we provide a training platform which includes a series of optical theory and experiments to university students. This EDK not only provides a LabVIEW based operation software to produce Computer Generated Holograms (CGH) to generate some basic diffraction image or holographic image, but also provides simulation software to verity the experiment results simultaneously. However, we believe that a robust LCoSSLM, operation software, simulation software, training system, and training course can help students to study the fundamental optics, wave optics, and Fourier optics more easily. Based on these fundamental knowledges, they could develop their unique skills and create their new innovations on the optoelectronic application in the future.

  13. gr-MRI: A software package for magnetic resonance imaging using software defined radios

    NASA Astrophysics Data System (ADS)

    Hasselwander, Christopher J.; Cao, Zhipeng; Grissom, William A.

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5 Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately 2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500 kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs.

  14. Recoding low-level simulator data into a record of meaningful task performance: the integrated task modeling environment (ITME).

    PubMed

    King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick

    2007-11-01

    The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.

  15. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  16. The Legacy of Space Shuttle Flight Software

    NASA Technical Reports Server (NTRS)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  17. The NCC project: A quality management perspective

    NASA Technical Reports Server (NTRS)

    Lee, Raymond H.

    1993-01-01

    The Network Control Center (NCC) Project introduced the concept of total quality management (TQM) in mid-1990. The CSC project team established a program which focused on continuous process improvement in software development methodology and consistent deliveries of high quality software products for the NCC. The vision of the TQM program was to produce error free software. Specific goals were established to allow continuing assessment of the progress toward meeting the overall quality objectives. The total quality environment, now a part of the NCC Project culture, has become the foundation for continuous process improvement and has resulted in the consistent delivery of quality software products over the last three years.

  18. Tactical Approaches for Making a Successful Satellite Passive Microwave ESDR

    NASA Astrophysics Data System (ADS)

    Hardman, M.; Brodzik, M. J.; Gotberg, J.; Long, D. G.; Paget, A. C.

    2014-12-01

    Our NASA MEaSUREs project is producing a new, enhanced resolution gridded Earth System Data Record for the entire satellite passive microwave (SMMR, SSM/I-SSMIS and AMSR-E) time series. Our project goals are twofold: to produce a well-documented, consistently processed, high-quality historical record at higher spatial resolutions than have previously been available, and to transition the production software to the NSIDC DAAC for ongoing processing after our project completion. In support of these goals, our distributed team at BYU and NSIDC faces project coordination challenges to produce a high-quality data set that our user community will accept as a replacement for the currently available historical versions of these data. We work closely with our DAAC liaison on format specifications, data and metadata plans, and project progress. In order for the user community to understand and support our project, we have solicited a team of Early Adopters who are reviewing and evaluating a prototype version of the data. Early Adopter feedback will be critical input to our final data content and format decisions. For algorithm transparency and accountability, we have released an Algorithm Theoretical Basis Document (ATBD) and detailed supporting technical documentation, with rationale for all algorithm implementation decisions. For distributed team management, we are using collaborative tools for software revision control and issue tracking. For reliably transitioning a research-quality image reconstruction software system to production-quality software suitable for use at the DAAC, we have adopted continuous integration methods for running automated regression testing. Our presentation will summarize bothadvantages and challenges of each of these tactics in ensuring production of a successful ESDR and an enduring production software system.

  19. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  20. PyNN: A Common Interface for Neuronal Network Simulators.

    PubMed

    Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.

  1. PyNN: A Common Interface for Neuronal Network Simulators

    PubMed Central

    Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  2. Cooperative GN&C development in a rapid prototyping environment. [flight software design for space vehicles

    NASA Technical Reports Server (NTRS)

    Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim

    1993-01-01

    The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.

  3. The Implementation of Satellite Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Anderson, Mark O.; Reid, Mark; Drury, Derek; Hansell, William; Phillips, Tom

    1998-01-01

    NASA established the Small Explorer (SMEX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions that can be launched into low earth orbit by small expendable vehicles. The development schedule for each SMEX spacecraft was three years from start to launch. The SMEX program has produced five satellites; Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX), Fast Auroral Snapshot Explorer (FAST), Submillimeter Wave Astronomy Satellite (SWAS), Transition Region and Coronal Explorer (TRACE) and Wide-Field Infrared Explorer (WIRE). SAMPEX and FAST are on-orbit, TRACE is scheduled to be launched in April of 1998, WIRE is scheduled to be launched in September of 1998, and SWAS is scheduled to be launched in January of 1999. In each of these missions, the Attitude Control System (ACS) software was written using a modular procedural design. Current program goals require complete spacecraft development within 18 months. This requirement has increased pressure to write reusable flight software. Object-Oriented Design (OOD) offers the constructs for developing an application that only needs modification for mission unique requirements. This paper describes the OOD that was used to develop the SMEX-Lite ACS software. The SMEX-Lite ACS is three-axis controlled, momentum stabilized, and is capable of performing sub-arc-minute pointing. The paper first describes the high level requirements which governed the architecture of the SMEX-Lite ACS software. Next, the context in which the software resides is explained. The paper describes the benefits of encapsulation, inheritance and polymorphism with respect to the implementation of an ACS software system. This paper will discuss the design of several software components that comprise the ACS software. Specifically, Object-Oriented designs are presented for sensor data processing, attitude control, attitude determination and failure detection. The paper addresses the benefits of the OOD versus a conventional procedural design. The final discussion in this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects, saving production time and costs.

  4. 15 CFR 740.9 - Temporary imports, exports, and reexports (TMP).

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... the end of the beta test period as defined by the software producer or, if the software producer does... States; and exports and reexports of beta test software. (a) Temporary exports and reexports—(1) Scope. You may export and reexport commodities and software for temporary use abroad (including use in...

  5. 15 CFR 740.9 - Temporary imports, exports, reexports, and transfers (in-country) (TMP).

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... commodities and software may be placed in a bonded warehouse or a storage facility provided that the exporter... the end of the beta test period as defined by the software producer or, if the software producer does... software. (a) Temporary exports, reexports, and transfers (in-country). License Exception TMP authorizes...

  6. 15 CFR 740.9 - Temporary imports, exports, and reexports (TMP).

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the end of the beta test period as defined by the software producer or, if the software producer does... States; and exports and reexports of beta test software. (a) Temporary exports and reexports—(1) Scope. You may export and reexport commodities and software for temporary use abroad (including use in...

  7. Learning to Write Programs with Others: Collaborative Quadruple Programming

    ERIC Educational Resources Information Center

    Arora, Ritu; Goel, Sanjay

    2012-01-01

    Most software development is carried out by teams of software engineers working collaboratively to achieve the desired goal. Consequently software development education not only needs to develop a student's ability to write programs that can be easily comprehended by others and be able to comprehend programs written by others, but also the ability…

  8. Checklists for the Evaluation of Educational Software: Critical Review and Prospects.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1998-01-01

    Reviews strengths and weaknesses of check lists for the evaluation of computer software and outlines consequences for their practical application. Suggests an approach based on an instructional design model and a comprehensive framework to cope with problems of validity and predictive power of software evaluation. Discusses prospects of the…

  9. Secure it now or secure it later: the benefits of addressing cyber-security from the outset

    NASA Astrophysics Data System (ADS)

    Olama, Mohammed M.; Nutaro, James

    2013-05-01

    The majority of funding for research and development (R&D) in cyber-security is focused on the end of the software lifecycle where systems have been deployed or are nearing deployment. Recruiting of cyber-security personnel is similarly focused on end-of-life expertise. By emphasizing cyber-security at these late stages, security problems are found and corrected when it is most expensive to do so, thus increasing the cost of owning and operating complex software systems. Worse, expenditures on expensive security measures often mean less money for innovative developments. These unwanted increases in cost and potential slowing of innovation are unavoidable consequences of an approach to security that finds and remediate faults after software has been implemented. We argue that software security can be improved and the total cost of a software system can be substantially reduced by an appropriate allocation of resources to the early stages of a software project. By adopting a similar allocation of R&D funds to the early stages of the software lifecycle, we propose that the costs of cyber-security can be better controlled and, consequently, the positive effects of this R&D on industry will be much more pronounced.

  10. MapEdit: solution to continuous raster map creation

    NASA Astrophysics Data System (ADS)

    Rančić, Dejan; Djordjevi-Kajan, Slobodanka

    2003-03-01

    The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.

  11. iStethoscope: a demonstration of the use of mobile devices for auscultation.

    PubMed

    Bentley, Peter J

    2015-01-01

    iStethoscope Pro is the first piece of software (an "App") produced for iOS devices, which enabled users to exploit their smartphones, music players, or tablets as stethoscopes. The software exploits the built-in microphone (and supports externally added microphones) and performs real-time amplification and filtering to enable heart sounds to be heard with high fidelity. The software also enables the heart sounds to be recorded, analyzed using a spectrogram, and to be transmitted to others via e-mail. This chapter describes the motivation, functionality, and results from this work.

  12. Consideration of Collision "Consequence" in Satellite Conjunction Assessment and Risk Analysis

    NASA Technical Reports Server (NTRS)

    Hejduk, M.; Laporte, F.; Moury, M.; Newman, L.; Shepperd, R.

    2017-01-01

    Classic risk management theory requires the assessment of both likelihood and consequence of deleterious events. Satellite conjunction risk assessment has produced a highly-developed theory for assessing collision likelihood but holds a completely static solution for collision consequence, treating all potential collisions as essentially equally worrisome. This may be true for the survival of the protected asset, but the amount of debris produced by the potential collision, and therefore the degree to which the orbital corridor may be compromised, can vary greatly among satellite conjunctions. This study leverages present work on satellite collision modeling to develop a method by which it can be estimated, to a particular confidence level, whether a particular collision is likely to produce a relatively large or relatively small amount of resultant debris and how this datum might alter conjunction remediation decisions. The more general question of orbital corridor protection is also addressed, and a preliminary framework presented by which both collision likelihood and consequence can be jointly considered in the risk assessment process.

  13. gr-MRI: A software package for magnetic resonance imaging using software defined radios.

    PubMed

    Hasselwander, Christopher J; Cao, Zhipeng; Grissom, William A

    2016-09-01

    The goal of this work is to develop software that enables the rapid implementation of custom MRI spectrometers using commercially-available software defined radios (SDRs). The developed gr-MRI software package comprises a set of Python scripts, flowgraphs, and signal generation and recording blocks for GNU Radio, an open-source SDR software package that is widely used in communications research. gr-MRI implements basic event sequencing functionality, and tools for system calibrations, multi-radio synchronization, and MR signal processing and image reconstruction. It includes four pulse sequences: a single-pulse sequence to record free induction signals, a gradient-recalled echo imaging sequence, a spin echo imaging sequence, and an inversion recovery spin echo imaging sequence. The sequences were used to perform phantom imaging scans with a 0.5Tesla tabletop MRI scanner and two commercially-available SDRs. One SDR was used for RF excitation and reception, and the other for gradient pulse generation. The total SDR hardware cost was approximately $2000. The frequency of radio desynchronization events and the frequency with which the software recovered from those events was also measured, and the SDR's ability to generate frequency-swept RF waveforms was validated and compared to the scanner's commercial spectrometer. The spin echo images geometrically matched those acquired using the commercial spectrometer, with no unexpected distortions. Desynchronization events were more likely to occur at the very beginning of an imaging scan, but were nearly eliminated if the user invoked the sequence for a short period before beginning data recording. The SDR produced a 500kHz bandwidth frequency-swept pulse with high fidelity, while the commercial spectrometer produced a waveform with large frequency spike errors. In conclusion, the developed gr-MRI software can be used to develop high-fidelity, low-cost custom MRI spectrometers using commercially-available SDRs. Copyright © 2016. Published by Elsevier Inc.

  14. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  15. Pure climb creep mechanism drives flow in Earth’s lower mantle

    PubMed Central

    Boioli, Francesca; Carrez, Philippe; Cordier, Patrick; Devincre, Benoit; Gouriet, Karine; Hirel, Pierre; Kraych, Antoine; Ritterbex, Sebastian

    2017-01-01

    At high pressure prevailing in the lower mantle, lattice friction opposed to dislocation glide becomes very high, as reported in recent experimental and theoretical studies. We examine the consequences of this high resistance to plastic shear exhibited by ringwoodite and bridgmanite on creep mechanisms under mantle conditions. To evaluate the consequences of this effect, we model dislocation creep by dislocation dynamics. The calculation yields to an original dominant creep behavior for lower mantle silicates where strain is produced by dislocation climb, which is very different from what can be activated under high stresses under laboratory conditions. This mechanism, named pure climb creep, is grain-size–insensitive and produces no crystal preferred orientation. In comparison to the previous considered diffusion creep mechanism, it is also a more efficient strain-producing mechanism for grain sizes larger than ca. 0.1 mm. The specificities of pure climb creep well match the seismic anisotropy observed of Earth’s lower mantle. PMID:28345037

  16. Development of an accurate and high-throughput methodology for structural comprehension of chlorophylls derivatives. (I) Phytylated derivatives.

    PubMed

    Chen, Kewei; Ríos, José Julián; Pérez-Gálvez, Antonio; Roca, María

    2015-08-07

    Phytylated chlorophyll derivatives undergo specific oxidative reactions through the natural metabolism or during food processing or storage, and consequently pyro-, 13(2)-hydroxy-, 15(1)-hydroxy-lactone chlorophylls, and pheophytins (a and b) are originated. New analytical procedures have been developed here to reproduce controlled oxidation reactions that specifically, and in reasonable amounts, produce those natural target standards. At the same time and under the same conditions, 16 natural chlorophyll derivatives have been analyzed by APCI-HPLC-hrMS(2) and most of them by the first time. The combination of the high-resolution MS mode with powerful post-processing software has allowed the identification of new fragmentation patterns, characterizing specific product ions for some particular standards. In addition, new hypotheses and reaction mechanisms for the established MS(2)-based reactions have been proposed. As a general rule, the main product ions involve the phytyl and the propionic chains but the introduction of oxygenated functional groups at the isocyclic ring produces new and specific productions and at the same time inhibits some particular fragmentations. It is noteworthy that all b derivatives, except 15(1)-hydroxy-lactone compounds, undergo specific CO losses. We propose a new reaction mechanism based in the structural configuration of a and b chlorophyll derivatives that explain the exclusive CO fragmentation in all b series except for 15(1)-hydroxy-lactone b and all a series compounds. Copyright © 2015 Elsevier B.V. All rights reserved.

  17. Integrating existing software toolkits into VO system

    NASA Astrophysics Data System (ADS)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  18. Architecture of a framework for providing information services for public transport.

    PubMed

    García, Carmelo R; Pérez, Ricardo; Lorenzo, Alvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino

    2012-01-01

    This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained.

  19. A new ImageJ plug-in "ActogramJ" for chronobiological analyses.

    PubMed

    Schmid, Benjamin; Helfrich-Förster, Charlotte; Yoshii, Taishi

    2011-10-01

    While the rapid development of personal computers and high-throughput recording systems for circadian rhythms allow chronobiologists to produce huge amounts of data, the software to analyze them often lags behind. Here, we announce newly developed chronobiology software that is easy to use, compatible with many different systems, and freely available. Our system can perform the most frequently used analyses: actogram drawing, periodogram analysis, and waveform analysis. The software is distributed as a pure Java plug-in for ImageJ and so works on the 3 main operating systems: Linux, Macintosh, and Windows. We believe that this free software raises the speed of data analyses and makes studying chronobiology accessible to newcomers. © 2011 The Author(s)

  20. Apple's Macintosh.

    ERIC Educational Resources Information Center

    Miller, Michael J.

    1984-01-01

    Description of the Macintosh personal, educational, and business computer produced by Apple covers cost; physical characteristics including display devices, circuit boards, and built-in features; company-produced software; third-party produced software; memory and storage capacity; word-processing features; and graphics capabilities. (MBR)

  1. RMP Guidance for Offsite Consequence Analysis

    EPA Pesticide Factsheets

    Offsite consequence analysis (OCA) consists of a worst-case release scenario and alternative release scenarios. OCA is required from facilities with chemicals above threshold quantities. RMP*Comp software can be used to perform calculations described here.

  2. Combining High-Speed Cameras and Stop-Motion Animation Software to Support Students' Modeling of Human Body Movement

    ERIC Educational Resources Information Center

    Lee, Victor R.

    2015-01-01

    Biomechanics, and specifically the biomechanics associated with human movement, is a potentially rich backdrop against which educators can design innovative science teaching and learning activities. Moreover, the use of technologies associated with biomechanics research, such as high-speed cameras that can produce high-quality slow-motion video,…

  3. Needs challenge software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-07-01

    New hardware and software tools build on existing platforms and add performance and ease-of-use benefits as the struggle to find and produce hydrocarbons at the lowest cost becomes more and more competitive. Software tools now provide geoscientists and petroleum engineers with a better understanding of reservoirs from the shape and makeup of formation to behavior projections as hydrocarbons are extracted. Petroleum software tools allow scientists to simulate oil flow, predict the life expectancy of a reservoir, and even help determine how to extend the life and economic viability of the reservoir. The requirement of the petroleum industry to find andmore » extract petroleum more efficiently drives the solutions provided by software and service companies. To one extent or another, most of the petroleum software products available today have achieved an acceptable level of competency. Innovative, high-impact products from small, focussed companies often were bought out by larger companies with deeper pockets if their developers couldn`t fund their expansion. Other products disappeared from the scene, because they were unable to evolve fast enough to compete. There are still enough small companies around producing excellent products to prevent the marketplace from feeling too narrow and lacking in choice. Oil companies requiring specific solutions to their problems have helped fund product development within the commercial sector. As the industry has matured, strategic alliances between vendors, both hardware and software, have provided market advantages, often combining strengths to enter new and undeveloped areas for technology. The pace of technological development has been fast and constant.« less

  4. Approaches in highly parameterized inversion—PEST++ Version 3, a Parameter ESTimation and uncertainty analysis software suite optimized for large environmental models

    USGS Publications Warehouse

    Welter, David E.; White, Jeremy T.; Hunt, Randall J.; Doherty, John E.

    2015-09-18

    The PEST++ Version 3 software suite can be compiled for Microsoft Windows®4 and Linux®5 operating systems; the source code is available in a Microsoft Visual Studio®6 2013 solution; Linux Makefiles are also provided. PEST++ Version 3 continues to build a foundation for an open-source framework capable of producing robust and efficient parameter estimation tools for large environmental models.

  5. Choices and Consequences.

    ERIC Educational Resources Information Center

    Thorp, Carmany

    1995-01-01

    Describes student use of Hyperstudio computer software to create history adventure games. History came alive while students learned efficient writing skills; learned to understand and manipulate cause, effect choice and consequence; and learned to incorporate succinct locational, climatic, and historical detail. (ET)

  6. GPS Software Packages Deliver Positioning Solutions

    NASA Technical Reports Server (NTRS)

    2010-01-01

    "To determine a spacecraft s position, the Jet Propulsion Laboratory (JPL) developed an innovative software program called the GPS (global positioning system)-Inferred Positioning System and Orbit Analysis Simulation Software, abbreviated as GIPSY-OASIS, and also developed Real-Time GIPSY (RTG) for certain time-critical applications. First featured in Spinoff 1999, JPL has released hundreds of licenses for GIPSY and RTG, including to Longmont, Colorado-based DigitalGlobe. Using the technology, DigitalGlobe produces satellite imagery with highly precise latitude and longitude coordinates and then supplies it for uses within defense and intelligence, civil agencies, mapping and analysis, environmental monitoring, oil and gas exploration, infrastructure management, Internet portals, and navigation technology."

  7. ORBS: A reduction software for SITELLE and SpiOMM data

    NASA Astrophysics Data System (ADS)

    Martin, Thomas

    2014-09-01

    ORBS merges, corrects, transforms and calibrates interferometric data cubes and produces a spectral cube of the observed region for analysis. It is a fully automatic data reduction software for use with SITELLE (installed at the Canada-France-Hawaii Telescope) and SpIOMM (a prototype attached to the Observatoire du Mont Mégantic); these imaging Fourier transform spectrometers obtain a hyperspectral data cube which samples a 12 arc-minutes field of view into 4 millions of visible spectra. ORBS is highly parallelized; its core classes (ORB) have been designed to be used in a suite of softwares for data analysis (ORCS and OACS), data simulation (ORUS) and data acquisition (IRIS).

  8. Requirements UML Tool (RUT) Expanded for Extreme Programming (CI02)

    NASA Technical Reports Server (NTRS)

    McCoy, James R.

    2003-01-01

    A procedure for capturing and managing system requirements that incorporates XP user stories. Because costs associated with identifying problems in requirements increase dramatically over the lifecycle of a project, a method for identifying sources of software risks in user stories is urgently needed. This initiative aims to determine a set of guide-lines for user stories that will result in high-quality requirement. To further this initiative, a tool is needed to analyze user stories that can assess the quality of individual user stories, detect sources cf software risk's, produce software metrics, and identify areas in user stories that can be improved.

  9. Do High Dynamic Range threatments improve the results of Structure from Motion approaches in Geomorphology?

    NASA Astrophysics Data System (ADS)

    Gómez-Gutiérrez, Álvaro; Juan de Sanjosé-Blasco, José; Schnabel, Susanne; de Matías-Bejarano, Javier; Pulido-Fernández, Manuel; Berenguer-Sempere, Fernando

    2015-04-01

    In this work, the hypothesis of improving 3D models obtained with Structure from Motion (SfM) approaches using images pre-processed by High Dynamic Range (HDR) techniques is tested. Photographs of the Veleta Rock Glacier in Spain were captured with different exposure values (EV0, EV+1 and EV-1), two focal lengths (35 and 100 mm) and under different weather conditions for the years 2008, 2009, 2011, 2012 and 2014. HDR images were produced using the different EV steps within Fusion F.1 software. Point clouds were generated using commercial and free available SfM software: Agisoft Photoscan and 123D Catch. Models Obtained using pre-processed images and non-preprocessed images were compared in a 3D environment with a benchmark 3D model obtained by means of a Terrestrial Laser Scanner (TLS). A total of 40 point clouds were produced, georeferenced and compared. Results indicated that for Agisoft Photoscan software differences in the accuracy between models obtained with pre-processed and non-preprocessed images were not significant from a statistical viewpoint. However, in the case of the free available software 123D Catch, models obtained using images pre-processed by HDR techniques presented a higher point density and were more accurate. This tendency was observed along the 5 studied years and under different capture conditions. More work should be done in the near future to corroborate whether the results of similar software packages can be improved by HDR techniques (e.g. ARC3D, Bundler and PMVS2, CMP SfM, Photosynth and VisualSFM).

  10. Software Fault Tolerance: A Tutorial

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2000-01-01

    Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.

  11. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  12. Physiology and quality of fresh-cut produce in CA/MA storage

    USDA-ARS?s Scientific Manuscript database

    Fresh-cut fruits and vegetables have exposed injured tissues due to the mechanical processes of peeling, slicing and/or cutting. Such processing consequently renders the produce highly susceptible to physiological breakdown and microbial spoilage. Product deterioration is usually accompanied with ph...

  13. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    PubMed

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  14. Produce Live News Broadcasts Using Standard AV Equipment: A Success Story from the Le Center High School in Minnesota.

    ERIC Educational Resources Information Center

    Rostad, John

    1997-01-01

    Describes the production of news broadcasts on video by a high school class in Le Center, Minnesota. Topics include software for Apple computers, equipment used, student responsibilities, class curriculum, group work, communication among the production crew, administrative and staff support, and future improvements. (LRW)

  15. Portal for Families Overcoming Neurodevelopmental Disorders (PFOND): Implementation of a Software Framework for Facilitated Community Website Creation by Nontechnical Volunteers.

    PubMed

    Ye, Xin Cynthia; Ng, Isaiah; Seid-Karbasi, Puya; Imam, Tuhina; Lee, Cheryl E; Chen, Shirley Yu; Herman, Adam; Sharma, Balraj; Johal, Gurinder; Gu, Bobby; Wasserman, Wyeth W

    2013-08-06

    The Portal for Families Overcoming Neurodevelopmental Disorders (PFOND) provides a structured Internet interface for the sharing of information with individuals struggling with the consequences of rare developmental disorders. Large disease-impacted communities can support fundraising organizations that disseminate Web-based information through elegant websites run by professional staff. Such quality resources for families challenged by rare disorders are infrequently produced and, when available, are often dependent upon the continued efforts of a single individual. The project endeavors to create an intuitive Web-based software system that allows a volunteer with limited technical computer skills to produce a useful rare disease website in a short time period. Such a system should provide access to emerging news and research findings, facilitate community participation, present summary information about the disorder, and allow for transient management by volunteers who are likely to change periodically. The prototype portal was implemented using the WordPress software system with both existing and customized supplementary plug-in software modules. Gamification scoring features were implemented in a module, allowing editors to measure progress. The system was installed on a Linux-based computer server, accessible across the Internet through standard Web browsers. A prototype PFOND system was implemented and tested. The prototype system features a structured organization with distinct partitions for background information, recent publications, and community discussions. The software design allows volunteer editors to create a themed website, implement a limited set of topic pages, and connect the software to dynamic RSS feeds providing information about recent news or advances. The prototype was assessed by a fraction of the disease sites developed (8 out of 27), including Aarskog-Scott syndrome, Aniridia, Adams-Oliver syndrome, Cat Eye syndrome, Kabuki syndrome, Leigh syndrome, Peters anomaly, and Rothmund-Thomson syndrome. The editor progress score was used to measure performance for a portion of sites. The PFOND system provides a convenient and structured Internet resource for the facilitated creation of information resources for families confronted by rare disorders. The system empowers volunteers to participate in the creation of quality content, while allowing for the inevitable turnover of contributors over time. The next phase of PFOND development will focus on volunteer participation in system development and community engagement.

  16. Portal for Families Overcoming Neurodevelopmental Disorders (PFOND): Implementation of a Software Framework for Facilitated Community Website Creation by Nontechnical Volunteers

    PubMed Central

    Imam, Tuhina; Lee, Cheryl E; Chen, Shirley Yu; Herman, Adam; Sharma, Balraj; Johal, Gurinder; Gu, Bobby

    2013-01-01

    Background The Portal for Families Overcoming Neurodevelopmental Disorders (PFOND) provides a structured Internet interface for the sharing of information with individuals struggling with the consequences of rare developmental disorders. Large disease-impacted communities can support fundraising organizations that disseminate Web-based information through elegant websites run by professional staff. Such quality resources for families challenged by rare disorders are infrequently produced and, when available, are often dependent upon the continued efforts of a single individual. Objective The project endeavors to create an intuitive Web-based software system that allows a volunteer with limited technical computer skills to produce a useful rare disease website in a short time period. Such a system should provide access to emerging news and research findings, facilitate community participation, present summary information about the disorder, and allow for transient management by volunteers who are likely to change periodically. Methods The prototype portal was implemented using the WordPress software system with both existing and customized supplementary plug-in software modules. Gamification scoring features were implemented in a module, allowing editors to measure progress. The system was installed on a Linux-based computer server, accessible across the Internet through standard Web browsers. Results A prototype PFOND system was implemented and tested. The prototype system features a structured organization with distinct partitions for background information, recent publications, and community discussions. The software design allows volunteer editors to create a themed website, implement a limited set of topic pages, and connect the software to dynamic RSS feeds providing information about recent news or advances. The prototype was assessed by a fraction of the disease sites developed (8 out of 27), including Aarskog-Scott syndrome, Aniridia, Adams-Oliver syndrome, Cat Eye syndrome, Kabuki syndrome, Leigh syndrome, Peters anomaly, and Rothmund-Thomson syndrome. The editor progress score was used to measure performance for a portion of sites. Conclusions The PFOND system provides a convenient and structured Internet resource for the facilitated creation of information resources for families confronted by rare disorders. The system empowers volunteers to participate in the creation of quality content, while allowing for the inevitable turnover of contributors over time. The next phase of PFOND development will focus on volunteer participation in system development and community engagement. PMID:23920006

  17. Automatic programming for critical applications

    NASA Technical Reports Server (NTRS)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  18. The LHCb Starterkit

    NASA Astrophysics Data System (ADS)

    Puig, Albert; LHCb Starterkit Team

    2017-10-01

    The vast majority of high-energy physicists use and produce software every day. Software skills are usually acquired “on the go” and dedicated training courses are rare. The LHCb Starterkit is a new training format for getting LHCb collaborators started in effectively using software to perform their research. The course focuses on teaching basic skills for research computing. Unlike traditional tutorials we focus on starting with basics, performing all the material live, with a high degree of interactivity, giving priority to understanding the tools as opposed to handing out recipes that work “as if by magic”. The LHCb Starterkit was started by two young members of the collaboration inspired by the principles of Software Carpentry, and the material is created in a collaborative fashion using the tools we teach. Three successful entry-level workshops, as well as an advance one, have taken place since the start of the initiative in 2015, and were taught largely by PhD students to other PhD students.

  19. Effect of Unifocal versus Multifocal Lenses on Cervical Spine Posture in Patients with Presbyopia.

    PubMed

    Abbas, Rami L; Houri, Mohamad T; Rayyan, Mohammad M; Hamada, Hamada Ahmad; Saab, Ibtissam M

    2018-04-04

    There are many environmental considerations which may or may not lead to the development of faulty cervical mechanics. The design of near vision lenses could contribute to the development of such cervical dysfunction and consequently neck pain. Decision making regarding proper type of lens prescription seems important for presbyopic individuals. To investigate the effect of unifocal and multifocal lenses on cervical posture. Thirty subjects (18 females and 12 males) participated in the study with an age range from 40 to 64 years. Each subject wore consequently both unifocal and multifocal lenses randomly while reading. Then lateral cervical spine X-ray films were taken for each subject during each lens wearing. X-ray films were analyzed with digital software (Autocad software, 2 D) to measure segmental angles of the cervical vertebrae (Occiput/C1, C1/C2, C2/C3, C3/C4, C4/C5, C5/C6, C6/C7, C3/C7, C0/C3, and occiput/C7). Higher significant extension angle in the segments C0/C7, C1/C2, C5/C6, C6/C7, and C3/C7 (p<0.05) during multifocal lenses wearing were observed in contrast with higher flexion angle between C3/C4 and C4/C5 (p<0.05) with unifocal lenses wear. Multifocal lens spectacles produces increased extension in the cervical vertebrae angles when compared with the use of unifocal lenses.

  20. Bias and design in software specifications

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1990-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.

  1. Architecture of a Framework for Providing Information Services for Public Transport

    PubMed Central

    García, Carmelo R.; Pérez, Ricardo; Lorenzo, Álvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino

    2012-01-01

    This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained. PMID:22778585

  2. CONRAD Software Architecture

    NASA Astrophysics Data System (ADS)

    Guzman, J. C.; Bennett, T.

    2008-08-01

    The Convergent Radio Astronomy Demonstrator (CONRAD) is a collaboration between the computing teams of two SKA pathfinder instruments, MeerKAT (South Africa) and ASKAP (Australia). Our goal is to produce the required common software to operate, process and store the data from the two instruments. Both instruments are synthesis arrays composed of a large number of antennas (40 - 100) operating at centimeter wavelengths with wide-field capabilities. Key challenges are the processing of high volume of data in real-time as well as the remote mode of operations. Here we present the software architecture for CONRAD. Our design approach is to maximize the use of open solutions and third-party software widely deployed in commercial applications, such as SNMP and LDAP, and to utilize modern web-based technologies for the user interfaces, such as AJAX.

  3. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prowell, Stacy J; Symons, Christopher T

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  4. Infographic Strategies: Publications' Staffs, Assisted by Desktop Publishing, Tell Stories in Visuals Second to None.

    ERIC Educational Resources Information Center

    Jordan, Jim

    1988-01-01

    Summarizes how infograhics are produced and how they provide information graphically in high school publications. Offers suggestions concerning information gathering, graphic format, and software selection, and provides examples of computer/student designed infographics. (MM)

  5. Precise Documentation: The Key to Better Software

    NASA Astrophysics Data System (ADS)

    Parnas, David Lorge

    The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.

  6. Dem Generation from Close-Range Photogrammetry Using Extended Python Photogrammetry Toolbox

    NASA Astrophysics Data System (ADS)

    Belmonte, A. A.; Biong, M. M. P.; Macatulad, E. G.

    2017-10-01

    Digital elevation models (DEMs) are widely used raster data for different applications concerning terrain, such as for flood modelling, viewshed analysis, mining, land development, engineering design projects, to name a few. DEMs can be obtained through various methods, including topographic survey, LiDAR or photogrammetry, and internet sources. Terrestrial close-range photogrammetry is one of the alternative methods to produce DEMs through the processing of images using photogrammetry software. There are already powerful photogrammetry software that are commercially-available and can produce high-accuracy DEMs. However, this entails corresponding cost. Although, some of these software have free or demo trials, these trials have limits in their usable features and usage time. One alternative is the use of free and open-source software (FOSS), such as the Python Photogrammetry Toolbox (PPT), which provides an interface for performing photogrammetric processes implemented through python script. For relatively small areas such as in mining or construction excavation, a relatively inexpensive, fast and accurate method would be advantageous. In this study, PPT was used to generate 3D point cloud data from images of an open pit excavation. The PPT was extended to add an algorithm converting the generated point cloud data into a usable DEM.

  7. The MONDO project: A secondary neutron tracker detector for particle therapy

    NASA Astrophysics Data System (ADS)

    Valle, S. M.; Battistoni, G.; Patera, V.; Pinci, D.; Sarti, A.; Sciubba, A.; Spiriti, E.; Marafini, M.

    2017-02-01

    During Particle Therapy treatments the patient irradiation produces, among different types of secondary radiation, an abundant flux of neutrons that can release a significant dose far away from the tumour region. A precise measurement of their flux, energy and angle distributions is eagerly needed in order to improve the Treatment Planning Systems software and to properly take into account the risk of late complications in the whole body. The technical challenges posed by a neutron detector aiming for high detection efficiency and good backtracking precision will be addressed within the MONDO project, whose main goal is to develop a tracking detector targeting fast and ultra-fast secondary neutrons. The neutron tracking principle is based on the reconstruction of two consequent elastic scattering interactions of a neutron with a target material. Reconstructing the recoiling protons it is hence possible to measure the energy and incoming direction of the neutron. Plastic scintillators will be used as scattering and detection media: the tracker is being developed as a matrix of squared scintillating fibres of 250 μm side. The light produced and collected in fibres will be amplified using a triple GEM-based image intensifier and acquired using CMOS Single Photon Avalanche Diode arrays. Using therapeutic beams, the principal detector goal will be the measurement of the neutron production yields, as a function of production angle and energy.

  8. A Telemetry Browser Built with Java Components

    NASA Astrophysics Data System (ADS)

    Poupart, E.

    In the context of CNES balloon scientific campaigns and telemetry survey field, a generic telemetry processing product, called TelemetryBrowser in the following, was developed reusing COTS, Java Components for most of them. Connection between those components relies on a software architecture based on parameter producers and parameter consumers. The first one transmit parameter values to the second one which has registered to it. All of those producers and consumers can be spread over the network thanks to Corba, and over every kind of workstation thanks to Java. This gives a very powerful mean to adapt to constraints like network bandwidth, or workstations processing or memory. It's also very useful to display and correlate at the same time information coming from multiple and various sources. An important point of this architecture is that the coupling between parameter producers and parameter consumers is reduced to the minimum and that transmission of information on the network is made asynchronously. So, if a parameter consumer goes down or runs slowly, there is no consequence on the other consumers, because producers don't wait for their consumers to finish their data processing before sending it to other consumers. An other interesting point is that parameter producers, also called TelemetryServers in the following are generated nearly automatically starting from a telemetry description using Flavori component. Keywords Java components, Corba, distributed application, OpenORBii, software reuse, COTS, Internet, Flavor. i Flavor (Formal Language for Audio-Visual Object Representation) is an object-oriented media representation language being developed at Columbia University. It is designed as an extension of Java and C++ and simplifies the development of applications that involve a significant media processing component (encoding, decoding, editing, manipulation, etc.) by providing bitstream representation semantics. (flavor.sourceforge.net) ii OpenORB provides a Java implementation of the OMG Corba 2.4.2 specification (openorb.sourceforge.net) 1/16

  9. Current Practice in Software Development for Computational Neuroscience and How to Improve It

    PubMed Central

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research. PMID:24465191

  10. Current practice in software development for computational neuroscience and how to improve it.

    PubMed

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  11. Yearbook Production: Yearbook Staffs Can Now "Blame" Strengths, Weaknesses on Computer as They Take More Control of Their Publications.

    ERIC Educational Resources Information Center

    Hall, H. L.

    1988-01-01

    Reports on the advantages and disadvantages of desktop publishing, using the Apple Macintosh and "Pagemaker" software, to produce a high school yearbook. Asserts that while desktop publishing may be initially more time consuming for those unfamiliar with computers, desktop publishing gives high school journalism staffs more control over…

  12. A Software Development Approach for Computer Assisted Language Learning

    ERIC Educational Resources Information Center

    Cushion, Steve

    2005-01-01

    Over the last 5 years we have developed, produced, tested, and evaluated an authoring software package to produce web-based, interactive, audio-enhanced language-learning material. That authoring package has been used to produce language-learning material in French, Spanish, German, Arabic, and Tamil. We are currently working on increasing…

  13. Desktop Publishing for Counselors.

    ERIC Educational Resources Information Center

    Lucking, Robert; Mitchum, Nancy

    1990-01-01

    Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…

  14. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.

  15. Finding Helpful Software Reviews.

    ERIC Educational Resources Information Center

    Kruse, Ted, Comp.

    1987-01-01

    Provides a list of evaluation services currently producing critical reviews of educational software. Includes information about The Apple K-12 Curriculum Software Reference, The Educational Software Preview, The Educational Software Selector, MicroSIFT, and Only The Best: The Discriminating Guide for Preschool-Grade 12. (TW)

  16. Food drying process by power ultrasound.

    PubMed

    de la Fuente-Blanco, S; Riera-Franco de Sarabia, E; Acosta-Aparicio, V M; Blanco-Blanco, A; Gallego-Juárez, J A

    2006-12-22

    Drying processes, which have a great significance in the food industry, are frequently based on the use of thermal energy. Nevertheless, such methods may produce structural changes in the products. Consequently, a great emphasis is presently given to novel treatments where the quality will be preserved. Such is the case of the application of high-power ultrasound which represents an emergent and promising technology. During the last few years, we have been involved in the development of an ultrasonic dehydration process, based on the application of the ultrasonic vibration in direct contact with the product. Such a process has been the object of a detailed study at laboratory stage on the influence of the different parameters involved. This paper deals with the development and testing of a prototype system for the application and evaluation of the process at a pre-industrial stage. Such prototype is based on a high-power rectangular plate transducer, working at a frequency of 20 kHz, with a power capacity of about 100 W. In order to study mechanical and thermal effects, the system is provided with a series of sensors which permit monitoring the parameters of the process. Specific software has also been developed to facilitate data collection and analysis. The system has been tested with vegetable samples.

  17. Epistemic Questions and Answers for Software System Safety

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, Chris W.

    2010-01-01

    System safety is primarily concerned with epistemic questions, that is, questions concerning knowledge and the degree of confidence that can be placed in that knowledge. For systems with which human experience is long, such as roads, bridges, and mechanical devices, knowledge about what is required to make the systems safe is deep and detailed. High confidence can be placed in the validity of that knowledge. For other systems, however, with which human experience is comparatively short, such as those that rely in part or in whole on software, knowledge about what is required to ensure safety tends to be shallow and general. The confidence that can be placed in the validity of that knowledge is consequently low. In a previous paper, we enumerated a collection of foundational epistemic questions concerning software system safety. In this paper, we review and refine the questions, discuss some difficulties that attend to answering the questions today, and speculate on possible research to improve the situation.

  18. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping

    PubMed Central

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables. PMID:27529547

  19. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping.

    PubMed

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables.

  20. HEP Community White Paper on Software Trigger and Event Reconstruction: Executive Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albrecht, Johannes; et al.

    Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less

  1. HEP Community White Paper on Software Trigger and Event Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albrecht, Johannes; et al.

    Realizing the physics programs of the planned and upgraded high-energy physics (HEP) experiments over the next 10 years will require the HEP community to address a number of challenges in the area of software and computing. For this reason, the HEP software community has engaged in a planning process over the past two years, with the objective of identifying and prioritizing the research and development required to enable the next generation of HEP detectors to fulfill their full physics potential. The aim is to produce a Community White Paper which will describe the community strategy and a roadmap for softwaremore » and computing research and development in HEP for the 2020s. The topics of event reconstruction and software triggers were considered by a joint working group and are summarized together in this document.« less

  2. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    PubMed

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  3. Consequences of Law and Rule Breaking (Law-Related Education Materials) 1982-83. Materials Produced by Teachers in Highlands County.

    ERIC Educational Resources Information Center

    Allen, Rodney F., Ed.

    Approximately 40 teacher-developed activities for legal education in some Florida elementary and junior high schools focus on the consequences of breaking rules and committing crimes and on victims of crime (individuals, community, society). Most of the lessons present a brief, one-page reading followed by questions to determine students'…

  4. Consequences of Law and Rule Breaking (Law Related Education Materials) 1982-83. Materials Produced by Teachers in Hardee County.

    ERIC Educational Resources Information Center

    Allen, Rodney F., Ed.

    Approximately 60 teacher-developed activities for legal education in some Florida elementary and junior high schools focus on the consequences of breaking rules and committing crimes and on victims of crime (individuals, community, society). Most of the lessons present a brief, one-page reading followed by questions to determine students'…

  5. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  6. University Approaches to Software Copyright and Licensure Policies.

    ERIC Educational Resources Information Center

    Hawkins, Brian L.

    Issues of copyright policy and software licensure at Drexel University that were developed during the introduction of a new microcomputing program are discussed. Channels for software distribution include: individual purchase of externally-produced software, distribution of internally-developed software, institutional licensure, and "read…

  7. General Mode Scanning Probe Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somnath, Suhas; Jesse, Stephen

    A critical part of SPM measurements is the information transfer from the probe-sample junction to the measurement system. Current information transfer methods heavily compress the information-rich data stream by averaging the data over a time interval, or via heterodyne detection approaches such as lock-in amplifiers and phase-locked loops. As a consequence, highly valuable information at the sub-microsecond time scales or information from frequencies outside the measurement band is lost. We have developed a fundamentally new approach called General Mode (G-mode), where we can capture the complete information stream from the detectors in the microscope. The availability of the complete informationmore » allows the microscope operator to analyze the data via information-theory analysis or comprehensive physical models. Furthermore, the complete data stream enables advanced data-driven filtering algorithms, multi-resolution imaging, ultrafast spectroscropic imaging, spatial mapping of multidimensional variability in material properties, etc. Though we applied this approach to scanning probe microscopy, the general philosophy of G-mode can be applied to many other modes of microscopy. G-mode data is captured by completely custom software written in LabVIEW and Matlab. The software generates the waveforms to electrically, thermally, or mechanically excite the SPM probe. It handles real-time communications with the microscope software for operations such as moving the SPM probe position and also controls other instrumentation hardware. The software also controls multiple variants of high-speed data acquisition cards to excite the SPM probe with the excitation waveform and simultaneously measure multiple channels of information from the microscope detectors at sampling rates of 1-100 MHz. The software also saves the raw data to the computer and allows the microscope operator to visualize processed or filtered data during the experiment. The software performs all these features while offering a user-friendly interface.« less

  8. Systems Studies of DDT Transport

    ERIC Educational Resources Information Center

    Harrison, H. L.; And Others

    1970-01-01

    Major consequences of present and additional environmental quantities of DDT pesticide are predictable by mathematical models of transport, accumulation and concentration mechanisms in the Wisconsin regional ecosystem. High solubility and stability produce increased DDT concentrations at high organism trophic levels within world biosphere…

  9. An Approach for Integrating the Prioritization of Functional and Nonfunctional Requirements

    PubMed Central

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches. PMID:24982987

  10. An approach for integrating the prioritization of functional and nonfunctional requirements.

    PubMed

    Dabbagh, Mohammad; Lee, Sai Peck

    2014-01-01

    Due to the budgetary deadlines and time to market constraints, it is essential to prioritize software requirements. The outcome of requirements prioritization is an ordering of requirements which need to be considered first during the software development process. To achieve a high quality software system, both functional and nonfunctional requirements must be taken into consideration during the prioritization process. Although several requirements prioritization methods have been proposed so far, no particular method or approach is presented to consider both functional and nonfunctional requirements during the prioritization stage. In this paper, we propose an approach which aims to integrate the process of prioritizing functional and nonfunctional requirements. The outcome of applying the proposed approach produces two separate prioritized lists of functional and non-functional requirements. The effectiveness of the proposed approach has been evaluated through an empirical experiment aimed at comparing the approach with the two state-of-the-art-based approaches, analytic hierarchy process (AHP) and hybrid assessment method (HAM). Results show that our proposed approach outperforms AHP and HAM in terms of actual time-consumption while preserving the quality of the results obtained by our proposed approach at a high level of agreement in comparison with the results produced by the other two approaches.

  11. Business Graphics

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Genigraphics Corporation's Masterpiece 8770 FilmRecorder is an advanced high resolution system designed to improve and expand a company's in-house graphics production. GRAFTIME/software package was designed to allow office personnel with minimal training to produce professional level graphics for business communications and presentations. Products are no longer being manufactured.

  12. Efficient QR sequential least square algorithm for high frequency GNSS precise point positioning seismic application

    NASA Astrophysics Data System (ADS)

    Barbu, Alina L.; Laurent-Varin, Julien; Perosanz, Felix; Mercier, Flavien; Marty, Jean-Charles

    2018-01-01

    The implementation into the GINS CNES geodetic software of a more efficient filter was needed to satisfy the users who wanted to compute high-rate GNSS PPP solutions. We selected the SRI approach and a QR factorization technique including an innovative algorithm which optimizes the matrix reduction step. A full description of this algorithm is given for future users. The new capacities of the software have been tested using a set of 1 Hz data from the Japanese GEONET network including the Mw 9.0 2011 Tohoku earthquake. Station coordinates solution agreed at a sub-decimeter level with previous publications as well as with solutions we computed with the National Resource Canada software. An additional benefit from the implementation of the SRI filter is the capability to estimate high-rate tropospheric parameters too. As the CPU time to estimate a 1 Hz kinematic solution from 1 h of data is now less than 1 min we could produced series of coordinates for the full 1300 stations of the Japanese network. The corresponding movie shows the impressive co-seismic deformation as well as the wave propagation along the island. The processing was straightforward using a cluster of PCs which illustrates the new potentiality of the GINS software for massive network high rate PPP processing.

  13. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  14. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push forward research into the implementation of methods and models for the assimilation of uncertainties in decision-making processes emerges.

  15. Next Generation Models for Storage and Representation of Microbial Biological Annotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quest, Daniel J; Land, Miriam L; Brettin, Thomas S

    2010-01-01

    Background Traditional genome annotation systems were developed in a very different computing era, one where the World Wide Web was just emerging. Consequently, these systems are built as centralized black boxes focused on generating high quality annotation submissions to GenBank/EMBL supported by expert manual curation. The exponential growth of sequence data drives a growing need for increasingly higher quality and automatically generated annotation. Typical annotation pipelines utilize traditional database technologies, clustered computing resources, Perl, C, and UNIX file systems to process raw sequence data, identify genes, and predict and categorize gene function. These technologies tightly couple the annotation software systemmore » to hardware and third party software (e.g. relational database systems and schemas). This makes annotation systems hard to reproduce, inflexible to modification over time, difficult to assess, difficult to partition across multiple geographic sites, and difficult to understand for those who are not domain experts. These systems are not readily open to scrutiny and therefore not scientifically tractable. The advent of Semantic Web standards such as Resource Description Framework (RDF) and OWL Web Ontology Language (OWL) enables us to construct systems that address these challenges in a new comprehensive way. Results Here, we develop a framework for linking traditional data to OWL-based ontologies in genome annotation. We show how data standards can decouple hardware and third party software tools from annotation pipelines, thereby making annotation pipelines easier to reproduce and assess. An illustrative example shows how TURTLE (Terse RDF Triple Language) can be used as a human readable, but also semantically-aware, equivalent to GenBank/EMBL files. Conclusions The power of this approach lies in its ability to assemble annotation data from multiple databases across multiple locations into a representation that is understandable to researchers. In this way, all researchers, experimental and computational, will more easily understand the informatics processes constructing genome annotation and ultimately be able to help improve the systems that produce them.« less

  16. Integrated Software Health Management for Aircraft GN and C

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole

    2011-01-01

    Modern aircraft rely heavily on dependable operation of many safety-critical software components. Despite careful design, verification and validation (V&V), on-board software can fail with disastrous consequences if it encounters problematic software/hardware interaction or must operate in an unexpected environment. We are using a Bayesian approach to monitor the software and its behavior during operation and provide up-to-date information about the health of the software and its components. The powerful reasoning mechanism provided by our model-based Bayesian approach makes reliable diagnosis of the root causes possible and minimizes the number of false alarms. Compilation of the Bayesian model into compact arithmetic circuits makes SWHM feasible even on platforms with limited CPU power. We show initial results of SWHM on a small simulator of an embedded aircraft software system, where software and sensor faults can be injected.

  17. TRIIG - Time-lapse reproduction of images through interactive graphics. [digital processing of quality hard copy

    NASA Technical Reports Server (NTRS)

    Buckner, J. D.; Council, H. W.; Edwards, T. R.

    1974-01-01

    Description of the hardware and software implementing the system of time-lapse reproduction of images through interactive graphics (TRIIG). The system produces a quality hard copy of processed images in a fast and inexpensive manner. This capability allows for optimal development of processing software through the rapid viewing of many image frames in an interactive mode. Three critical optical devices are used to reproduce an image: an Optronics photo reader/writer, the Adage Graphics Terminal, and Polaroid Type 57 high speed film. Typical sources of digitized images are observation satellites, such as ERTS or Mariner, computer coupled electron microscopes for high-magnification studies, or computer coupled X-ray devices for medical research.

  18. Integrated software system for low level waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worku, G.

    1995-12-31

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less

  19. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  20. Virtual rough samples to test 3D nanometer-scale scanning electron microscopy stereo photogrammetry.

    PubMed

    Villarrubia, J S; Tondare, V N; Vladár, A E

    2016-01-01

    The combination of scanning electron microscopy for high spatial resolution, images from multiple angles to provide 3D information, and commercially available stereo photogrammetry software for 3D reconstruction offers promise for nanometer-scale dimensional metrology in 3D. A method is described to test 3D photogrammetry software by the use of virtual samples-mathematical samples from which simulated images are made for use as inputs to the software under test. The virtual sample is constructed by wrapping a rough skin with any desired power spectral density around a smooth near-trapezoidal line with rounded top corners. Reconstruction is performed with images simulated from different angular viewpoints. The software's reconstructed 3D model is then compared to the known geometry of the virtual sample. Three commercial photogrammetry software packages were tested. Two of them produced results for line height and width that were within close to 1 nm of the correct values. All of the packages exhibited some difficulty in reconstructing details of the surface roughness.

  1. High beta plasma operation in a toroidal plasma producing device

    DOEpatents

    Clarke, John F.

    1978-01-01

    A high beta plasma is produced in a plasma producing device of toroidal configuration by ohmic heating and auxiliary heating. The plasma pressure is continuously monitored and used in a control system to program the current in the poloidal field windings. Throughout the heating process, magnetic flux is conserved inside the plasma and the distortion of the flux surfaces drives a current in the plasma. As a consequence, the total current increases and the poloidal field windings are driven with an equal and opposing increasing current. The spatial distribution of the current in the poloidal field windings is determined by the plasma pressure. Plasma equilibrium is maintained thereby, and high temperature, high beta operation results.

  2. AIS Modeling and a Satellite for AIS Observations in the High North + Draft New ITU-R Report Improved Satellite Detection of AIS

    DTIC Science & Technology

    2008-09-01

    20 Aug 08) Report Documentation Page Form ApprovedOMB No . 0704-0188 Public reporting burden for the collection of information is estimated to average...Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person...to eliminate propagation delay – 3 min. reporting interval to lower message rate • Vessel consequences: – No extra hardware – Transponder software

  3. Integrated computer-aided design using minicomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  4. Preliminary design of the redundant software experiment

    NASA Technical Reports Server (NTRS)

    Campbell, Roy; Deimel, Lionel; Eckhardt, Dave, Jr.; Kelly, John; Knight, John; Lauterbach, Linda; Lee, Larry; Mcallister, Dave; Mchugh, John

    1985-01-01

    The goal of the present experiment is to characterize the fault distributions of highly reliable software replicates, constructed using techniques and environments which are similar to those used in comtemporary industrial software facilities. The fault distributions and their effect on the reliability of fault tolerant configurations of the software will be determined through extensive life testing of the replicates against carefully constructed randomly generated test data. Each detected error will be carefully analyzed to provide insight in to their nature and cause. A direct objective is to develop techniques for reducing the intensity of coincident errors, thus increasing the reliability gain which can be achieved with fault tolerance. Data on the reliability gains realized, and the cost of the fault tolerant configurations can be used to design a companion experiment to determine the cost effectiveness of the fault tolerant strategy. Finally, the data and analysis produced by this experiment will be valuable to the software engineering community as a whole because it will provide a useful insight into the nature and cause of hard to find, subtle faults which escape standard software engineering validation techniques and thus persist far into the software life cycle.

  5. A Discussion of Using a Reconfigurable Processor to Implement the Discrete Fourier Transform

    NASA Technical Reports Server (NTRS)

    White, Michael J.

    2004-01-01

    This paper presents the design and implementation of the Discrete Fourier Transform (DFT) algorithm on a reconfigurable processor system. While highly applicable to many engineering problems, the DFT is an extremely computationally intensive algorithm. Consequently, the eventual goal of this work is to enhance the execution of a floating-point precision DFT algorithm by off loading the algorithm from the computing system. This computing system, within the context of this research, is a typical high performance desktop computer with an may of field programmable gate arrays (FPGAs). FPGAs are hardware devices that are configured by software to execute an algorithm. If it is desired to change the algorithm, the software is changed to reflect the modification, then download to the FPGA, which is then itself modified. This paper will discuss methodology for developing the DFT algorithm to be implemented on the FPGA. We will discuss the algorithm, the FPGA code effort, and the results to date.

  6. Generation of arbitrarily shaped picosecond optical pulses using an integrated electrooptic waveguide modulator.

    PubMed

    Haner, M; Warren, W S

    1987-09-01

    We have produced complex software adjustable laser pulse shapes with ~10-ps resolution, and pulse energies up to 100 microJ for spectroscopic applications. The key devices are a high damage threshold electrooptic directional coupler and a GaAs circuit for synthesizing arbitrarily shaped microwave pulses.

  7. The application of a computer data acquisition system to a new high temperature tribometer

    NASA Technical Reports Server (NTRS)

    Bonham, Charles D.; Dellacorte, Christopher

    1991-01-01

    The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.

  8. The application of a computer data acquisition system for a new high temperature tribometer

    NASA Technical Reports Server (NTRS)

    Bonham, Charles D.; Dellacorte, Christopher

    1990-01-01

    The two data acquisition computer programs are described which were developed for a high temperature friction and wear test apparatus, a tribometer. The raw data produced by the tribometer and the methods used to sample that data are explained. In addition, the instrumentation and computer hardware and software are presented. Also shown is how computer data acquisition was applied to increase convenience and productivity on a high temperature tribometer.

  9. Architectural Heritage Documentation by Using Low Cost Uav with Fisheye Lens: Otag-I Humayun in Istanbul as a Case Study

    NASA Astrophysics Data System (ADS)

    Yastikli, N.; Özerdem, Ö. Z.

    2017-11-01

    The digital documentation of architectural heritage is important for monitoring, preserving, managing as well as 3B BIM modelling, time-space VR (virtual reality) applications. The unmanned aerial vehicles (UAVs) have been widely used in these application thanks to rapid developments in technology which enable the high resolution images with resolutions in millimeters. Moreover, it has become possible to produce highly accurate 3D point clouds with structure from motion (SfM) and multi-view stereo (MVS), to obtain a surface reconstruction of a realistic 3D architectural heritage model by using high-overlap images and 3D modeling software such as Context capture, Pix4Dmapper, Photoscan. In this study, digital documentation of Otag-i Humayun (The Ottoman Empire Sultan's Summer Palace) located in Davutpaşa, Istanbul/Turkey is aimed using low cost UAV. The data collections have been made with low cost UAS 3DR Solo UAV with GoPro Hero 4 with fisheye lens. The data processing was accomplished by using commercial Pix4D software. The dense point clouds, a true orthophoto and 3D solid model of the Otag-i Humayun were produced results. The quality check of the produced point clouds has been performed. The obtained result from Otag-i Humayun in Istanbul proved that, the low cost UAV with fisheye lens can be successfully used for architectural heritage documentation.

  10. The Widening Gulf between Genomics Data Generation and Consumption: A Practical Guide to Big Data Transfer Technology.

    PubMed

    Feltus, Frank A; Breen, Joseph R; Deng, Juan; Izard, Ryan S; Konger, Christopher A; Ligon, Walter B; Preuss, Don; Wang, Kuang-Ching

    2015-01-01

    In the last decade, high-throughput DNA sequencing has become a disruptive technology and pushed the life sciences into a distributed ecosystem of sequence data producers and consumers. Given the power of genomics and declining sequencing costs, biology is an emerging "Big Data" discipline that will soon enter the exabyte data range when all subdisciplines are combined. These datasets must be transferred across commercial and research networks in creative ways since sending data without thought can have serious consequences on data processing time frames. Thus, it is imperative that biologists, bioinformaticians, and information technology engineers recalibrate data processing paradigms to fit this emerging reality. This review attempts to provide a snapshot of Big Data transfer across networks, which is often overlooked by many biologists. Specifically, we discuss four key areas: 1) data transfer networks, protocols, and applications; 2) data transfer security including encryption, access, firewalls, and the Science DMZ; 3) data flow control with software-defined networking; and 4) data storage, staging, archiving and access. A primary intention of this article is to orient the biologist in key aspects of the data transfer process in order to frame their genomics-oriented needs to enterprise IT professionals.

  11. Teachers' Initial Orchestration of Students' Dynamic Geometry Software Use: Consequences for Students' Opportunities to Learn Mathematics

    ERIC Educational Resources Information Center

    Erfjord, Ingvald

    2011-01-01

    This paper reports from a case study with teachers at two schools in Norway participating in developmental projects aiming for inquiry communities in mathematics teaching and learning. In the reported case study, the teachers participated in one of the developmental projects focusing on implementation and use of computer software in mathematics…

  12. PrimerSuite: A High-Throughput Web-Based Primer Design Program for Multiplex Bisulfite PCR.

    PubMed

    Lu, Jennifer; Johnston, Andrew; Berichon, Philippe; Ru, Ke-Lin; Korbie, Darren; Trau, Matt

    2017-01-24

    The analysis of DNA methylation at CpG dinucleotides has become a major research focus due to its regulatory role in numerous biological processes, but the requisite need for assays which amplify bisulfite-converted DNA represents a major bottleneck due to the unique design constraints imposed on bisulfite-PCR primers. Moreover, a review of the literature indicated no available software solutions which accommodated both high-throughput primer design, support for multiplex amplification assays, and primer-dimer prediction. In response, the tri-modular software package PrimerSuite was developed to support bisulfite multiplex PCR applications. This software was constructed to (i) design bisulfite primers against multiple regions simultaneously (PrimerSuite), (ii) screen for primer-primer dimerizing artefacts (PrimerDimer), and (iii) support multiplex PCR assays (PrimerPlex). Moreover, a major focus in the development of this software package was the emphasis on extensive empirical validation, and over 1300 unique primer pairs have been successfully designed and screened, with over 94% of them producing amplicons of the expected size, and an average mapping efficiency of 93% when screened using bisulfite multiplex resequencing. The potential use of the software in other bisulfite-based applications such as methylation-specific PCR is under consideration for future updates. This resource is freely available for use at PrimerSuite website (www.primer-suite.com).

  13. Use of Continuous Integration Tools for Application Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vergara Larrea, Veronica G; Joubert, Wayne; Fuson, Christopher B

    High performance computing systems are becom- ing increasingly complex, both in node architecture and in the multiple layers of software stack required to compile and run applications. As a consequence, the likelihood is increasing for application performance regressions to occur as a result of routine upgrades of system software components which interact in complex ways. The purpose of this study is to evaluate the effectiveness of continuous integration tools for application performance monitoring on HPC systems. In addition, this paper also describes a prototype system for application perfor- mance monitoring based on Jenkins, a Java-based continuous integration tool. The monitoringmore » system described leverages several features in Jenkins to track application performance results over time. Preliminary results and lessons learned from monitoring applications on Cray systems at the Oak Ridge Leadership Computing Facility are presented.« less

  14. Epigenomics of Development in Populus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strauss, Steve; Freitag, Michael; Mockler, Todd

    2013-01-10

    We conducted research to determine the role of epigenetic modifications during tree development using poplar (Populus trichocarpa), a model woody feedstock species. Using methylated DNA immunoprecipitation (MeDIP) or chromatin immunoprecipitation (ChIP), followed by high-throughput sequencing, we are analyzed DNA and histone methylation patterns in the P. trichocarpa genome in relation to four biological processes: bud dormancy and release, mature organ maintenance, in vitro organogenesis, and methylation suppression. Our project is now completed. We have 1) produced 22 transgenic events for a gene involved in DNA methylation suppression and studied its phenotypic consequences; 2) completed sequencing of methylated DNA from elevenmore » target tissues in wildtype P. trichocarpa; 3) updated our customized poplar genome browser using the open-source software tools (2.13) and (V2.2) of the P. trichocarpa genome; 4) produced summary data for genome methylation in P. trichocarpa, including distribution of methylation across chromosomes and in and around genes; 5) employed bioinformatic and statistical methods to analyze differences in methylation patterns among tissue types; and 6) used bisulfite sequencing of selected target genes to confirm bioinformatics and sequencing results, and gain a higher-resolution view of methylation at selected genes 7) compared methylation patterns to expression using available microarray data. Our main findings of biological significance are the identification of extensive regions of the genome that display developmental variation in DNA methylation; highly distinctive gene-associated methylation profiles in reproductive tissues, particularly male catkins; a strong whole genome/all tissue inverse association of methylation at gene bodies and promoters with gene expression; a lack of evidence that tissue specificity of gene expression is associated with gene methylation; and evidence that genome methylation is a significant impediment to tissue dedifferentiation and redifferentiation in vitro.« less

  15. Avoid the four perils of CRM.

    PubMed

    Rigby, Darrell K; Reichheld, Frederick F; Schefter, Phil

    2002-02-01

    Customer relationship management is one of the hottest management tools today. But more than half of all CRM initiatives fail to produce the anticipated results. Why? And what can companies do to reverse that negative trend? The authors--three senior Bain consultants--have spent the past ten years analyzing customer-loyalty initiatives, both successful and unsuccessful, at more than 200 companies in a wide range of industries. They've found that CRM backfires in part because executives don't understand what they are implementing, let alone how much it will cost or how long it will take. The authors' research unveiled four common pitfalls that managers stumble into when trying to implement CRM. Each pitfall is a consequence of a single flawed assumption--that CRM is software that will automatically manage customer relationships. It isn't. Rather, CRM is the creation of customer strategies and processes to build customer loyalty, which are then supported by the technology. This article looks at best practices in CRM at several companies, including the New York Times Company, Square D, GE Capital, Grand Expeditions, and BMC Software. It provides an intellectual framework for any company that wants to start a CRM program or turn around a failing one.

  16. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon D.

    1991-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. All materials have been grouped into eight general subject areas for easy reference: The Software Engineering Laboratory; The Software Engineering Laboratory: Software Development Documents; Software Tools; Software Models; Software Measurement; Technology Evaluations; Ada Technology; and Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  17. Tarjetas v.1.2015.7.23

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burchard, Ross L.; Pierson, Kathleen P.; Trumbo, Derek

    Tarjetas is used to generate requirements from source documents. These source documents are in a hierarchical XML format that have been produced from PDF documents processed through the “Reframe” software package. The software includes the ability to create Topics and associate text Snippets with those topics. Requirements are then generated and text Snippets with their associated Topics are referenced to the requirement. The software maintains traceability from the requirement ultimately to the source document that produced the snippet

  18. Rapid prototyping in orthopaedic surgery: a user's guide.

    PubMed

    Frame, Mark; Huntley, James S

    2012-01-01

    Rapid prototyping (RP) is applicable to orthopaedic problems involving three dimensions, particularly fractures, deformities, and reconstruction. In the past, RP has been hampered by cost and difficulties accessing the appropriate expertise. Here we outline the history of rapid prototyping and furthermore a process using open-source software to produce a high fidelity physical model from CT data. This greatly mitigates the expense associated with the technique, allowing surgeons to produce precise models for preoperative planning and procedure rehearsal. We describe the method with an illustrative case.

  19. Culture shock: Improving software quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of themore » concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.« less

  20. A comprehensive evaluation of popular proteomics software workflows for label-free proteome quantification and imputation.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2017-05-31

    Label-free mass spectrometry (MS) has developed into an important tool applied in various fields of biological and life sciences. Several software exist to process the raw MS data into quantified protein abundances, including open source and commercial solutions. Each software includes a set of unique algorithms for different tasks of the MS data processing workflow. While many of these algorithms have been compared separately, a thorough and systematic evaluation of their overall performance is missing. Moreover, systematic information is lacking about the amount of missing values produced by the different proteomics software and the capabilities of different data imputation methods to account for them.In this study, we evaluated the performance of five popular quantitative label-free proteomics software workflows using four different spike-in data sets. Our extensive testing included the number of proteins quantified and the number of missing values produced by each workflow, the accuracy of detecting differential expression and logarithmic fold change and the effect of different imputation and filtering methods on the differential expression results. We found that the Progenesis software performed consistently well in the differential expression analysis and produced few missing values. The missing values produced by the other software decreased their performance, but this difference could be mitigated using proper data filtering or imputation methods. Among the imputation methods, we found that the local least squares (lls) regression imputation consistently increased the performance of the software in the differential expression analysis, and a combination of both data filtering and local least squares imputation increased performance the most in the tested data sets. © The Author 2017. Published by Oxford University Press.

  1. Collected software engineering papers, volume 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  2. Introduction: Cybersecurity and Software Assurance Minitrack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, Luanne; George, Richard; Linger, Richard C

    Modern society is dependent on software systems of remarkable scope and complexity. Yet methods for assuring their security and functionality have not kept pace. The result is persistent compromises and failures despite best efforts. Cybersecurity methods must work together for situational awareness, attack prevention and detection, threat attribution, minimization of consequences, and attack recovery. Because defective software cannot be secure, assurance technologies must play a central role in cybersecurity approaches. There is increasing recognition of the need for rigorous methods for cybersecurity and software assurance. The goal of this minitrack is to develop science foundations, technologies, and practices that canmore » improve the security and dependability of complex systems.« less

  3. EOS MLS Level 2 Data Processing Software Version 3

    NASA Technical Reports Server (NTRS)

    Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.; hide

    2011-01-01

    This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.

  4. The theory and implementation of a high quality pulse width modulated waveform synthesiser applicable to voltage FED inverters

    NASA Astrophysics Data System (ADS)

    Lower, Kim Nigel

    1985-03-01

    Modulation processes associated with the digital implementation of pulse width modulation (PWM) switching strategies were examined. A software package based on a portable turnkey structure is presented. Waveform synthesizer implementation techniques are reviewed. A three phase PWM waveform synthesizer for voltage fed inverters was realized. It is based on a constant carrier frequency of 18 kHz and a regular sample, single edge, asynchronous PWM switching scheme. With high carrier frequencies, it is possible to utilize simple switching strategies and as a consequence, many advantages are highlighted, emphasizing the importance to industrial and office markets.

  5. Combination of structured illumination and single molecule localization microscopy in one setup

    NASA Astrophysics Data System (ADS)

    Rossberger, Sabrina; Best, Gerrit; Baddeley, David; Heintzmann, Rainer; Birk, Udo; Dithmar, Stefan; Cremer, Christoph

    2013-09-01

    Understanding the positional and structural aspects of biological nanostructures simultaneously is as much a challenge as a desideratum. In recent years, highly accurate (20 nm) positional information of optically isolated targets down to the nanometer range has been obtained using single molecule localization microscopy (SMLM), while highly resolved (100 nm) spatial information has been achieved using structured illumination microscopy (SIM). In this paper, we present a high-resolution fluorescence microscope setup which combines the advantages of SMLM with SIM in order to provide high-precision localization and structural information in a single setup. Furthermore, the combination of the wide-field SIM image with the SMLM data allows us to identify artifacts produced during the visualization process of SMLM data, and potentially also during the reconstruction process of SIM images. We describe the SMLM-SIM combo and software, and apply the instrument in a first proof-of-principle to the same region of H3K293 cells to achieve SIM images with high structural resolution (in the 100 nm range) in overlay with the highly accurate position information of localized single fluorophores. Thus, with its robust control software, efficient switching between the SMLM and SIM mode, fully automated and user-friendly acquisition and evaluation software, the SMLM-SIM combo is superior over existing solutions.

  6. Effects of byproducts amended lead contaminated urban soils on carrot yield and lead uptake

    USDA-ARS?s Scientific Manuscript database

    Lead (Pb) has been used to produce a large number of materials and manufactured products. In areas with a history of lead paint use, high vehicular traffic and/or areas close to urban and industrial centers, atmospheric lead deposition may be very high. Consequently, a high deposition of lead in u...

  7. preAssemble: a tool for automatic sequencer trace data processing.

    PubMed

    Adzhubei, Alexei A; Laerdahl, Jon K; Vlasova, Anna V

    2006-01-17

    Trace or chromatogram files (raw data) are produced by automatic nucleic acid sequencing equipment or sequencers. Each file contains information which can be interpreted by specialised software to reveal the sequence (base calling). This is done by the sequencer proprietary software or publicly available programs. Depending on the size of a sequencing project the number of trace files can vary from just a few to thousands of files. Sequencing quality assessment on various criteria is important at the stage preceding clustering and contig assembly. Two major publicly available packages--Phred and Staden are used by preAssemble to perform sequence quality processing. The preAssemble pre-assembly sequence processing pipeline has been developed for small to large scale automatic processing of DNA sequencer chromatogram (trace) data. The Staden Package Pregap4 module and base-calling program Phred are utilized in the pipeline, which produces detailed and self-explanatory output that can be displayed with a web browser. preAssemble can be used successfully with very little previous experience, however options for parameter tuning are provided for advanced users. preAssemble runs under UNIX and LINUX operating systems. It is available for downloading and will run as stand-alone software. It can also be accessed on the Norwegian Salmon Genome Project web site where preAssemble jobs can be run on the project server. preAssemble is a tool allowing to perform quality assessment of sequences generated by automatic sequencing equipment. preAssemble is flexible since both interactive jobs on the preAssemble server and the stand alone downloadable version are available. Virtually no previous experience is necessary to run a default preAssemble job, on the other hand options for parameter tuning are provided. Consequently preAssemble can be used as efficiently for just several trace files as for large scale sequence processing.

  8. Virtual Reality Robotic Programming Software in the Technology Classroom

    ERIC Educational Resources Information Center

    Geissler, Jason; Knott, Patrick J.; Vazquez, Matthew R.; Wright, John R., Jr.

    2004-01-01

    Robots make a wonderful context for teaching students about many concepts important to technological literacy. They can provide an authentic context and produce high levels of motivation. According to Standards for Technological Literacy: Content for the Study of Technology (STL) (ITEA, 2000, 2002), there are six core concepts that should be…

  9. Risk Based Inspection Methodology and Software Applied to Atmospheric Storage Tanks

    NASA Astrophysics Data System (ADS)

    Topalis, P.; Korneliussen, G.; Hermanrud, J.; Steo, Y.

    2012-05-01

    A new risk-based inspection (RBI) methodology and software is presented in this paper. The objective of this work is to allow management of the inspections of atmospheric storage tanks in the most efficient way, while, at the same time, accident risks are minimized. The software has been built on the new risk framework architecture, a generic platform facilitating efficient and integrated development of software applications using risk models. The framework includes a library of risk models and the user interface is automatically produced on the basis of editable schemas. This risk-framework-based RBI tool has been applied in the context of RBI for above-ground atmospheric storage tanks (AST) but it has been designed with the objective of being generic enough to allow extension to the process plants in general. This RBI methodology is an evolution of an approach and mathematical models developed for Det Norske Veritas (DNV) and the American Petroleum Institute (API). The methodology assesses damage mechanism potential, degradation rates, probability of failure (PoF), consequence of failure (CoF) in terms of environmental damage and financial loss, risk and inspection intervals and techniques. The scope includes assessment of the tank floor for soil-side external corrosion and product-side internal corrosion and the tank shell courses for atmospheric corrosion and internal thinning. It also includes preliminary assessment for brittle fracture and cracking. The data are structured according to an asset hierarchy including Plant, Production Unit, Process Unit, Tag, Part and Inspection levels and the data are inherited / defaulted seamlessly from a higher hierarchy level to a lower level. The user interface includes synchronized hierarchy tree browsing, dynamic editor and grid-view editing and active reports with drill-in capability.

  10. User's Manual for the National Water-Quality Assessment Program Invertebrate Data Analysis System (IDAS) Software: Version 3

    USGS Publications Warehouse

    Cuffney, Thomas F.

    2003-01-01

    The Invertebrate Data Analysis System (IDAS) software provides an accurate, consistent, and efficient mechanism for analyzing invertebrate data collected as part of the National Water-Quality Assessment Program and stored in the Biological Transactional Database (Bio-TDB). The IDAS software is a stand-alone program for personal computers that run Microsoft (MS) Windows?. It allows users to read data downloaded from Bio-TDB and stored either as MS Excel? or MS Access? files. The program consists of five modules. The Edit Data module allows the user to subset, combine, delete, and summarize community data. The Data Preparation module allows the user to select the type(s) of sample(s) to process, calculate densities, delete taxa based on laboratory processing notes, combine lifestages or keep them separate, select a lowest taxonomic level for analysis, delete rare taxa, and resolve taxonomic ambiguities. The Calculate Community Metrics module allows the user to calculate over 130 community metrics, including metrics based on organism tolerances and functional feeding groups. The Calculate Diversities and Similarities module allows the user to calculate nine diversity and eight similarity indices. The Data export module allows the user to export data to other software packages and produce tables of community data that can be imported into spreadsheet and word-processing programs. Though the IDAS program was developed to process invertebrate data downloaded from USGS databases, it will work with other data sets that are converted to the USGS (Bio-TDB) format. Consequently, the data manipulation, analysis, and export procedures provided by the IDAS program can be used by anyone involved in using benthic macroinvertebrates in applied or basic research.

  11. Generation of multiple analog pulses with different duty cycles within VME control system for ICRH Aditya system

    NASA Astrophysics Data System (ADS)

    Joshi, Ramesh; Singh, Manoj; Jadav, H. M.; Misra, Kishor; Kulkarni, S. V.; ICRH-RF Group

    2010-02-01

    Ion Cyclotron Resonance Heating (ICRH) is a promising heating method for a fusion device due to its localized power deposition profile, a direct ion heating at high density, and established technology for high RF power generation and transmission at low cost. Multiple analog pulse with different duty cycle in master of digital pulse for Data acquisition and Control system for steady state RF ICRH System(RF ICRH DAC) to be used for operating of RF Generator in Aditya to produce pre ionization and second analog pulse will produce heating. The control system software is based upon single digital pulse operation for RF source. It is planned to integrate multiple analog pulses with different duty cycle in master of digital pulse for Data acquisition and Control system for RF ICRH System(RF ICRH DAC) to be used for operating of RF Generator in Aditya tokamak. The task of RF ICRH DAC is to control and acquisition of all ICRH system operation with all control loop and acquisition for post analysis of data with java based tool. For pre ionization startup as well as heating experiments using multiple RF Power of different powers and duration. The experiment based upon the idea of using single RF generator to energize antenna inside the tokamak to radiate power twise, out of which first analog pulse will produce pre ionization and second analog pulse will produce heating. The whole system is based on standard client server technology using tcp/ip protocol. DAC Software is based on linux operating system for highly reliable, secure and stable system operation in failsafe manner. Client system is based on tcl/tk like toolkit for user interface with c/c++ like environment which is reliable programming languages widely used on stand alone system operation with server as vxWorks real time operating system like environment. The paper is focused on the Data acquisition and monitoring system software on Aditya RF ICRH System with analog pulses in slave mode with digital pulse in master mode for control acquisition and monitoring and interlocking.

  12. Combining High-Speed Cameras and Stop-Motion Animation Software to Support Students' Modeling of Human Body Movement

    NASA Astrophysics Data System (ADS)

    Lee, Victor R.

    2015-04-01

    Biomechanics, and specifically the biomechanics associated with human movement, is a potentially rich backdrop against which educators can design innovative science teaching and learning activities. Moreover, the use of technologies associated with biomechanics research, such as high-speed cameras that can produce high-quality slow-motion video, can be deployed in such a way to support students' participation in practices of scientific modeling. As participants in classroom design experiment, fifteen fifth-grade students worked with high-speed cameras and stop-motion animation software (SAM Animation) over several days to produce dynamic models of motion and body movement. The designed series of learning activities involved iterative cycles of animation creation and critique and use of various depictive materials. Subsequent analysis of flipbooks of human jumping movements created by the students at the beginning and end of the unit revealed a significant improvement in both the epistemic fidelity of students' representations. Excerpts from classroom observations highlight the role that the teacher plays in supporting students' thoughtful reflection of and attention to slow-motion video. In total, this design and research intervention demonstrates that the combination of technologies, activities, and teacher support can lead to improvements in some of the foundations associated with students' modeling.

  13. Software for Aerospace Education. A Bibliography (Second Edition).

    ERIC Educational Resources Information Center

    Vogt, Gregory L.; And Others

    The software described in this bibliography represents programs made available to the National Aeronautics and Space Administration (NASA) Educational Technology Branch by software producers and vendors. More than 200 computer software programs and 12 laser videodisk programs are reviewed in terms of title, copyright, subject, application, type,…

  14. Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects

    ERIC Educational Resources Information Center

    Buffardi, Kevin John

    2014-01-01

    Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…

  15. Students Soaring High with Software Spinoff

    NASA Technical Reports Server (NTRS)

    2004-01-01

    An educational software product designed by the Educational Technology Team at Ames Research Center is bringing actual aeronautical work performed by NASA engineers to the public in an interactive format for the very first time, in order to introduce future generations of engineers to the fundamentals of flight. The "Exploring Aeronautics" multimedia CD-ROM was created for use by teachers of students in grades 5 through 8. The software offers an introduction to aeronautics and covers the fundamentals of flight, including how airplanes take off, fly, and land. It contains a historical timeline and a glossary of aeronautical terms, examines different types of aircraft, and familiarizes its audience with the tools used by researchers to test aircraft designs, like wind tunnels and computational fluid dynamics. "Exploring Aeronautics" was done in cartoon animation to make it appealing to kids," notes Andrew Doser, an Ames graphic artist who helped to produce the CD-ROM, along with a team of multimedia programmers, artists, and educators, in conjunction with numerous Ames scientists. In addition to lively animation, the software features QuickTime movies and highly intuitive tools to promote usage of NASA s scientific methods in the world of aeronautics.

  16. Producing genome structure populations with the dynamic and automated PGS software.

    PubMed

    Hua, Nan; Tjong, Harianto; Shin, Hanjun; Gong, Ke; Zhou, Xianghong Jasmine; Alber, Frank

    2018-05-01

    Chromosome conformation capture technologies such as Hi-C are widely used to investigate the spatial organization of genomes. Because genome structures can vary considerably between individual cells of a population, interpreting ensemble-averaged Hi-C data can be challenging, in particular for long-range and interchromosomal interactions. We pioneered a probabilistic approach for the generation of a population of distinct diploid 3D genome structures consistent with all the chromatin-chromatin interaction probabilities from Hi-C experiments. Each structure in the population is a physical model of the genome in 3D. Analysis of these models yields new insights into the causes and the functional properties of the genome's organization in space and time. We provide a user-friendly software package, called PGS, which runs on local machines (for practice runs) and high-performance computing platforms. PGS takes a genome-wide Hi-C contact frequency matrix, along with information about genome segmentation, and produces an ensemble of 3D genome structures entirely consistent with the input. The software automatically generates an analysis report, and provides tools to extract and analyze the 3D coordinates of specific domains. Basic Linux command-line knowledge is sufficient for using this software. A typical running time of the pipeline is ∼3 d with 300 cores on a computer cluster to generate a population of 1,000 diploid genome structures at topological-associated domain (TAD)-level resolution.

  17. Removing a barrier to computer-based outbreak and disease surveillance--the RODS Open Source Project.

    PubMed

    Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J

    2004-09-24

    Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.

  18. Developing high-quality educational software.

    PubMed

    Johnson, Lynn A; Schleyer, Titus K L

    2003-11-01

    The development of effective educational software requires a systematic process executed by a skilled development team. This article describes the core skills required of the development team members for the six phases of successful educational software development. During analysis, the foundation of product development is laid including defining the audience and program goals, determining hardware and software constraints, identifying content resources, and developing management tools. The design phase creates the specifications that describe the user interface, the sequence of events, and the details of the content to be displayed. During development, the pieces of the educational program are assembled. Graphics and other media are created, video and audio scripts written and recorded, the program code created, and support documentation produced. Extensive testing by the development team (alpha testing) and with students (beta testing) is conducted. Carefully planned implementation is most likely to result in a flawless delivery of the educational software and maintenance ensures up-to-date content and software. Due to the importance of the sixth phase, evaluation, we have written a companion article on it that follows this one. The development of a CD-ROM product is described including the development team, a detailed description of the development phases, and the lessons learned from the project.

  19. The Software Ontology (SWO): a resource for reproducibility in biomedical data analysis, curation and digital preservation.

    PubMed

    Malone, James; Brown, Andy; Lister, Allyson L; Ison, Jon; Hull, Duncan; Parkinson, Helen; Stevens, Robert

    2014-01-01

    Biomedical ontologists to date have concentrated on ontological descriptions of biomedical entities such as gene products and their attributes, phenotypes and so on. Recently, effort has diversified to descriptions of the laboratory investigations by which these entities were produced. However, much biological insight is gained from the analysis of the data produced from these investigations, and there is a lack of adequate descriptions of the wide range of software that are central to bioinformatics. We need to describe how data are analyzed for discovery, audit trails, provenance and reproducibility. The Software Ontology (SWO) is a description of software used to store, manage and analyze data. Input to the SWO has come from beyond the life sciences, but its main focus is the life sciences. We used agile techniques to gather input for the SWO and keep engagement with our users. The result is an ontology that meets the needs of a broad range of users by describing software, its information processing tasks, data inputs and outputs, data formats versions and so on. Recently, the SWO has incorporated EDAM, a vocabulary for describing data and related concepts in bioinformatics. The SWO is currently being used to describe software used in multiple biomedical applications. The SWO is another element of the biomedical ontology landscape that is necessary for the description of biomedical entities and how they were discovered. An ontology of software used to analyze data produced by investigations in the life sciences can be made in such a way that it covers the important features requested and prioritized by its users. The SWO thus fits into the landscape of biomedical ontologies and is produced using techniques designed to keep it in line with user's needs. The Software Ontology is available under an Apache 2.0 license at http://theswo.sourceforge.net/; the Software Ontology blog can be read at http://softwareontology.wordpress.com.

  20. The Software Ontology (SWO): a resource for reproducibility in biomedical data analysis, curation and digital preservation

    PubMed Central

    2014-01-01

    Motivation Biomedical ontologists to date have concentrated on ontological descriptions of biomedical entities such as gene products and their attributes, phenotypes and so on. Recently, effort has diversified to descriptions of the laboratory investigations by which these entities were produced. However, much biological insight is gained from the analysis of the data produced from these investigations, and there is a lack of adequate descriptions of the wide range of software that are central to bioinformatics. We need to describe how data are analyzed for discovery, audit trails, provenance and reproducibility. Results The Software Ontology (SWO) is a description of software used to store, manage and analyze data. Input to the SWO has come from beyond the life sciences, but its main focus is the life sciences. We used agile techniques to gather input for the SWO and keep engagement with our users. The result is an ontology that meets the needs of a broad range of users by describing software, its information processing tasks, data inputs and outputs, data formats versions and so on. Recently, the SWO has incorporated EDAM, a vocabulary for describing data and related concepts in bioinformatics. The SWO is currently being used to describe software used in multiple biomedical applications. Conclusion The SWO is another element of the biomedical ontology landscape that is necessary for the description of biomedical entities and how they were discovered. An ontology of software used to analyze data produced by investigations in the life sciences can be made in such a way that it covers the important features requested and prioritized by its users. The SWO thus fits into the landscape of biomedical ontologies and is produced using techniques designed to keep it in line with user’s needs. Availability The Software Ontology is available under an Apache 2.0 license at http://theswo.sourceforge.net/; the Software Ontology blog can be read at http://softwareontology.wordpress.com. PMID:25068035

  1. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand.

    PubMed

    Chung, Beom Sun; Chung, Min Suk; Shin, Byeong Seok; Kwon, Koojoo

    2018-02-19

    The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. © 2018 The Korean Academy of Medical Sciences.

  2. Three Software Tools for Viewing Sectional Planes, Volume Models, and Surface Models of a Cadaver Hand

    PubMed Central

    2018-01-01

    Background The hand anatomy, including the complicated hand muscles, can be grasped by using computer-assisted learning tools with high quality two-dimensional images and three-dimensional models. The purpose of this study was to present up-to-date software tools that promote learning of stereoscopic morphology of the hand. Methods On the basis of horizontal sectioned images and outlined images of a male cadaver, vertical planes, volume models, and surface models were elaborated. Software to browse pairs of the sectioned and outlined images in orthogonal planes and software to peel and rotate the volume models, as well as a portable document format (PDF) file to select and rotate the surface models, were produced. Results All of the software tools were downloadable free of charge and usable off-line. The three types of tools for viewing multiple aspects of the hand could be adequately employed according to individual needs. Conclusion These new tools involving the realistic images of a cadaver and the diverse functions are expected to improve comprehensive knowledge of the hand shape. PMID:29441756

  3. NASA's TReK Project: A Case Study in Using the Spiral Model of Software Development

    NASA Technical Reports Server (NTRS)

    Hendrix, T. Dean; Schneider, Michelle P.

    1998-01-01

    Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.

  4. Standardized development of computer software. Part 2: Standards

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1978-01-01

    This monograph contains standards for software development and engineering. The book sets forth rules for design, specification, coding, testing, documentation, and quality assurance audits of software; it also contains detailed outlines for the documentation to be produced.

  5. Software for minimalistic data management in large camera trap studies

    PubMed Central

    Krishnappa, Yathin S.; Turner, Wendy C.

    2014-01-01

    The use of camera traps is now widespread and their importance in wildlife studies well understood. Camera trap studies can produce millions of photographs and there is a need for software to help manage photographs efficiently. In this paper, we describe a software system that was built to successfully manage a large behavioral camera trap study that produced more than a million photographs. We describe the software architecture and the design decisions that shaped the evolution of the program over the study’s three year period. The software system has the ability to automatically extract metadata from images, and add customized metadata to the images in a standardized format. The software system can be installed as a standalone application on popular operating systems. It is minimalistic, scalable and extendable so that it can be used by small teams or individual researchers for a broad variety of camera trap studies. PMID:25110471

  6. Using CyberShake Workflows to Manage Big Seismic Hazard Data on Large-Scale Open-Science HPC Resources

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2015-12-01

    The CyberShake computational platform, developed by the Southern California Earthquake Center (SCEC), is an integrated collection of scientific software and middleware that performs 3D physics-based probabilistic seismic hazard analysis (PSHA) for Southern California. CyberShake integrates large-scale and high-throughput research codes to produce probabilistic seismic hazard curves for individual locations of interest and hazard maps for an entire region. A recent CyberShake calculation produced about 500,000 two-component seismograms for each of 336 locations, resulting in over 300 million synthetic seismograms in a Los Angeles-area probabilistic seismic hazard model. CyberShake calculations require a series of scientific software programs. Early computational stages produce data used as inputs by later stages, so we describe CyberShake calculations using a workflow definition language. Scientific workflow tools automate and manage the input and output data and enable remote job execution on large-scale HPC systems. To satisfy the requests of broad impact users of CyberShake data, such as seismologists, utility companies, and building code engineers, we successfully completed CyberShake Study 15.4 in April and May 2015, calculating a 1 Hz urban seismic hazard map for Los Angeles. We distributed the calculation between the NSF Track 1 system NCSA Blue Waters, the DOE Leadership-class system OLCF Titan, and USC's Center for High Performance Computing. This study ran for over 5 weeks, burning about 1.1 million node-hours and producing over half a petabyte of data. The CyberShake Study 15.4 results doubled the maximum simulated seismic frequency from 0.5 Hz to 1.0 Hz as compared to previous studies, representing a factor of 16 increase in computational complexity. We will describe how our workflow tools supported splitting the calculation across multiple systems. We will explain how we modified CyberShake software components, including GPU implementations and migrating from file-based communication to MPI messaging, to greatly reduce the I/O demands and node-hour requirements of CyberShake. We will also present performance metrics from CyberShake Study 15.4, and discuss challenges that producers of Big Data on open-science HPC resources face moving forward.

  7. Animation of multi-flexible body systems and its use in control system design

    NASA Technical Reports Server (NTRS)

    Juengst, Carl; Stahlberg, Ron

    1993-01-01

    Animation can greatly assist the structural dynamicist and control system analyst with better understanding of how multi-flexible body systems behave. For multi-flexible body systems, the structural characteristics (mode frequencies, mode shapes, and damping) change, sometimes dramatically with large angles of rotation between bodies. With computer animation, the analyst can visualize these changes and how the system responds to active control forces and torques. A characterization of the type of system we wish to animate is presented. The lack of clear understanding of the above effects was a key element leading to the development of a multi-flexible body animation software package. The resulting animation software is described in some detail here, followed by its application to the control system analyst. Other applications of this software can be determined on an individual need basis. A number of software products are currently available that make the high-speed rendering of rigid body mechanical system simulation possible. However, such options are not available for use in rendering flexible body mechanical system simulations. The desire for a high-speed flexible body visualization tool led to the development of the Flexible Or Rigid Mechanical System (FORMS) software. This software was developed at the Center for Simulation and Design Optimization of Mechanical Systems at the University of Iowa. FORMS provides interactive high-speed rendering of flexible and/or rigid body mechanical system simulations, and combines geometry and motion information to produce animated output. FORMS is designed to be both portable and flexible, and supports a number of different user interfaces and graphical display devices. Additional features have been added to FORMS that allow special visualization results related to the nature of the flexible body geometric representations.

  8. JPSS Science Data Services for the Direct Readout Community

    NASA Technical Reports Server (NTRS)

    Chander, Gyanesh; Lutz, Bob

    2014-01-01

    The Suomi National Polar-orbiting Partnership (S-NPP) and Joint Polar Satellite System (JPSS) High Rate Data (HRD) link provides Direct Broadcast data to users in real-time, utilizing their own remote field terminals. The Field Terminal Support (FTS) provides the resources needed to support the Direct Readout communities by providing software, documentation, and periodic updates to enable them to produce data products from SNPP and JPSS. The FTS distribution server will also provide the necessary ancillary and auxiliary data needed for processing the broadcasts, as well as making orbital data available to assist in locating the satellites of interest. In addition, the FTS provides development support for the algorithm and software through GSFC Direct Readout Laboratory (DRL) International Polar Orbiter Processing Package (IPOPP) and University of Wisconsin (UWISC) Community Satellite Processing Package (CSPP), to enable users to integrate the algorithms into their remote terminals. The support the JPSS Program provides to the institutions developing and maintaining these two software packages, will demonstrate the ability to produce ready-to-use products from the HRD link and provide risk reduction effort at a minimal cost. This paper discusses the key functions and system architecture of FTS.

  9. Evidential evaluation of DNA profiles using a discrete statistical model implemented in the DNA LiRa software.

    PubMed

    Puch-Solis, Roberto; Clayton, Tim

    2014-07-01

    The high sensitivity of the technology for producing profiles means that it has become routine to produce profiles from relatively small quantities of DNA. The profiles obtained from low template DNA (LTDNA) are affected by several phenomena which must be taken into consideration when interpreting and evaluating this evidence. Furthermore, many of the same phenomena affect profiles from higher amounts of DNA (e.g. where complex mixtures has been revealed). In this article we present a statistical model, which forms the basis of software DNA LiRa, and that is able to calculate likelihood ratios where one to four donors are postulated and for any number of replicates. The model can take into account dropin and allelic dropout for different contributors, template degradation and uncertain allele designations. In this statistical model unknown parameters are treated following the Empirical Bayesian paradigm. The performance of LiRa is tested using examples and the outputs are compared with those generated using two other statistical software packages likeLTD and LRmix. The concept of ban efficiency is introduced as a measure for assessing model sensitivity. Copyright © 2014. Published by Elsevier Ireland Ltd.

  10. Minor Distortions with Major Consequences: Correcting Distortions in Imaging Spectrographs

    PubMed Central

    Esmonde-White, Francis W. L.; Esmonde-White, Karen A.; Morris, Michael D.

    2010-01-01

    Projective transformation is a mathematical correction (implemented in software) used in the remote imaging field to produce distortion-free images. We present the application of projective transformation to correct minor alignment and astigmatism distortions that are inherent in dispersive spectrographs. Patterned white-light images and neon emission spectra were used to produce registration points for the transformation. Raman transects collected on microscopy and fiber-optic systems were corrected using established methods and compared with the same transects corrected using the projective transformation. Even minor distortions have a significant effect on reproducibility and apparent fluorescence background complexity. Simulated Raman spectra were used to optimize the projective transformation algorithm. We demonstrate that the projective transformation reduced the apparent fluorescent background complexity and improved reproducibility of measured parameters of Raman spectra. Distortion correction using a projective transformation provides a major advantage in reducing the background fluorescence complexity even in instrumentation where slit-image distortions and camera rotation were minimized using manual or mechanical means. We expect these advantages should be readily applicable to other spectroscopic modalities using dispersive imaging spectrographs. PMID:21211158

  11. Producing Hydrogen by Plasma Pyrolysis of Methane

    NASA Technical Reports Server (NTRS)

    Atwater, James; Akse, James; Wheeler, Richard

    2010-01-01

    Plasma pyrolysis of methane has been investigated for utility as a process for producing hydrogen. This process was conceived as a means of recovering hydrogen from methane produced as a byproduct of operation of a life-support system aboard a spacecraft. On Earth, this process, when fully developed, could be a means of producing hydrogen (for use as a fuel) from methane in natural gas. The most closely related prior competing process - catalytic pyrolysis of methane - has several disadvantages: a) The reactor used in the process is highly susceptible to fouling and deactivation of the catalyst by carbon deposits, necessitating frequent regeneration or replacement of the catalyst. b) The reactor is highly susceptible to plugging by deposition of carbon within fixed beds, with consequent channeling of flow, high pressure drops, and severe limitations on mass transfer, all contributing to reductions in reactor efficiency. c) Reaction rates are intrinsically low. d) The energy demand of the process is high.

  12. Copyright and the Assurance of Quality Courseware.

    ERIC Educational Resources Information Center

    Helm, Virginia M.

    Issues related to the illegal copying or piracy of educational software in the schools and its potential effect on quality software availability are discussed. Copyright violation is examined as a reason some software producers may be abandoning the school software market. An explanation of what the copyright allows and prohibits in terms of…

  13. Annotated bibliography of Software Engineering Laboratory (SEL) literature

    NASA Technical Reports Server (NTRS)

    Card, D.

    1982-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is presented. More than 75 publications are summarized. An index of these publications by subject is also included. These publications cover many areas of software engineering and range from research reports to software documentation.

  14. A Process for Evaluating Student Records Management Software. ERIC/AE Digest.

    ERIC Educational Resources Information Center

    Vecchioli, Lisa

    This digest provides practical advice on evaluating software for managing student records. An evaluation of record-keeping software should start with a process to identify all of the individual needs the software produce must meet in order to be considered for purchase. The first step toward establishing an administrative computing system is…

  15. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Buhler, Melanie; Valett, Jon

    1989-01-01

    An annotated bibliography is presented of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. The bibliography was updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials were grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  16. Promoting Engineering Education among High School and Middle School Students

    ERIC Educational Resources Information Center

    Goonatilake, Rohitha; Bachnak, Rafic A.

    2012-01-01

    Recent decline of students pursuing engineering degree programs is a great concern for many higher education authorities including Federal and State governments. Existing programs in high schools have not yet produced the desired results. Consequently, a number of initiatives to remedy this situation have been proposed and implemented. One such…

  17. Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations

    ERIC Educational Resources Information Center

    Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.

    2007-01-01

    Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…

  18. Implications of Aggregated DoD Information Systems for Information Assurance Certification and Accreditation

    DTIC Science & Technology

    2010-01-01

    offshoring, or producing major software components overseas (Defense Science Board, 2009). These trends raise concerns about the level of trust that...7 Software Complexity...7 Increasing Software Vulnerabilities and Malware Population . . . . . . . . . . . . . . . . 9 Limitations of

  19. New and Promising: Software Worth a Look. A MicroSIFT Survey of Educational Software Preview Center Coordinators. Volume II, No. 1.

    ERIC Educational Resources Information Center

    Podany, Zita

    This guide lists 21 software packages considered to be worthy of further consideration by other reviewing agencies and schools by a group of 12 computer coordinators from educational software preview centers and evaluation agencies. These software products have been selected as not being likely to appear in the reviews produced by major software…

  20. Physician use of updated anti-virus software in a tertiary Nigerian hospital.

    PubMed

    Laabes, E P; Nyango, D D; Ayedima, M M; Ladep, N G

    2010-01-01

    While physicians are becoming increasingly dependent on computers and the internet, highly lethal malware continue to be loaded into cyberspace. We sought to assess the proportion of physicians with updated anti-virus software in Jos University Teaching Hospital Nigeria and to determine perceived barriers to getting updates. We used a pre-tested semi-structured self-administered questionnaire to conduct a cross-sectional survey among 118 physicians. The mean age (+/- SD) of subjects was 34 (+/- 4) years, with 94 male and 24 female physicians. Forty-two (36.5%) of 115 physicians with anti-virus software used an updated program (95% Cl: 27, 45). The top-three antivirus software were: McAfee 40 (33.9%), AVG 37 (31.4%) and Norton 17 (14.4%). Common infections were: Trojan horse 22 (29.7%), Brontok worm 8 (10.8%), and Ravmonlog.exe 5 (6.8%). Internet browsing with a firewall was an independent determinant for use of updated anti-virus software [OR 4.3, 95% CI, 1.86, 10.02; P < 0.001]. Busy schedule, 40 (33.9%) and lack of credit card 39 (33.1%) were perceived barriers to updating antivirus software. The use of regularly updated anti-virus software is sub-optimal among physicians implying vulnerability to computer viruses. Physicians should be careful with flash drives and should avoid being victims of the raging arms race between malware producers and anti-virus software developers.

  1. The Use of Modeling Approach for Teaching Exponential Functions

    NASA Astrophysics Data System (ADS)

    Nunes, L. F.; Prates, D. B.; da Silva, J. M.

    2017-12-01

    This work presents a discussion related to the teaching and learning of mathematical contents related to the study of exponential functions in a freshman students group enrolled in the first semester of the Science and Technology Bachelor’s (STB of the Federal University of Jequitinhonha and Mucuri Valleys (UFVJM). As a contextualization tool strongly mentioned in the literature, the modelling approach was used as an educational teaching tool to produce contextualization in the teaching-learning process of exponential functions to these students. In this sense, were used some simple models elaborated with the GeoGebra software and, to have a qualitative evaluation of the investigation and the results, was used Didactic Engineering as a methodology research. As a consequence of this detailed research, some interesting details about the teaching and learning process were observed, discussed and described.

  2. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  3. Development of the Software for 30 inch Telescope Control System at KHAO

    NASA Astrophysics Data System (ADS)

    Mun, B.-S.; Kim, S.-J.; Jang, M.; Min, S.-W.; Seol, K.-H.; Moon, K.-S.

    2006-12-01

    Even though 30inch optical telescope at Kyung Hee Astronomy Observatory has been used to produce a series of scientific achievements since its first light in 1992, numerous difficulties in the operation of the telescope have hindered the precise observations needed for further researches. Since the currently used PC-TCS (Personal Computer based Telescope Control system) software based on ISA-bus type is outdated, it doesn't have a user friendly interface and make it impossible to scale. Also accumulated errors which are generated by discordance from input and output signals into a motion controller required new control system. Thus we have improved the telescope control system by updating software and modifying mechanical parts. We applied a new BLDC (brushless DC) servo motor system to the mechanical parts of the telescope and developed a control software using Visual Basic 6.0. As a result, we could achieve a high accuracy in controlling of the telescope and use the userfriendly GUI (Graphic User Interface).

  4. Generalized Support Software: Domain Analysis and Implementation

    NASA Technical Reports Server (NTRS)

    Stark, Mike; Seidewitz, Ed

    1995-01-01

    For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.

  5. Identification of triacylglycerol using automated annotation of high resolution multistage mass spectral trees.

    PubMed

    Wang, Xiupin; Peng, Qingzhi; Li, Peiwu; Zhang, Qi; Ding, Xiaoxia; Zhang, Wen; Zhang, Liangxiao

    2016-10-12

    High complexity of identification for non-target triacylglycerols (TAGs) is a major challenge in lipidomics analysis. To identify non-target TAGs, a powerful tool named accurate MS(n) spectrometry generating so-called ion trees is used. In this paper, we presented a technique for efficient structural elucidation of TAGs on MS(n) spectral trees produced by LTQ Orbitrap MS(n), which was implemented as an open source software package, or TIT. The TIT software was used to support automatic annotation of non-target TAGs on MS(n) ion trees from a self-built fragment ion database. This database includes 19108 simulate TAG molecules from a random combination of fatty acids and corresponding 500582 self-built multistage fragment ions (MS ≤ 3). Our software can identify TAGs using a "stage-by-stage elimination" strategy. By utilizing the MS(1) accurate mass and referenced RKMD, the TIT software can discriminate unique elemental composition candidates. The regiospecific isomers of fatty acyl chains will be distinguished using MS(2) and MS(3) fragment spectra. We applied the algorithm to the selection of 45 TAG standards and demonstrated that the molecular ions could be 100% correctly assigned. Therefore, the TIT software could be applied to TAG identification in complex biological samples such as mouse plasma extracts. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. An analysis of integrated science and language arts themes in software at the elementary school level

    NASA Astrophysics Data System (ADS)

    Libidinsky, Lisa Jill

    2002-09-01

    There are many demands on the elementary classroom teacher today, such that teachers often do not have the time and resources to instruct in a meaningful manner that would produce effective, real instruction. Subjects are often disjointed and not significant. When teachers instruct using an integrated approach, students learn more efficiently as they see connections in the subjects. Science and language arts, when combined to produce an integrated approach, show positive associations that can enable students to learn real-life connections. In addition, with the onset of technology and the increased usage of technological programs in the schools, teachers can use technology to support an integrated curriculum. When teachers use a combined instructional focus of science, language arts, and technology to produce lessons, students are able to gain knowledge of concepts and skills necessary for appropriate academic growth and development. Given that there are many software programs available to teachers for classroom use, it is imperative that quality software is used for instruction. Using criteria based upon an intensive literature review of integrated instruction in the areas of science and language arts, this study examines science and language arts software programs to determine whether there are science and language arts integrated themes in the software analyzed. Also, this study examines whether more science and language arts integrated themes are present in science or language arts software programs. Overall, this study finds a significant difference between language arts software and science software when looking at integrated themes. This study shows that science software shows integrated themes with language arts more often than does language arts software with science. The findings in this study can serve as a reference point for educators when selecting software that is meaningful and effective in the elementary classroom. Based on this study, it is apparent that there is a need to evaluate software for appropriate use in the classroom in order to promote effective education.

  7. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  8. Peak Doctor v 1.0.0 Labview Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garner, Scott

    2014-05-29

    PeakDoctor software works interactively with its user to analyze raw gamma-ray spectroscopic data. The goal of the software is to produce a list of energies and areas of all of the peaks in the spectrum, as accurately as possible. It starts by performing an energy calibration, creating a function that describes how energy can be related to channel number. Next, the software determines which channels in the raw histogram are in the Compton continuum and which channels are parts of a peak. Then the software fits the Compton continuum with cubic polynomials. The last step is to fit all ofmore » the peaks with Gaussian functions, thus producing the list.« less

  9. Data archiving and network system of Bisei Spaceguard center

    NASA Astrophysics Data System (ADS)

    Terazono, J.-Y.; Asami, A.; Asher, D.; Hashimoto, N.; Nakano, S.; Nishiyama, K.; Oshima, Y.; Umehara, H.; Urata, T.; Yoshikawa, M.; Isobe, S.

    2002-09-01

    Bisei Spaceguard Center, Japan's first facility for observations of space debris and Near-Earth Objects (NEOs), will produce large amounts of data. In this paper, we describe details of the data transfer and processing system we are now developing. Also we present a software system devoted to the discovery of asteroids mainly by high school students.

  10. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  11. Software/hardware optimization for attenuation-based microtomography using SR at PETRA III (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Beckmann, Felix

    2016-10-01

    The Helmholtz-Zentrum Geesthacht, Germany, is operating the user experiments for microtomography at the beamlines P05 and P07 using synchrotron radiation produced in the storage ring PETRA III at DESY, Hamburg, Germany. In recent years the software pipeline, sample changing hardware for performing high throughput experiments were developed. In this talk the current status of the beamlines will be given. Furthermore, optimisation and automatisation of scanning techniques, will be presented. These are required to scan samples which are larger than the field of view defined by the X-ray beam. The integration into an optimized reconstruction pipeline will be shown.

  12. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    PubMed

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  13. The long-term impact of magnesium in seawater on foraminiferal mineralogy: Mechanism and consequences

    NASA Astrophysics Data System (ADS)

    Dijk, I.; Nooijer, L. J.; Hart, M. B.; Reichart, G.-J.

    2016-03-01

    Foraminifera are unicellular protists, primarily known for their calcium carbonate shells that provide an extensive fossil record. This record, ranging from Cambrian to present shows both major shifts and gradual changes in the relative occurrence of taxa producing different polymorphs of carbonate. Here we present evidence for coupling between shifts in calcite- versus aragonite-producing species and periods with, respectively, low and high seawater Mg/Ca throughout the Phanerozoic. During periods when seawater Mg/Ca is <2 mol/mol, low-Mg calcite-producing species dominate the foraminiferal community. Vice versa, high-Mg calcite- and aragonite-producing species are more abundant during periods with relatively high seawater Mg/Ca. This alteration in dominance of the phase precipitated is due to selective recovery of groups producing the favorable polymorph after shifts from calcite to aragonite seas. In addition, relatively high extinction rates of species producing the mineral phase not favored by the seawater Mg/Ca of that time may be responsible for this alteration. These results imply that the current high seawater Mg/Ca will, in the long term, favor prevalence of high-Mg and aragonite-producing foraminifera over calcite-producing taxa, possibly shifting the balance toward a community in which calcite production is less dominant.

  14. Climate Change, Human Rights, and Social Justice.

    PubMed

    Levy, Barry S; Patz, Jonathan A

    2015-01-01

    The environmental and health consequences of climate change, which disproportionately affect low-income countries and poor people in high-income countries, profoundly affect human rights and social justice. Environmental consequences include increased temperature, excess precipitation in some areas and droughts in others, extreme weather events, and increased sea level. These consequences adversely affect agricultural production, access to safe water, and worker productivity, and, by inundating land or making land uninhabitable and uncultivatable, will force many people to become environmental refugees. Adverse health effects caused by climate change include heat-related disorders, vector-borne diseases, foodborne and waterborne diseases, respiratory and allergic disorders, malnutrition, collective violence, and mental health problems. These environmental and health consequences threaten civil and political rights and economic, social, and cultural rights, including rights to life, access to safe food and water, health, security, shelter, and culture. On a national or local level, those people who are most vulnerable to the adverse environmental and health consequences of climate change include poor people, members of minority groups, women, children, older people, people with chronic diseases and disabilities, those residing in areas with a high prevalence of climate-related diseases, and workers exposed to extreme heat or increased weather variability. On a global level, there is much inequity, with low-income countries, which produce the least greenhouse gases (GHGs), being more adversely affected by climate change than high-income countries, which produce substantially higher amounts of GHGs yet are less immediately affected. In addition, low-income countries have far less capability to adapt to climate change than high-income countries. Adaptation and mitigation measures to address climate change needed to protect human society must also be planned to protect human rights, promote social justice, and avoid creating new problems or exacerbating existing problems for vulnerable populations. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Evidence of absence (v2.0) software user guide

    USGS Publications Warehouse

    Dalthorp, Daniel; Huso, Manuela; Dail, David

    2017-07-06

    Evidence of Absence software (EoA) is a user-friendly software application for estimating bird and bat fatalities at wind farms and for designing search protocols. The software is particularly useful in addressing whether the number of fatalities is below a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software also includes tools (1) for estimating carcass persistence distributions and searcher efficiency parameters ( and ) from field trials, (2) for projecting future mortality based on past monitoring data, and (3) for exploring the potential consequences of various choices in the design of long-term incidental take permits for protected species. The software was designed specifically for cases where tolerance for mortality is low and carcass counts are small or even 0, but the tools also may be used for mortality estimates when carcass counts are large.

  16. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Groves, Paula; Valett, Jon

    1990-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory-software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. Subject and author indexes further classify these documents by specific topic and individual author.

  17. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1993-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory: software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. This document contains an index of these publications classified by individual author.

  18. Combating unethical publications with plagiarism detection services

    PubMed Central

    Garner, H.R.

    2010-01-01

    About 3,000 new citations that are highly similar to citations in previously published manuscripts that appear each year in the biomedical literature (Medline) alone. This underscores the importance for the opportunity for editors and reviewers to have detection system to identify highly similar text in submitted manuscripts so that they can then review them for novelty. New software-based services, both commercial and free, provide this capability. The availability of such tools provides both a way to intercept suspect manuscripts and serve as a deterrent. Unfortunately, the capabilities of these services vary considerably, mainly as a consequence of the availability and completeness of the literature bases to which new queries are compared. Most of the commercial software has been designed for detection of plagiarism in high school and college papers, however, there is at least one fee-based service (CrossRef) and one free service (etblast.org) which are designed to target the needs of the biomedical publication industry. Information on these various services, examples of the type of operability and output, and things that need to be considered by publishers, editors and reviewers before selecting and using these services is provided. PMID:21194644

  19. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  20. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  1. Combating unethical publications with plagiarism detection services.

    PubMed

    Garner, H R

    2011-01-01

    About 3,000 new citations that are highly similar to citations in previously published manuscripts that appear each year in the biomedical literature (Medline) alone. This underscores the importance for the opportunity for editors and reviewers to have detection system to identify highly similar text in submitted manuscripts so that they can then review them for novelty. New software-based services, both commercial and free, provide this capability. The availability of such tools provides both a way to intercept suspect manuscripts and serve as a deterrent. Unfortunately, the capabilities of these services vary considerably, mainly as a consequence of the availability and completeness of the literature bases to which new queries are compared. Most of the commercial software has been designed for detection of plagiarism in high school and college papers; however, there is at least 1 fee-based service (CrossRef) and 1 free service (etblast.org), which are designed to target the needs of the biomedical publication industry. Information on these various services, examples of the type of operability and output, and things that need to be considered by publishers, editors, and reviewers before selecting and using these services is provided. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Differential phase acoustic microscope for micro-NDE

    NASA Technical Reports Server (NTRS)

    Waters, David D.; Pusateri, T. L.; Huang, S. R.

    1992-01-01

    A differential phase scanning acoustic microscope (DP-SAM) was developed, fabricated, and tested in this project. This includes the acoustic lens and transducers, driving and receiving electronics, scanning stage, scanning software, and display software. This DP-SAM can produce mechanically raster-scanned acoustic microscopic images of differential phase, differential amplitude, or amplitude of the time gated returned echoes of the samples. The differential phase and differential amplitude images provide better image contrast over the conventional amplitude images. A specially designed miniature dual beam lens was used to form two foci to obtain the differential phase and amplitude information of the echoes. High image resolution (1 micron) was achieved by applying high frequency (around 1 GHz) acoustic signals to the samples and placing two foci close to each other (1 micron). Tone burst was used in this system to obtain a good estimation of the phase differences between echoes from the two adjacent foci. The system can also be used to extract the V(z) acoustic signature. Since two acoustic beams and four receiving modes are available, there are 12 possible combinations to produce an image or a V(z) scan. This provides a unique feature of this system that none of the existing acoustic microscopic systems can provide for the micro-nondestructive evaluation applications. The entire system, including the lens, electronics, and scanning control software, has made a competitive industrial product for nondestructive material inspection and evaluation and has attracted interest from existing acoustic microscope manufacturers.

  3. A system verification platform for high-density epiretinal prostheses.

    PubMed

    Chen, Kuanfu; Lo, Yi-Kai; Yang, Zhi; Weiland, James D; Humayun, Mark S; Liu, Wentai

    2013-06-01

    Retinal prostheses have restored light perception to people worldwide who have poor or no vision as a consequence of retinal degeneration. To advance the quality of visual stimulation for retinal implant recipients, a higher number of stimulation channels is expected in the next generation retinal prostheses, which poses a great challenge to system design and verification. This paper presents a system verification platform dedicated to the development of retinal prostheses. The system includes primary processing, dual-band power and data telemetry, a high-density stimulator array, and two methods for output verification. End-to-end system validation and individual functional block characterization can be achieved with this platform through visual inspection and software analysis. Custom-built software running on the computers also provides a good way for testing new features before they are realized by the ICs. Real-time visual feedbacks through the video displays make it easy to monitor and debug the system. The characterization of the wireless telemetry and the demonstration of the visual display are reported in this paper using a 256-channel retinal prosthetic IC as an example.

  4. Warpage analysis on thin shell part using response surface methodology (RSM)

    NASA Astrophysics Data System (ADS)

    Zulhasif, Z.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    The optimisation of moulding parameters appropriate to reduce warpage defects produce using Autodesk Moldflow Insight (AMI) 2012 software The product is injected by using Acrylonitrile-Butadiene-Styrene (ABS) materials. This analysis has processing parameter that varies in melting temperature, mould temperature, packing pressure and packing time. Design of Experiments (DOE) has been integrated to obtain a polynomial model using Response Surface Methodology (RSM). The Glowworm Swarm Optimisation (GSO) method is used to predict a best combination parameters to minimise warpage defect in order to produce high quality parts.

  5. Rapid Prototyping in Orthopaedic Surgery: A User's Guide

    PubMed Central

    Frame, Mark; Huntley, James S.

    2012-01-01

    Rapid prototyping (RP) is applicable to orthopaedic problems involving three dimensions, particularly fractures, deformities, and reconstruction. In the past, RP has been hampered by cost and difficulties accessing the appropriate expertise. Here we outline the history of rapid prototyping and furthermore a process using open-source software to produce a high fidelity physical model from CT data. This greatly mitigates the expense associated with the technique, allowing surgeons to produce precise models for preoperative planning and procedure rehearsal. We describe the method with an illustrative case. PMID:22666160

  6. Using a Geographic Information System to Improve Childhood Lead-Screening Efforts

    PubMed Central

    2013-01-01

    The Idaho Division of Public Health conducted a pilot study to produce a lead-exposure–risk map to help local and state agencies better target childhood lead-screening efforts. Priority lead-screening areas, at the block group level, were created by using county tax assessor data and geographic information system software. A series of maps were produced, indicating childhood lead-screening prevalence in areas in which there was high potential for exposure to lead. These maps could enable development of more systematically targeted and cost-effective childhood lead-screening efforts. PMID:23764346

  7. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less

  8. Library-Specific Microcomputer Software.

    ERIC Educational Resources Information Center

    Levert, Virginia M.

    1985-01-01

    Discusses number and type of microcomputer software programs useful to libraries and types of hardware on which they run, as identified by Nolan Information Management Services. Highlights include general application programs, applications designed to support library technical processes, producers of library software, and choosing among options.…

  9. 32 CFR 291.3 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., functions, decisions, or procedures of a DNA organization. Normally, computer software, including source.... (This does not include the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in...

  10. The Economics of Educational Software Portability.

    ERIC Educational Resources Information Center

    Oliveira, Joao Batista Araujo e

    1990-01-01

    Discusses economic issues that affect the portability of educational software. Topics discussed include economic reasons for portability, including cost effectiveness; the nature and behavior of educational computer software markets; the role of producers, buyers, and consumers; potential effects of government policies; computer piracy; and…

  11. Technology and Participation in Japanese Factories: The Consequences for Morale and Productivity.

    ERIC Educational Resources Information Center

    Hull, Frank; Azumi, Koya

    1988-01-01

    By fully using their human resources, Japanese factories mass produce goods of low cost and high quality. Participation in Japanese factories occurs in a more hierarchical framework than advocated in the Western model of worker democracy. (JOW)

  12. Single software platform used for high speed data transfer implementation in a 65k pixel camera working in single photon counting mode

    NASA Astrophysics Data System (ADS)

    Maj, P.; Kasiński, K.; Gryboś, P.; Szczygieł, R.; Kozioł, A.

    2015-12-01

    Integrated circuits designed for specific applications generally use non-standard communication methods. Hybrid pixel detector readout electronics produces a huge amount of data as a result of number of frames per seconds. The data needs to be transmitted to a higher level system without limiting the ASIC's capabilities. Nowadays, the Camera Link interface is still one of the fastest communication methods, allowing transmission speeds up to 800 MB/s. In order to communicate between a higher level system and the ASIC with a dedicated protocol, an FPGA with dedicated code is required. The configuration data is received from the PC and written to the ASIC. At the same time, the same FPGA should be able to transmit the data from the ASIC to the PC at the very high speed. The camera should be an embedded system enabling autonomous operation and self-monitoring. In the presented solution, at least three different hardware platforms are used—FPGA, microprocessor with real-time operating system and the PC with end-user software. We present the use of a single software platform for high speed data transfer from 65k pixel camera to the personal computer.

  13. TGeoCad: an Interface between ROOT and CAD Systems

    NASA Astrophysics Data System (ADS)

    Luzzi, C.; Carminati, F.

    2014-06-01

    In the simulation of High Energy Physics experiment a very high precision in the description of the detector geometry is essential to achieve the required performances. The physicists in charge of Monte Carlo Simulation of the detector need to collaborate efficiently with the engineers working at the mechanical design of the detector. Often, this collaboration is made hard by the usage of different and incompatible software. ROOT is an object-oriented C++ framework used by physicists for storing, analyzing and simulating data produced by the high-energy physics experiments while CAD (Computer-Aided Design) software is used for mechanical design in the engineering field. The necessity to improve the level of communication between physicists and engineers led to the implementation of an interface between the ROOT geometrical modeler used by the virtual Monte Carlo simulation software and the CAD systems. In this paper we describe the design and implementation of the TGeoCad Interface that has been developed to enable the use of ROOT geometrical models in several CAD systems. To achieve this goal, the ROOT geometry description is converted into STEP file format (ISO 10303), which can be imported and used by many CAD systems.

  14. Software for aerospace education: A bibliography, 2nd edition

    NASA Technical Reports Server (NTRS)

    Vogt, Gregory L.; Roth, Susan Kies; Phelps, Malcom V.

    1990-01-01

    This is the second aerospace education software bibliography to be published by the NASA Educational Technology Branch in Washington, DC. Unlike many software bibliographies, this bibliography does not evaluate and grade software according to its quality and value to the classroom, nor does it make any endorsements or warrant scientific accuracy. Rather, it describes software, its subject, approach, and technical details. This bibliography is intended as a convenience to educators. The specific software included represents replies to more than 300 queries to software producers for aerospace education programs.

  15. Extended cage adjustable speed electric motors and drive packages

    DOEpatents

    Hsu, John S.

    1999-01-01

    The rotor cage of a motor is extended, a second stator is coupled to this extended rotor cage, and the windings have the same number of poles. The motor torque and speed can be controlled by either injecting energy into or extracting energy out from the rotor cage. The motor produces less harmonics than existing doubly-fed motors. Consequently, a new type of low cost, high efficiency drive is produced.

  16. ICESat (GLAS) Science Processing Software Document Series. Volume 1; Science Software Management Plan; 3.0

    NASA Technical Reports Server (NTRS)

    Hancock, David W., III

    1999-01-01

    This document provides the Software Management Plan for the GLAS Standard Data Software (SDS) supporting the GLAS instrument of the EOS ICESat Spacecraft. The SDS encompasses the ICESat Science Investigator-led Processing System (I-SIPS) Software and the Instrument Support Terminal (IST) Software. For the I-SIPS Software, the SDS will produce Level 0, Level 1, and Level 2 data products as well as the associated product quality assessments and descriptive information. For the IST Software, the SDS will accommodate the GLAS instrument support areas of engineering status, command, performance assessment, and instrument health status.

  17. Increasing the Efficiency on Producing Radiology Reports for Breast Cancer Diagnosis by Means of Structured Reports. A Comparative Study.

    PubMed

    Segrelles, J Damian; Medina, Rosana; Blanquer, Ignacio; Martí-Bonmatí, Luis

    2017-05-18

    Radiology reports are commonly written on free-text using voice recognition devices. Structured reports (SR) have a high potential but they are usually considered more difficult to fill-in so their adoption in clinical practice leads to a lower efficiency. However, some studies have demonstrated that in some cases, producing SRs may require shorter time than plain-text ones. This work focuses on the definition and demonstration of a methodology to evaluate the productivity of software tools for producing radiology reports. A set of SRs for breast cancer diagnosis based on BI-RADS have been developed using this method. An analysis of their efficiency with respect to free-text reports has been performed. The methodology proposed compares the Elapsed Time (ET) on a set of radiological reports. Free-text reports are produced with the speech recognition devices used in the clinical practice. Structured reports are generated using a web application generated with TRENCADIS framework. A team of six radiologists with three different levels of experience in the breast cancer diagnosis was recruited. These radiologists performed the evaluation, each one introducing 50 reports for mammography, 50 for ultrasound scan and 50 for MRI using both approaches. Also, the Relative Efficiency (REF) was computed for each report, dividing the ET of both methods. We applied the T-Student (T-S) test to compare the ETs and the ANOVA test to compare the REFs. Both tests were computed using the SPSS software. The study produced three DICOM-SR templates for Breast Cancer Diagnosis on mammography, ultrasound and MRI, using RADLEX terms based on BIRADs 5th edition. The T-S test on radiologists with high or intermediate profile, showed that the difference between the ET was only statistically significant for mammography and ultrasound. The ANOVA test performed grouping the REF by modalities, indicated that there were no significant differences between mammograms and ultrasound scans, but both have significant statistical differences with MRI. The ANOVA test of the REF for each modality, indicated that there were only significant differences in Mammography (ANOVA p = 0.024) and Ultrasound (ANOVA p = 0.008). The ANOVA test for each radiologist profile, indicated that there were significant differences on the high profile (ANOVA p = 0.028) and medium (ANOVA p = 0.045). In this work, we have defined and demonstrated a methodology to evaluate the productivity of software tools for producing radiology reports in Breast Cancer. We have evaluated that adopting Structured Reporting in mammography and ultrasound studies in breast cancer diagnosis improves the performance in producing reports.

  18. Elementary Keyboarding Software Product Reports.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This report provides detailed product descriptions of 45 software programs designed to teach or improve the keyboarding skills of elementary school students that were identified by the MicroSIFT (Microcomputer Information and Software for Teachers) staff. The descriptions include program titles, producer names, costs, grade levels, hardware,…

  19. ARC Software and Models

    Science.gov Websites

    produce software code and methodologies that are transferred to TARDEC and industry partners. These constraints", ASME Dynamic Systems and Control Conference, 2013, DOI:10.1115/DSCC2013-3935 Software Monitoring",IEEE Transactions on Control Systems Technology, DOI:10.1109/TCST.2012.2217143 Fast

  20. Technology Assessment Software Package: Final Report.

    ERIC Educational Resources Information Center

    Hutinger, Patricia L.

    This final report describes the Technology Assessment Software Package (TASP) Project, which produced developmentally appropriate technology assessment software for children from 18 months through 8 years of age who have moderate to severe disabilities that interfere with their interaction with people, objects, tasks, and events in their…

  1. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  2. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Kistler, David; Bristow, John; Smith, Don

    1994-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  3. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  4. Massive stereo-based DTM production for Mars on cloud computers

    NASA Astrophysics Data System (ADS)

    Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Xiong, Si-Ting; Putri, A. R. D.; Walter, S. H. G.; Veitch-Michaelis, J.; Yershov, V.

    2018-05-01

    Digital Terrain Model (DTM) creation is essential to improving our understanding of the formation processes of the Martian surface. Although there have been previous demonstrations of open-source or commercial planetary 3D reconstruction software, planetary scientists are still struggling with creating good quality DTMs that meet their science needs, especially when there is a requirement to produce a large number of high quality DTMs using "free" software. In this paper, we describe a new open source system to overcome many of these obstacles by demonstrating results in the context of issues found from experience with several planetary DTM pipelines. We introduce a new fully automated multi-resolution DTM processing chain for NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) stereo processing, called the Co-registration Ames Stereo Pipeline (ASP) Gotcha Optimised (CASP-GO), based on the open source NASA ASP. CASP-GO employs tie-point based multi-resolution image co-registration, and Gotcha sub-pixel refinement and densification. CASP-GO pipeline is used to produce planet-wide CTX and HiRISE DTMs that guarantee global geo-referencing compliance with respect to High Resolution Stereo Colour imaging (HRSC), and thence to the Mars Orbiter Laser Altimeter (MOLA); providing refined stereo matching completeness and accuracy. All software and good quality products introduced in this paper are being made open-source to the planetary science community through collaboration with NASA Ames, United States Geological Survey (USGS) and the Jet Propulsion Laboratory (JPL), Advanced Multi-Mission Operations System (AMMOS) Planetary Data System (PDS) Pipeline Service (APPS-PDS4), as well as browseable and visualisable through the iMars web based Geographic Information System (webGIS) system.

  5. Guayule (Parthenium argentatum)pyrolysis and analysis by PY-GC/MS

    USDA-ARS?s Scientific Manuscript database

    Economic and sustainable biofuel production requires high process efficiency. The choice of biomass and the conversion technology employed to produce renewable fuels determines the product yields, fuel quality and consequently the process efficiency. Guayule, a perennial shrub native to the southwes...

  6. An intelligent healthcare management system: a new approach in work-order prioritization for medical equipment maintenance requests.

    PubMed

    Hamdi, Naser; Oweis, Rami; Abu Zraiq, Hamzeh; Abu Sammour, Denis

    2012-04-01

    The effective maintenance management of medical technology influences the quality of care delivered and the profitability of healthcare facilities. Medical equipment maintenance in Jordan lacks an objective prioritization system; consequently, the system is not sensitive to the impact of equipment downtime on patient morbidity and mortality. The current work presents a novel software system (EQUIMEDCOMP) that is designed to achieve valuable improvements in the maintenance management of medical technology. This work-order prioritization model sorts medical maintenance requests by calculating a priority index for each request. Model performance was assessed by utilizing maintenance requests from several Jordanian hospitals. The system proved highly efficient in minimizing equipment downtime based on healthcare delivery capacity, and, consequently, patient outcome. Additionally, a preventive maintenance optimization module and an equipment quality control system are incorporated. The system is, therefore, expected to improve the reliability of medical equipment and significantly improve safety and cost-efficiency.

  7. Viewing the functional consequences of traumatic brain injury by using brain SPECT.

    PubMed

    Pavel, D; Jobe, T; Devore-Best, S; Davis, G; Epstein, P; Sinha, S; Kohn, R; Craita, I; Liu, P; Chang, Y

    2006-03-01

    High-resolution brain SPECT is increasingly benefiting from improved image processing software and multiple complementary display capabilities. This enables detailed functional mapping of the disturbances in relative perfusion occurring after TBI. The patient population consisted of 26 cases (ages 8-61 years)between 3 months and 6 years after traumatic brain injury.A very strong case can be made for the routine use of Brain SPECT in TBI. Indeed it can provide a detailed evaluation of multiple functional consequences after TBI and is thus capable of supplementing the clinical evaluation and tailoring the therapeutic strategies needed. In so doing it also provides significant additional information beyond that available from MRI/CT. The critical factor for Brain SPECT's clinical relevance is a carefully designed technical protocol, including displays which should enable a comprehensive description of the patterns found, in a user friendly mode.

  8. Maximizing Use of Extension Beef Cattle Benchmarks Data Derived from Cow Herd Appraisal Performance Software

    ERIC Educational Resources Information Center

    Ramsay, Jennifer M.; Hanna, Lauren L. Hulsman; Ringwall, Kris A.

    2016-01-01

    One goal of Extension is to provide practical information that makes a difference to producers. Cow Herd Appraisal Performance Software (CHAPS) has provided beef producers with production benchmarks for 30 years, creating a large historical data set. Many such large data sets contain useful information but are underutilized. Our goal was to create…

  9. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    PubMed

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  10. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software.

    PubMed

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians.

  11. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software

    PubMed Central

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians. PMID:25996054

  12. The Widening Gulf between Genomics Data Generation and Consumption: A Practical Guide to Big Data Transfer Technology

    PubMed Central

    Feltus, Frank A.; Breen, Joseph R.; Deng, Juan; Izard, Ryan S.; Konger, Christopher A.; Ligon, Walter B.; Preuss, Don; Wang, Kuang-Ching

    2015-01-01

    In the last decade, high-throughput DNA sequencing has become a disruptive technology and pushed the life sciences into a distributed ecosystem of sequence data producers and consumers. Given the power of genomics and declining sequencing costs, biology is an emerging “Big Data” discipline that will soon enter the exabyte data range when all subdisciplines are combined. These datasets must be transferred across commercial and research networks in creative ways since sending data without thought can have serious consequences on data processing time frames. Thus, it is imperative that biologists, bioinformaticians, and information technology engineers recalibrate data processing paradigms to fit this emerging reality. This review attempts to provide a snapshot of Big Data transfer across networks, which is often overlooked by many biologists. Specifically, we discuss four key areas: 1) data transfer networks, protocols, and applications; 2) data transfer security including encryption, access, firewalls, and the Science DMZ; 3) data flow control with software-defined networking; and 4) data storage, staging, archiving and access. A primary intention of this article is to orient the biologist in key aspects of the data transfer process in order to frame their genomics-oriented needs to enterprise IT professionals. PMID:26568680

  13. Salt Composition Derived from Veazey Composition by Thermodynamic Modeling and Predicted Composition of Drum Contents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weisbrod, Kirk Ryan; Veirs, Douglas Kirk; Funk, David John

    This report describes the derivation of the salt composition from the Veazey salt stream analysis. It also provides an estimate of the proportions of the kitty litter, nitrate salt and neutralizer that was contained in drum 68660. While the actinide content of waste streams was judiciously followed in the 1980s in TA-55, no record of the salt composition could be found. Consequently, a salt waste stream produced from 1992 to 1994 and reported by Gerry Veazey provided the basis for this study. While chemical analysis of the waste stream was highly variable, an average analysis provided input to the Streammore » Analyzer software to calculate a composition for a concentrated solid nitrate salt and liquid waste stream. The calculation predicted the gas / condensed phase compositions as well as solid salt / saturated liquid compositions. The derived composition provides an estimate of the nitrate feedstream to WIPP for which kinetic measurements can be made. The ratio of salt to Swheat in drum 68660 contents was estimated through an overall mass balance on the parent and sibling drums. The RTR video provided independent confirmation concerning the volume of the mixture. The solid salt layer contains the majority of the salt at a ratio with Swheat that potentially could become exothermic.« less

  14. An Integrated Framework Advancing Membrane Protein Modeling and Design

    PubMed Central

    Weitzner, Brian D.; Duran, Amanda M.; Tilley, Drew C.; Elazar, Assaf; Gray, Jeffrey J.

    2015-01-01

    Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1) prediction of free energy changes upon mutation; (2) high-resolution structural refinement; (3) protein-protein docking; and (4) assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design. PMID:26325167

  15. File format for normalizing radiological concentration exposure rate and dose rate data for the effects of radioactive decay and weathering processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraus, Terrence D.

    2017-04-01

    This report specifies the electronic file format that was agreed upon to be used as the file format for normalized radiological data produced by the software tool developed under this TI project. The NA-84 Technology Integration (TI) Program project (SNL17-CM-635, Normalizing Radiological Data for Analysis and Integration into Models) investigators held a teleconference on December 7, 2017 to discuss the tasks to be completed under the TI program project. During this teleconference, the TI project investigators determined that the comma-separated values (CSV) file format is the most suitable file format for the normalized radiological data that will be outputted frommore » the normalizing tool developed under this TI project. The CSV file format was selected because it provides the requisite flexibility to manage different types of radiological data (i.e., activity concentration, exposure rate, dose rate) from other sources [e.g., Radiological Assessment and Monitoring System (RAMS), Aerial Measuring System (AMS), Monitoring and Sampling). The CSV file format also is suitable for the file format of the normalized radiological data because this normalized data can then be ingested by other software [e.g., RAMS, Visual Sampling Plan (VSP)] used by the NA-84’s Consequence Management Program.« less

  16. Cerebroprotective effect of combined treatment with pyrazidol and bemitil in craniocerebral trauma.

    PubMed

    Zarubina, I V; Kuritsyna, N A; Shabanov, P D

    2004-07-01

    Monotherapy of consequences of craniocerebral trauma with pyrazidol (1 mg/kg) produced an anxiolytic effect in animals highly resistant to hypoxia and activating effect on low resistant animals. Treatment with bemitil in a dose of 25 mg/kg produced a cerebroprotective effect and normalized individual behavioral characteristics, parameters of energy metabolism, and state of the antioxidant system in the brain of highly and low resistant rats. The effect of bemitil was most pronounced in highly resistant animals. During combined treatment, pyrazidol and bemitil had an additive effect in animals of both groups. They normalized behavioral reactions and prevented the development of metabolic disturbances in the brain.

  17. Department-Generated Microcomputer Software.

    ERIC Educational Resources Information Center

    Mantei, Erwin J.

    1986-01-01

    Explains how self-produced software can be used to perform rapid number analysis or number-crunching duties in geology classes. Reviews programs in mineralogy and petrology and identifies areas in geology where computers can be used effectively. Discusses the advantages and benefits of integrating department-generated software into a geology…

  18. A streamlined Python framework for AT-TPC data analysis

    NASA Astrophysics Data System (ADS)

    Taylor, J. Z.; Bradt, J.; Bazin, D.; Kuchera, M. P.

    2017-09-01

    User-friendly data analysis software has been developed for the Active-Target Time Projection Chamber (AT-TPC) experiment at the National Superconducting Cyclotron Laboratory at Michigan State University. The AT-TPC, commissioned in 2014, is a gas-filled detector that acts as both the detector and target for high-efficiency detection of low-intensity, exotic nuclear reactions. The pytpc framework is a Python package for analyzing AT-TPC data. The package was developed for the analysis of 46Ar(p, p) data. The existing software was used to analyze data produced by the 40Ar(p, p) experiment that ran in August, 2015. Usage of the package was documented in an analysis manual both to improve analysis steps and aid in the work of future AT-TPC users. Software features and analysis methods in the pytpc framework will be presented along with the 40Ar results.

  19. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  20. The Human Physiome: how standards, software and innovative service infrastructures are providing the building blocks to make it achievable

    PubMed Central

    2016-01-01

    Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome. PMID:27051515

  1. The Human Physiome: how standards, software and innovative service infrastructures are providing the building blocks to make it achievable.

    PubMed

    Nickerson, David; Atalag, Koray; de Bono, Bernard; Geiger, Jörg; Goble, Carole; Hollmann, Susanne; Lonien, Joachim; Müller, Wolfgang; Regierer, Babette; Stanford, Natalie J; Golebiewski, Martin; Hunter, Peter

    2016-04-06

    Reconstructing and understanding the Human Physiome virtually is a complex mathematical problem, and a highly demanding computational challenge. Mathematical models spanning from the molecular level through to whole populations of individuals must be integrated, then personalized. This requires interoperability with multiple disparate and geographically separated data sources, and myriad computational software tools. Extracting and producing knowledge from such sources, even when the databases and software are readily available, is a challenging task. Despite the difficulties, researchers must frequently perform these tasks so that available knowledge can be continually integrated into the common framework required to realize the Human Physiome. Software and infrastructures that support the communities that generate these, together with their underlying standards to format, describe and interlink the corresponding data and computer models, are pivotal to the Human Physiome being realized. They provide the foundations for integrating, exchanging and re-using data and models efficiently, and correctly, while also supporting the dissemination of growing knowledge in these forms. In this paper, we explore the standards, software tooling, repositories and infrastructures that support this work, and detail what makes them vital to realizing the Human Physiome.

  2. Modernization of gas-turbine engines with high-frequency induction motors

    NASA Astrophysics Data System (ADS)

    Abramovich, B. N.; Sychev, Yu A.; Kuznetsov, P. A.

    2018-03-01

    Main tendencies of growth of electric energy consumption in general and mining industries were analyzed in the paper. A key role of electric drive in this process was designated. A review about advantages and disadvantages of unregulated gearboxes with mechanical units that are commonly used in domestically produced gas-turbine engines was made. This review allows one to propose different gas-turbine engines modernization schemes with the help of PWM-driven high-frequency induction motors. Induction motors with the double rotor winding were examined. A simulation of high-frequency induction motors with double rotor windings in Matlab-Simulink software was carried out based on equivalent circuit parameters. Obtained characteristics of new motors were compared with serially produced analogues. After the simulation, results were implemented in the real prototype.

  3. A practical tool for monitoring the performance of measuring systems in a laboratory network: report of an ACB Working Group.

    PubMed

    Ayling, Pete; Hill, Robert; Jassam, Nuthar; Kallner, Anders; Khatami, Zahra

    2017-11-01

    Background A logical consequence of the introduction of robotics and high-capacity analysers has seen a consolidation to larger units. This requires new structures and quality systems to ensure that laboratories deliver consistent and comparable results. Methods A spreadsheet program was designed to accommodate results from up to 12 different instruments/laboratories and present IQC data, i.e. Levey-Jennings and Youden plots and comprehensive numerical tables of the performance of each item. Input of data was made possible by a 'data loader' by which IQC data from the individual instruments could be transferred to the spreadsheet program on line. Results A set of real data from laboratories is used to populate the data loader and the networking software program. Examples are present from the analysis of variance components, the Levey-Jennings and Youden plots. Conclusions This report presents a software package that allows the simultaneous management and detailed monitoring of the performance of up to 12 different instruments/laboratories in a fully interactive mode. The system allows a quality manager of networked laboratories to have a continuous updated overview of the performance. This software package has been made available at the ACB website.

  4. Multi-axis control based on movement control cards in NC systems

    NASA Astrophysics Data System (ADS)

    Jiang, Tingbiao; Wei, Yunquan

    2005-12-01

    Today most movement control cards need special control software of topper computers and are only suitable for fixed-axis controls. Consequently, the number of axes which can be controlled is limited. Advanced manufacture technology develops at a very high speed, and that development brings forth. New requirements for movement control in mechanisms and electronics. This paper introduces products of the 5th generation of movement control cards, PMAC 2A-PC/104, made by the Delta Tau Company in the USA. Based on an analysis of PMAC 2A-PC/104, this paper first describes two aspects relevant to the hardware structure of movement control cards and the interrelated software of the topper computers. Then, two methods are presented for solving these problems. The first method is to set limit switches on the movement control cards; all of them can be used to control each moving axis. The second method is to program applied software with existing programming language (for example, VC ++, Visual Basic, Delphi, and so forth). This program is much easier to operate and expand by its users. By using a limit switch, users can choose different axes in movement control cards. Also, users can change parts of the parameters in the control software of topper computers to realize different control axes. Combining these 2 methods proves to be convenient for realizing multi-axis control in numerical control systems.

  5. FlowCal: A user-friendly, open source software tool for automatically converting flow cytometry data from arbitrary to calibrated units

    PubMed Central

    Castillo-Hair, Sebastian M.; Sexton, John T.; Landry, Brian P.; Olson, Evan J.; Igoshin, Oleg A.; Tabor, Jeffrey J.

    2017-01-01

    Flow cytometry is widely used to measure gene expression and other molecular biological processes with single cell resolution via fluorescent probes. Flow cytometers output data in arbitrary units (a.u.) that vary with the probe, instrument, and settings. Arbitrary units can be converted to the calibrated unit molecules of equivalent fluorophore (MEF) using commercially available calibration particles. However, there is no convenient, non-proprietary tool available to perform this calibration. Consequently, most researchers report data in a.u., limiting interpretation. Here, we report a software tool named FlowCal to overcome current limitations. FlowCal can be run using an intuitive Microsoft Excel interface, or customizable Python scripts. The software accepts Flow Cytometry Standard (FCS) files as inputs and is compatible with different calibration particles, fluorescent probes, and cell types. Additionally, FlowCal automatically gates data, calculates common statistics, and produces publication quality plots. We validate FlowCal by calibrating a.u. measurements of E. coli expressing superfolder GFP (sfGFP) collected at 10 different detector sensitivity (gain) settings to a single MEF value. Additionally, we reduce day-to-day variability in replicate E. coli sfGFP expression measurements due to instrument drift by 33%, and calibrate S. cerevisiae mVenus expression data to MEF units. Finally, we demonstrate a simple method for using FlowCal to calibrate fluorescence units across different cytometers. FlowCal should ease the quantitative analysis of flow cytometry data within and across laboratories and facilitate the adoption of standard fluorescence units in synthetic biology and beyond. PMID:27110723

  6. Extended cage adjustable speed electric motors and drive packages

    DOEpatents

    Hsu, J.S.

    1999-03-23

    The rotor cage of a motor is extended, a second stator is coupled to this extended rotor cage, and the windings have the same number of poles. The motor torque and speed can be controlled by either injecting energy into or extracting energy out from the rotor cage. The motor produces less harmonics than existing doubly-fed motors. Consequently, a new type of low cost, high efficiency drive is produced. 12 figs.

  7. Integration of the Remote Agent for the NASA Deep Space One Autonomy Experiment

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Bernard, Douglas E.; Gamble, Edward B., Jr.; Kanefsky, Bob; Kurien, James; Muscettola, Nicola; Nayak, P. Pandurang; Rajan, Kanna; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes the integration of the Remote Agent (RA), a spacecraft autonomy system which is scheduled to control the Deep Space 1 spacecraft during a flight experiment in 1999. The RA is a reusable, model-based autonomy system that is quite different from software typically used to control an aerospace system. We describe the integration challenges we faced, how we addressed them, and the lessons learned. We focus on those aspects of integrating the RA that were either easier or more difficult than integrating a more traditional large software application because the RA is a model-based autonomous system. A number of characteristics of the RA made integration process easier. One example is the model-based nature of RA. Since the RA is model-based, most of its behavior is not hard coded into procedural program code. Instead, engineers specify high level models of the spacecraft's components from which the Remote Agent automatically derives correct system-wide behavior on the fly. This high level, modular, and declarative software description allowed some interfaces between RA components and between RA and the flight software to be automatically generated and tested for completeness against the Remote Agent's models. In addition, the Remote Agent's model-based diagnosis system automatically diagnoses when the RA models are not consistent with the behavior of the spacecraft. In flight, this feature is used to diagnose failures in the spacecraft hardware. During integration, it proved valuable in finding problems in the spacecraft simulator or flight software. In addition, when modifications are made to the spacecraft hardware or flight software, the RA models are easily changed because they only capture a description of the spacecraft. one does not have to maintain procedural code that implements the correct behavior for every expected situation. On the other hand, several features of the RA made it more difficult to integrate than typical flight software. For example, the definition of correct behavior is more difficult to specify for a system that is expected to reason about and flexibly react to its environment than for a traditional flight software system. Consequently, whenever a change is made to the RA it is more time consuming to determine if the resulting behavior is correct. We conclude the paper with a discussion of future work on the Remote Agent as well as recommendations to ease integration of similar autonomy projects.

  8. Manager's handbook for software development, revision 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences of the Software Engineering Laboratory (SEL) with flight dynamics software development. The management aspects of the following subjects are described: organizing the project, producing a development plan, estimating costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying.

  9. Adaptive Integration of Nonsmooth Dynamical Systems

    DTIC Science & Technology

    2017-10-11

    controlled time stepping method to interactively design running robots. [1] John Shepherd, Samuel Zapolsky, and Evan M. Drumwright, “Fast multi-body...software like this to test software running on my robots. Started working in simulation after attempting to use software like this to test software... running on my robots. The libraries that produce these beautiful results have failed at simulating robotic manipulation. Postulate: It is easier to

  10. Using Software: A Guide to the Ethical and Legal Use of Software for Members of the Academic Community.

    ERIC Educational Resources Information Center

    Interuniversity Communications Council (EDUCOM), Washington, DC.

    The purpose of this brochure, which was produced as a service to the academic community, is to provide a brief outline of what can and cannot be done legally with software, and to clarify the implications and restrictions of the U.S. Copyright Law. Relevant facts concerning copying software precede the EDUCOM statement of principle on intellectual…

  11. Multidisciplinary and Active/Collaborative Approaches in Teaching Requirements Engineering

    ERIC Educational Resources Information Center

    Rosca, Daniela

    2005-01-01

    The requirements engineering course is a core component of the curriculum for the Master's in Software Engineering programme, at Monmouth University (MU). It covers the process, methods and tools specific to this area, together with the corresponding software quality issues. The need to produce software engineers with strong teamwork and…

  12. Company's Unusual Plan to Package Commercial Software with Business Textbooks Produces a Measure of Success.

    ERIC Educational Resources Information Center

    Watkins, Beverly T.

    1992-01-01

    Course Technology Inc. has developed 10 products combining textbooks with commercial software for college accounting, business, computer science, and statistics courses. Five of the products use Lotus 1-2-3 spreadsheet software. The products have been positively received by teachers and students. (DB)

  13. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  14. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  15. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  16. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  17. 32 CFR 295.3 - Definition of OIG records.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., decisions, or procedures of the OIG. Normally, computer software, including source code, object code, and... the underlying data which is processed and produced by such software and which may in some instances be stored with the software.) Exceptions to this position are outlined in § 295.4(c). (3) Anything...

  18. Software Solutions for Better Administration.

    ERIC Educational Resources Information Center

    Kazanjian, Edward

    1997-01-01

    The CO/OP (founded in 1973 as the Massachusetts Association of School Business Officials Cooperative Corporation) has created and produced administrative software for schools. Describes two areas in which software can increase revenue and provide protection for personnel: (1) invoice/accounts receivable for the rental of school space; and (2) an…

  19. 78 FR 66865 - Acquisition Regulation: Patents, Data, and Copyrights

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-07

    ... obligations under current law. The proposed changes include policy revisions for computer software developed.... Computer Software DOE's existing Rights in Technical Data-Technology Transfer clause at 970.5227-2 provides mechanisms by which computer software first produced by a DOE contractor may be made available to the public...

  20. SEED Software Annotations.

    ERIC Educational Resources Information Center

    Bethke, Dee; And Others

    This document provides a composite index of the first five sets of software annotations produced by Project SEED. The software has been indexed by title, subject area, and grade level, and it covers sets of annotations distributed in September 1986, April 1987, September 1987, November 1987, and February 1988. The date column in the index…

  1. Science for the Home: New Products Tackle Such Weighty Subjects as Immunology, Chemistry.

    ERIC Educational Resources Information Center

    Mace, Scott

    1984-01-01

    Discusses trends in science software for home and educational use. Examples of software on various science topics are provided, including packages which revolve around such television shows as "Nova" and "Voyage of the Mimi" and those produced by the Human Engineering Software. (JN)

  2. Towards a New Paradigm of Software Development: an Ambassador Driven Process in Distributed Software Companies

    NASA Astrophysics Data System (ADS)

    Kumlander, Deniss

    The globalization of companies operations and competitor between software vendors demand improving quality of delivered software and decreasing the overall cost. The same in fact introduce a lot of problem into software development process as produce distributed organization breaking the co-location rule of modern software development methodologies. Here we propose a reformulation of the ambassador position increasing its productivity in order to bridge communication and workflow gap by managing the entire communication process rather than concentrating purely on the communication result.

  3. The analysis of the statistical and historical information gathered during the development of the Shuttle Orbiter Primary Flight Software

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.; Marchbanks, M. P., Jr.; Quick, M. J.

    1982-01-01

    The results of an effort to thoroughly and objectively analyze the statistical and historical information gathered during the development of the Shuttle Orbiter Primary Flight Software are given. The particular areas of interest include cost of the software, reliability of the software, requirements for the software and how the requirements changed during development of the system. Data related to the current version of the software system produced some interesting results. Suggestions are made for the saving of additional data which will allow additional investigation.

  4. Middleware Case Study: MeDICi

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wynne, Adam S.

    2011-05-05

    In many application domains in science and engineering, data produced by sensors, instruments and networks is naturally processed by software applications structured as a pipeline . Pipelines comprise a sequence of software components that progressively process discrete units of data to produce a desired outcome. For example, in a Web crawler that is extracting semantics from text on Web sites, the first stage in the pipeline might be to remove all HTML tags to leave only the raw text of the document. The second step may parse the raw text to break it down into its constituent grammatical parts, suchmore » as nouns, verbs and so on. Subsequent steps may look for names of people or places, interesting events or times so documents can be sequenced on a time line. Each of these steps can be written as a specialized program that works in isolation with other steps in the pipeline. In many applications, simple linear software pipelines are sufficient. However, more complex applications require topologies that contain forks and joins, creating pipelines comprising branches where parallel execution is desirable. It is also increasingly common for pipelines to process very large files or high volume data streams which impose end-to-end performance constraints. Additionally, processes in a pipeline may have specific execution requirements and hence need to be distributed as services across a heterogeneous computing and data management infrastructure. From a software engineering perspective, these more complex pipelines become problematic to implement. While simple linear pipelines can be built using minimal infrastructure such as scripting languages, complex topologies and large, high volume data processing requires suitable abstractions, run-time infrastructures and development tools to construct pipelines with the desired qualities-of-service and flexibility to evolve to handle new requirements. The above summarizes the reasons we created the MeDICi Integration Framework (MIF) that is designed for creating high-performance, scalable and modifiable software pipelines. MIF exploits a low friction, robust, open source middleware platform and extends it with component and service-based programmatic interfaces that make implementing complex pipelines simple. The MIF run-time automatically handles queues between pipeline elements in order to handle request bursts, and automatically executes multiple instances of pipeline elements to increase pipeline throughput. Distributed pipeline elements are supported using a range of configurable communications protocols, and the MIF interfaces provide efficient mechanisms for moving data directly between two distributed pipeline elements.« less

  5. A Methodology for Forecasting Damage & Economic Consequences to Floods: Building on the National Flood Interoperability Experiment (NFIE)

    NASA Astrophysics Data System (ADS)

    Tootle, G. A.; Gutenson, J. L.; Zhu, L.; Ernest, A. N. S.; Oubeidillah, A.; Zhang, X.

    2015-12-01

    The National Flood Interoperability Experiment (NFIE) held June 3-July 17, 2015 at the National Water Center (NWC) in Tuscaloosa, Alabama sought to demonstrate an increase in flood predictive capacity for the coterminous United States (CONUS). Accordingly, NFIE-derived technologies and workflows offer the ability to forecast flood damage and economic consequence estimates that coincide with the hydrologic and hydraulic estimations these physics-based models generate. A model providing an accurate prediction of damage and economic consequences is a valuable asset when allocating funding for disaster response, recovery, and relief. Damage prediction and economic consequence assessment also offer an adaptation planning mechanism for defending particularly valuable or vulnerable structures. The NFIE, held at the NWC on The University of Alabama (UA) campus led to the development of this large scale flow and inundation forecasting framework. Currently, the system can produce 15-hour lead-time forecasts for the entire coterminous United States (CONUS). A concept which is anticipated to become operational as of May 2016 within the NWC. The processing of such a large-scale, fine resolution model is accomplished in a parallel computing environment using large supercomputing clusters. Traditionally, flood damage and economic consequence assessment is calculated in a desktop computing environment with a ménage of meteorology, hydrology, hydraulic, and damage assessment tools. In the United States, there are a range of these flood damage/ economic consequence assessment software's available to local, state, and federal emergency management agencies. Among the more commonly used and freely accessible models are the Hydrologic Engineering Center's Flood Damage Reduction Analysis (HEC-FDA), Flood Impact Assessment (HEC-FIA), and Federal Emergency Management Agency's (FEMA's) United States Multi-Hazard (Hazus-MH). All of which exist only in a desktop environment. With this, authors submit an initial framework for estimating damage and economic consequences to floods using flow and inundation products from the NFIE framework. This adaptive system utilizes existing nationwide datasets describing location and use of structures and can take assimilate a range of data resolutions.

  6. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    NASA Astrophysics Data System (ADS)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  7. Real-time UNIX in HEP data acquisition

    NASA Astrophysics Data System (ADS)

    Buono, S.; Gaponenko, I.; Jones, R.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Aguer, M.; Huet, M.

    1994-12-01

    Today's experimentation in high energy physics is characterized by an increasing need for sensitivity to rare phenomena and complex physics signatures, which require the use of huge and sophisticated detectors and consequently a high performance readout and data acquisition. Multi-level triggering, hierarchical data collection and an always increasing amount of processing power, distributed throughout the data acquisition layers, will impose a number of features on the software environment, especially the need for a high level of standardization. Real-time UNIX seems, today, the best solution for the platform independence, operating system interface standards and real-time features necessary for data acquisition in HEP experiments. We present the results of the evaluation, in a realistic application environment, of a Real-Time UNIX operating system: the EP/LX real-time UNIX system.

  8. Computer sciences

    NASA Technical Reports Server (NTRS)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  9. Three-dimensional surface reconstruction for industrial computed tomography

    NASA Technical Reports Server (NTRS)

    Vannier, M. W.; Knapp, R. H.; Gayou, D. E.; Sammon, N. P.; Butterfield, R. L.; Larson, J. W.

    1985-01-01

    Modern high resolution medical computed tomography (CT) scanners can produce geometrically accurate sectional images of many types of industrial objects. Computer software has been developed to convert serial CT scans into a three-dimensional surface form, suitable for display on the scanner itself. This software, originally developed for imaging the skull, has been adapted for application to industrial CT scanning, where serial CT scans thrrough an object of interest may be reconstructed to demonstrate spatial relationships in three dimensions that cannot be easily understood using the original slices. The methods of three-dimensional reconstruction and solid modeling are reviewed, and reconstruction in three dimensions from CT scans through familiar objects is demonstrated.

  10. MIDAS: Software for the detection and analysis of lunar impact flashes

    NASA Astrophysics Data System (ADS)

    Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús

    2015-06-01

    Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.

  11. Dynamics of forest herbivory: quest for pattern and principle.

    Treesearch

    William J. Mattson; Pekka Niemila; Matti Rossi

    1996-01-01

    Herbivory on woody plants is highly variable in both space and time. This proceedings addresses one of its root causes, the highly intricate and dynamic relationships that exist between most herbivores and their host plants. It emphasizes that the consequences of herbivory both to the consumer and to the producer plant often balance on a razor`s edge--depending on...

  12. Configurable software for satellite graphics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartzman, P D

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The levelmore » of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.« less

  13. 3D Graphics For Interactive Surgical Simulation And Implant Design

    NASA Astrophysics Data System (ADS)

    Dev, P.; Fellingham, L. L.; Vassiliadis, A.; Woolson, S. T.; White, D. N.; Young, S. L.

    1984-10-01

    The combination of user-friendly, highly interactive software, 3D graphics, and the high-resolution detailed views of anatomy afforded by X-ray computer tomography and magnetic resonance imaging can provide surgeons with the ability to plan and practice complex surgeries. In addition to providing a realistic and manipulable 3D graphics display, this system can drive a milling machine in order to produce physical models of the anatomy or prosthetic devices and implants which have been designed using its interactive graphics editing facilities.

  14. [Software for illustrating a cost-quality balance carried out by clinical laboratory practice].

    PubMed

    Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi

    2010-09-01

    We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial

  15. Maintenance Manual for AUDIT. A System for Analyzing SESCOMP Software. Volume 4: Appendix D. Listings of the AUDIT Software for the IBM 360.

    DTIC Science & Technology

    1977-08-01

    The AUDIT documentation provides the maintenance programmer personnel with the information to effectively maintain and use the AUDIT software. The ...SESCOMPSPEC’s) and produces reports detailing the deviations from those standards. The AUDIT software also examines a program unit to detect and report...changes in word length on the output of computer programs. This report contains the listings of the AUDIT software for the IBM 360. (Author)

  16. Maintenance Manual for AUDIT. A System for Analyzing SESCOMP Software. Volume 3: Appendix C - Listings of the AUDIT Software for the UNIVAC 1108.

    DTIC Science & Technology

    1977-08-01

    The AUDIT documentation provides the maintenance programmer personnel with the information to effectively maintain and use the AUDIT software. The ...SESCOMPSPEC’s) and produces reports detailing the deviations from those standards. The AUDIT software also examines a program unit to detect and report...changes in word length on the output of computer programs. This report contains the listings of the AUDIT software for the UNIVAC 1108. (Author)

  17. Maintenance Manual for AUDIT. A System for Analyzing SESCOMP Software. Volume 2: Appendix B. Listings of the Audit Software for the CDC 6000.

    DTIC Science & Technology

    1977-08-01

    The AUDIT documentation provides the maintenance programmer personnel with the information to effectively maintain and use the AUDIT software. The ...SESCOMPSPEC’s) and produces reports detailing the deviations from those standards. The AUDIT software also examines a program unit to detect and report...changes in word length on the output of computer programs. This report contains the listings of the AUDIT software for the CDC 6000. (Author)

  18. Wake Turbulence Mitigation for Departures (WTMD) Prototype System - Software Design Document

    NASA Technical Reports Server (NTRS)

    Sturdy, James L.

    2008-01-01

    This document describes the software design of a prototype Wake Turbulence Mitigation for Departures (WTMD) system that was evaluated in shadow mode operation at the Saint Louis (KSTL) and Houston (KIAH) airports. This document describes the software that provides the system framework, communications, user displays, and hosts the Wind Forecasting Algorithm (WFA) software developed by the M.I.T. Lincoln Laboratory (MIT-LL). The WFA algorithms and software are described in a separate document produced by MIT-LL.

  19. Designing an Orthotic Insole by Using Kinect® XBOX Gaming Sensor Scanner and Computer Aided Engineering Software

    NASA Astrophysics Data System (ADS)

    Hafiz Burhan, Mohd; Nor, Nik Hisyamudin Muhd; Yarwindran, Mogan; Ibrahim, Mustaffa; Fahrul Hassan, Mohd; Azwir Azlan, Mohd; Turan, Faiz Mohd; Johan, Kartina

    2017-08-01

    Healthcare and medical is one of the most expensive field in the modern world. In order to fulfil medical requirement, this study aimed to design an orthotic insole by using Kinect Xbox Gaming Sensor Scanner and CAE softwares. The accuracy of the Kinect® XBOX 360 gaming sensor is capable of producing 3D reconstructed geometry with the maximum and minimum error of 3.78% (2.78mm) and 1.74% (0.46mm) respectively. The orthotic insole design process had been done by using Autodesk Meshmixer 2.6 and Solidworks 2014 software. Functionality of the orthotic insole designed was capable of reducing foot pressure especially in the metatarsal area. Overall, the proposed method was proved to be highly potential in the design of the insole where it promises low cost, less time consuming, and efficiency in regards that the Kinect® XBOX 360 device promised low price compared to other digital 3D scanner since the software needed to run the device can be downloaded for free.

  20. Software for Partly Automated Recognition of Targets

    NASA Technical Reports Server (NTRS)

    Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark; Selinsky, T.

    2002-01-01

    The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user's tendencies while the user is selecting targets and to increase the user's productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.

  1. Extracting patterns of database and software usage from the bioinformatics literature

    PubMed Central

    Duck, Geraint; Nenadic, Goran; Brass, Andy; Robertson, David L.; Stevens, Robert

    2014-01-01

    Motivation: As a natural consequence of being a computer-based discipline, bioinformatics has a strong focus on database and software development, but the volume and variety of resources are growing at unprecedented rates. An audit of database and software usage patterns could help provide an overview of developments in bioinformatics and community common practice, and comparing the links between resources through time could demonstrate both the persistence of existing software and the emergence of new tools. Results: We study the connections between bioinformatics resources and construct networks of database and software usage patterns, based on resource co-occurrence, that correspond to snapshots of common practice in the bioinformatics community. We apply our approach to pairings of phylogenetics software reported in the literature and argue that these could provide a stepping stone into the identification of scientific best practice. Availability and implementation: The extracted resource data, the scripts used for network generation and the resulting networks are available at http://bionerds.sourceforge.net/networks/ Contact: robert.stevens@manchester.ac.uk PMID:25161253

  2. Effects of sodium fluoride on immune response in murine macrophages.

    PubMed

    De la Fuente, Beatriz; Vázquez, Marta; Rocha, René Antonio; Devesa, Vicenta; Vélez, Dinoraz

    2016-08-01

    Excessive fluoride intake may be harmful for health, producing dental and skeletal fluorosis, and effects upon neurobehavioral development. Studies in animals have revealed effects upon the gastrointestinal, renal and reproductive systems. Some of the disorders may be a consequence of immune system alterations. In this study, an in vitro evaluation is made of fluoride immunotoxicity using the RAW 264.7 murine macrophage line over a broad range of concentrations (2.5-75mg/L). The results show that the highest fluoride concentrations used (50-75mg/L) reduce the macrophage population in part as a consequence of the generation of reactive oxygen and/or nitrogen species and consequent redox imbalance, which in turn is accompanied by lipid peroxidation. A decrease in the expression of the antiinflammatory cytokine Il10 is observed from the lowest concentrations (5mg/L). High concentrations (50mg/L) in turn produce a significant increase in the proinflammatory cytokines Il6 and Mip2 from 4h of exposure. In addition, cell phagocytic capacity is seen to decrease at concentrations of ≥20mg/L. These data indicate that fluoride, at high concentrations, may affect macrophages and thus immune system function - particularly with regard to the inflammation autoregulatory processes, in which macrophages play a key role. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Real-time control system for adaptive resonator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flath, L; An, J; Brase, J

    2000-07-24

    Sustained operation of high average power solid-state lasers currently requires an adaptive resonator to produce the optimal beam quality. We describe the architecture of a real-time adaptive control system for correcting intra-cavity aberrations in a heat capacity laser. Image data collected from a wavefront sensor are processed and used to control phase with a high-spatial-resolution deformable mirror. Our controller takes advantage of recent developments in low-cost, high-performance processor technology. A desktop-based computational engine and object-oriented software architecture replaces the high-cost rack-mount embedded computers of previous systems.

  4. Comparison of 3D reconstruction of mandible for pre-operative planning using commercial and open-source software

    NASA Astrophysics Data System (ADS)

    Abdullah, Johari Yap; Omar, Marzuki; Pritam, Helmi Mohd Hadi; Husein, Adam; Rajion, Zainul Ahmad

    2016-12-01

    3D printing of mandible is important for pre-operative planning, diagnostic purposes, as well as for education and training. Currently, the processing of CT data is routinely performed with commercial software which increases the cost of operation and patient management for a small clinical setting. Usage of open-source software as an alternative to commercial software for 3D reconstruction of the mandible from CT data is scarce. The aim of this study is to compare two methods of 3D reconstruction of the mandible using commercial Materialise Mimics software and open-source Medical Imaging Interaction Toolkit (MITK) software. Head CT images with a slice thickness of 1 mm and a matrix of 512x512 pixels each were retrieved from the server located at the Radiology Department of Hospital Universiti Sains Malaysia. The CT data were analysed and the 3D models of mandible were reconstructed using both commercial Materialise Mimics and open-source MITK software. Both virtual 3D models were saved in STL format and exported to 3matic and MeshLab software for morphometric and image analyses. Both models were compared using Wilcoxon Signed Rank Test and Hausdorff Distance. No significant differences were obtained between the 3D models of the mandible produced using Mimics and MITK software. The 3D model of the mandible produced using MITK open-source software is comparable to the commercial MIMICS software. Therefore, open-source software could be used in clinical setting for pre-operative planning to minimise the operational cost.

  5. RMP*Comp

    EPA Pesticide Factsheets

    You can use this free software program to complete the Off-site Consequence Analyses (both worst case scenarios and alternative scenarios) required under the Risk Management Program rule, so that you don't have to do calculations by hand.

  6. Software Process Assessment (SPA)

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda H.; Sheppard, Sylvia B.; Butler, Scott A.

    1994-01-01

    NASA's environment mirrors the changes taking place in the nation at large, i.e. workers are being asked to do more work with fewer resources. For software developers at NASA's Goddard Space Flight Center (GSFC), the effects of this change are that we must continue to produce quality code that is maintainable and reusable, but we must learn to produce it more efficiently and less expensively. To accomplish this goal, the Data Systems Technology Division (DSTD) at GSFC is trying a variety of both proven and state-of-the-art techniques for software development (e.g., object-oriented design, prototyping, designing for reuse, etc.). In order to evaluate the effectiveness of these techniques, the Software Process Assessment (SPA) program was initiated. SPA was begun under the assumption that the effects of different software development processes, techniques, and tools, on the resulting product must be evaluated in an objective manner in order to assess any benefits that may have accrued. SPA involves the collection and analysis of software product and process data. These data include metrics such as effort, code changes, size, complexity, and code readability. This paper describes the SPA data collection and analysis methodology and presents examples of benefits realized thus far by DSTD's software developers and managers.

  7. SEED 2: a user-friendly platform for amplicon high-throughput sequencing data analyses.

    PubMed

    Vetrovský, Tomáš; Baldrian, Petr; Morais, Daniel; Berger, Bonnie

    2018-02-14

    Modern molecular methods have increased our ability to describe microbial communities. Along with the advances brought by new sequencing technologies, we now require intensive computational resources to make sense of the large numbers of sequences continuously produced. The software developed by the scientific community to address this demand, although very useful, require experience of the command-line environment, extensive training and have steep learning curves, limiting their use. We created SEED 2, a graphical user interface for handling high-throughput amplicon-sequencing data under Windows operating systems. SEED 2 is the only sequence visualizer that empowers users with tools to handle amplicon-sequencing data of microbial community markers. It is suitable for any marker genes sequences obtained through Illumina, IonTorrent or Sanger sequencing. SEED 2 allows the user to process raw sequencing data, identify specific taxa, produce of OTU-tables, create sequence alignments and construct phylogenetic trees. Standard dual core laptops with 8 GB of RAM can handle ca. 8 million of Illumina PE 300 bp sequences, ca. 4GB of data. SEED 2 was implemented in Object Pascal and uses internal functions and external software for amplicon data processing. SEED 2 is a freeware software, available at http://www.biomed.cas.cz/mbu/lbwrf/seed/ as a self-contained file, including all the dependencies, and does not require installation. Supplementary data contain a comprehensive list of supported functions. daniel.morais@biomed.cas.cz. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  8. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  9. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  10. Managers Handbook for Software Development

    NASA Technical Reports Server (NTRS)

    Agresti, W.; Mcgarry, F.; Card, D.; Page, J.; Church, V.; Werking, R.

    1984-01-01

    Methods and aids for the management of software development projects are presented. The recommendations are based on analyses and experiences with flight dynamics software development. The management aspects of organizing the project, producing a development plan, estimation costs, scheduling, staffing, preparing deliverable documents, using management tools, monitoring the project, conducting reviews, auditing, testing, and certifying are described.

  11. An Inquiry into the Cost of Post Deployment Software Support (PDSS)

    DTIC Science & Technology

    1989-09-01

    Equations .......... ii vi AFIT/GLM/LSY/835- I0 The increasing cost of software maintenance is taking a larger share of the military bidget each year... increments as needed (3:59). The second page of tne Form 75 starts with a section stating how the hours, and consequently the funds, will be allocated to...length of time required, the timeline can be in hourly, weekly, mnunthly, or quarterly increments . Some milestones included are formal approval, test

  12. Collected software engineering papers, volume 12

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1993 through October 1994. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 12th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  13. Collected software engineering papers, volume 11

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1992 through November 1993. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 11th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  14. Searching for Physics Beyond the Standard Model: Strongly-Coupled Field Theories at the Intensity and Energy Frontiers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brower, Richard C.

    This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data thatmore » will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.« less

  15. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  16. Reflection of a polarized light cone

    NASA Astrophysics Data System (ADS)

    Brody, Jed; Weiss, Daniel; Berland, Keith

    2013-01-01

    We introduce a visually appealing experimental demonstration of Fresnel reflection. In this simple optical experiment, a polarized light beam travels through a high numerical-aperture microscope objective, reflects off a glass slide, and travels back through the same objective lens. The return beam is sampled with a polarizing beam splitter and produces a surprising geometric pattern on an observation screen. Understanding the origin of this pattern requires careful attention to geometry and an understanding of the Fresnel coefficients for S and P polarized light. We demonstrate that in addition to a relatively simple experimental implementation, the shape of the observed pattern can be computed both analytically and by using optical modeling software. The experience of working through complex mathematical computations and demonstrating their agreement with a surprising experimental observation makes this a highly educational experiment for undergraduate optics or advanced-lab courses. It also provides a straightforward yet non-trivial system for teaching students how to use optical modeling software.

  17. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  18. Iterative non-sequential protein structural alignment.

    PubMed

    Salem, Saeed; Zaki, Mohammed J; Bystroff, Christopher

    2009-06-01

    Structural similarity between proteins gives us insights into their evolutionary relationships when there is low sequence similarity. In this paper, we present a novel approach called SNAP for non-sequential pair-wise structural alignment. Starting from an initial alignment, our approach iterates over a two-step process consisting of a superposition step and an alignment step, until convergence. We propose a novel greedy algorithm to construct both sequential and non-sequential alignments. The quality of SNAP alignments were assessed by comparing against the manually curated reference alignments in the challenging SISY and RIPC datasets. Moreover, when applied to a dataset of 4410 protein pairs selected from the CATH database, SNAP produced longer alignments with lower rmsd than several state-of-the-art alignment methods. Classification of folds using SNAP alignments was both highly sensitive and highly selective. The SNAP software along with the datasets are available online at http://www.cs.rpi.edu/~zaki/software/SNAP.

  19. Mapping RNA-seq Reads with STAR

    PubMed Central

    Dobin, Alexander; Gingeras, Thomas R.

    2015-01-01

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, signal visualization, and so forth. In this unit we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is Open Source software that can be run on Unix, Linux or Mac OS X systems. PMID:26334920

  20. Mapping RNA-seq Reads with STAR.

    PubMed

    Dobin, Alexander; Gingeras, Thomas R

    2015-09-03

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates, providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, and signal visualization. In this unit, we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is open source software that can be run on Unix, Linux, or Mac OS X systems. Copyright © 2015 John Wiley & Sons, Inc.

  1. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is presented. More than 100 publications are summarized. These publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials are grouped into five general subject areas for easy reference: (1) the software engineering laboratory; (2) software tools; (3) models and measures; (4) technology evaluations; and (5) data collection. An index further classifies these documents by specific topic.

  2. [A modified intracellular labelling technique for high-resolution staining of neuron in 500 microm-thickness brain slice].

    PubMed

    Zhao, Ming-liang; Liu, Guo-long; Sui, Jian-feng; Ruan, Huai-zhen; Xiong, Ying

    2007-05-01

    To develop simple but reliable intracellular labelling method for high-resolution visualization of the fine structure of single neurons in brain slice with thickness of 500 microm. Biocytin was introduced into neurons in 500 microm-thickness brain slices while blind whole cell recording. Following processed for histochemistry using the avidin-biotin-complex method, stained slices were mounted in glycerol on special glass slides. Labelled cells were digital photomicrographed every 30 microm and reconstructed with Adobe Photoshop software. After histochemistry, limited background staining was produced. The resolution was so high that fine structure, including branching, termination of individual axons and even spines of neurons could be identified in exquisite detail with optic microscope. With the help of software, the neurons of interest could be reconstructed from a stack of photomicrographs. The modified method provides an easy and reliable approach to revealing the detailed morphological properties of single neurons in 500 microm-thickness brain slice. Without requisition of special equipment, it is suited to be broadly applied.

  3. MediaTracker system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandoval, D. M.; Strittmatter, R. B.; Abeyta, J. D.

    2004-01-01

    The initial objectives of this effort were to provide a hardware and software platform that can address the requirements for the accountability of classified removable electronic media and vault access logging. The Media Tracker system software assists classified media custodian in managing vault access logging and Media Tracking to prevent the inadvertent violation of rules or policies for the access to a restricted area and the movement and use of tracked items. The MediaTracker system includes the software tools to track and account for high consequence security assets and high value items. The overall benefits include: (1) real-time access tomore » the disposition of all Classified Removable Electronic Media (CREM), (2) streamlined security procedures and requirements, (3) removal of ambiguity and managerial inconsistencies, (4) prevention of incidents that can and should be prevented, (5) alignment with the DOE's initiative to achieve improvements in security and facility operations through technology deployment, and (6) enhanced individual responsibility by providing a consistent method of dealing with daily responsibilities. In response to initiatives to enhance the control of classified removable electronic media (CREM), the Media Tracker software suite was developed, piloted and implemented at the Los Alamos National Laboratory beginning in July 2000. The Media Tracker software suite assists in the accountability and tracking of CREM and other high-value assets. One component of the MediaTracker software suite provides a Laboratory-approved media tracking system. Using commercial touch screen and bar code technology, the MediaTracker (MT) component of the MediaTracker software suite provides an efficient and effective means to meet current Laboratory requirements and provides new-engineered controls to help assure compliance with those requirements. It also establishes a computer infrastructure at vault entrances for vault access logging, and can accommodate several methods of positive identification including smart cards and biometrics. Currently, we have three mechanisms that provide added security for accountability and tracking purposes. One mechanism consists of a portable, hand-held inventory scanner, which allows the custodian to physically track the items that are not accessible within a particular area. The second mechanism is a radio frequency identification (RFID) consisting of a monitoring portal, which tracks and logs in a database all activity tagged of items that pass through the portals. The third mechanism consists of an electronic tagging of a flash memory device for automated inventory of CREM in storage. By modifying this USB device the user is provided with added assurance, limiting the data from being obtained from any other computer.« less

  4. Software-engineering challenges of building and deploying reusable problem solvers.

    PubMed

    O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A

    2009-11-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.

  5. Software-engineering challenges of building and deploying reusable problem solvers

    PubMed Central

    O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.

    2012-01-01

    Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031

  6. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  7. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    PubMed

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  8. Aviation Environmental Design Tool (AEDT): Version 2d: Installation Guide

    DOT National Transportation Integrated Search

    2017-09-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  9. Vienna Fortran - A Language Specification. Version 1.1

    DTIC Science & Technology

    1992-03-01

    other computer archi- tectures is the fact that the memory is physically distributed among the processors; the time required to access a non-local...datum may be an order of magnitude higher than the time taken to access locally stored data. This has important consequences for program efficiency. In...machine in many aspects. It is tedious, time -consuming and error prone. It has led to particularly slow software development cycles and, in consequence

  10. Roll-Out and Turn-Off Display Software for Integrated Display System

    NASA Technical Reports Server (NTRS)

    Johnson, Edward J., Jr.; Hyer, Paul V.

    1999-01-01

    This report describes the software products, system architectures and operational procedures developed by Lockheed-Martin in support of the Roll-Out and Turn-Off (ROTO) sub-element of the Low Visibility Landing and Surface Operations (LVLASO) program at the NASA Langley Research Center. The ROTO portion of this program focuses on developing technologies that aid pilots in the task of managing the deceleration of an aircraft to a pre-selected exit taxiway. This report focuses on software that produces a system of redundant deceleration cues for a pilot during the landing roll-out, and presents these cues on a head up display (HUD). The software also produces symbology for aircraft operational phases involving cruise flight, approach, takeoff, and go-around. The algorithms and data sources used to compute the deceleration guidance and generate the displays are discussed. Examples of the display formats and symbology options are presented. Logic diagrams describing the design of the ROTO software module are also given.

  11. Software Metrics

    DTIC Science & Technology

    1988-12-01

    software development scene is often charac- c. SPQR Model-Jones terized by: * schedule and cost estimates that are gross-d. COPMO-Thebaut ly inaccurate, SEI...time c. SPQR Model-Jones (in seconds) is simply derived from E by dividing T. Capers Jones has developed a software cost by the Stroud number, S...estimation model called the Software Produc- T=E/S tivity, Quality, and Reliability ( SPQR ) model. The basic approach is similar to that of Boehm’s The value

  12. Recommended approach to software development, revision 3

    NASA Technical Reports Server (NTRS)

    Landis, Linda; Waligora, Sharon; Mcgarry, Frank; Pajerski, Rose; Stark, Mike; Johnson, Kevin Orlin; Cover, Donna

    1992-01-01

    Guidelines for an organized, disciplined approach to software development that is based on studies conducted by the Software Engineering Laboratory (SEL) since 1976 are presented. It describes methods and practices for each phase of a software development life cycle that starts with requirements definition and ends with acceptance testing. For each defined life cycle phase, guidelines for the development process and its management, and for the products produced and their reviews are presented.

  13. CANCER RISKS ATTRIBUTABLE TO LOW DOSES OF IONIZING RADIATION - ASSESSING WHAT WE REALLY KNOW?

    EPA Science Inventory

    Cancer Risks Attributable to Low Doses of Ionizing Radiation - What Do We Really Know?

    Abstract
    High doses of ionizing radiation clearly produce deleterious consequences in humans including, but not exclusively, cancer induction. At very low radiation doses the situatio...

  14. Enzyme Activity Dynamics in Response to Climate Change: 2011 Drought-Heat Wave

    USDA-ARS?s Scientific Manuscript database

    Extreme weather events such as severe droughts and heat waves may have permanent consequences on soil quality and functioning in agroecosystems. The Southern High Plains (SHP) region of Texas, U.S., a large cotton producing area, experienced a historically extreme drought and heat wave during 2011,...

  15. Coloring the FITS Universe

    NASA Astrophysics Data System (ADS)

    Levay, Z. G.

    2004-12-01

    A new, freely-available accessory for Adobe's widely-used Photoshop image editing software makes it much more convenient to produce presentable images directly from FITS data. It merges a fully-functional FITS reader with an intuitive user interface and includes fully interactive flexibility in scaling data. Techniques for producing attractive images from astronomy data using the FITS plugin will be presented, including the assembly of full-color images. These techniques have been successfully applied to producing colorful images for public outreach with data from the Hubble Space Telescope and other major observatories. Now it is much less cumbersome for students or anyone not experienced with specialized astronomical analysis software, but reasonably familiar with digital photography, to produce useful and attractive images.

  16. SharP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkata, Manjunath Gorentla; Aderholdt, William F

    The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less

  17. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.

    PubMed

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo

    2016-12-13

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.

  18. cn.FARMS: a latent variable model to detect copy number variations in microarray data with a low false discovery rate.

    PubMed

    Clevert, Djork-Arné; Mitterecker, Andreas; Mayr, Andreas; Klambauer, Günter; Tuefferd, Marianne; De Bondt, An; Talloen, Willem; Göhlmann, Hinrich; Hochreiter, Sepp

    2011-07-01

    Cost-effective oligonucleotide genotyping arrays like the Affymetrix SNP 6.0 are still the predominant technique to measure DNA copy number variations (CNVs). However, CNV detection methods for microarrays overestimate both the number and the size of CNV regions and, consequently, suffer from a high false discovery rate (FDR). A high FDR means that many CNVs are wrongly detected and therefore not associated with a disease in a clinical study, though correction for multiple testing takes them into account and thereby decreases the study's discovery power. For controlling the FDR, we propose a probabilistic latent variable model, 'cn.FARMS', which is optimized by a Bayesian maximum a posteriori approach. cn.FARMS controls the FDR through the information gain of the posterior over the prior. The prior represents the null hypothesis of copy number 2 for all samples from which the posterior can only deviate by strong and consistent signals in the data. On HapMap data, cn.FARMS clearly outperformed the two most prevalent methods with respect to sensitivity and FDR. The software cn.FARMS is publicly available as a R package at http://www.bioinf.jku.at/software/cnfarms/cnfarms.html.

  19. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes

    PubMed Central

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J.; Wang, Liliang; Lin, Jianguo

    2016-01-01

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions. PMID:28060298

  20. Development of a multimedia CD-ROM on telemedicine and teleradiology

    NASA Astrophysics Data System (ADS)

    Schnur, Mark T.; Williamson, Morgan P.; Goeringer, Fred; Zimnik, Paul; Linn, Reid; Suitor, Charles T.; Rocca, Mitra A.; Strother, Thomas

    1996-04-01

    The Department of Defense Telemedicine Test Bed produced a CD-ROM including information on telemedicine, teleradiology and military medical advanced technology projects. The CD-ROM was produced using media from the Telemedicine Test Bed World Wide Web site and academic papers and presentations. Apple Media Tools software was used to produce the interactive program and the authoring was done on a high speed Apple Macintosh Power PC computer. The process took roughly 100 hours to author 50 Mb of data into 200 frames of interactive material. Future versions of the Telemedicine CD-ROM are in progress which will include much more material to take advantage of the 650 Mb available on a compact disk. This paper graphically depicts and explains the authoring process.

  1. Classroom Software for the Information Age. Technical Report No. 23.

    ERIC Educational Resources Information Center

    Sheingold, Karen; And Others

    Consumers and producers of educational software must make decisions about what kind of software to buy and create. Both groups must base these decisions on criteria that consider what it is important to learn in our technological era, what is workable, and what is currently practical and cost-effective. Five criteria that are central to these…

  2. Designing Online Software for Teaching the Concept of Variable That Facilitates Mental Interaction with the Material: Systemic Approach

    ERIC Educational Resources Information Center

    Koehler, Natalya A.; Thompson, Ann D.; Correia, Ana-Paula; Hagedorn, Linda Serra

    2015-01-01

    Our case study is a response to the need for research and reporting on specific strategies employed by software designers to produce effective multimedia instructional solutions. A systemic approach for identifying appropriate software features and conducting a formative evaluation that evaluates both the overall effectiveness of the multimedia…

  3. Automated Estimation Of Software-Development Costs

    NASA Technical Reports Server (NTRS)

    Roush, George B.; Reini, William

    1993-01-01

    COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.

  4. GSC configuration management plan

    NASA Technical Reports Server (NTRS)

    Withers, B. Edward

    1990-01-01

    The tools and methods used for the configuration management of the artifacts (including software and documentation) associated with the Guidance and Control Software (GCS) project are described. The GCS project is part of a software error studies research program. Three implementations of GCS are being produced in order to study the fundamental characteristics of the software failure process. The Code Management System (CMS) is used to track and retrieve versions of the documentation and software. Application of the CMS for this project is described and the numbering scheme is delineated for the versions of the project artifacts.

  5. Industry Software Trustworthiness Criterion Research Based on Business Trustworthiness

    NASA Astrophysics Data System (ADS)

    Zhang, Jin; Liu, Jun-fei; Jiao, Hai-xing; Shen, Yi; Liu, Shu-yuan

    To industry software Trustworthiness problem, an idea aiming to business to construct industry software trustworthiness criterion is proposed. Based on the triangle model of "trustworthy grade definition-trustworthy evidence model-trustworthy evaluating", the idea of business trustworthiness is incarnated from different aspects of trustworthy triangle model for special industry software, power producing management system (PPMS). Business trustworthiness is the center in the constructed industry trustworthy software criterion. Fusing the international standard and industry rules, the constructed trustworthy criterion strengthens the maneuverability and reliability. Quantitive evaluating method makes the evaluating results be intuitionistic and comparable.

  6. Development of a support software system for real-time HAL/S applications

    NASA Technical Reports Server (NTRS)

    Smith, R. S.

    1984-01-01

    Methodologies employed in defining and implementing a software support system for the HAL/S computer language for real-time operations on the Shuttle are detailed. Attention is also given to the management and validation techniques used during software development and software maintenance. Utilities developed to support the real-time operating conditions are described. With the support system being produced on Cyber computers and executable code then processed through Cyber or PDP machines, the support system has a production level status and can serve as a model for other software development projects.

  7. Bullying during adolescence in Brazil: an overview.

    PubMed

    Pigozi, Pamela Lamarca; Machado, Ana Lúcia

    2015-11-01

    Bullying has been the subject of worldwide study for over four decades and is widely reported by social media. Despite this, the issue is a relatively new area of research in Brazil. This study analyzes academic literature addressing bullying produced in Brazil focusing on aspects that characterize this issue as a subtype of violence: gender differences, factors associated with bullying, consequences, and possible intervention and prevention approaches. The guiding question of this study was: what have Brazilian researchers produced regarding bullying among adolescents? The results show that over half of the studies used quantitative approaches, principally cross-sectional methods and questionnaires, and focused on determining the prevalence of and factors associated with bullying. The findings showed a high prevalence of bullying among Brazilian adolescents, an association between risk behavior and bullying, serious consequences for the mental health of young people, lack of awareness and understanding among adolescents about bullying and its consequences, and a lack of strategies to manage this type of aggression. There is a need for intervention studies, prevention and restorative practices that involve the community and can be applied to everyday life at school.

  8. Quantum machine learning.

    PubMed

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-13

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  9. Quantum machine learning

    NASA Astrophysics Data System (ADS)

    Biamonte, Jacob; Wittek, Peter; Pancotti, Nicola; Rebentrost, Patrick; Wiebe, Nathan; Lloyd, Seth

    2017-09-01

    Fuelled by increasing computer power and algorithmic advances, machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that classical systems are thought not to produce efficiently, so it is reasonable to postulate that quantum computers may outperform classical computers on machine learning tasks. The field of quantum machine learning explores how to devise and implement quantum software that could enable machine learning that is faster than that of classical computers. Recent work has produced quantum algorithms that could act as the building blocks of machine learning programs, but the hardware and software challenges are still considerable.

  10. Soil erosion predictions from a landscape evolution model - An assessment of a post-mining landform using spatial climate change analogues.

    PubMed

    Hancock, G R; Verdon-Kidd, D; Lowry, J B C

    2017-12-01

    Landscape Evolution Modelling (LEM) technologies provide a means by which it is possible to simulate the long-term geomorphic stability of a conceptual rehabilitated landform. However, simulations rarely consider the potential effects of anthropogenic climate change and consequently risk not accounting for the range of rainfall variability that might be expected in both the near and far future. One issue is that high resolution (both spatial and temporal) rainfall projections incorporating the potential effects of greenhouse forcing are required as input. However, projections of rainfall change are still highly uncertain for many regions, particularly at sub annual/seasonal scales. This is the case for northern Australia, where a decrease or an increase in rainfall post 2030 is considered equally likely based on climate model simulations. The aim of this study is therefore to investigate a spatial analogue approach to develop point scale hourly rainfall scenarios to be used as input to the CAESAR - Lisflood LEM to test the sensitivity of the geomorphic stability of a conceptual rehabilitated landform to potential changes in climate. Importantly, the scenarios incorporate the range of projected potential increase/decrease in rainfall for northern Australia and capture the expected envelope of erosion rates and erosion patterns (i.e. where erosion and deposition occurs) over a 100year modelled period. We show that all rainfall scenarios produce sediment output and gullying greater than that of the surrounding natural system, however a 'wetter' future climate produces the highest output. Importantly, incorporating analogue rainfall scenarios into LEM has the capacity to both improve landform design and enhance the modelling software. Further, the method can be easily transferred to other sites (both nationally and internationally) where rainfall variability is significant and climate change impacts are uncertain. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.

  11. Federal COBOL Compiler Testing Service Compiler Validation Request Information.

    DTIC Science & Technology

    1977-05-09

    background of the Federal COBOL Compiler Testing Service which was set up by a memorandum of agreement between the National Bureau of Standards and the...Federal Standard, and the requirement of COBOL compiler validation in the procurement process. It also contains a list of all software products...produced by the software Development Division in support of the FCCTS as well as the Validation Summary Reports produced as a result of discharging the

  12. Precise control and animation creation over the DMD for projection-based applications

    NASA Astrophysics Data System (ADS)

    Koudsi, Badia

    2014-03-01

    Digital micromirror devices (DMDs) are used in a variety of display and projection applications to produce high resolution images, both static and animated. A common obstacle to working with DMDs in research and development applications is the steep learning curve required to obtain proficiency in programming the boards that control the behavior of the DMDs. This can discourage developers who wish to use DMDs in new or novel research and development applications which might benefit from their light-control properties. A new software package called Light Animator has been developed that provides a user friendly and more intuitive interface for controlling the DMD. The software allows users to address the micromirror array by the drawing and animation of objects in a style similar to that of commercial drawing programs. Sequences and animation are controlled by dividing the sequence into frames which the user can draw individually or the software can fill in for the user. Examples and descriptions of the software operation are described and operational performance measures are provided. Potential applications include 3D volumetric displays, a 3D scanner when combining the DMD with a CCD camera, and most any 2D application for which DMDs are currently used. The software's capabilities allow scientists to develop applications more easily and effectively.

  13. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  14. NASGRO(registered trademark): Fracture Mechanics and Fatigue Crack Growth Analysis Software

    NASA Technical Reports Server (NTRS)

    Forman, Royce; Shivakumar, V.; Mettu, Sambi; Beek, Joachim; Williams, Leonard; Yeh, Feng; McClung, Craig; Cardinal, Joe

    2004-01-01

    This viewgraph presentation describes NASGRO, which is a fracture mechanics and fatigue crack growth analysis software package that is used to reduce risk of fracture in Space Shuttles. The contents include: 1) Consequences of Fracture; 2) NASA Fracture Control Requirements; 3) NASGRO Reduces Risk; 4) NASGRO Use Inside NASA; 5) NASGRO Components: Crack Growth Module; 6) NASGRO Components:Material Property Module; 7) Typical NASGRO analysis: Crack growth or component life calculation; and 8) NASGRO Sample Application: Orbiter feedline flowliner crack analysis.

  15. In Law We Trust? Trusted Computing and Legal Responsibility for Internet Security

    NASA Astrophysics Data System (ADS)

    Danidou, Yianna; Schafer, Burkhard

    This paper analyses potential legal responses and consequences to the anticipated roll out of Trusted Computing (TC). It is argued that TC constitutes such a dramatic shift in power away from users to the software providers, that it is necessary for the legal system to respond. A possible response is to mirror the shift in power by a shift in legal responsibility, creating new legal liabilities and duties for software companies as the new guardians of internet security.

  16. Importance of Requirements Analysis & Traceability to Improve Software Quality and Reduce Cost and Risk

    NASA Technical Reports Server (NTRS)

    Kapoor, Manju M.; Mehta, Manju

    2010-01-01

    The goal of this paper is to emphasize the importance of developing complete and unambiguous requirements early in the project cycle (prior to Preliminary Design Phase). Having a complete set of requirements early in the project cycle allows sufficient time to generate a traceability matrix. Requirements traceability and analysis are the key elements in improving verification and validation process, and thus overall software quality. Traceability can be most beneficial when the system changes. If changes are made to high-level requirements it implies that low-level requirements need to be modified. Traceability ensures that requirements are appropriately and efficiently verified at various levels whereas analysis ensures that a rightly interpreted set of requirements is produced.

  17. Evaluation of vocal acoustic and efficiency analysis parameters in medical students and academic teachers with use of iris and diagnoscope specialist software.

    PubMed

    Zielińska-Bliźniewska, Hanna; Sułkowski, Wiesław J; Pietkiewicz, Piotr; Miłoński, Jarosław; Mazurek, Agnieszka; Olszewski, Jurek

    2012-06-01

    The aim of this study was to compare the parameters of vocal acoustic and vocal efficiency analyses in medical students and academic teachers with use of the IRIS and DiagnoScope Specialist software and to evaluate their usefulness in prevention and certification of occupational disease. The study group comprised 40 women, including students and employees of the Military Medical Faculty, Medical University of Łodź. After informed consent had been obtained from the participant women, the primary medical history was taken, videolaryngoscopic and stroboscopic examinations were performed and diagnostic vocal acoustic analysis was carried out with the use of the IRIS and Diagno-Scope Specialist software. Based on the results of the performed measurements, the statistical analysis evidenced the compatibility between two software programs, IRIS and DiagnoScope Specialist, with the only exception of the F4 formant. The mean values of vocal acoustic parameters in medical students and academic teachers, obtained by means of the IRIS software, can be used as standards for the female population not yet developed by the producer. When using the DiagnoScope Specialist software, some mean values were higher and some lower than the standards specified by the producer. The study evidenced the compatibility between two measurement software programs, IRIS and DiagnoScope Specialist, except for the F4 formant. It should be noted that the later has advantage over the former since the standard values of vocal acoustic parameters have been worked out by the producer. Moreover, they only slightly departed from the values obtained in our study and may be useful in diagnostics of occupational voice disorders.

  18. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  19. Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [December 2015

    DOT National Transportation Integrated Search

    2015-12-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  20. Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [June 2016

    DOT National Transportation Integrated Search

    2016-06-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  1. Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [July 2015

    DOT National Transportation Integrated Search

    2015-07-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  2. Interactive multimedia preventive alcohol education: a technology application in higher education.

    PubMed

    Reis, J; Riley, W; Lokman, L; Baer, J

    2000-01-01

    This article summarizes the process of implementation and short-term impact on knowledge and attitudes of an interactive multimedia software program on preventive alcohol education for young adults. The three factors related to behavioral change addressed in the software are self-efficacy in maintaining personal control and safety while using alcohol, attitudes and related expectations regarding the physiological and behavioral consequences of alcohol consumption, and peer norms regarding alcohol consumption. As compared to alternative alcohol education and a no-alcohol education groups, students using the interactive computer lesson reported learning more about dose-response and ways to intervene with friends in peril. The article concludes with consideration of the import of this technology for informing students about the consequences of alcohol use, and the utility to higher education institutions of using this technology in an era when pressures increase for due diligence around student safety but with few additional institutional resources.

  3. Improving the Automated Detection and Analysis of Secure Coding Violations

    DTIC Science & Technology

    2014-06-01

    eliminating software vulnerabilities and other flaws. The CERT Division produces books and courses that foster a security mindset in developers, and...website also provides a virtual machine containing a complete build of the Rosecheckers project on Linux . The Rosecheckers project leverages the...Compass/ROSE6 project developed at Law- rence Livermore National Laboratory. This project provides a high-level API for accessing the abstract syntax tree

  4. Extreme C2 and Multi-Touch, Multi-User Collaborative User Interfaces

    DTIC Science & Technology

    2008-06-01

    Organization: Office of the Chief Engineer , Space and Naval Warfare Systems Center Charleston Address: PO Box 190022 N. Charleston, SC 29419 843...collaborative development technique can increase the adaptability and quality of software, something of high value in the complex domain of enterprise...concept to C2 should be able to produce similar benefits for planning in military operations, particularly complex, multi- faceted operations. This

  5. A Disk-Based System for Producing and Distributing Science Products from MODIS

    NASA Technical Reports Server (NTRS)

    Masuoka, Edward; Wolfe, Robert; Sinno, Scott; Ye Gang; Teague, Michael

    2007-01-01

    Since beginning operations in 1999, the MODIS Adaptive Processing System (MODAPS) has evolved to take advantage of trends in information technology, such as the falling cost of computing cycles and disk storage and the availability of high quality open-source software (Linux, Apache and Perl), to achieve substantial gains in processing and distribution capacity and throughput while driving down the cost of system operations.

  6. Software For Clear-Air Doppler-Radar Display

    NASA Technical Reports Server (NTRS)

    Johnston, Bruce W.

    1990-01-01

    System of software developed to present plan-position-indicator scans of clear-air Doppler radar station on color graphical cathode-ray-tube display. Designed to incorporate latest accepted standards for equipment, computer programs, and meteorological data bases. Includes use of Ada programming language, of "Graphical-Kernel-System-like" graphics interface, and of Common Doppler Radar Exchange Format. Features include portability and maintainability. Use of Ada software packages produced number of software modules reused on other related projects.

  7. PolyPhred analysis software for mutation detection from fluorescence-based sequence data.

    PubMed

    Montgomery, Kate T; Iartchouck, Oleg; Li, Li; Loomis, Stephanie; Obourn, Vanessa; Kucherlapati, Raju

    2008-10-01

    The ability to search for genetic variants that may be related to human disease is one of the most exciting consequences of the availability of the sequence of the human genome. Large cohorts of individuals exhibiting certain phenotypes can be studied and candidate genes resequenced. However, the challenge of analyzing sequence data from many individuals with accuracy, speed, and economy is great. This unit describes one set of software tools: Phred, Phrap, PolyPhred, and Consed. Coverage includes the advantages and disadvantages of these analysis tools, details for obtaining and using the software, and the results one may expect. The software is being continually updated to permit further automation of mutation analysis. Currently, however, at least some manual review is required if one wishes to identify 100% of the variants in a sample set.

  8. Effect of nozzle orifice geometry on spray, combustion, and emission characteristics under diesel engine conditions.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Som, S.; Longman, D. E; Ramirez, A. I.

    2011-03-01

    Diesel engine performance and emissions are strongly coupled with fuel atomization and spray processes, which in turn are strongly influenced by injector flow dynamics. Modern engines employ micro-orifices with different orifice designs. It is critical to characterize the effects of various designs on engine performance and emissions. In this study, a recently developed primary breakup model (KH-ACT), which accounts for the effects of cavitation and turbulence generated inside the injector nozzle is incorporated into a CFD software CONVERGE for comprehensive engine simulations. The effects of orifice geometry on inner nozzle flow, spray, and combustion processes are examined by coupling themore » injector flow and spray simulations. Results indicate that conicity and hydrogrinding reduce cavitation and turbulence inside the nozzle orifice, which slows down primary breakup, increasing spray penetration, and reducing dispersion. Consequently, with conical and hydroground nozzles, the vaporization rate and fuel air mixing are reduced, and ignition occurs further downstream. The flame lift-off lengths are the highest and lowest for the hydroground and conical nozzles, respectively. This can be related to the rate of fuel injection, which is higher for the hydroground nozzle, leading to richer mixtures and lower flame base speeds. A modified flame index is employed to resolve the flame structure, which indicates a dual combustion mode. For the conical nozzle, the relative role of rich premixed combustion is enhanced and that of diffusion combustion reduced compared to the other two nozzles. In contrast, for the hydroground nozzle, the role of rich premixed combustion is reduced and that of non-premixed combustion is enhanced. Consequently, the amount of soot produced is the highest for the conical nozzle, while the amount of NOx produced is the highest for the hydroground nozzle, indicating the classical tradeoff between them.« less

  9. TOOTHPASTEV6.11.3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sankel, David J.; Clair, Aaron B. St.; Langsfield, Joshua D.

    2006-11-01

    Toothpaste is a graphical user interface and Computer Aided Drafting/Manufacturing (CAD/CAM) software package used to plan tool paths for Galil Motion Control hardware. The software is a tool for computer controlled dispensing of materials. The software may be used for solid freeform fabrication of components or the precision printing of inks. Mathematical calculations are used to produce a set of segments and arcs that when coupled together will fill space. The paths of the segments and arcs are then translated into a machine language that controls the motion of motors and translational stages to produce tool paths in three dimensions.more » As motion begins material(s) are dispensed or printed along the three-dimensional pathway.« less

  10. Parametric modelling design applied to weft knitted surfaces and its effects in their physical properties

    NASA Astrophysics Data System (ADS)

    Oliveira, N. P.; Maciel, L.; Catarino, A. P.; Rocha, A. M.

    2017-10-01

    This work proposes the creation of models of surfaces using a parametric computer modelling software to obtain three-dimensional structures in weft knitted fabrics produced on single needle system machines. Digital prototyping, another feature of digital modelling software, was also explored in three-dimensional drawings generated using the Rhinoceros software. With this approach, different 3D structures were developed and produced. Physical characterization tests were then performed on the resulting 3D weft knitted structures to assess their ability to promote comfort. From the obtained results, it is apparent that the developed structures have potential for application in different market segments, such as clothing and interior textiles.

  11. MUST - An integrated system of support tools for research flight software engineering. [Multipurpose User-oriented Software Technology

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.

  12. FTOOLS: A general package of software to manipulate FITS files

    NASA Astrophysics Data System (ADS)

    Blackburn, J. K.; Shaw, R. A.; Payne, H. E.; Hayes, J. J. E.; Heasarc

    1999-12-01

    FTOOLS, a highly modular collection of utilities for processing and analyzing data in the FITS (Flexible Image Transport System) format, has been developed in support of the HEASARC (High Energy Astrophysics Research Archive Center) at NASA's Goddard Space Flight Center. The FTOOLS package contains many utility programs which perform modular tasks on any FITS image or table, as well as higher-level analysis programs designed specifically for data from current and past high energy astrophysics missions. The utility programs for FITS tables are especially rich and powerful, and provide functions for presentation of file contents, extraction of specific rows or columns, appending or merging tables, binning values in a column or selecting subsets of rows based on a boolean expression. Individual FTOOLS programs can easily be chained together in scripts to achieve more complex operations such as the generation and displaying of spectra or light curves. FTOOLS development began in 1991 and has produced the main set of data analysis software for the current ASCA and RXTE space missions and for other archival sets of X-ray and gamma-ray data. The FTOOLS software package is supported on most UNIX platforms and on Windows machines. The user interface is controlled by standard parameter files that are very similar to those used by IRAF. The package is self documenting through a stand alone help task called fhelp. Software is written in ANSI C and FORTRAN to provide portability across most computer systems. The data format dependencies between hardware platforms are isolated through the FITSIO library package.

  13. Phenological Parameters Estimation Tool

    NASA Technical Reports Server (NTRS)

    McKellip, Rodney D.; Ross, Kenton W.; Spruce, Joseph P.; Smoot, James C.; Ryan, Robert E.; Gasser, Gerald E.; Prados, Donald L.; Vaughan, Ronald D.

    2010-01-01

    The Phenological Parameters Estimation Tool (PPET) is a set of algorithms implemented in MATLAB that estimates key vegetative phenological parameters. For a given year, the PPET software package takes in temporally processed vegetation index data (3D spatio-temporal arrays) generated by the time series product tool (TSPT) and outputs spatial grids (2D arrays) of vegetation phenological parameters. As a precursor to PPET, the TSPT uses quality information for each pixel of each date to remove bad or suspect data, and then interpolates and digitally fills data voids in the time series to produce a continuous, smoothed vegetation index product. During processing, the TSPT displays NDVI (Normalized Difference Vegetation Index) time series plots and images from the temporally processed pixels. Both the TSPT and PPET currently use moderate resolution imaging spectroradiometer (MODIS) satellite multispectral data as a default, but each software package is modifiable and could be used with any high-temporal-rate remote sensing data collection system that is capable of producing vegetation indices. Raw MODIS data from the Aqua and Terra satellites is processed using the TSPT to generate a filtered time series data product. The PPET then uses the TSPT output to generate phenological parameters for desired locations. PPET output data tiles are mosaicked into a Conterminous United States (CONUS) data layer using ERDAS IMAGINE, or equivalent software package. Mosaics of the vegetation phenology data products are then reprojected to the desired map projection using ERDAS IMAGINE

  14. Strategic directions of computing at Fermilab

    NASA Astrophysics Data System (ADS)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  15. Implementing Natural Resources Cadastral Plan in Pasargadae District of Iran by Using Quick Bird Images

    NASA Astrophysics Data System (ADS)

    Azhdari, G. H.; Deilami, K.; Firooznia, E.

    2015-12-01

    Natural Resources are essential for security and sustainable development of each country. Therefore, in order to reach sustainable development, conservation as well as optimum utilization of natural resources, executing of natural resources cadastral plan is necessary and essential. Governments conduct lands management in Iran, so there is a need for comprehensive plan with arranged program for best evaluation. In this research as a pilot, Pasargadae city is opted. Pasargadae region is located in north-east of Shiraz in Fars province with Latitude and longitude of 30° 15 ´ 53 ° N and 53° 13 ´ 29 ° E respectively. In order to generate the cadastral maps, Firstly, images from QuickBird satellite with 50-60 centimeters resolution were georeferenced by utilizing ground control points with accurate GPS coordinates. In addition to satellite images, old paper maps with 1:10000 scale in local coordinate system from agriculture ministry in 1963 were digitized according to 1:25000 scale map from army geographical organization with AutoCad software. Beside, paper maps with 1:50000 scale and Google Earth were used to find the changes during time. All the above maps were added to QuickBird images as new layers by using ArcMap software. These maps also were utilized to determine the different land-uses. Thus, by employing ArcMap software lands divide into 2 groups: firstly, lands with official document, which is owned by either natural or legal persons, and secondly national lands under different uses such as forestry, range management and desertification plans. Consequently, the generation of cadastral maps leads to better difference between private and national lands. In addition, producing cadastral maps prevent the destruction and illegal possession of natural lands by individuals.

  16. SEER*Stat Software

    Cancer.gov

    If you have access to SEER Research Data, use SEER*Stat to analyze SEER and other cancer-related databases. View individual records and produce statistics including incidence, mortality, survival, prevalence, and multiple primary. Tutorials and related analytic software tools are available.

  17. Test Driven Development of Scientific Models

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.

    2014-01-01

    Test-Driven Development (TDD), a software development process that promises many advantages for developer productivity and software reliability, has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices.After a brief overview of the TDD process and my experience in applying the methodology for development activities at Goddard, I will delve more deeply into some of the challenges that are posed by numerical and scientific software as well as tools and implementation approaches that should address those challenges.

  18. Ten simple rules for making research software more robust

    PubMed Central

    2017-01-01

    Software produced for research, published and otherwise, suffers from a number of common problems that make it difficult or impossible to run outside the original institution or even off the primary developer’s computer. We present ten simple rules to make such software robust enough to be run by anyone, anywhere, and thereby delight your users and collaborators. PMID:28407023

  19. Theoretical Foundations of Software Technology.

    DTIC Science & Technology

    1983-02-14

    major research interests are software testing, aritificial intelligence , pattern recogu- tion, and computer graphics. Dr. Chandranekaran is currently...produce PASCAL language code for the problems. Because of its relationship to many issues in Artificial Intelligence , we also investigated problems of...analysis to concurmt-prmcess software re- are not " intelligent " enough to discover these by themselves, ouirl more complex control flow models. The PAF

  20. FastaValidator: an open-source Java library to parse and validate FASTA formatted sequences.

    PubMed

    Waldmann, Jost; Gerken, Jan; Hankeln, Wolfgang; Schweer, Timmy; Glöckner, Frank Oliver

    2014-06-14

    Advances in sequencing technologies challenge the efficient importing and validation of FASTA formatted sequence data which is still a prerequisite for most bioinformatic tools and pipelines. Comparative analysis of commonly used Bio*-frameworks (BioPerl, BioJava and Biopython) shows that their scalability and accuracy is hampered. FastaValidator represents a platform-independent, standardized, light-weight software library written in the Java programming language. It targets computer scientists and bioinformaticians writing software which needs to parse quickly and accurately large amounts of sequence data. For end-users FastaValidator includes an interactive out-of-the-box validation of FASTA formatted files, as well as a non-interactive mode designed for high-throughput validation in software pipelines. The accuracy and performance of the FastaValidator library qualifies it for large data sets such as those commonly produced by massive parallel (NGS) technologies. It offers scientists a fast, accurate and standardized method for parsing and validating FASTA formatted sequence data.

  1. Implementation of metal-friendly EAM/FS-type semi-empirical potentials in HOOMD-blue: A GPU-accelerated molecular dynamics software

    NASA Astrophysics Data System (ADS)

    Yang, Lin; Zhang, Feng; Wang, Cai-Zhuang; Ho, Kai-Ming; Travesset, Alex

    2018-04-01

    We present an implementation of EAM and FS interatomic potentials, which are widely used in simulating metallic systems, in HOOMD-blue, a software designed to perform classical molecular dynamics simulations using GPU accelerations. We first discuss the details of our implementation and then report extensive benchmark tests. We demonstrate that single-precision floating point operations efficiently implemented on GPUs can produce sufficient accuracy when compared against double-precision codes, as demonstrated in test simulations of calculations of the glass-transition temperature of Cu64.5Zr35.5, and pair correlation function g (r) of liquid Ni3Al. Our code scales well with the size of the simulating system on NVIDIA Tesla M40 and P100 GPUs. Compared with another popular software LAMMPS running on 32 cores of AMD Opteron 6220 processors, the GPU/CPU performance ratio can reach as high as 4.6. The source code can be accessed through the HOOMD-blue web page for free by any interested user.

  2. An improved classification tree analysis of high cost modules based upon an axiomatic definition of complexity

    NASA Technical Reports Server (NTRS)

    Tian, Jianhui; Porter, Adam; Zelkowitz, Marvin V.

    1992-01-01

    Identification of high cost modules has been viewed as one mechanism to improve overall system reliability, since such modules tend to produce more than their share of problems. A decision tree model was used to identify such modules. In this current paper, a previously developed axiomatic model of program complexity is merged with the previously developed decision tree process for an improvement in the ability to identify such modules. This improvement was tested using data from the NASA Software Engineering Laboratory.

  3. Software For Drawing Design Details Concurrently

    NASA Technical Reports Server (NTRS)

    Crosby, Dewey C., III

    1990-01-01

    Software system containing five computer-aided-design programs enables more than one designer to work on same part or assembly at same time. Reduces time necessary to produce design by implementing concept of parallel or concurrent detailing, in which all detail drawings documenting three-dimensional model of part or assembly produced simultaneously, rather than sequentially. Keeps various detail drawings consistent with each other and with overall design by distributing changes in each detail to all other affected details.

  4. Mapping of H.264 decoding on a multiprocessor architecture

    NASA Astrophysics Data System (ADS)

    van der Tol, Erik B.; Jaspers, Egbert G.; Gelderblom, Rob H.

    2003-05-01

    Due to the increasing significance of development costs in the competitive domain of high-volume consumer electronics, generic solutions are required to enable reuse of the design effort and to increase the potential market volume. As a result from this, Systems-on-Chip (SoCs) contain a growing amount of fully programmable media processing devices as opposed to application-specific systems, which offered the most attractive solutions due to a high performance density. The following motivates this trend. First, SoCs are increasingly dominated by their communication infrastructure and embedded memory, thereby making the cost of the functional units less significant. Moreover, the continuously growing design costs require generic solutions that can be applied over a broad product range. Hence, powerful programmable SoCs are becoming increasingly attractive. However, to enable power-efficient designs, that are also scalable over the advancing VLSI technology, parallelism should be fully exploited. Both task-level and instruction-level parallelism can be provided by means of e.g. a VLIW multiprocessor architecture. To provide the above-mentioned scalability, we propose to partition the data over the processors, instead of traditional functional partitioning. An advantage of this approach is the inherent locality of data, which is extremely important for communication-efficient software implementations. Consequently, a software implementation is discussed, enabling e.g. SD resolution H.264 decoding with a two-processor architecture, whereas High-Definition (HD) decoding can be achieved with an eight-processor system, executing the same software. Experimental results show that the data communication considerably reduces up to 65% directly improving the overall performance. Apart from considerable improvement in memory bandwidth, this novel concept of partitioning offers a natural approach for optimally balancing the load of all processors, thereby further improving the overall speedup.

  5. Aviation Environmental Design Tool (AEDT): Version 2c Service Pack 2: Installation Guide

    DOT National Transportation Integrated Search

    2017-03-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  6. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    NASA Astrophysics Data System (ADS)

    Hawkins, Donovan Lee

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  7. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  8. Genotyping-by-sequencing-based investigation of the genetic architecture responsible for a ~sevenfold increase in soybean seed stearic acid

    USDA-ARS?s Scientific Manuscript database

    Soybean oil is highly unsaturated and oxidatively unstable, rendering it non-ideal for most food applications. Until recently, the majority of soybean oil underwent partial chemical hydrogenation, a process which produces trans fats as an unavoidable consequence. Dietary intake of trans fat and most...

  9. Is Bigger Better? Customer Base Expansion through Word-of-Mouth Reputation

    ERIC Educational Resources Information Center

    Rob, Rafael; Fishman, Arthur

    2005-01-01

    A model of gradual reputation formation through a process of continuous investment in product quality is developed. We assume that the ability to produce high-quality products requires continuous investment and that as a consequence of informational frictions, such as search costs, information about firms' past performance diffuses only gradually…

  10. Simple solution to the medical instrumentation software problem

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  11. Choosing a software design method for real-time Ada applications: JSD process inversion as a means to tailor a design specification to the performance requirements and target machine

    NASA Technical Reports Server (NTRS)

    Withey, James V.

    1986-01-01

    The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.

  12. Using multi-attribute decision-making approaches in the selection of a hospital management system.

    PubMed

    Arasteh, Mohammad Ali; Shamshirband, Shahaboddin; Yee, Por Lip

    2018-01-01

    The most appropriate organizational software is always a real challenge for managers, especially, the IT directors. The illustration of the term "enterprise software selection", is to purchase, create, or order a software that; first, is best adapted to require of the organization; and second, has suitable price and technical support. Specifying selection criteria and ranking them, is the primary prerequisite for this action. This article provides a method to evaluate, rank, and compare the available enterprise software for choosing the apt one. The prior mentioned method is constituted of three-stage processes. First, the method identifies the organizational requires and assesses them. Second, it selects the best method throughout three possibilities; indoor-production, buying software, and ordering special software for the native use. Third, the method evaluates, compares and ranks the alternative software. The third process uses different methods of multi attribute decision making (MADM), and compares the consequent results. Based on different characteristics of the problem; several methods had been tested, namely, Analytic Hierarchy Process (AHP), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Elimination and Choice Expressing Reality (ELECTURE), and easy weight method. After all, we propose the most practical method for same problems.

  13. Data acquisition architecture and online processing system for the HAWC gamma-ray observatory

    NASA Astrophysics Data System (ADS)

    Abeysekara, A. U.; Alfaro, R.; Alvarez, C.; Álvarez, J. D.; Arceo, R.; Arteaga-Velázquez, J. C.; Ayala Solares, H. A.; Barber, A. S.; Baughman, B. M.; Bautista-Elivar, N.; Becerra Gonzalez, J.; Belmont-Moreno, E.; BenZvi, S. Y.; Berley, D.; Bonilla Rosales, M.; Braun, J.; Caballero-Lopez, R. A.; Caballero-Mora, K. S.; Carramiñana, A.; Castillo, M.; Cotti, U.; Cotzomi, J.; de la Fuente, E.; De León, C.; DeYoung, T.; Diaz-Cruz, J.; Diaz Hernandez, R.; Díaz-Vélez, J. C.; Dingus, B. L.; DuVernois, M. A.; Ellsworth, R. W.; Fiorino, D. W.; Fraija, N.; Galindo, A.; Garfias, F.; González, M. M.; Goodman, J. A.; Grabski, V.; Gussert, M.; Hampel-Arias, Z.; Harding, J. P.; Hui, C. M.; Hüntemeyer, P.; Imran, A.; Iriarte, A.; Karn, P.; Kieda, D.; Kunde, G. J.; Lara, A.; Lauer, R. J.; Lee, W. H.; Lennarz, D.; León Vargas, H.; Linares, E. C.; Linnemann, J. T.; Longo Proper, M.; Luna-García, R.; Malone, K.; Marinelli, A.; Marinelli, S. S.; Martinez, O.; Martínez-Castro, J.; Martínez-Huerta, H.; Matthews, J. A. J.; McEnery, J.; Mendoza Torres, E.; Miranda-Romagnoli, P.; Moreno, E.; Mostafá, M.; Nellen, L.; Newbold, M.; Noriega-Papaqui, R.; Oceguera-Becerra, T.; Patricelli, B.; Pelayo, R.; Pérez-Pérez, E. G.; Pretz, J.; Rivière, C.; Rosa-González, D.; Ruiz-Velasco, E.; Ryan, J.; Salazar, H.; Salesa Greus, F.; Sanchez, F. E.; Sandoval, A.; Schneider, M.; Silich, S.; Sinnis, G.; Smith, A. J.; Sparks Woodle, K.; Springer, R. W.; Taboada, I.; Toale, P. A.; Tollefson, K.; Torres, I.; Ukwatta, T. N.; Villaseñor, L.; Weisgarber, T.; Westerhoff, S.; Wisher, I. G.; Wood, J.; Yapici, T.; Yodh, G. B.; Younk, P. W.; Zaborov, D.; Zepeda, A.; Zhou, H.

    2018-04-01

    The High Altitude Water Cherenkov observatory (HAWC) is an air shower array devised for TeV gamma-ray astronomy. HAWC is located at an altitude of 4100 m a.s.l. in Sierra Negra, Mexico. HAWC consists of 300 Water Cherenkov Detectors, each instrumented with 4 photomultiplier tubes (PMTs). HAWC re-uses the Front-End Boards from the Milagro experiment to receive the PMT signals. These boards are used in combination with Time to Digital Converters (TDCs) to record the time and the amount of light in each PMT hit (light flash). A set of VME TDC modules (128 channels each) is operated in a continuous (dead time free) mode. The TDCs are read out via the VME bus by Single-Board Computers (SBCs), which in turn are connected to a gigabit Ethernet network. The complete system produces ≈500 MB/s of raw data. A high-throughput data processing system has been designed and built to enable real-time data analysis. The system relies on off-the-shelf hardware components, an open-source software technology for data transfers (ZeroMQ) and a custom software framework for data analysis (AERIE). Multiple trigger and reconstruction algorithms can be combined and run on blocks of data in a parallel fashion, producing a set of output data streams which can be analyzed in real time with minimal latency (<5 s). This paper provides an overview of the hardware set-up and an in-depth description of the software design, covering both the TDC data acquisition system and the real-time data processing system. The performance of these systems is also discussed.

  14. Agile Development Methods for Space Operations

    NASA Technical Reports Server (NTRS)

    Trimble, Jay; Webster, Chris

    2012-01-01

    Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).

  15. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  16. Comparison of 3D representations depicting micro folds: overlapping imagery vs. time-of-flight laser scanner

    NASA Astrophysics Data System (ADS)

    Vaiopoulos, Aristidis D.; Georgopoulos, Andreas; Lozios, Stylianos G.

    2012-10-01

    A relatively new field of interest, which continuously gains grounds nowadays, is digital 3D modeling. However, the methodologies, the accuracy and the time and effort required to produce a high quality 3D model have been changing drastically the last few years. Whereas in the early days of digital 3D modeling, 3D models were only accessible to computer experts in animation, working many hours in expensive sophisticated software, today 3D modeling has become reasonably fast and convenient. On top of that, with online 3D modeling software, such as 123D Catch, nearly everyone can produce 3D models with minimum effort and at no cost. The only requirement is panoramic overlapping images, of the (still) objects the user wishes to model. This approach however, has limitations in the accuracy of the model. An objective of the study is to examine these limitations by assessing the accuracy of this 3D modeling methodology, with a Terrestrial Laser Scanner (TLS). Therefore, the scope of this study is to present and compare 3D models, produced with two different methods: 1) Traditional TLS method with the instrument ScanStation 2 by Leica and 2) Panoramic overlapping images obtained with DSLR camera and processed with 123D Catch free software. The main objective of the study is to evaluate advantages and disadvantages of the two 3D model producing methodologies. The area represented with the 3D models, features multi-scale folding in a cipollino marble formation. The most interesting part and most challenging to capture accurately, is an outcrop which includes vertically orientated micro folds. These micro folds have dimensions of a few centimeters while a relatively strong relief is evident between them (perhaps due to different material composition). The area of interest is located in Mt. Hymittos, Greece.

  17. Field Programmable Gate Array Failure Rate Estimation Guidelines for Launch Vehicle Fault Tree Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Today's launch vehicles complex electronic and avionics systems heavily utilize Field Programmable Gate Array (FPGA) integrated circuits (IC) for their superb speed and reconfiguration capabilities. Consequently, FPGAs are prevalent ICs in communication protocols such as MILSTD- 1553B and in control signal commands such as in solenoid valve actuations. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  18. The Apple III.

    ERIC Educational Resources Information Center

    Ditlea, Steve

    1982-01-01

    Describes and evaluates the features, performance, peripheral devices, available software, and capabilities of the Apple III microcomputer. The computer's operating system, its hardware, and the commercially produced software it accepts are discussed. Specific applications programs for financial planning, accounting, and word processing are…

  19. Use of software tools in the development of real time software systems

    NASA Technical Reports Server (NTRS)

    Garvey, R. C.

    1981-01-01

    The transformation of a preexisting software system into a larger and more versatile system with different mission requirements is discussed. The history of this transformation is used to illustrate the use of structured real time programming techniques and tools to produce maintainable and somewhat transportable systems. The predecessor system is a single ground diagnostic system; its purpose is to exercise a computer controlled hardware set prior to its deployment in its functional environment, as well as test the equipment set by supplying certain well known stimulas. The successor system (FTE) is required to perform certain testing and control functions while this hardware set is in its functional environment. Both systems must deal with heavy user input/output loads and a new I/O requirement is included in the design of the FTF system. Human factors are enhanced by adding an improved console interface and special function keyboard handler. The additional features require the inclusion of much new software to the original set from which FTF was developed. As a result, it is necessary to split the system into a duel programming configuration with high rates of interground communications. A generalized information routing mechanism is used to support this configuration.

  20. Software for Partly Automated Recognition of Targets

    NASA Technical Reports Server (NTRS)

    Opitz, David; Blundell, Stuart; Bain, William; Morris, Matthew; Carlson, Ian; Mangrich, Mark

    2003-01-01

    The Feature Analyst is a computer program for assisted (partially automated) recognition of targets in images. This program was developed to accelerate the processing of high-resolution satellite image data for incorporation into geographic information systems (GIS). This program creates an advanced user interface that embeds proprietary machine-learning algorithms in commercial image-processing and GIS software. A human analyst provides samples of target features from multiple sets of data, then the software develops a data-fusion model that automatically extracts the remaining features from selected sets of data. The program thus leverages the natural ability of humans to recognize objects in complex scenes, without requiring the user to explain the human visual recognition process by means of lengthy software. Two major subprograms are the reactive agent and the thinking agent. The reactive agent strives to quickly learn the user s tendencies while the user is selecting targets and to increase the user s productivity by immediately suggesting the next set of pixels that the user may wish to select. The thinking agent utilizes all available resources, taking as much time as needed, to produce the most accurate autonomous feature-extraction model possible.

  1. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  2. Simultaneous mapping of membrane voltage and calcium in zebrafish heart in vivo reveals chamber-specific developmental transitions in ionic currents

    PubMed Central

    Hou, Jennifer H.; Kralj, Joel M.; Douglass, Adam D.; Engert, Florian; Cohen, Adam E.

    2014-01-01

    The cardiac action potential (AP) and the consequent cytosolic Ca2+ transient are key indicators of cardiac function. Natural developmental processes, as well as many drugs and pathologies change the waveform, propagation, or variability (between cells or over time) of these parameters. Here we apply a genetically encoded dual-function calcium and voltage reporter (CaViar) to study the development of the zebrafish heart in vivo between 1.5 and 4 days post fertilization (dpf). We developed a high-sensitivity spinning disk confocal microscope and associated software for simultaneous three-dimensional optical mapping of voltage and calcium. We produced a transgenic zebrafish line expressing CaViar under control of the heart-specific cmlc2 promoter, and applied ion channel blockers at a series of developmental stages to map the maturation of the action potential in vivo. Early in development, the AP initiated via a calcium current through L-type calcium channels. Between 90 and 102 h post fertilization (hpf), the ventricular AP switched to a sodium-driven upswing, while the atrial AP remained calcium driven. In the adult zebrafish heart, a sodium current drives the AP in both the atrium and ventricle. Simultaneous voltage and calcium imaging with genetically encoded reporters provides a new approach for monitoring cardiac development, and the effects of drugs on cardiac function. PMID:25309445

  3. Simultaneous mapping of membrane voltage and calcium in zebrafish heart in vivo reveals chamber-specific developmental transitions in ionic currents.

    PubMed

    Hou, Jennifer H; Kralj, Joel M; Douglass, Adam D; Engert, Florian; Cohen, Adam E

    2014-01-01

    The cardiac action potential (AP) and the consequent cytosolic Ca(2+) transient are key indicators of cardiac function. Natural developmental processes, as well as many drugs and pathologies change the waveform, propagation, or variability (between cells or over time) of these parameters. Here we apply a genetically encoded dual-function calcium and voltage reporter (CaViar) to study the development of the zebrafish heart in vivo between 1.5 and 4 days post fertilization (dpf). We developed a high-sensitivity spinning disk confocal microscope and associated software for simultaneous three-dimensional optical mapping of voltage and calcium. We produced a transgenic zebrafish line expressing CaViar under control of the heart-specific cmlc2 promoter, and applied ion channel blockers at a series of developmental stages to map the maturation of the action potential in vivo. Early in development, the AP initiated via a calcium current through L-type calcium channels. Between 90 and 102 h post fertilization (hpf), the ventricular AP switched to a sodium-driven upswing, while the atrial AP remained calcium driven. In the adult zebrafish heart, a sodium current drives the AP in both the atrium and ventricle. Simultaneous voltage and calcium imaging with genetically encoded reporters provides a new approach for monitoring cardiac development, and the effects of drugs on cardiac function.

  4. DBCreate: A SUPCRT92-based program for producing EQ3/6, TOUGHREACT, and GWB thermodynamic databases at user-defined T and P

    NASA Astrophysics Data System (ADS)

    Kong, Xiang-Zhao; Tutolo, Benjamin M.; Saar, Martin O.

    2013-02-01

    SUPCRT92 is a widely used software package for calculating the standard thermodynamic properties of minerals, gases, aqueous species, and reactions. However, it is labor-intensive and error-prone to use it directly to produce databases for geochemical modeling programs such as EQ3/6, the Geochemist's Workbench, and TOUGHREACT. DBCreate is a SUPCRT92-based software program written in FORTRAN90/95 and was developed in order to produce the required databases for these programs in a rapid and convenient way. This paper describes the overall structure of the program and provides detailed usage instructions.

  5. Digamma diagnostics for the mixed-phase generation at NICA

    NASA Astrophysics Data System (ADS)

    Kukulin, V. I.; Platonova, M. N.

    2017-03-01

    A novel type of diagnostics for dense and/or hot nuclear matter produced in heavy-ion collisions at NICA and similar future colliders (FAIR, etc.) is suggested. The diagnostics is based on an assumption (confirmed in many experiments worldwide) about intensive generation of light scalar mesons (σ) the consequent decay of which produces γγ pairs with the mass and width dependent upon density and temperature of the fireball produced in the collision process. Thus, measurements of the absolute yield, mass and width of the γγ signal carry valuable information about the state of fireball generated during the high-energy nuclear collision.

  6. Generalized Software Architecture Applied to the Continuous Lunar Water Separation Process and the Lunar Greenhouse Amplifier

    NASA Technical Reports Server (NTRS)

    Perusich, Stephen; Moos, Thomas; Muscatello, Anthony

    2011-01-01

    This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not efficient for absorbing longwavelength infrared radiation and therefore will lose more heat to the environment compared to glass. The LGA unit uses a transparent polymer antechamber that surrounds part of the greenhouse and encases the SGGs, thereby minimizing infrared losses through the plastic windows. With ambient temperatures at the lunar poles at 50 C, the LGA should provide a substantial enhancement to currently conceived lunar greenhouses. Positive results obtained from this project could lead to a future large-scale system capable of running autonomously on the Moon, Mars, and beyond. The software for both applications needs to run the entire units and all subprocesses; however, throughout testing, many variables and parameters need to be changed as more is learned about the system operation. The software provides the versatility to permit the software operation to change as the user requirements evolve.

  7. MicroSIFT Courseware Evaluation. [Set 13 (294-319), Set 14 (320-361), with Hardware (HRD) and Subject (SBJ) Indexes to Both Sets.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This document consists of 68 microcomputer software package evaluations prepared by MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. There are 26 packages in set 13 and 42 in set 14. Each software review lists producer, time and place of evaluation, cost, ability level,…

  8. Coordination in Large Scale Software Development

    DTIC Science & Technology

    1990-01-01

    toward achieving common and explicitly recognized goals" (Blau and Scott, 1962) and "the integration or linking together of different parts of an...require a strong degree of integration of its components. Much software is built of thousands of modules that must mesh with each other perfectly for the...coordination between subgroups producing software modules could lead to failure in integrating the modules themselves. Informal communication. Both

  9. Enhancing the Breadth and Efficacy of Therapeutic Vaccines for Breast Cancer

    DTIC Science & Technology

    2014-10-01

    sequence data produced by the Slansky team following their single-cell emulsion RT-PCR technique; however, it can be packaged and shared for use...cell emulsion RT-PCR. Additional modifications were made to our epitope discovery workflow to increase efficacy of transcript and neoantigen candidate...the MiTCR [8] open source software package developed by MiLaboratory. MiTCR is a highly efficient and fast approach to CDR3 extraction, clonotype

  10. Preparing Colorful Astronomical Images and Illustrations

    NASA Astrophysics Data System (ADS)

    Levay, Z. G.; Frattare, L. M.

    2001-12-01

    We present techniques for using mainstream graphics software, specifically Adobe Photoshop and Illustrator, for producing composite color images and illustrations from astronomical data. These techniques have been used with numerous images from the Hubble Space Telescope to produce printed and web-based news, education and public presentation products as well as illustrations for technical publication. While Photoshop is not intended for quantitative analysis of full dynamic range data (as are IRAF or IDL, for example), we have had much success applying Photoshop's numerous, versatile tools to work with scaled images, masks, text and graphics in multiple semi-transparent layers and channels. These features, along with its user-oriented, visual interface, provide convenient tools to produce high-quality, full-color images and graphics for printed and on-line publication and presentation.

  11. A Discussion of the Software Quality Assurance Role

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2010-01-01

    The basic idea underlying this paper is that the conventional understanding of the role of a Software Quality Assurance (SQA) engineer is unduly limited. This is because few have asked who the customers of a SQA engineer are. Once you do this, you can better define what tasks a SQA engineer should perform, as well as identify the knowledge and skills that such a person should have. The consequence of doing this is that a SQA engineer can provide greater value to his or her customers. It is the position of this paper that a SQA engineer providing significant value to his or her customers must not only assume the role of an auditor, but also that of a software and systems engineer. This is because software engineers and their managers particularly value contributions that directly impact products and their development. These ideas are summarized as lessons learned, based on my experience at Jet Propulsion Laboratory (JPL).

  12. A professional and cost effective digital video editing and image storage system for the operating room.

    PubMed

    Scollato, A; Perrini, P; Benedetto, N; Di Lorenzo, N

    2007-06-01

    We propose an easy-to-construct digital video editing system ideal to produce video documentation and still images. A digital video editing system applicable to many video sources in the operating room is described in detail. The proposed system has proved easy to use and permits one to obtain videography quickly and easily. Mixing different streams of video input from all the devices in use in the operating room, the application of filters and effects produces a final, professional end-product. Recording on a DVD provides an inexpensive, portable and easy-to-use medium to store or re-edit or tape at a later time. From stored videography it is easy to extract high-quality, still images useful for teaching, presentations and publications. In conclusion digital videography and still photography can easily be recorded by the proposed system, producing high-quality video recording. The use of firewire ports provides good compatibility with next-generation hardware and software. The high standard of quality makes the proposed system one of the lowest priced products available today.

  13. Finite Element Simulations of Kaikoura, NZ Earthquake using DInSAR and High-Resolution DSMs

    NASA Astrophysics Data System (ADS)

    Barba, M.; Willis, M. J.; Tiampo, K. F.; Glasscoe, M. T.; Clark, M. K.; Zekkos, D.; Stahl, T. A.; Massey, C. I.

    2017-12-01

    Three-dimensional displacements from the Kaikoura, NZ, earthquake in November 2016 are imaged here using Differential Interferometric Synthetic Aperture Radar (DInSAR) and high-resolution Digital Surface Model (DSM) differencing and optical pixel tracking. Full-resolution co- and post-seismic interferograms of Sentinel-1A/B images are constructed using the JPL ISCE software. The OSU SETSM software is used to produce repeat 0.5 m posting DSMs from commercial satellite imagery, which are supplemented with UAV derived DSMs over the Kaikoura fault rupture on the eastern South Island, NZ. DInSAR provides long-wavelength motions while DSM differencing and optical pixel tracking provides both horizontal and vertical near fault motions, improving the modeling of shallow rupture dynamics. JPL GeoFEST software is used to perform finite element modeling of the fault segments and slip distributions and, in turn, the associated asperity distribution. The asperity profile is then used to simulate event rupture, the spatial distribution of stress drop, and the associated stress changes. Finite element modeling of slope stability is accomplished using the ultra high-resolution UAV derived DSMs to examine the evolution of post-earthquake topography, landslide dynamics and volumes. Results include new insights into shallow dynamics of fault slip and partitioning, estimates of stress change, and improved understanding of its relationship with the associated seismicity, deformation, and triggered cascading hazards.

  14. High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL.

    PubMed

    Stone, John E; Messmer, Peter; Sisneros, Robert; Schulten, Klaus

    2016-05-01

    Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications.

  15. High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL

    PubMed Central

    Stone, John E.; Messmer, Peter; Sisneros, Robert; Schulten, Klaus

    2016-01-01

    Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications. PMID:27747137

  16. Automated ultrasound edge-tracking software comparable to established semi-automated reference software for carotid intima-media thickness analysis.

    PubMed

    Shenouda, Ninette; Proudfoot, Nicole A; Currie, Katharine D; Timmons, Brian W; MacDonald, Maureen J

    2018-05-01

    Many commercial ultrasound systems are now including automated analysis packages for the determination of carotid intima-media thickness (cIMT); however, details regarding their algorithms and methodology are not published. Few studies have compared their accuracy and reliability with previously established automated software, and those that have were in asymptomatic adults. Therefore, this study compared cIMT measures from a fully automated ultrasound edge-tracking software (EchoPAC PC, Version 110.0.2; GE Medical Systems, Horten, Norway) to an established semi-automated reference software (Artery Measurement System (AMS) II, Version 1.141; Gothenburg, Sweden) in 30 healthy preschool children (ages 3-5 years) and 27 adults with coronary artery disease (CAD; ages 48-81 years). For both groups, Bland-Altman plots revealed good agreement with a negligible mean cIMT difference of -0·03 mm. Software differences were statistically, but not clinically, significant for preschool images (P = 0·001) and were not significant for CAD images (P = 0·09). Intra- and interoperator repeatability was high and comparable between software for preschool images (ICC, 0·90-0·96; CV, 1·3-2·5%), but slightly higher with the automated ultrasound than the semi-automated reference software for CAD images (ICC, 0·98-0·99; CV, 1·4-2·0% versus ICC, 0·84-0·89; CV, 5·6-6·8%). These findings suggest that the automated ultrasound software produces valid cIMT values in healthy preschool children and adults with CAD. Automated ultrasound software may be useful for ensuring consistency among multisite research initiatives or large cohort studies involving repeated cIMT measures, particularly in adults with documented CAD. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  17. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform

    PubMed Central

    2013-01-01

    Background Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists’ capacity to use these immunoassays to evaluate human clinical trials. Results The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose–response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Conclusions Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. PMID:23631706

  18. Quality control, analysis and secure sharing of Luminex® immunoassay data using the open source LabKey Server platform.

    PubMed

    Eckels, Josh; Nathe, Cory; Nelson, Elizabeth K; Shoemaker, Sara G; Nostrand, Elizabeth Van; Yates, Nicole L; Ashley, Vicki C; Harris, Linda J; Bollenbeck, Mark; Fong, Youyi; Tomaras, Georgia D; Piehler, Britt

    2013-04-30

    Immunoassays that employ multiplexed bead arrays produce high information content per sample. Such assays are now frequently used to evaluate humoral responses in clinical trials. Integrated software is needed for the analysis, quality control, and secure sharing of the high volume of data produced by such multiplexed assays. Software that facilitates data exchange and provides flexibility to perform customized analyses (including multiple curve fits and visualizations of assay performance over time) could increase scientists' capacity to use these immunoassays to evaluate human clinical trials. The HIV Vaccine Trials Network and the Statistical Center for HIV/AIDS Research and Prevention collaborated with LabKey Software to enhance the open source LabKey Server platform to facilitate workflows for multiplexed bead assays. This system now supports the management, analysis, quality control, and secure sharing of data from multiplexed immunoassays that leverage Luminex xMAP® technology. These assays may be custom or kit-based. Newly added features enable labs to: (i) import run data from spreadsheets output by Bio-Plex Manager™ software; (ii) customize data processing, curve fits, and algorithms through scripts written in common languages, such as R; (iii) select script-defined calculation options through a graphical user interface; (iv) collect custom metadata for each titration, analyte, run and batch of runs; (v) calculate dose-response curves for titrations; (vi) interpolate unknown concentrations from curves for titrated standards; (vii) flag run data for exclusion from analysis; (viii) track quality control metrics across runs using Levey-Jennings plots; and (ix) automatically flag outliers based on expected values. Existing system features allow researchers to analyze, integrate, visualize, export and securely share their data, as well as to construct custom user interfaces and workflows. Unlike other tools tailored for Luminex immunoassays, LabKey Server allows labs to customize their Luminex analyses using scripting while still presenting users with a single, graphical interface for processing and analyzing data. The LabKey Server system also stands out among Luminex tools for enabling smooth, secure transfer of data, quality control information, and analyses between collaborators. LabKey Server and its Luminex features are freely available as open source software at http://www.labkey.com under the Apache 2.0 license.

  19. CisSERS: Customizable in silico sequence evaluation for restriction sites

    DOE PAGES

    Sharpe, Richard M.; Koepke, Tyson; Harper, Artemus; ...

    2016-04-12

    High-throughput sequencing continues to produce an immense volume of information that is processed and assembled into mature sequence data. Here, data analysis tools are urgently needed that leverage the embedded DNA sequence polymorphisms and consequent changes to restriction sites or sequence motifs in a high-throughput manner to enable biological experimentation. CisSERS was developed as a standalone open source tool to analyze sequence datasets and provide biologists with individual or comparative genome organization information in terms of presence and frequency of patterns or motifs such as restriction enzymes. Predicted agarose gel visualization of the custom analyses results was also integrated tomore » enhance the usefulness of the software. CisSERS offers several novel functionalities, such as handling of large and multiple datasets in parallel, multiple restriction enzyme site detection and custom motif detection features, which are seamlessly integrated with real time agarose gel visualization. Using a simple fasta-formatted file as input, CisSERS utilizes the REBASE enzyme database. Results from CisSERSenable the user to make decisions for designing genotyping by sequencing experiments, reduced representation sequencing, 3’UTR sequencing, and cleaved amplified polymorphic sequence (CAPS) molecular markers for large sample sets. CisSERS is a java based graphical user interface built around a perl backbone. Several of the applications of CisSERS including CAPS molecular marker development were successfully validated using wet-lab experimentation. Here, we present the tool CisSERSand results from in-silico and corresponding wet-lab analyses demonstrating that CisSERS is a technology platform solution that facilitates efficient data utilization in genomics and genetics studies.« less

  20. CisSERS: Customizable in silico sequence evaluation for restriction sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharpe, Richard M.; Koepke, Tyson; Harper, Artemus

    High-throughput sequencing continues to produce an immense volume of information that is processed and assembled into mature sequence data. Here, data analysis tools are urgently needed that leverage the embedded DNA sequence polymorphisms and consequent changes to restriction sites or sequence motifs in a high-throughput manner to enable biological experimentation. CisSERS was developed as a standalone open source tool to analyze sequence datasets and provide biologists with individual or comparative genome organization information in terms of presence and frequency of patterns or motifs such as restriction enzymes. Predicted agarose gel visualization of the custom analyses results was also integrated tomore » enhance the usefulness of the software. CisSERS offers several novel functionalities, such as handling of large and multiple datasets in parallel, multiple restriction enzyme site detection and custom motif detection features, which are seamlessly integrated with real time agarose gel visualization. Using a simple fasta-formatted file as input, CisSERS utilizes the REBASE enzyme database. Results from CisSERSenable the user to make decisions for designing genotyping by sequencing experiments, reduced representation sequencing, 3’UTR sequencing, and cleaved amplified polymorphic sequence (CAPS) molecular markers for large sample sets. CisSERS is a java based graphical user interface built around a perl backbone. Several of the applications of CisSERS including CAPS molecular marker development were successfully validated using wet-lab experimentation. Here, we present the tool CisSERSand results from in-silico and corresponding wet-lab analyses demonstrating that CisSERS is a technology platform solution that facilitates efficient data utilization in genomics and genetics studies.« less

  1. An Automated, High-Throughput Method for Interpreting the Tandem Mass Spectra of Glycosaminoglycans

    NASA Astrophysics Data System (ADS)

    Duan, Jiana; Jonathan Amster, I.

    2018-05-01

    The biological interactions between glycosaminoglycans (GAGs) and other biomolecules are heavily influenced by structural features of the glycan. The structure of GAGs can be assigned using tandem mass spectrometry (MS2), but analysis of these data, to date, requires manually interpretation, a slow process that presents a bottleneck to the broader deployment of this approach to solving biologically relevant problems. Automated interpretation remains a challenge, as GAG biosynthesis is not template-driven, and therefore, one cannot predict structures from genomic data, as is done with proteins. The lack of a structure database, a consequence of the non-template biosynthesis, requires a de novo approach to interpretation of the mass spectral data. We propose a model for rapid, high-throughput GAG analysis by using an approach in which candidate structures are scored for the likelihood that they would produce the features observed in the mass spectrum. To make this approach tractable, a genetic algorithm is used to greatly reduce the search-space of isomeric structures that are considered. The time required for analysis is significantly reduced compared to an approach in which every possible isomer is considered and scored. The model is coded in a software package using the MATLAB environment. This approach was tested on tandem mass spectrometry data for long-chain, moderately sulfated chondroitin sulfate oligomers that were derived from the proteoglycan bikunin. The bikunin data was previously interpreted manually. Our approach examines glycosidic fragments to localize SO3 modifications to specific residues and yields the same structures reported in literature, only much more quickly.

  2. Hardware in-the-Loop Demonstration of Real-Time Orbit Determination in High Earth Orbits

    NASA Technical Reports Server (NTRS)

    Moreau, Michael; Naasz, Bo; Leitner, Jesse; Carpenter, J. Russell; Gaylor, Dave

    2005-01-01

    This paper presents results from a study conducted at Goddard Space Flight Center (GSFC) to assess the real-time orbit determination accuracy of GPS-based navigation in a number of different high Earth orbital regimes. Measurements collected from a GPS receiver (connected to a GPS radio frequency (RF) signal simulator) were processed in a navigation filter in real-time, and resulting errors in the estimated states were assessed. For the most challenging orbit simulated, a 12 hour Molniya orbit with an apogee of approximately 39,000 km, mean total position and velocity errors were approximately 7 meters and 3 mm/s respectively. The study also makes direct comparisons between the results from the above hardware in-the-loop tests and results obtained by processing GPS measurements generated from software simulations. Care was taken to use the same models and assumptions in the generation of both the real-time and software simulated measurements, in order that the real-time data could be used to help validate the assumptions and models used in the software simulations. The study makes use of the unique capabilities of the Formation Flying Test Bed at GSFC, which provides a capability to interface with different GPS receivers and to produce real-time, filtered orbit solutions even when less than four satellites are visible. The result is a powerful tool for assessing onboard navigation performance in a wide range of orbital regimes, and a test-bed for developing software and procedures for use in real spacecraft applications.

  3. Automatic AVHRR image navigation software

    NASA Technical Reports Server (NTRS)

    Baldwin, Dan; Emery, William

    1992-01-01

    This is the final report describing the work done on the project entitled Automatic AVHRR Image Navigation Software funded through NASA-Washington, award NAGW-3224, Account 153-7529. At the onset of this project, we had developed image navigation software capable of producing geo-registered images from AVHRR data. The registrations were highly accurate but required a priori knowledge of the spacecraft's axes alignment deviations, commonly known as attitude. The three angles needed to describe the attitude are called roll, pitch, and yaw, and are the components of the deviations in the along scan, along track and about center directions. The inclusion of the attitude corrections in the navigation software results in highly accurate georegistrations, however, the computation of the angles is very tedious and involves human interpretation for several steps. The technique also requires easily identifiable ground features which may not be available due to cloud cover or for ocean data. The current project was motivated by the need for a navigation system which was automatic and did not require human intervention or ground control points. The first step in creating such a system must be the ability to parameterize the spacecraft's attitude. The immediate goal of this project was to study the attitude fluctuations and determine if they displayed any systematic behavior which could be modeled or parameterized. We chose a period in 1991-1992 to study the attitude of the NOAA 11 spacecraft using data from the Tiros receiving station at the Colorado Center for Astrodynamic Research (CCAR) at the University of Colorado.

  4. CARGO: effective format-free compressed storage of genomic information

    PubMed Central

    Roguski, Łukasz; Ribeca, Paolo

    2016-01-01

    The recent super-exponential growth in the amount of sequencing data generated worldwide has put techniques for compressed storage into the focus. Most available solutions, however, are strictly tied to specific bioinformatics formats, sometimes inheriting from them suboptimal design choices; this hinders flexible and effective data sharing. Here, we present CARGO (Compressed ARchiving for GenOmics), a high-level framework to automatically generate software systems optimized for the compressed storage of arbitrary types of large genomic data collections. Straightforward applications of our approach to FASTQ and SAM archives require a few lines of code, produce solutions that match and sometimes outperform specialized format-tailored compressors and scale well to multi-TB datasets. All CARGO software components can be freely downloaded for academic and non-commercial use from http://bio-cargo.sourceforge.net. PMID:27131376

  5. Toward the S3DVAR data assimilation software for the Caspian Sea

    NASA Astrophysics Data System (ADS)

    Arcucci, Rossella; Celestino, Simone; Toumi, Ralf; Laccetti, Giuliano

    2017-07-01

    Data Assimilation (DA) is an uncertainty quantification technique used to incorporate observed data into a prediction model in order to improve numerical forecasted results. The forecasting model used for producing oceanographic prediction into the Caspian Sea is the Regional Ocean Modeling System (ROMS). Here we propose the computational issues we are facing in a DA software we are developing (we named S3DVAR) which implements a Scalable Three Dimensional Variational Data Assimilation model for assimilating sea surface temperature (SST) values collected into the Caspian Sea with observations provided by the Group of High resolution sea surface temperature (GHRSST). We present the algorithmic strategies we employ and the numerical issues on data collected in two of the months which present the most significant variability in water temperature: August and March.

  6. MrEnt: an editor for publication-quality phylogenetic tree illustrations.

    PubMed

    Zuccon, Alessandro; Zuccon, Dario

    2014-09-01

    We developed MrEnt, a Windows-based, user-friendly software that allows the production of complex, high-resolution, publication-quality phylogenetic trees in few steps, directly from the analysis output. The program recognizes the standard Nexus tree format and the annotated tree files produced by BEAST and MrBayes. MrEnt combines in a single software a large suite of tree manipulation functions (e.g. handling of multiple trees, tree rotation, character mapping, node collapsing, compression of large clades, handling of time scale and error bars for chronograms) with drawing tools typical of standard graphic editors, including handling of graphic elements and images. The tree illustration can be printed or exported in several standard formats suitable for journal publication, PowerPoint presentation or Web publication. © 2014 John Wiley & Sons Ltd.

  7. Using Ready-To Drone Images in Forestry Activities: Case Study of ÇINARPINAR in Kahramanmaras, Turkey

    NASA Astrophysics Data System (ADS)

    Gülci, S.; Akgül, M.; Akay, A. E.; Taş, İ.

    2017-11-01

    This short paper aims to present pros and cons of current usage of ready-to-use drone images in the field of forestry also considering flight planning and photogrammetric processes. The capabilities of DJI Phantom 4, which is the low cost drone producing by Dji company, was evaluated through sample flights in Cinarpinar Forest Enterprise Chief in Kahramanmaras in Turkey. In addition, the photogrammetric workflow of obtained images and automated flight were presented with respect to capabilities of available software. The flight plans were created by using Pix4DCapture software with android based cell phone. The results indicated that high-resolution imagery obtained by drone can provide significant data for assessment of forest resources, forest roads, and stream channels.

  8. Virtual Environment Computer Simulations to Support Human Factors Engineering and Operations Analysis for the RLV Program

    NASA Technical Reports Server (NTRS)

    Lunsford, Myrtis Leigh

    1998-01-01

    The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.

  9. SpcAudace: Spectroscopic processing and analysis package of Audela software

    NASA Astrophysics Data System (ADS)

    Mauclaire, Benjamin

    2017-11-01

    SpcAudace processes long slit spectra with automated pipelines and performs astrophysical analysis of the latter data. These powerful pipelines do all the required steps in one pass: standard preprocessing, masking of bad pixels, geometric corrections, registration, optimized spectrum extraction, wavelength calibration and instrumental response computation and correction. Both high and low resolution long slit spectra are managed for stellar and non-stellar targets. Many types of publication-quality figures can be easily produced: pdf and png plots or annotated time series plots. Astrophysical quantities can be derived from individual or large amount of spectra with advanced functions: from line profile characteristics to equivalent width and periodogram. More than 300 documented functions are available and can be used into TCL scripts for automation. SpcAudace is based on Audela open source software.

  10. Coalescent Inference Using Serially Sampled, High-Throughput Sequencing Data from Intrahost HIV Infection

    PubMed Central

    Dialdestoro, Kevin; Sibbesen, Jonas Andreas; Maretty, Lasse; Raghwani, Jayna; Gall, Astrid; Kellam, Paul; Pybus, Oliver G.; Hein, Jotun; Jenkins, Paul A.

    2016-01-01

    Human immunodeficiency virus (HIV) is a rapidly evolving pathogen that causes chronic infections, so genetic diversity within a single infection can be very high. High-throughput “deep” sequencing can now measure this diversity in unprecedented detail, particularly since it can be performed at different time points during an infection, and this offers a potentially powerful way to infer the evolutionary dynamics of the intrahost viral population. However, population genomic inference from HIV sequence data is challenging because of high rates of mutation and recombination, rapid demographic changes, and ongoing selective pressures. In this article we develop a new method for inference using HIV deep sequencing data, using an approach based on importance sampling of ancestral recombination graphs under a multilocus coalescent model. The approach further extends recent progress in the approximation of so-called conditional sampling distributions, a quantity of key interest when approximating coalescent likelihoods. The chief novelties of our method are that it is able to infer rates of recombination and mutation, as well as the effective population size, while handling sampling over different time points and missing data without extra computational difficulty. We apply our method to a data set of HIV-1, in which several hundred sequences were obtained from an infected individual at seven time points over 2 years. We find mutation rate and effective population size estimates to be comparable to those produced by the software BEAST. Additionally, our method is able to produce local recombination rate estimates. The software underlying our method, Coalescenator, is freely available. PMID:26857628

  11. Analysis on Operating Parameter Design to Steam Methane Reforming in Heat Application RDE

    NASA Astrophysics Data System (ADS)

    Dibyo, Sukmanto; Sunaryo, Geni Rina; Bakhri, Syaiful; Zuhair; Irianto, Ign. Djoko

    2018-02-01

    The high temperature reactor has been developed with various power capacities and can produce electricity and heat application. One of heat application is used for hydrogen production. Most hydrogen production occurs by steam reforming that operated at high temperature. This study aims to analyze the feasibility of heat application design of RDE reactor in the steam methane reforming for hydrogen production using the ChemCAD software. The outlet temperature of cogeneration heat exchanger is analyzed to be applied as a feed of steam reformer. Furthermore, the additional heater and calculating amount of fuel usage are described. Results show that at a low mass flow rate of feed, its can produce a temperature up to 480°C. To achieve the temperature of steam methane reforming of 850°C the additional fired heater was required. By the fired heater, an amount of fuel usage is required depending on the Reformer feed temperature produced from the heat exchanger of the cogeneration system.

  12. Internet Usage In The Fresh Produce Supply Chainin China

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoxiao; Duan, Yanqing; Fu, Zetian; Liu, Xue

    Although effective implementation of the Internet technologies has a great potential for improving efficiency and reducing wastage within the fresh produce supply chain. the situation of the Internet usage by SMEs (small and medium sized enterprises) in the fresh produce supply chain is still unclear in China. As the main players, SMEs haven't been given enough attention from both academics and governments. Therefore, this research attempts to address this issue by, first, investigating the current usage of the Internet and related software by Chinese SMEs in the fresh produce supply chain, and then, by identifying enablers and barriers faced by SMEs to call government's attention. As a part of an EU-Asia IT&C funded project, a survey was carried out with SMEs in this industry from five major cities in China. The results reveal that in the relatively developed areas of China, SMEs in the fresh produce supply chain are rapidly adopting the Internet and software packages, but the level of adoption varies greatly and there is a significant lack of integration among the supply chain partners. Chinese SMEs are keen to embrace emerging technologies and have acted to adopt new software and tools. Given that cost of implementation is not a barrier, their concern over legal protection and online security must be addressed for further development.

  13. Modeling and Grid Generation of Iced Airfoils

    NASA Technical Reports Server (NTRS)

    Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Hackenberg, Anthony W.; Pennline, James A.; Schilling, Herbert W.

    2007-01-01

    SmaggIce Version 2.0 is a software toolkit for geometric modeling and grid generation for two-dimensional, singleand multi-element, clean and iced airfoils. A previous version of SmaggIce was described in Preparing and Analyzing Iced Airfoils, NASA Tech Briefs, Vol. 28, No. 8 (August 2004), page 32. To recapitulate: Ice shapes make it difficult to generate quality grids around airfoils, yet these grids are essential for predicting ice-induced complex flow. This software efficiently creates high-quality structured grids with tools that are uniquely tailored for various ice shapes. SmaggIce Version 2.0 significantly enhances the previous version primarily by adding the capability to generate grids for multi-element airfoils. This version of the software is an important step in streamlining the aeronautical analysis of ice airfoils using computational fluid dynamics (CFD) tools. The user may prepare the ice shape, define the flow domain, decompose it into blocks, generate grids, modify/divide/merge blocks, and control grid density and smoothness. All these steps may be performed efficiently even for the difficult glaze and rime ice shapes. Providing the means to generate highly controlled grids near rough ice, the software includes the creation of a wrap-around block (called the "viscous sublayer block"), which is a thin, C-type block around the wake line and iced airfoil. For multi-element airfoils, the software makes use of grids that wrap around and fill in the areas between the viscous sub-layer blocks for all elements that make up the airfoil. A scripting feature records the history of interactive steps, which can be edited and replayed later to produce other grids. Using this version of SmaggIce, ice shape handling and grid generation can become a practical engineering process, rather than a laborious research effort.

  14. A planning language for activity scheduling

    NASA Technical Reports Server (NTRS)

    Zoch, David R.; Lavallee, David; Weinstein, Stuart; Tong, G. Michael

    1991-01-01

    Mission planning and scheduling of spacecraft operations are becoming more complex at NASA. Described here are a mission planning process; a robust, flexible planning language for spacecraft and payload operations; and a software scheduling system that generates schedules based on planning language inputs. The mission planning process often involves many people and organizations. Consequently, a planning language is needed to facilitate communication, to provide a standard interface, and to represent flexible requirements. The software scheduling system interprets the planning language and uses the resource, time duration, constraint, and alternative plan flexibilities to resolve scheduling conflicts.

  15. Bureaucracy, Safety and Software: a Potentially Lethal Cocktail

    NASA Astrophysics Data System (ADS)

    Hatton, Les

    This position paper identifies a potential problem with the evolution of software controlled safety critical systems. It observes that the rapid growth of bureaucracy in society quickly spills over into rules for behaviour. Whether the need for the rules comes first or there is simple anticipation of the need for a rule by a bureaucrat is unclear in many cases. Many such rules lead to draconian restrictions and often make the existing situation worse due to the presence of unintended consequences as will be shown with a number of examples.

  16. Steamer II: Steamer prototype component inventory and user interface commands. Technical report, 1988-1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickieson, J.L.; Thode, W.F.; Newbury, K.

    1988-12-01

    Over the last several years, Navy Personnel Research and Development has produced a prototype simulation of a 1200-psi steam plant. This simulation, called Steamer, is installed on an expensive Symbolics minicomputer at the Surface Warfare Officers School, Pacific Coronado, California. The fundamental research goal of the Steamer prototype system was to evaluate the potential of, what was then, new artificial intelligence (AI) hardware and software technology for supporting the construction of computer-based training systems using graphic representations of complex, dynamic systems. The area of propulsion engineering was chosen for a number of reasons. This document describes the Steamer prototype systemmore » components and user interface commands and establishes a starting point for designing, developing, and implementing Steamer II. Careful examination of the actual program code produced an inventory that describes the hardware, system software, application software, and documentation for the Steamer prototype system. Exercising all menu options systematically produced an inventory of all Steamer prototype user interface commands.« less

  17. GP Workbench Manual: Technical Manual, User's Guide, and Software Guide

    USGS Publications Warehouse

    Oden, Charles P.; Moulton, Craig W.

    2006-01-01

    GP Workbench is an open-source general-purpose geophysical data processing software package written primarily for ground penetrating radar (GPR) data. It also includes support for several USGS prototype electromagnetic instruments such as the VETEM and ALLTEM. The two main programs in the package are GP Workbench and GP Wave Utilities. GP Workbench has routines for filtering, gridding, and migrating GPR data; as well as an inversion routine for characterizing UXO (unexploded ordinance) using ALLTEM data. GP Workbench provides two-dimensional (section view) and three-dimensional (plan view or time slice view) processing for GPR data. GP Workbench can produce high-quality graphics for reports when Surfer 8 or higher (Golden Software) is installed. GP Wave Utilities provides a wide range of processing algorithms for single waveforms, such as filtering, correlation, deconvolution, and calculating GPR waveforms. GP Wave Utilities is used primarily for calibrating radar systems and processing individual traces. Both programs also contain research features related to the calibration of GPR systems and calculating subsurface waveforms. The software is written to run on the Windows operating systems. GP Workbench can import GPR data file formats used by major commercial instrument manufacturers including Sensors and Software, GSSI, and Mala. The GP Workbench native file format is SU (Seismic Unix), and subsequently, files generated by GP Workbench can be read by Seismic Unix as well as many other data processing packages.

  18. Institute Born of Gratitude.

    ERIC Educational Resources Information Center

    McLellan, Vin

    1980-01-01

    The Wang Institute of Graduate Studies plans to offer a master's degree in software engineering. The development of an academic program to produce superior, technically qualified managers for the computer industry's software production is discussed. (Journal availability: Datamation, 666 Fifth Ave., New York, NY 10103.) (MLW)

  19. Collected Software Engineering Papers, Volume 10

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from Oct. 1991 - Nov. 1992. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document. For the convenience of this presentation, the 11 papers contained here are grouped into 5 major sections: (1) the Software Engineering Laboratory; (2) software tools studies; (3) software models studies; (4) software measurement studies; and (5) Ada technology studies.

  20. Surgical training programs in Pakistan.

    PubMed

    Talati, Jamsheer J; Syed, Nadir Ali

    2008-10-01

    This paper traces the history and describes the status of surgical training in Pakistan. A key revelation is that excellent surgeons are produced through systems which on formal review might appear to lack standards. Personal characteristics of residents modify outcomes in high volume surgical training units; and consequent variation in quality of outputs is noted. Attention needs to be given to (i) develop new educational systems which are not prolonged costly and cumbersome, and which produce the adequate number, types and spread of highly skilled and cognitively developed empathic surgeons for the country; (ii) the improvement of the health systems which currently impede the development of surgeons and (iii) novel ways of tackling rural urban disparities in health delivery.

  1. Psychosocial risks, burnout and intention to quit following the introduction of new software at work.

    PubMed

    Knani, Mouna; Fournier, Pierre-Sébastien; Biron, Caroline

    2018-05-01

    Despite a rich literature on association between psychosocial factors, the demand-control-support (DCS) model and burnout, there are few integrated frameworks encompassing the DCS model, burnout and intention to quit, particularly in a technological context. This manuscript examines the relationships between psychosocial risks, the demand-control-support (DCS) model, burnout syndrome and intention to quit following the introduction of new software at work. Data was collected from agents and advisors working at a Canadian university and using newstudy management software. An online questionnaire was sent via the university's internal mail. Finally, 112 people completed the online survey for a response rate of 60.9% . The results of structural equation modeling show that psychological demands, decision latitude and social support are associated with burnout. It is also clear that burnout, in particular depersonalization and emotional exhaustion, is positively associated with intention to quit. The few studies that raise the negative consequences of technology on quality of life in the workplace, and particularly on health, have not succeeded in establishing a direct link between a deterioration of health and the use of technology. This is due to the fact that there are few epidemiological studies on the direct consequences of the use of ITC on health.

  2. SketchBio: a scientist's 3D interface for molecular modeling and animation.

    PubMed

    Waldon, Shawn M; Thompson, Peter M; Hahn, Patrick J; Taylor, Russell M

    2014-10-30

    Because of the difficulties involved in learning and using 3D modeling and rendering software, many scientists hire programmers or animators to create models and animations. This both slows the discovery process and provides opportunities for miscommunication. Working with multiple collaborators, a tool was developed (based on a set of design goals) to enable them to directly construct models and animations. SketchBio is presented, a tool that incorporates state-of-the-art bimanual interaction and drop shadows to enable rapid construction of molecular structures and animations. It includes three novel features: crystal-by-example, pose-mode physics, and spring-based layout that accelerate operations common in the formation of molecular models. Design decisions and their consequences are presented, including cases where iterative design was required to produce effective approaches. The design decisions, novel features, and inclusion of state-of-the-art techniques enabled SketchBio to meet all of its design goals. These features and decisions can be incorporated into existing and new tools to improve their effectiveness.

  3. The Organization of Catholic Secondary Schools: A Preliminary Statement on Catholic School Governance.

    ERIC Educational Resources Information Center

    Maltby, Gregory P.

    This paper explores whether or not an effort to centralize the administration of several metropolitan Catholic high schools for the purpose of economy of resources would produce an unanticipated consequence that would severely offset any savings. The research suggests that Catholic parents perceive a difference among these schools in terms of…

  4. Soil enzyme activities during the 2011 Texas record drought/heat wave and implications to biogeochemical cycling and organic matter dynamics

    USDA-ARS?s Scientific Manuscript database

    Extreme weather events such as severe droughts and heat waves may have permanent consequences on soil quality and functioning in agroecosystems. The Southern High Plains (SHP) region of Texas, U.S., a large cotton producing area, experienced a historically extreme drought and heat wave during 2011,...

  5. Warfighting Concepts to Future Weapon System Designs (WARCON)

    DTIC Science & Technology

    2003-09-12

    34* Software design documents rise to litigation. "* A Material List "Cost information that may support, or may * Final Engineering Process Maps be...document may include design the system as derived from the engineering design, software development, SRD. MTS Technologies, Inc. 26 FOR OFFICIAL USE...document, early in the development phase. It is software engineers produce the vision of important to establish a standard, formal the design effort. As

  6. Test Driven Development of Scientific Models

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  7. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  8. Software Engineering Research/Developer Collaborations in 2005

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom

    2006-01-01

    In CY 2005, three collaborations between software engineering technology providers and NASA software development personnel deployed three software engineering technologies on NASA development projects (a different technology on each project). The main purposes were to benefit the projects, infuse the technologies if beneficial into NASA, and give feedback to the technology providers to improve the technologies. Each collaboration project produced a final report. Section 2 of this report summarizes each project, drawing from the final reports and communications with the software developers and technology providers. Section 3 indicates paths to further infusion of the technologies into NASA practice. Section 4 summarizes some technology transfer lessons learned. Also included is an acronym list.

  9. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  10. Software testing

    NASA Astrophysics Data System (ADS)

    Price-Whelan, Adrian M.

    2016-01-01

    Now more than ever, scientific results are dependent on sophisticated software and analysis. Why should we trust code written by others? How do you ensure your own code produces sensible results? How do you make sure it continues to do so as you update, modify, and add functionality? Software testing is an integral part of code validation and writing tests should be a requirement for any software project. I will talk about Python-based tools that make managing and running tests much easier and explore some statistics for projects hosted on GitHub that contain tests.

  11. CARDS: A blueprint and environment for domain-specific software reuse

    NASA Technical Reports Server (NTRS)

    Wallnau, Kurt C.; Solderitsch, Anne Costa; Smotherman, Catherine

    1992-01-01

    CARDS (Central Archive for Reusable Defense Software) exploits advances in domain analysis and domain modeling to identify, specify, develop, archive, retrieve, understand, and reuse domain-specific software components. An important element of CARDS is to provide visibility into the domain model artifacts produced by, and services provided by, commercial computer-aided software engineering (CASE) technology. The use of commercial CASE technology is important to provide rich, robust support for the varied roles involved in a reuse process. We refer to this kind of use of knowledge representation systems as supporting 'knowledge-based integration.'

  12. SVM classifier on chip for melanoma detection.

    PubMed

    Afifi, Shereen; GholamHosseini, Hamid; Sinha, Roopak

    2017-07-01

    Support Vector Machine (SVM) is a common classifier used for efficient classification with high accuracy. SVM shows high accuracy for classifying melanoma (skin cancer) clinical images within computer-aided diagnosis systems used by skin cancer specialists to detect melanoma early and save lives. We aim to develop a medical low-cost handheld device that runs a real-time embedded SVM-based diagnosis system for use in primary care for early detection of melanoma. In this paper, an optimized SVM classifier is implemented onto a recent FPGA platform using the latest design methodology to be embedded into the proposed device for realizing online efficient melanoma detection on a single system on chip/device. The hardware implementation results demonstrate a high classification accuracy of 97.9% and a significant acceleration factor of 26 from equivalent software implementation on an embedded processor, with 34% of resources utilization and 2 watts for power consumption. Consequently, the implemented system meets crucial embedded systems constraints of high performance and low cost, resources utilization and power consumption, while achieving high classification accuracy.

  13. Inflight Performance of Cassini Reaction Wheel Bearing Drag in 1997-2013

    NASA Technical Reports Server (NTRS)

    Lee, Allan Y.; Wang, Eric K.

    2013-01-01

    As the first spacecraft to achieve orbit at Saturn in 2004, Cassini has collected science data throughout its four-year prime mission (2004-08), and has since been approved for a first and second extended missions through September 2017. Cassini is a three-axis stabilized spacecraft. It uses reaction wheels to achieve high level of spacecraft pointing stability that is needed during imaging operations of several science instruments. The Cassini flight software makes in-flight estimates of reaction wheel bearing drag torque and made them available to the mission operations team. These telemetry data are being trended for the purpose of monitoring the long-term health of the reaction wheel bearings. Anomalous drag torque signatures observed over the past 15 years are described in this paper. One of these anomalous drag conditions is bearing cage instability that appeared (and disappeared) spontaneously and unpredictably. Cage instability is an uncontrolled vibratory motion of the bearing cage that can produce high-impact forces internal to the bearing that will cause intermittent and erratic torque transients. Characteristics of the observed cage instabilities and other drag torque "spikes" are described in this paper. In day-to-day operations, the reaction wheels' rates must be neither too high nor too low. To protect against operating the wheels in any undesirable conditions (such as prolonged low spin rate operations), a ground software tool named Reaction Wheel Bias Optimization Tool (RBOT) was developed for the management of the wheels. Disciplined and long-term use of this ground software has led to significant reduction in the daily consumption rate of the wheels' low spin rate dwell time. Flight experience on the use of this ground software tool as well as other lessons learned on the management of Cassini reaction wheels is given in this paper.

  14. NASA Tech Briefs, September 2005

    NASA Technical Reports Server (NTRS)

    2005-01-01

    Topivs include: Diamond-Coated Carbon Nanotubes for Efficient Field Emission; Improved Anode Coatings for Direct Methanol Fuel Cells; Advanced Ablative Insulators and Methods of Making Them; PETIs as High-Temperature Resin-Transfer-Molding Materials; Stable Polyimides for Terrestrial and Space Uses; Low-Density, Aerogel-Filled Thermal-Insulation Tiles; High-Performance Polymers Having Low Melt Viscosities; Nonflammable, Hydrophobic Aerogel Composites for Insulation; Front-Side Microstrip Line Feeding a Raised Antenna Patch; Medium-Frequency Pseudonoise Georadar; Facilitating Navigation Through Large Archives; Program for Weibull Analysis of Fatigue Data; Comprehensive Micromechanics-Analysis Code - Version 4.0; Component-Based Visualization System; Software for Engineering Simulations of a Spacecraft; LabVIEW Interface for PCI-SpaceWire Interface Card; Path Following with Slip Compensation for a Mars Rover; International Space Station Electric Power System Performance Code-SPACE; Software for Automation of Real-Time Agents, Version 2; Software for Optimizing Plans Involving Interdependent Goals; Computing Gravitational Fields of Finite-Sized Bodies; Custom Sky-Image Mosaics from NASA's Information Power Grid; ANTLR Tree Grammar Generator and Extensions; Generic Kalman Filter Software; Alignment Stage for a Cryogenic Dilatometer; Rugged Iris Mechanism; Treatments To Produce Stabilized Aluminum Mirrors for Cryogenic Uses; Making AlNx Tunnel Barriers Using a Low-Energy Nitrogen-Ion Beam; Making Wide-IF SIS Mixers with Suspended Metal-Beam Leads; Sol-Gel Glass Holographic Light-Shaping Diffusers; Automated Counting of Particles To Quantify Cleanliness; Phase Correction for GPS Antenna with Nonunique Phase Center; Compact Infrasonic Windscreen; Broadband External-Cavity Diode Laser; High-Efficiency Solar Cells Using Photonic-Bandgap Materials; Generating Solid Models from Topographical Data; Computationally Lightweight Air-Traffic-Control Simulation; Spool Valve for Switching Air Flows Between Two Beds; Partial Model of Insulator/ Insulator Contact Charging; Asymmetric Electrostatic Radiation Shielding for Spacecraft; and Reusable Hybrid Propellant Modules for Outer-Space Transport.

  15. Injection molding lens metrology using software configurable optical test system

    NASA Astrophysics Data System (ADS)

    Zhan, Cheng; Cheng, Dewen; Wang, Shanshan; Wang, Yongtian

    2016-10-01

    Optical plastic lens produced by injection molding machine possesses numerous advantages of light quality, impact resistance, low cost, etc. The measuring methods in the optical shop are mainly interferometry, profile meter. However, these instruments are not only expensive, but also difficult to alignment. The software configurable optical test system (SCOTS) is based on the geometry of the fringe refection and phase measuring deflectometry method (PMD), which can be used to measure large diameter mirror, aspheric and freeform surface rapidly, robustly, and accurately. In addition to the conventional phase shifting method, we propose another data collection method called as dots matrix projection. We also use the Zernike polynomials to correct the camera distortion. This polynomials fitting mapping distortion method has not only simple operation, but also high conversion precision. We simulate this test system to measure the concave surface using CODE V and MATLAB. The simulation results show that the dots matrix projection method has high accuracy and SCOTS has important significance for on-line detection in optical shop.

  16. Approaches to 3D printing teeth from X-ray microtomography.

    PubMed

    Cresswell-Boyes, A J; Barber, A H; Mills, D; Tatla, A; Davis, G R

    2018-06-28

    Artificial teeth have several advantages in preclinical training. The aim of this study is to three-dimensionally (3D) print accurate artificial teeth using scans from X-ray microtomography (XMT). Extracted and artificial teeth were imaged at 90 kV and 40 kV, respectively, to create detailed high contrast scans. The dataset was visualised to produce internal and external meshes subsequently exported to 3D modelling software for modification before finally sending to a slicing program for printing. After appropriate parameter setting, the printer deposited material in specific locations layer by layer, to create a 3D physical model. Scans were manipulated to ensure a clean model was imported into the slicing software, where layer height replicated the high spatial resolution that was observed in the XMT scans. The model was then printed in two different materials (polylactic acid and thermoplastic elastomer). A multimaterial print was created to show the different physical characteristics between enamel and dentine. © 2018 The Authors Journal of Microscopy © 2018 Royal Microscopical Society.

  17. Proposed US Contributions to LOFT

    NASA Technical Reports Server (NTRS)

    Wilson-Hodge, Colleen

    2013-01-01

    Proposed US Enhancements include:Tantalum X -ray collimator, Additional ground station, Large Observatory for X-Ray Timing (LOFT) instrument team participation, US science support center & data archive, and Science enabled by US hardware. High-Z material with excellent stopping power. Fabricated using a combination of laser micromachining and chemical etching. Known technology capable of producing high-aspect ratio holes and large open fractions. Reduces LOFT LAD background by a factor of 3. Telemetry formats for LOFT based upon RXTE/EDS experience. Ground system software and strategies for WFM based upon RXTE/ASM automated pipeline software. MSFC engineering trade studies supporting the Ta collimator. Burst alert triggers based upon Fermi/GBM and HETE-2. Science Enhancements Enabled by US Hardware include: Tantalum collimator: Reduces background by factor of 3. Improves sensitivity to faint sources such as AGN. Eliminates contamination by bright/variable sources. outside the LAD field of view. US Ground Station: Enables continuous telemetry of all events from the WFM. Allows LAD to observe very bright >500 mCrab sources with full event resolution.

  18. Spatiotemporal matrix image formation for programmable ultrasound scanners

    NASA Astrophysics Data System (ADS)

    Berthon, Beatrice; Morichau-Beauchant, Pierre; Porée, Jonathan; Garofalakis, Anikitos; Tavitian, Bertrand; Tanter, Mickael; Provost, Jean

    2018-02-01

    As programmable ultrasound scanners become more common in research laboratories, it is increasingly important to develop robust software-based image formation algorithms that can be obtained in a straightforward fashion for different types of probes and sequences with a small risk of error during implementation. In this work, we argue that as the computational power keeps increasing, it is becoming practical to directly implement an approximation to the matrix operator linking reflector point targets to the corresponding radiofrequency signals via thoroughly validated and widely available simulations software. Once such a spatiotemporal forward-problem matrix is constructed, standard and thus highly optimized inversion procedures can be leveraged to achieve very high quality images in real time. Specifically, we show that spatiotemporal matrix image formation produces images of similar or enhanced quality when compared against standard delay-and-sum approaches in phantoms and in vivo, and show that this approach can be used to form images even when using non-conventional probe designs for which adapted image formation algorithms are not readily available.

  19. Digital 3D Microstructure Analysis of Concrete using X-Ray Micro Computed Tomography SkyScan 1173: A Preliminary Study

    NASA Astrophysics Data System (ADS)

    Latief, F. D. E.; Mohammad, I. H.; Rarasati, A. D.

    2017-11-01

    Digital imaging of a concrete sample using high resolution tomographic imaging by means of X-Ray Micro Computed Tomography (μ-CT) has been conducted to assess the characteristic of the sample’s structure. A standard procedure of image acquisition, reconstruction, image processing of the method using a particular scanning device i.e., the Bruker SkyScan 1173 High Energy Micro-CT are elaborated. A qualitative and a quantitative analysis were briefly performed on the sample to deliver some basic ideas of the capability of the system and the bundled software package. Calculation of total VOI volume, object volume, percent of object volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity were conducted and analysed. This paper should serve as a brief description of how the device can produce the preferred image quality as well as the ability of the bundled software packages to help in performing qualitative and quantitative analysis.

  20. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less

  1. Swine production.

    PubMed

    Plain, Ronald L; Lawrence, John D

    2003-07-01

    The US swine industry is large and growing. The quantity of pork desired by consumers of US pork is growing at the rate of 1.5%/y. New production systems and new technology have enabled production per sow to grow at a rate of 4% annually in recent years. Consequently, the number of sows in the United States is declining. Because productivity growth is outpacing demand growth, the deflated price of hogs and pork is declining. Hog production and prices continue to exhibit strong seasonal and cyclic patterns. Pork production is usually lowest in the summer and highest in the fall. Production and prices tend to follow 4-year patterns. The US swine industry continues to evolve toward fewer and larger producers who rely on contracts for both hog production and marketing. In 2000, over half of the hogs marketed were from approximately 156 firms marketing more than 50,000 head annually. These producers finished 60% of their production in contract facilities. Over 90% of their marketings were under contract or were owned by a packer. These producers expressed a high level of satisfaction with hog production. Both they and their contract growers were satisfied with production contracts. These large producers were satisfied with their marketing contracts and planned to continue them in the future. The hog industry has changed a great deal in the last decade. There is little reason to believe this rapid rate of change will not continue. This swine industry is highly competitive and profit driven. Profit margins are too small to allow producers the luxury of ignoring new technology and innovative production systems. Consequently, hog production will continue its rapid evolution from traditional agriculture to typical industry.

  2. Applications of Low Altitude Remote Sensing in Agriculture upon Farmers' Requests– A Case Study in Northeastern Ontario, Canada

    PubMed Central

    Zhang, Chunhua; Walters, Dan; Kovacs, John M.

    2014-01-01

    With the growth of the low altitude remote sensing (LARS) industry in recent years, their practical application in precision agriculture seems all the more possible. However, only a few scientists have reported using LARS to monitor crop conditions. Moreover, there have been concerns regarding the feasibility of such systems for producers given the issues related to the post-processing of images, technical expertise, and timely delivery of information. The purpose of this study is to showcase actual requests by farmers to monitor crop conditions in their fields using an unmanned aerial vehicle (UAV). Working in collaboration with farmers in northeastern Ontario, we use optical and near-infrared imagery to monitor fertilizer trials, conduct crop scouting and map field tile drainage. We demonstrate that LARS imagery has many practical applications. However, several obstacles remain, including the costs associated with both the LARS system and the image processing software, the extent of professional training required to operate the LARS and to process the imagery, and the influence from local weather conditions (e.g. clouds, wind) on image acquisition all need to be considered. Consequently, at present a feasible solution for producers might be the use of LARS service provided by private consultants or in collaboration with LARS scientific research teams. PMID:25386696

  3. Applications of low altitude remote sensing in agriculture upon farmers' requests--a case study in northeastern Ontario, Canada.

    PubMed

    Zhang, Chunhua; Walters, Dan; Kovacs, John M

    2014-01-01

    With the growth of the low altitude remote sensing (LARS) industry in recent years, their practical application in precision agriculture seems all the more possible. However, only a few scientists have reported using LARS to monitor crop conditions. Moreover, there have been concerns regarding the feasibility of such systems for producers given the issues related to the post-processing of images, technical expertise, and timely delivery of information. The purpose of this study is to showcase actual requests by farmers to monitor crop conditions in their fields using an unmanned aerial vehicle (UAV). Working in collaboration with farmers in northeastern Ontario, we use optical and near-infrared imagery to monitor fertilizer trials, conduct crop scouting and map field tile drainage. We demonstrate that LARS imagery has many practical applications. However, several obstacles remain, including the costs associated with both the LARS system and the image processing software, the extent of professional training required to operate the LARS and to process the imagery, and the influence from local weather conditions (e.g. clouds, wind) on image acquisition all need to be considered. Consequently, at present a feasible solution for producers might be the use of LARS service provided by private consultants or in collaboration with LARS scientific research teams.

  4. DMRfinder: efficiently identifying differentially methylated regions from MethylC-seq data.

    PubMed

    Gaspar, John M; Hart, Ronald P

    2017-11-29

    DNA methylation is an epigenetic modification that is studied at a single-base resolution with bisulfite treatment followed by high-throughput sequencing. After alignment of the sequence reads to a reference genome, methylation counts are analyzed to determine genomic regions that are differentially methylated between two or more biological conditions. Even though a variety of software packages is available for different aspects of the bioinformatics analysis, they often produce results that are biased or require excessive computational requirements. DMRfinder is a novel computational pipeline that identifies differentially methylated regions efficiently. Following alignment, DMRfinder extracts methylation counts and performs a modified single-linkage clustering of methylation sites into genomic regions. It then compares methylation levels using beta-binomial hierarchical modeling and Wald tests. Among its innovative attributes are the analyses of novel methylation sites and methylation linkage, as well as the simultaneous statistical analysis of multiple sample groups. To demonstrate its efficiency, DMRfinder is benchmarked against other computational approaches using a large published dataset. Contrasting two replicates of the same sample yielded minimal genomic regions with DMRfinder, whereas two alternative software packages reported a substantial number of false positives. Further analyses of biological samples revealed fundamental differences between DMRfinder and another software package, despite the fact that they utilize the same underlying statistical basis. For each step, DMRfinder completed the analysis in a fraction of the time required by other software. Among the computational approaches for identifying differentially methylated regions from high-throughput bisulfite sequencing datasets, DMRfinder is the first that integrates all the post-alignment steps in a single package. Compared to other software, DMRfinder is extremely efficient and unbiased in this process. DMRfinder is free and open-source software, available on GitHub ( github.com/jsh58/DMRfinder ); it is written in Python and R, and is supported on Linux.

  5. The use of mathematics and electric circuit simulator software in the learning process of wireless power transfer for electrical engineering students

    NASA Astrophysics Data System (ADS)

    Habibi, Muhammad Afnan; Fall, Cheikh; Setiawan, Eko; Hodaka, Ichijo; Wijono, Hasanah, Rini Nur

    2017-09-01

    Wireless Power Transfer (WPT) isa technique to deliver the electrical power from the source to the load without using wires or conductors. The physics of WPT is well known and basically learned as a course in high school. However, it is very recent that WPT is useful in practical situation: it should be able to transfer electric power in a significant efficiency. It means that WPT requires not much knowledge to university students but may attract students because of cutting edge technique of WPT. On the other hand, phenomena of WPT is invisible and sometimes difficult to imagine. The objective of this paper is to demonstrate the use of mathematics and an electric circuit simulator using MATHEMATICA software and LT-SPICE software in designing a WPT system application. It brings to a conclusion that the students as well the designer can take the benefit of the proposed method. By giving numerical values to circuit parameters, students acquires the power output and efficiency of WPT system. The average power output as well as the efficiency of the designed WPT which resonance frequency set on the system,leads it to produce high output power and better efficiency.

  6. Comparison of performance of object-based image analysis techniques available in open source software (Spring and Orfeo Toolbox/Monteverdi) considering very high spatial resolution data

    NASA Astrophysics Data System (ADS)

    Teodoro, Ana C.; Araujo, Ricardo

    2016-01-01

    The use of unmanned aerial vehicles (UAVs) for remote sensing applications is becoming more frequent. However, this type of information can result in several software problems related to the huge amount of data available. Object-based image analysis (OBIA) has proven to be superior to pixel-based analysis for very high-resolution images. The main objective of this work was to explore the potentialities of the OBIA methods available in two different open source software applications, Spring and OTB/Monteverdi, in order to generate an urban land cover map. An orthomosaic derived from UAVs was considered, 10 different regions of interest were selected, and two different approaches were followed. The first one (Spring) uses the region growing segmentation algorithm followed by the Bhattacharya classifier. The second approach (OTB/Monteverdi) uses the mean shift segmentation algorithm followed by the support vector machine (SVM) classifier. Two strategies were followed: four classes were considered using Spring and thereafter seven classes were considered for OTB/Monteverdi. The SVM classifier produces slightly better results and presents a shorter processing time. However, the poor spectral resolution of the data (only RGB bands) is an important factor that limits the performance of the classifiers applied.

  7. Software Architecture to Support the Evolution of the ISRU RESOLVE Engineering Breadboard Unit 2 (EBU2)

    NASA Technical Reports Server (NTRS)

    Moss, Thomas; Nurge, Mark; Perusich, Stephen

    2011-01-01

    The In-Situ Resource Utilization (ISRU) Regolith & Environmental Science and Oxygen & Lunar Volatiles Extraction (RESOLVE) software provides operation of the physical plant from a remote location with a high-level interface that can access and control the data from external software applications of other subsystems. This software allows autonomous control over the entire system with manual computer control of individual system/process components. It gives non-programmer operators the capability to easily modify the high-level autonomous sequencing while the software is in operation, as well as the ability to modify the low-level, file-based sequences prior to the system operation. Local automated control in a distributed system is also enabled where component control is maintained during the loss of network connectivity with the remote workstation. This innovation also minimizes network traffic. The software architecture commands and controls the latest generation of RESOLVE processes used to obtain, process, and quantify lunar regolith. The system is grouped into six sub-processes: Drill, Crush, Reactor, Lunar Water Resource Demonstration (LWRD), Regolith Volatiles Characterization (RVC) (see example), and Regolith Oxygen Extraction (ROE). Some processes are independent, some are dependent on other processes, and some are independent but run concurrently with other processes. The first goal is to analyze the volatiles emanating from lunar regolith, such as water, carbon monoxide, carbon dioxide, ammonia, hydrogen, and others. This is done by heating the soil and analyzing and capturing the volatilized product. The second goal is to produce water by reducing the soil at high temperatures with hydrogen. This is done by raising the reactor temperature in the range of 800 to 900 C, causing the reaction to progress by adding hydrogen, and then capturing the water product in a desiccant bed. The software needs to run the entire unit and all sub-processes; however, throughout testing, many variables and parameters need to be changed as more is learned about the system operation. The Master Events Controller (MEC) is run on a standard laptop PC using Windows XP. This PC runs in parallel to another laptop that monitors the GC, and a third PC that monitors the drilling/ crushing operation. These three PCs interface to the process through a CompactRIO, OPC Servers, and modems.

  8. PAIR: A Cooperative Effort to Meet Informational Needs

    PubMed Central

    Closurdo, Janette S.; Pehkonen, Charles A.

    1973-01-01

    St. Joseph Mercy Hospital organized a cooperative association of area institutions (the Pontiac Area Instructional Resources group: PAIR) in order to (1) promote a forum in which to exchange ideas and information on software used for learning materials and hardware for using such materials, (2) provide a resource library system to lend such learning materials, and (3) cooperatively produce such learning materials for use in member institutions. In less than one year of cooperation, a union list of serials and a union list of software for the area have been produced. A forum has been created in which ideas and information can be shared, and a sound/slide program has been produced. PMID:4122093

  9. Impact of detector simulation in particle physics collider experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elvira, V. Daniel

    Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less

  10. Impact of detector simulation in particle physics collider experiments

    DOE PAGES

    Elvira, V. Daniel

    2017-06-01

    Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less

  11. Implementation errors in the GingerALE Software: Description and recommendations.

    PubMed

    Eickhoff, Simon B; Laird, Angela R; Fox, P Mickle; Lancaster, Jack L; Fox, Peter T

    2017-01-01

    Neuroscience imaging is a burgeoning, highly sophisticated field the growth of which has been fostered by grant-funded, freely distributed software libraries that perform voxel-wise analyses in anatomically standardized three-dimensional space on multi-subject, whole-brain, primary datasets. Despite the ongoing advances made using these non-commercial computational tools, the replicability of individual studies is an acknowledged limitation. Coordinate-based meta-analysis offers a practical solution to this limitation and, consequently, plays an important role in filtering and consolidating the enormous corpus of functional and structural neuroimaging results reported in the peer-reviewed literature. In both primary data and meta-analytic neuroimaging analyses, correction for multiple comparisons is a complex but critical step for ensuring statistical rigor. Reports of errors in multiple-comparison corrections in primary-data analyses have recently appeared. Here, we report two such errors in GingerALE, a widely used, US National Institutes of Health (NIH)-funded, freely distributed software package for coordinate-based meta-analysis. These errors have given rise to published reports with more liberal statistical inferences than were specified by the authors. The intent of this technical report is threefold. First, we inform authors who used GingerALE of these errors so that they can take appropriate actions including re-analyses and corrective publications. Second, we seek to exemplify and promote an open approach to error management. Third, we discuss the implications of these and similar errors in a scientific environment dependent on third-party software. Hum Brain Mapp 38:7-11, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  12. Impact of detector simulation in particle physics collider experiments

    NASA Astrophysics Data System (ADS)

    Daniel Elvira, V.

    2017-06-01

    Through the last three decades, accurate simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics (HEP) experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detector simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the precision of the physics results and publication turnaround, from data-taking to submission. It also presents estimates of the cost and economic impact of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data with increasingly complex detectors, taxing heavily the performance of simulation and reconstruction software. Consequently, exploring solutions to speed up simulation and reconstruction software to satisfy the growing demand of computing resources in a time of flat budgets is a matter that deserves immediate attention. The article ends with a short discussion on the potential solutions that are being considered, based on leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering HEP code for concurrency and parallel computing.

  13. Software Safety Risk in Legacy Safety-Critical Computer Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Baggs, Rhoda

    2007-01-01

    Safety-critical computer systems must be engineered to meet system and software safety requirements. For legacy safety-critical computer systems, software safety requirements may not have been formally specified during development. When process-oriented software safety requirements are levied on a legacy system after the fact, where software development artifacts don't exist or are incomplete, the question becomes 'how can this be done?' The risks associated with only meeting certain software safety requirements in a legacy safety-critical computer system must be addressed should such systems be selected as candidates for reuse. This paper proposes a method for ascertaining formally, a software safety risk assessment, that provides measurements for software safety for legacy systems which may or may not have a suite of software engineering documentation that is now normally required. It relies upon the NASA Software Safety Standard, risk assessment methods based upon the Taxonomy-Based Questionnaire, and the application of reverse engineering CASE tools to produce original design documents for legacy systems.

  14. A study of software standards used in the avionics industry

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1994-01-01

    Within the past decade, software has become an increasingly common element in computing systems. In particular, the role of software used in the aerospace industry, especially in life- or safety-critical applications, is rapidly expanding. This intensifies the need to use effective techniques for achieving and verifying the reliability of avionics software. Although certain software development processes and techniques are mandated by government regulating agencies, no one methodology has been shown to consistently produce reliable software. The knowledge base for designing reliable software simply has not reached the maturity of its hardware counterpart. In an effort to increase our understanding of software, the Langley Research Center conducted a series of experiments over 15 years with the goal of understanding why and how software fails. As part of this program, the effectiveness of current industry standards for the development of avionics is being investigated. This study involves the generation of a controlled environment to conduct scientific experiments on software processes.

  15. The TI-99/4A Software.

    ERIC Educational Resources Information Center

    Wrege, Rachael; And Others

    1982-01-01

    Describes the software modules produced by Texas Instruments for use with the TI-99/4A home computer. Among the modules described are: Personal Real Estate, Programing Aids, Home Financial Decisions, Music Maker, Weight Control and Nutrition, Early Learning Fun, and Tax/Investment Record Keeping. (JL)

  16. Abstracts Produced Using Computer Assistance.

    ERIC Educational Resources Information Center

    Craven, Timothy C.

    2000-01-01

    Describes an experiment that evaluated features of TEXNET abstracting software, compared the use of keywords and phrases that were automatically extracted, tested hypotheses about relations between abstractors' backgrounds and their reactions to abstracting assistance software, and obtained ideas for further features to be developed in TEXNET.…

  17. Detection of faults and software reliability analysis

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1987-01-01

    Multi-version or N-version programming is proposed as a method of providing fault tolerance in software. The approach requires the separate, independent preparation of multiple versions of a piece of software for some application. These versions are executed in parallel in the application environment; each receives identical inputs and each produces its version of the required outputs. The outputs are collected by a voter and, in principle, they should all be the same. In practice there may be some disagreement. If this occurs, the results of the majority are taken to be the correct output, and that is the output used by the system. A total of 27 programs were produced. Each of these programs was then subjected to one million randomly-generated test cases. The experiment yielded a number of programs containing faults that are useful for general studies of software reliability as well as studies of N-version programming. Fault tolerance through data diversity and analytic models of comparison testing are discussed.

  18. Pedagogical issues for effective teaching of biosignal processing and analysis.

    PubMed

    Sandham, William A; Hamilton, David J

    2010-01-01

    Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.

  19. D Cultural Heritage Documentation: a Comparison Between Different Photogrammetric Software and Their Products

    NASA Astrophysics Data System (ADS)

    Gagliolo, S.; Ausonio, E.; Federici, B.; Ferrando, I.; Passoni, D.; Sguerso, D.

    2018-05-01

    The conservation of Cultural Heritage depends on the availability of means and resources and, consequently, on the possibility to make effective operations of data acquisition. In facts, on the one hand the creation of data repositories allows the description of the present state-of-art, in order to preserve the testimonial value and to permit the fruition. On the other hand, data acquisition grants a metrical knowledge, which is particularly useful for a direct restoration of the surveyed objects, through the analysis of their 3D digital models. In the last decades, the continuous increase and improvement of 3D survey techniques and of tools for the geometric and digital data management have represented a great support to the development of documentary activities. In particular, Photogrammetry is a survey technique highly appropriate in the creation of data repositories in the field of Cultural Heritage, thanks to its advantages of cheapness, flexibility, speed, and the opportunity to ensure the operators' safety in hazardous areas too. In order to obtain a complete documentation, the high precision of the on-site operations must be coupled with an effective post-processing phase. Hence, a comparison among some of the photogrammetric software currently available was performed by the authors, with a particular attention to the workflow completeness and the final products quality.

  20. NASA Affordable Vehicle Avionics (AVA). Common Modular Avionics System for Nanolaunchers Offering Affordable Access to Space; [Space Technology: Game Changing Development

    NASA Technical Reports Server (NTRS)

    Aquilina, Rudy

    2017-01-01

    Small satellites are becoming ever more capable of performing valuable missions for both government and commercial customers. However, currently these satellites can be launched affordably only as secondary payloads. This makes it difficult for the small satellite mission to launch when needed, to the desired orbit, and with acceptable risk. What is needed is a class of low-cost launchers, so that launch costs to low-Earth orbit (LEO) are commensurate with payload costs. Several private and government-sponsored launch vehicle developers are working toward just that-the ability to affordably insert small payloads into LEO. But until now, cost of the complex avionics remained disproportionately high. AVA (Affordable Vehicle Avionics) solves this problem. Significant contributors to the cost of launching nanosatellites to orbit are the avionics and software systems that steer and control the launch vehicles, sequence stage separation, deploy payloads, and telemeter data. The high costs of these guidance, navigation and control (GNC) avionics systems are due in part to the current practice of developing unique, single-use hardware and software for each launch. High-performance, high-reliability inertial sensors components with heritage from legacy launchers also contribute to costs-but can low-cost commercial inertial sensors work just as well? NASA Ames Research Center has developed and tested a prototype low-cost avionics package for space launch vehicles that provides complete GNC functionality in a package smaller than a tissue box (100 millimeters by 120 millimeters by 69 millimeters; 4 inches by 4.7 inches by 2.7 inches), with a mass of less than 0.84 kilogram (2 pounds. AVA takes advantage of commercially available, low-cost, mass-produced, miniaturized sensors, filtering their more noisy inertial data with real-time GPS (Global Positioning Satellite) data. The goal of the AVA project is to produce and light-verify a common suite of avionics and software that deliver affordable, capable GNC and telemetry avionics with application to multiple nanolaunch vehicles at 1 percent of the cost of current state-of-the-art avionics.

  1. Evolving software reengineering technology for the emerging innovative-competitive era

    NASA Technical Reports Server (NTRS)

    Hwang, Phillip Q.; Lock, Evan; Prywes, Noah

    1994-01-01

    This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.

  2. Cloud4Psi: cloud computing for 3D protein structure similarity searching.

    PubMed

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-10-01

    Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.

  3. Three-dimensional finite volume modelling of blood flow in simulated angular neck abdominal aortic aneurysm

    NASA Astrophysics Data System (ADS)

    Algabri, Y. A.; Rookkapan, S.; Chatpun, S.

    2017-09-01

    An abdominal aortic aneurysm (AAA) is considered a deadly cardiovascular disease that defined as a focal dilation of blood artery. The healthy aorta size is between 15 and 24 mm based on gender, bodyweight, and age. When the diameter increased to 30 mm or more, the rupture can occur if it is kept growing or untreated. Moreover, the proximal angular neck of aneurysm is categorized as a significant morphological feature with prime harmful effects on endovascular aneurysm repair (EVAR). Flow pattern in pathological vessel can influence the vascular intervention. The aim of this study is to investigate the blood flow behaviours in angular neck abdominal aortic aneurysm with simulated geometry based on patient’s information using computational fluid dynamics (CFD). The 3D angular neck AAA models have been designed by using SolidWorks Software. Consequently, CFD tools are used for simulating these 3D models of angular neck AAA in ANSYS FLUENT Software. Eventually, based on the results, we summarized that the CFD techniques have shown high performance in explaining and investigating the flow patterns for angular neck abdominal aortic aneurysm.

  4. Cloud4Psi: cloud computing for 3D protein structure similarity searching

    PubMed Central

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-01-01

    Summary: Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Availability and implementation: Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. Contact: dariusz.mrozek@polsl.pl PMID:24930141

  5. Petascale Computing for Ground-Based Solar Physics with the DKIST Data Center

    NASA Astrophysics Data System (ADS)

    Berukoff, Steven J.; Hays, Tony; Reardon, Kevin P.; Spiess, DJ; Watson, Fraser; Wiant, Scott

    2016-05-01

    When construction is complete in 2019, the Daniel K. Inouye Solar Telescope will be the most-capable large aperture, high-resolution, multi-instrument solar physics facility in the world. The telescope is designed as a four-meter off-axis Gregorian, with a rotating Coude laboratory designed to simultaneously house and support five first-light imaging and spectropolarimetric instruments. At current design, the facility and its instruments will generate data volumes of 3 PB per year, and produce 107-109 metadata elements.The DKIST Data Center is being designed to store, curate, and process this flood of information, while providing association of science data and metadata to its acquisition and processing provenance. The Data Center will produce quality-controlled calibrated data sets, and make them available freely and openly through modern search interfaces and APIs. Documented software and algorithms will also be made available through community repositories like Github for further collaboration and improvement.We discuss the current design and approach of the DKIST Data Center, describing the development cycle, early technology analysis and prototyping, and the roadmap ahead. We discuss our iterative development approach, the underappreciated challenges of calibrating ground-based solar data, the crucial integration of the Data Center within the larger Operations lifecycle, and how software and hardware support, intelligently deployed, will enable high-caliber solar physics research and community growth for the DKIST's 40-year lifespan.

  6. System Evaluation and Life-Cycle Cost Analysis of a Commercial-Scale High-Temperature Electrolysis Hydrogen Production Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwin A. Harvego; James E. O'Brien; Michael G. McKellar

    2012-11-01

    Results of a system evaluation and lifecycle cost analysis are presented for a commercial-scale high-temperature electrolysis (HTE) central hydrogen production plant. The plant design relies on grid electricity to power the electrolysis process and system components, and industrial natural gas to provide process heat. The HYSYS process analysis software was used to evaluate the reference central plant design capable of producing 50,000 kg/day of hydrogen. The HYSYS software performs mass and energy balances across all components to allow optimization of the design using a detailed process flow sheet and realistic operating conditions specified by the analyst. The lifecycle cost analysismore » was performed using the H2A analysis methodology developed by the Department of Energy (DOE) Hydrogen Program. This methodology utilizes Microsoft Excel spreadsheet analysis tools that require detailed plant performance information (obtained from HYSYS), along with financial and cost information to calculate lifecycle costs. The results of the lifecycle analyses indicate that for a 10% internal rate of return, a large central commercial-scale hydrogen production plant can produce 50,000 kg/day of hydrogen at an average cost of $2.68/kg. When the cost of carbon sequestration is taken into account, the average cost of hydrogen production increases by $0.40/kg to $3.08/kg.« less

  7. Final Report Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less

  8. Guide surgery osteotomy system (GSOS) a new device for treatment in orthognathic surgery.

    PubMed

    Salvato, Giuseppe; Chiavenna, Carlo; Meazzini, Maria Costanza

    2014-04-01

    This article proposes an innovative and revolutionary diagnostic and therapeutic protocol for performing dentoalveolar osteotomies in office under local anaesthesia with piezoelectric surgery using a surgical acrylic guide produced through software-based planning. The method was applied in the correction of crossbites, changing in the curve of Spee, incisal decompensations and dental ankylosis. Performing a preoperative CT with a special splint, optical scanning of the models and the subsequent planning with software has enabled us to produce a model with rapid prototyping with the design of the osteotomy on which the surgical guide was shaped, the use of the guide associated with piezoelectric surgery, allowed to perform surgery under local anaesthesia, with minimal invasiveness and high accuracy. Dentoalveolar immediate movements, with preservation of the roots of teeth involved, allow for rapid treatment of malocclusions which would be long and often difficult if not impossible to treat with orthodontics only. Dentoalveolar osteotomies associated to osteodistraction concepts, allow the orthodontist to achieve with accuracy the objectives required by the treatment plan. GSOS is a new method, which, utilizing 3D optical scanning images of models, software and piezoelectric surgery, allows to perform dentoalveolar movements which may be dangerous to the roots or for the periodontal support, with orthodontics only. It dramatically reduces total surgical-orthodontic treatment time, with obvious great patient satisfaction. Copyright © 2013 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  9. Managing the Software Development Process

    NASA Technical Reports Server (NTRS)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  10. Towards understanding software: 15 years in the SEL

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Pajerski, Rose

    1990-01-01

    For 15 years, the Software Engineering Laboratory (SEL) at GSFC has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software, and software processes within a production software environment. The SEL comprises three major organizations: (1) the GSFC Flight Dynamics Division; (2) the University of Maryland Computer Science Department; and (3) the Computer Sciences Corporation Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents: all describing some aspect of the software engineering technology that has undergone analysis in the flight dynamics environment. The studies range from small controlled experiments (such as analyzing the effectiveness of code reading versus functional testing) to large, multiple-project studies (such as assessing the impacts of Ada on a production environment). The key findings that NASA feels have laid the foundation for ongoing and future software development and research activities are summarized.

  11. Collaboration Between NASA Centers of Excellence on Autonomous System Software Development

    NASA Technical Reports Server (NTRS)

    Goodrich, Charles H.; Larson, William E.; Delgado, H. (Technical Monitor)

    2001-01-01

    Software for space systems flight operations has its roots in the early days of the space program when computer systems were incapable of supporting highly complex and flexible control logic. Control systems relied on fast data acquisition and supervisory control from a roomful of systems engineers on the ground. Even though computer hardware and software has become many orders of magnitude more capable, space systems have largely adhered to this original paradigm In an effort to break this mold, Kennedy Space Center (KSC) has invested in the development of model-based diagnosis and control applications for ten years having broad experience in both ground and spacecraft systems and software. KSC has now partnered with Ames Research Center (ARC), NASA's Center of Excellence in Information Technology, to create a new paradigm for the control of dynamic space systems. ARC has developed model-based diagnosis and intelligent planning software that enables spacecraft to handle most routine problems automatically and allocate resources in a flexible way to realize mission objectives. ARC demonstrated the utility of onboard diagnosis and planning with an experiment aboard Deep Space I in 1999. This paper highlights the software control system collaboration between KSC and ARC. KSC has developed a Mars In-situ Resource Utilization testbed based on the Reverse Water Gas Shift (RWGS) reaction. This plant, built in KSC's Applied Chemistry Laboratory, is capable of producing the large amount of Oxygen that would be needed to support a Human Mars Mission. KSC and ARC are cooperating to develop an autonomous, fault-tolerant control system for RWGS to meet the need for autonomy on deep space missions. The paper will also describe how the new system software paradigm will be applied to Vehicle Health Monitoring, tested on the new X vehicles and integrated into future launch processing systems.

  12. How to Submit a Risk Management Plan (RMP) to EPA

    EPA Pesticide Factsheets

    RMP*eSubmit software is the only way to submit RMPs. After you have prepared your plan using RMP*eSubmit, you may also re-submit, correct, or withdraw an RMP. Another electronic tool, RMP*Comp, performs the required off-site consequence analysis.

  13. A Database for Propagation Models and Conversion to C++ Programming Language

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.; Angkasa, Krisjani; Rucker, James

    1996-01-01

    The telecommunications system design engineer generally needs the quantification of effects of the propagation medium (definition of the propagation channel) to design an optimal communications system. To obtain the definition of the channel, the systems engineer generally has a few choices. A search of the relevant publications such as the IEEE Transactions, CCIR's, NASA propagation handbook, etc., may be conducted to find the desired channel values. This method may need excessive amounts of time and effort on the systems engineer's part and there is a possibility that the search may not even yield the needed results. To help the researcher and the systems engineers, it was recommended by the conference participants of NASA Propagation Experimenters (NAPEX) XV (London, Ontario, Canada, June 28 and 29, 1991) that a software should be produced that would contain propagation models and the necessary prediction methods of most propagation phenomena. Moreover, the software should be flexible enough for the user to make slight changes to the models without expending a substantial effort in programming. In the past few years, a software was produced to fit these requirements as best as could be done. The software was distributed to all NAPEX participants for evaluation and use, the participant reactions, suggestions etc., were gathered and were used to improve the subsequent releases of the software. The existing database program is in the Microsoft Excel application software and works fine within the guidelines of that environment, however, recently there have been some questions about the robustness and survivability of the Excel software in the ever changing (hopefully improving) world of software packages.

  14. Characteristics of low-and high-fat beef patties: effect of high hydrostatic pressure.

    PubMed

    Carballo, J; Fernandez, P; Carrascosa, A V; Solas, M T; Colmenero, F J

    1997-01-01

    The purpose of this study was to analyze the consequences of applying high pressures (100 and 300 MPa for 5 or 20 min) on characteristics such as water- and fat-binding properties, texture, color, microstructure, and microbiology of low-fat (9.2%) and high-fat (20.3%) beef patties. In nonpressurized patties, the low-fat product exhibited significantly poorer (P < 0.05) binding properties and higher (P < 0.05) Kramer shear force and Kramer energy than did high-fat patties. Although high pressure did not clearly influence the binding properties of low- and high-fat beef patties, it did produce a rise in the Kramer shear force and energy which were more pronounced at 300 MPa. High pressures altered patty color, the extent of alteration depending on fat content, pressure, and pressurizing time. Pressurizing high- and low-fat beef patties at 300 MPa not only produced a lethal effect (P < 0.05) on microorganisms, but caused sublethal damage as well.

  15. Field Programmable Gate Array Reliability Analysis Guidelines for Launch Vehicle Reliability Block Diagrams

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  16. [Stress management in large-scale establishments].

    PubMed

    Fukasawa, Kenji

    2002-07-01

    Due to a recent dramatic change in industrial structures in Japan, the role of large-scale enterprises is changing. Mass production used to be the major income sources of companies, but nowadays it has changed to high value-added products, including, software development. As a consequence of highly competitive inter-corporate development, there are various sources of job stress which induce health problems in employees, especially those concerned with development or management. To simply to obey the law or offer medical care are not enough to achieve management of these problems. Occupational health staff need to act according to the disease type and provide care with support from the Supervisor and Personnel Division. And for the training, development and consultation system, occupational health staff must work with the Personnel Division and Safety Division, and be approved by management supervisors.

  17. Would Consolidation of Army Software Engineering Organizations Help to Control Software Costs for Current and Future Systems

    DTIC Science & Technology

    2015-04-16

    Specific to Work and Organization .................................................................... 32 Summary of Questions Specific to Work and...Limitations include assumptions that the work identified in the software center’s 6 mission and functions manual (10-1; CECOM, 2011) as well as in public...that produced RDECOM. The focus was on the movement of positions based on the position job series, not on the work that was actually being performed

  18. Recommended approach to sofware development

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.; Page, J.; Eslinger, S.; Church, V.; Merwarth, P.

    1983-01-01

    A set of guideline for an organized, disciplined approach to software development, based on data collected and studied for 46 flight dynamics software development projects. Methods and practices for each phase of a software development life cycle that starts with requirements analysis and ends with acceptance testing are described; maintenance and operation is not addressed. For each defined life cycle phase, guidelines for the development process and its management, and the products produced and their reviews are presented.

  19. MicroSIFT Courseware Evaluations [Set 15 (362-388) and Set 16 (389-441), with an Index Listing the Contents of Each Set (Sets 1-16) and a Cumulative Subject Index (Sets 1-16)].

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This document consists of 80 microcomputer software package evaluations prepared by the MicroSIFT (Microcomputer Software and Information for Teachers) Clearinghouse at the Northwest Regional Education Laboratory. Set 15 consists of 27 packages; set 16 consists of 53 packages. Each software review lists producer, time and place of evaluation,…

  20. Construction of Ultradense Linkage Maps with Lep-MAP2: Stickleback F2 Recombinant Crosses as an Example

    PubMed Central

    Rastas, Pasi; Calboli, Federico C. F.; Guo, Baocheng; Shikano, Takahito; Merilä, Juha

    2016-01-01

    High-density linkage maps are important tools for genome biology and evolutionary genetics by quantifying the extent of recombination, linkage disequilibrium, and chromosomal rearrangements across chromosomes, sexes, and populations. They provide one of the best ways to validate and refine de novo genome assemblies, with the power to identify errors in assemblies increasing with marker density. However, assembly of high-density linkage maps is still challenging due to software limitations. We describe Lep-MAP2, a software for ultradense genome-wide linkage map construction. Lep-MAP2 can handle various family structures and can account for achiasmatic meiosis to gain linkage map accuracy. Simulations show that Lep-MAP2 outperforms other available mapping software both in computational efficiency and accuracy. When applied to two large F2-generation recombinant crosses between two nine-spined stickleback (Pungitius pungitius) populations, it produced two high-density (∼6 markers/cM) linkage maps containing 18,691 and 20,054 single nucleotide polymorphisms. The two maps showed a high degree of synteny, but female maps were 1.5–2 times longer than male maps in all linkage groups, suggesting genome-wide recombination suppression in males. Comparison with the genome sequence of the three-spined stickleback (Gasterosteus aculeatus) revealed a high degree of interspecific synteny with a low frequency (<5%) of interchromosomal rearrangements. However, a fairly large (ca. 10 Mb) translocation from autosome to sex chromosome was detected in both maps. These results illustrate the utility and novel features of Lep-MAP2 in assembling high-density linkage maps, and their usefulness in revealing evolutionarily interesting properties of genomes, such as strong genome-wide sex bias in recombination rates. PMID:26668116

Top