Sample records for high consequence software

  1. Quality Market: Design and Field Study of Prediction Market for Software Quality Control

    ERIC Educational Resources Information Center

    Krishnamurthy, Janaki

    2010-01-01

    Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…

  2. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  3. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  4. Architecture for interoperable software in biology.

    PubMed

    Bare, James Christopher; Baliga, Nitin S

    2014-07-01

    Understanding biological complexity demands a combination of high-throughput data and interdisciplinary skills. One way to bring to bear the necessary combination of data types and expertise is by encapsulating domain knowledge in software and composing that software to create a customized data analysis environment. To this end, simple flexible strategies are needed for interconnecting heterogeneous software tools and enabling data exchange between them. Drawing on our own work and that of others, we present several strategies for interoperability and their consequences, in particular, a set of simple data structures--list, matrix, network, table and tuple--that have proven sufficient to achieve a high degree of interoperability. We provide a few guidelines for the development of future software that will function as part of an interoperable community of software tools for biological data analysis and visualization. © The Author 2012. Published by Oxford University Press.

  5. High-school software development project helps increasing students' awareness of geo-hydrological hazards and their risks

    NASA Astrophysics Data System (ADS)

    Marchesini, Ivan; Rossi, Mauro; Balducci, Vinicio; Salvati, Paola; Guzzetti, Fausto; Bianchini, Andrea; Grzeleswki, Emanuell; Canonico, Andrea; Coccia, Rita; Fiorucci, Gianni Mario; Gobbi, Francesca; Ciuchetti, Monica

    2015-04-01

    In Italy, inundation and landslides are widespread phenomena that impact the population and cause significant economic damage to private and public properties. The perception of the risk posed by these natural geo-hydrological hazards varies geographically and in time. The variation in the perception of the risks has negative consequences on risk management, and limits the adoption of effective risk reduction strategies. We maintain that targeted education can foster the understanding of geo-hydrological hazards, improving their perception and the awareness of the associated risk. Collaboration of a research center experienced in geo-hydrological hazards and risks (CNR IRPI, Perugia) and a high school (ITIS Alessandro Volta, Perugia) has resulted in the design and execution of a project aimed at improving the perception of geo-hydrological risks in high school students and teachers through software development. In the two-year project, students, high school teachers and research scientists have jointly developed software broadly related to landslide and flood hazards. User requirements and system specifications were decided to facilitate the distribution and use of the software among students and their peers. This allowed a wider distribution of the project results. We discuss two prototype software developed by the high school students, including an application of augmented reality for improved dissemination of information of landslides and floods with human consequences in Italy, and a crowd science application to allow students (and others, including their families and friends) to collect information on landslide and flood occurrence exploiting modern mobile devices. This information can prove important e.g., for the validation of landslide forecasting models.

  6. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  7. [Three-dimensional 3D modeling: First applications in radioanatomy and interventional radiology under CT guidance].

    PubMed

    Aubry, S; Pousse, A; Sarliève, P; Laborie, L; Delabrousse, E; Kastler, B

    2006-11-01

    To model vertebrae in 3D to improve radioanatomic knowledge of the spine with the vascular and nerve environment and simulate CT-guided interventions. Vertebra acquisitions were made with multidetector CT. We developed segmentation software and specific viewer software using the Delphi programming environment. This segmentation software makes it possible to model 3D high-resolution segments of vertebrae and their environment from multidetector CT acquisitions. Then the specific viewer software provides multiplanar reconstructions of the CT volume and the possibility to select different 3D objects of interest. This software package improves radiologists' radioanatomic knowledge through a new 3D anatomy presentation. Furthermore, the possibility of inserting virtual 3D objects in the volume can simulate CT-guided intervention. The first volumetric radioanatomic software has been born. Furthermore, it simulates CT-guided intervention and consequently has the potential to facilitate learning interventions using CT guidance.

  8. System support software for the Space Ultrareliable Modular Computer (SUMC)

    NASA Technical Reports Server (NTRS)

    Hill, T. E.; Hintze, G. C.; Hodges, B. C.; Austin, F. A.; Buckles, B. P.; Curran, R. T.; Lackey, J. D.; Payne, R. E.

    1974-01-01

    The highly transportable programming system designed and implemented to support the development of software for the Space Ultrareliable Modular Computer (SUMC) is described. The SUMC system support software consists of program modules called processors. The initial set of processors consists of the supervisor, the general purpose assembler for SUMC instruction and microcode input, linkage editors, an instruction level simulator, a microcode grid print processor, and user oriented utility programs. A FORTRAN 4 compiler is undergoing development. The design facilitates the addition of new processors with a minimum effort and provides the user quasi host independence on the ground based operational software development computer. Additional capability is provided to accommodate variations in the SUMC architecture without consequent major modifications in the initial processors.

  9. Ensuring critical event sequences in high consequence computer based systems as inspired by path expressions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kidd, M.E.C.

    1997-02-01

    The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.

  10. Ethical education in software engineering: responsibility in the production of complex systems.

    PubMed

    Génova, Gonzalo; González, M Rosario; Fraga, Anabel

    2007-12-01

    Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics.

  11. Dynamic visualization techniques for high consequence software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-02-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification. The prototype tool is described along with the requirements constraint language after a brief literature review is presented. Examples of howmore » the tool can be used are also presented. In conclusion, the most significant advantage of this tool is to provide a first step in evaluating specification completeness, and to provide a more productive method for program comprehension and debugging. The expected payoff is increased software surety confidence, increased program comprehension, and reduced development and debugging time.« less

  12. Formal Validation of Aerospace Software

    NASA Astrophysics Data System (ADS)

    Lesens, David; Moy, Yannick; Kanig, Johannes

    2013-08-01

    Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.

  13. Would Boys and Girls Benefit from Gender-Specific Educational Software?

    ERIC Educational Resources Information Center

    Luik, Piret

    2011-01-01

    Most boys and girls interact differently with educational software and have different preferences for the design of educational software. The question is whether the usage of educational software has the same consequences for both genders. This paper investigates the characteristics of drill-and-practice programmes or drills that are efficient for…

  14. Fragment Impact Toolkit (FIT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shevitz, Daniel Wolf; Key, Brian P.; Garcia, Daniel B.

    2017-09-05

    The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.

  15. Canputer Science and Technology: Introduction to Software Packages

    DTIC Science & Technology

    1984-04-01

    Table 5 Sources of Software Packages.20 Table 6 Reference Services Matrix . 33 Table 7 Reference Matrix.40 LIST OF FIGURES Figure 1 Document...consideration should be given to the acquisition of appropriate software packages to replace or upgrade existing services and to provide services not...Consequently, there are many companies that produce only software packages, and are committed to providing training, service , and support. These vendors

  16. Learning to Write Programs with Others: Collaborative Quadruple Programming

    ERIC Educational Resources Information Center

    Arora, Ritu; Goel, Sanjay

    2012-01-01

    Most software development is carried out by teams of software engineers working collaboratively to achieve the desired goal. Consequently software development education not only needs to develop a student's ability to write programs that can be easily comprehended by others and be able to comprehend programs written by others, but also the ability…

  17. Checklists for the Evaluation of Educational Software: Critical Review and Prospects.

    ERIC Educational Resources Information Center

    Tergan, Sigmar-Olaf

    1998-01-01

    Reviews strengths and weaknesses of check lists for the evaluation of computer software and outlines consequences for their practical application. Suggests an approach based on an instructional design model and a comprehensive framework to cope with problems of validity and predictive power of software evaluation. Discusses prospects of the…

  18. Secure it now or secure it later: the benefits of addressing cyber-security from the outset

    NASA Astrophysics Data System (ADS)

    Olama, Mohammed M.; Nutaro, James

    2013-05-01

    The majority of funding for research and development (R&D) in cyber-security is focused on the end of the software lifecycle where systems have been deployed or are nearing deployment. Recruiting of cyber-security personnel is similarly focused on end-of-life expertise. By emphasizing cyber-security at these late stages, security problems are found and corrected when it is most expensive to do so, thus increasing the cost of owning and operating complex software systems. Worse, expenditures on expensive security measures often mean less money for innovative developments. These unwanted increases in cost and potential slowing of innovation are unavoidable consequences of an approach to security that finds and remediate faults after software has been implemented. We argue that software security can be improved and the total cost of a software system can be substantially reduced by an appropriate allocation of resources to the early stages of a software project. By adopting a similar allocation of R&D funds to the early stages of the software lifecycle, we propose that the costs of cyber-security can be better controlled and, consequently, the positive effects of this R&D on industry will be much more pronounced.

  19. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  20. Big Science, Small-Budget Space Experiment Package Aka MISSE-5: A Hardware And Software Perspective

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan

    2007-01-01

    Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.

  1. A Hardware and Software Perspective of the Fifth Materials on the International Space Station Experiment (MISSE-5)

    NASA Technical Reports Server (NTRS)

    Krasowski, Michael; Greer, Lawrence; Flatico, Joseph; Jenkins, Phillip; Spina, Dan

    2005-01-01

    Conducting space experiments with small budgets is a fact of life for many design groups with low-visibility science programs. One major consequence is that specialized space grade electronic components are often too costly to incorporate into the design. Radiation mitigation now becomes more complex as a result of being restricted to the use of commercial off-the-shelf (COTS) parts. Unique hardware and software design techniques are required to succeed in producing a viable instrument suited for use in space. This paper highlights some of the design challenges and associated solutions encountered in the production of a highly capable, low cost space experiment package.

  2. Integrating existing software toolkits into VO system

    NASA Astrophysics Data System (ADS)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  3. RMP Guidance for Offsite Consequence Analysis

    EPA Pesticide Factsheets

    Offsite consequence analysis (OCA) consists of a worst-case release scenario and alternative release scenarios. OCA is required from facilities with chemicals above threshold quantities. RMP*Comp software can be used to perform calculations described here.

  4. Choices and Consequences.

    ERIC Educational Resources Information Center

    Thorp, Carmany

    1995-01-01

    Describes student use of Hyperstudio computer software to create history adventure games. History came alive while students learned efficient writing skills; learned to understand and manipulate cause, effect choice and consequence; and learned to incorporate succinct locational, climatic, and historical detail. (ET)

  5. Separation in Logistic Regression: Causes, Consequences, and Control.

    PubMed

    Mansournia, Mohammad Ali; Geroldinger, Angelika; Greenland, Sander; Heinze, Georg

    2018-04-01

    Separation is encountered in regression models with a discrete outcome (such as logistic regression) where the covariates perfectly predict the outcome. It is most frequent under the same conditions that lead to small-sample and sparse-data bias, such as presence of a rare outcome, rare exposures, highly correlated covariates, or covariates with strong effects. In theory, separation will produce infinite estimates for some coefficients. In practice, however, separation may be unnoticed or mishandled because of software limits in recognizing and handling the problem and in notifying the user. We discuss causes of separation in logistic regression and describe how common software packages deal with it. We then describe methods that remove separation, focusing on the same penalized-likelihood techniques used to address more general sparse-data problems. These methods improve accuracy, avoid software problems, and allow interpretation as Bayesian analyses with weakly informative priors. We discuss likelihood penalties, including some that can be implemented easily with any software package, and their relative advantages and disadvantages. We provide an illustration of ideas and methods using data from a case-control study of contraceptive practices and urinary tract infection.

  6. Automatic programming for critical applications

    NASA Technical Reports Server (NTRS)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  7. Bias and design in software specifications

    NASA Technical Reports Server (NTRS)

    Straub, Pablo A.; Zelkowitz, Marvin V.

    1990-01-01

    Implementation bias in a specification is an arbitrary constraint in the solution space. Presented here is a model of bias in software specifications. Bias is defined in terms of the specification process and a classification of the attributes of the software product. Our definition of bias provides insight into both the origin and the consequences of bias. It also shows that bias is relative and essentially unavoidable. Finally, we describe current work on defining a measure of bias, formalizing our model, and relating bias to software defects.

  8. Epistemic Questions and Answers for Software System Safety

    NASA Technical Reports Server (NTRS)

    Holloway, C. M.; Johnson, Chris W.

    2010-01-01

    System safety is primarily concerned with epistemic questions, that is, questions concerning knowledge and the degree of confidence that can be placed in that knowledge. For systems with which human experience is long, such as roads, bridges, and mechanical devices, knowledge about what is required to make the systems safe is deep and detailed. High confidence can be placed in the validity of that knowledge. For other systems, however, with which human experience is comparatively short, such as those that rely in part or in whole on software, knowledge about what is required to ensure safety tends to be shallow and general. The confidence that can be placed in the validity of that knowledge is consequently low. In a previous paper, we enumerated a collection of foundational epistemic questions concerning software system safety. In this paper, we review and refine the questions, discuss some difficulties that attend to answering the questions today, and speculate on possible research to improve the situation.

  9. Advantages and Disadvantages in Image Processing with Free Software in Radiology.

    PubMed

    Mujika, Katrin Muradas; Méndez, Juan Antonio Juanes; de Miguel, Andrés Framiñan

    2018-01-15

    Currently, there are sophisticated applications that make it possible to visualize medical images and even to manipulate them. These software applications are of great interest, both from a teaching and a radiological perspective. In addition, some of these applications are known as Free Open Source Software because they are free and the source code is freely available, and therefore it can be easily obtained even on personal computers. Two examples of free open source software are Osirix Lite® and 3D Slicer®. However, this last group of free applications have limitations in its use. For the radiological field, manipulating and post-processing images is increasingly important. Consequently, sophisticated computing tools that combine software and hardware to process medical images are needed. In radiology, graphic workstations allow their users to process, review, analyse, communicate and exchange multidimensional digital images acquired with different image-capturing radiological devices. These radiological devices are basically CT (Computerised Tomography), MRI (Magnetic Resonance Imaging), PET (Positron Emission Tomography), etc. Nevertheless, the programs included in these workstations have a high cost which always depends on the software provider and is always subject to its norms and requirements. With this study, we aim to present the advantages and disadvantages of these radiological image visualization systems in the advanced management of radiological studies. We will compare the features of the VITREA2® and AW VolumeShare 5® radiology workstation with free open source software applications like OsiriX® and 3D Slicer®, with examples from specific studies.

  10. Evaluation of radiological dispersion/consequence codes supporting DOE nuclear facility SARs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O`Kula, K.R.; Paik, I.K.; Chung, D.Y.

    1996-12-31

    Since the early 1990s, the authorization basis documentation of many U.S. Department of Energy (DOE) nuclear facilities has been upgraded to comply with DOE orders and standards. In this process, many safety analyses have been revised. Unfortunately, there has been nonuniform application of software, and the most appropriate computer and engineering methodologies often are not applied. A DOE Accident Phenomenology and Consequence (APAC) Methodology Evaluation Program was originated at the request of DOE Defense Programs to evaluate the safety analysis methodologies used in nuclear facility authorization basis documentation and to define future cost-effective support and development initiatives. Six areas, includingmore » source term development (fire, spills, and explosion analysis), in-facility transport, and dispersion/ consequence analysis (chemical and radiological) are contained in the APAC program. The evaluation process, codes considered, key results, and recommendations for future model and software development of the Radiological Dispersion/Consequence Working Group are summarized in this paper.« less

  11. General Mode Scanning Probe Microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Somnath, Suhas; Jesse, Stephen

    A critical part of SPM measurements is the information transfer from the probe-sample junction to the measurement system. Current information transfer methods heavily compress the information-rich data stream by averaging the data over a time interval, or via heterodyne detection approaches such as lock-in amplifiers and phase-locked loops. As a consequence, highly valuable information at the sub-microsecond time scales or information from frequencies outside the measurement band is lost. We have developed a fundamentally new approach called General Mode (G-mode), where we can capture the complete information stream from the detectors in the microscope. The availability of the complete informationmore » allows the microscope operator to analyze the data via information-theory analysis or comprehensive physical models. Furthermore, the complete data stream enables advanced data-driven filtering algorithms, multi-resolution imaging, ultrafast spectroscropic imaging, spatial mapping of multidimensional variability in material properties, etc. Though we applied this approach to scanning probe microscopy, the general philosophy of G-mode can be applied to many other modes of microscopy. G-mode data is captured by completely custom software written in LabVIEW and Matlab. The software generates the waveforms to electrically, thermally, or mechanically excite the SPM probe. It handles real-time communications with the microscope software for operations such as moving the SPM probe position and also controls other instrumentation hardware. The software also controls multiple variants of high-speed data acquisition cards to excite the SPM probe with the excitation waveform and simultaneously measure multiple channels of information from the microscope detectors at sampling rates of 1-100 MHz. The software also saves the raw data to the computer and allows the microscope operator to visualize processed or filtered data during the experiment. The software performs all these features while offering a user-friendly interface.« less

  12. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  13. Integrated Software Health Management for Aircraft GN and C

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole

    2011-01-01

    Modern aircraft rely heavily on dependable operation of many safety-critical software components. Despite careful design, verification and validation (V&V), on-board software can fail with disastrous consequences if it encounters problematic software/hardware interaction or must operate in an unexpected environment. We are using a Bayesian approach to monitor the software and its behavior during operation and provide up-to-date information about the health of the software and its components. The powerful reasoning mechanism provided by our model-based Bayesian approach makes reliable diagnosis of the root causes possible and minimizes the number of false alarms. Compilation of the Bayesian model into compact arithmetic circuits makes SWHM feasible even on platforms with limited CPU power. We show initial results of SWHM on a small simulator of an embedded aircraft software system, where software and sensor faults can be injected.

  14. Integrated software system for low level waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worku, G.

    1995-12-31

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less

  15. Automated Source-Code-Based Testing of Object-Oriented Software

    NASA Astrophysics Data System (ADS)

    Gerlich, Ralf; Gerlich, Rainer; Dietrich, Carsten

    2014-08-01

    With the advent of languages such as C++ and Java in mission- and safety-critical space on-board software, new challenges for testing and specifically automated testing arise. In this paper we discuss some of these challenges, consequences and solutions based on an experiment in automated source- code-based testing for C++.

  16. AIS Modeling and a Satellite for AIS Observations in the High North + Draft New ITU-R Report Improved Satellite Detection of AIS

    DTIC Science & Technology

    2008-09-01

    20 Aug 08) Report Documentation Page Form ApprovedOMB No . 0704-0188 Public reporting burden for the collection of information is estimated to average...Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person...to eliminate propagation delay – 3 min. reporting interval to lower message rate • Vessel consequences: – No extra hardware – Transponder software

  17. A Discussion of Using a Reconfigurable Processor to Implement the Discrete Fourier Transform

    NASA Technical Reports Server (NTRS)

    White, Michael J.

    2004-01-01

    This paper presents the design and implementation of the Discrete Fourier Transform (DFT) algorithm on a reconfigurable processor system. While highly applicable to many engineering problems, the DFT is an extremely computationally intensive algorithm. Consequently, the eventual goal of this work is to enhance the execution of a floating-point precision DFT algorithm by off loading the algorithm from the computing system. This computing system, within the context of this research, is a typical high performance desktop computer with an may of field programmable gate arrays (FPGAs). FPGAs are hardware devices that are configured by software to execute an algorithm. If it is desired to change the algorithm, the software is changed to reflect the modification, then download to the FPGA, which is then itself modified. This paper will discuss methodology for developing the DFT algorithm to be implemented on the FPGA. We will discuss the algorithm, the FPGA code effort, and the results to date.

  18. Teachers' Initial Orchestration of Students' Dynamic Geometry Software Use: Consequences for Students' Opportunities to Learn Mathematics

    ERIC Educational Resources Information Center

    Erfjord, Ingvald

    2011-01-01

    This paper reports from a case study with teachers at two schools in Norway participating in developmental projects aiming for inquiry communities in mathematics teaching and learning. In the reported case study, the teachers participated in one of the developmental projects focusing on implementation and use of computer software in mathematics…

  19. Use of Continuous Integration Tools for Application Performance Monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vergara Larrea, Veronica G; Joubert, Wayne; Fuson, Christopher B

    High performance computing systems are becom- ing increasingly complex, both in node architecture and in the multiple layers of software stack required to compile and run applications. As a consequence, the likelihood is increasing for application performance regressions to occur as a result of routine upgrades of system software components which interact in complex ways. The purpose of this study is to evaluate the effectiveness of continuous integration tools for application performance monitoring on HPC systems. In addition, this paper also describes a prototype system for application perfor- mance monitoring based on Jenkins, a Java-based continuous integration tool. The monitoringmore » system described leverages several features in Jenkins to track application performance results over time. Preliminary results and lessons learned from monitoring applications on Cray systems at the Oak Ridge Leadership Computing Facility are presented.« less

  20. Introduction: Cybersecurity and Software Assurance Minitrack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burns, Luanne; George, Richard; Linger, Richard C

    Modern society is dependent on software systems of remarkable scope and complexity. Yet methods for assuring their security and functionality have not kept pace. The result is persistent compromises and failures despite best efforts. Cybersecurity methods must work together for situational awareness, attack prevention and detection, threat attribution, minimization of consequences, and attack recovery. Because defective software cannot be secure, assurance technologies must play a central role in cybersecurity approaches. There is increasing recognition of the need for rigorous methods for cybersecurity and software assurance. The goal of this minitrack is to develop science foundations, technologies, and practices that canmore » improve the security and dependability of complex systems.« less

  1. The theory and implementation of a high quality pulse width modulated waveform synthesiser applicable to voltage FED inverters

    NASA Astrophysics Data System (ADS)

    Lower, Kim Nigel

    1985-03-01

    Modulation processes associated with the digital implementation of pulse width modulation (PWM) switching strategies were examined. A software package based on a portable turnkey structure is presented. Waveform synthesizer implementation techniques are reviewed. A three phase PWM waveform synthesizer for voltage fed inverters was realized. It is based on a constant carrier frequency of 18 kHz and a regular sample, single edge, asynchronous PWM switching scheme. With high carrier frequencies, it is possible to utilize simple switching strategies and as a consequence, many advantages are highlighted, emphasizing the importance to industrial and office markets.

  2. NASA's TReK Project: A Case Study in Using the Spiral Model of Software Development

    NASA Technical Reports Server (NTRS)

    Hendrix, T. Dean; Schneider, Michelle P.

    1998-01-01

    Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.

  3. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  4. Evidence of absence (v2.0) software user guide

    USGS Publications Warehouse

    Dalthorp, Daniel; Huso, Manuela; Dail, David

    2017-07-06

    Evidence of Absence software (EoA) is a user-friendly software application for estimating bird and bat fatalities at wind farms and for designing search protocols. The software is particularly useful in addressing whether the number of fatalities is below a given threshold and what search parameters are needed to give assurance that thresholds were not exceeded. The software also includes tools (1) for estimating carcass persistence distributions and searcher efficiency parameters ( and ) from field trials, (2) for projecting future mortality based on past monitoring data, and (3) for exploring the potential consequences of various choices in the design of long-term incidental take permits for protected species. The software was designed specifically for cases where tolerance for mortality is low and carcass counts are small or even 0, but the tools also may be used for mortality estimates when carcass counts are large.

  5. Combating unethical publications with plagiarism detection services

    PubMed Central

    Garner, H.R.

    2010-01-01

    About 3,000 new citations that are highly similar to citations in previously published manuscripts that appear each year in the biomedical literature (Medline) alone. This underscores the importance for the opportunity for editors and reviewers to have detection system to identify highly similar text in submitted manuscripts so that they can then review them for novelty. New software-based services, both commercial and free, provide this capability. The availability of such tools provides both a way to intercept suspect manuscripts and serve as a deterrent. Unfortunately, the capabilities of these services vary considerably, mainly as a consequence of the availability and completeness of the literature bases to which new queries are compared. Most of the commercial software has been designed for detection of plagiarism in high school and college papers, however, there is at least one fee-based service (CrossRef) and one free service (etblast.org) which are designed to target the needs of the biomedical publication industry. Information on these various services, examples of the type of operability and output, and things that need to be considered by publishers, editors and reviewers before selecting and using these services is provided. PMID:21194644

  6. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  7. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  8. Combating unethical publications with plagiarism detection services.

    PubMed

    Garner, H R

    2011-01-01

    About 3,000 new citations that are highly similar to citations in previously published manuscripts that appear each year in the biomedical literature (Medline) alone. This underscores the importance for the opportunity for editors and reviewers to have detection system to identify highly similar text in submitted manuscripts so that they can then review them for novelty. New software-based services, both commercial and free, provide this capability. The availability of such tools provides both a way to intercept suspect manuscripts and serve as a deterrent. Unfortunately, the capabilities of these services vary considerably, mainly as a consequence of the availability and completeness of the literature bases to which new queries are compared. Most of the commercial software has been designed for detection of plagiarism in high school and college papers; however, there is at least 1 fee-based service (CrossRef) and 1 free service (etblast.org), which are designed to target the needs of the biomedical publication industry. Information on these various services, examples of the type of operability and output, and things that need to be considered by publishers, editors, and reviewers before selecting and using these services is provided. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. A system verification platform for high-density epiretinal prostheses.

    PubMed

    Chen, Kuanfu; Lo, Yi-Kai; Yang, Zhi; Weiland, James D; Humayun, Mark S; Liu, Wentai

    2013-06-01

    Retinal prostheses have restored light perception to people worldwide who have poor or no vision as a consequence of retinal degeneration. To advance the quality of visual stimulation for retinal implant recipients, a higher number of stimulation channels is expected in the next generation retinal prostheses, which poses a great challenge to system design and verification. This paper presents a system verification platform dedicated to the development of retinal prostheses. The system includes primary processing, dual-band power and data telemetry, a high-density stimulator array, and two methods for output verification. End-to-end system validation and individual functional block characterization can be achieved with this platform through visual inspection and software analysis. Custom-built software running on the computers also provides a good way for testing new features before they are realized by the ICs. Real-time visual feedbacks through the video displays make it easy to monitor and debug the system. The characterization of the wireless telemetry and the demonstration of the visual display are reported in this paper using a 256-channel retinal prosthetic IC as an example.

  10. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less

  11. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  12. An intelligent healthcare management system: a new approach in work-order prioritization for medical equipment maintenance requests.

    PubMed

    Hamdi, Naser; Oweis, Rami; Abu Zraiq, Hamzeh; Abu Sammour, Denis

    2012-04-01

    The effective maintenance management of medical technology influences the quality of care delivered and the profitability of healthcare facilities. Medical equipment maintenance in Jordan lacks an objective prioritization system; consequently, the system is not sensitive to the impact of equipment downtime on patient morbidity and mortality. The current work presents a novel software system (EQUIMEDCOMP) that is designed to achieve valuable improvements in the maintenance management of medical technology. This work-order prioritization model sorts medical maintenance requests by calculating a priority index for each request. Model performance was assessed by utilizing maintenance requests from several Jordanian hospitals. The system proved highly efficient in minimizing equipment downtime based on healthcare delivery capacity, and, consequently, patient outcome. Additionally, a preventive maintenance optimization module and an equipment quality control system are incorporated. The system is, therefore, expected to improve the reliability of medical equipment and significantly improve safety and cost-efficiency.

  13. Viewing the functional consequences of traumatic brain injury by using brain SPECT.

    PubMed

    Pavel, D; Jobe, T; Devore-Best, S; Davis, G; Epstein, P; Sinha, S; Kohn, R; Craita, I; Liu, P; Chang, Y

    2006-03-01

    High-resolution brain SPECT is increasingly benefiting from improved image processing software and multiple complementary display capabilities. This enables detailed functional mapping of the disturbances in relative perfusion occurring after TBI. The patient population consisted of 26 cases (ages 8-61 years)between 3 months and 6 years after traumatic brain injury.A very strong case can be made for the routine use of Brain SPECT in TBI. Indeed it can provide a detailed evaluation of multiple functional consequences after TBI and is thus capable of supplementing the clinical evaluation and tailoring the therapeutic strategies needed. In so doing it also provides significant additional information beyond that available from MRI/CT. The critical factor for Brain SPECT's clinical relevance is a carefully designed technical protocol, including displays which should enable a comprehensive description of the patterns found, in a user friendly mode.

  14. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    PubMed

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  15. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software.

    PubMed

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians.

  16. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software

    PubMed Central

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians. PMID:25996054

  17. A practical tool for monitoring the performance of measuring systems in a laboratory network: report of an ACB Working Group.

    PubMed

    Ayling, Pete; Hill, Robert; Jassam, Nuthar; Kallner, Anders; Khatami, Zahra

    2017-11-01

    Background A logical consequence of the introduction of robotics and high-capacity analysers has seen a consolidation to larger units. This requires new structures and quality systems to ensure that laboratories deliver consistent and comparable results. Methods A spreadsheet program was designed to accommodate results from up to 12 different instruments/laboratories and present IQC data, i.e. Levey-Jennings and Youden plots and comprehensive numerical tables of the performance of each item. Input of data was made possible by a 'data loader' by which IQC data from the individual instruments could be transferred to the spreadsheet program on line. Results A set of real data from laboratories is used to populate the data loader and the networking software program. Examples are present from the analysis of variance components, the Levey-Jennings and Youden plots. Conclusions This report presents a software package that allows the simultaneous management and detailed monitoring of the performance of up to 12 different instruments/laboratories in a fully interactive mode. The system allows a quality manager of networked laboratories to have a continuous updated overview of the performance. This software package has been made available at the ACB website.

  18. Multi-axis control based on movement control cards in NC systems

    NASA Astrophysics Data System (ADS)

    Jiang, Tingbiao; Wei, Yunquan

    2005-12-01

    Today most movement control cards need special control software of topper computers and are only suitable for fixed-axis controls. Consequently, the number of axes which can be controlled is limited. Advanced manufacture technology develops at a very high speed, and that development brings forth. New requirements for movement control in mechanisms and electronics. This paper introduces products of the 5th generation of movement control cards, PMAC 2A-PC/104, made by the Delta Tau Company in the USA. Based on an analysis of PMAC 2A-PC/104, this paper first describes two aspects relevant to the hardware structure of movement control cards and the interrelated software of the topper computers. Then, two methods are presented for solving these problems. The first method is to set limit switches on the movement control cards; all of them can be used to control each moving axis. The second method is to program applied software with existing programming language (for example, VC ++, Visual Basic, Delphi, and so forth). This program is much easier to operate and expand by its users. By using a limit switch, users can choose different axes in movement control cards. Also, users can change parts of the parameters in the control software of topper computers to realize different control axes. Combining these 2 methods proves to be convenient for realizing multi-axis control in numerical control systems.

  19. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-04

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC .

  20. Integration of the Remote Agent for the NASA Deep Space One Autonomy Experiment

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Bernard, Douglas E.; Gamble, Edward B., Jr.; Kanefsky, Bob; Kurien, James; Muscettola, Nicola; Nayak, P. Pandurang; Rajan, Kanna; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes the integration of the Remote Agent (RA), a spacecraft autonomy system which is scheduled to control the Deep Space 1 spacecraft during a flight experiment in 1999. The RA is a reusable, model-based autonomy system that is quite different from software typically used to control an aerospace system. We describe the integration challenges we faced, how we addressed them, and the lessons learned. We focus on those aspects of integrating the RA that were either easier or more difficult than integrating a more traditional large software application because the RA is a model-based autonomous system. A number of characteristics of the RA made integration process easier. One example is the model-based nature of RA. Since the RA is model-based, most of its behavior is not hard coded into procedural program code. Instead, engineers specify high level models of the spacecraft's components from which the Remote Agent automatically derives correct system-wide behavior on the fly. This high level, modular, and declarative software description allowed some interfaces between RA components and between RA and the flight software to be automatically generated and tested for completeness against the Remote Agent's models. In addition, the Remote Agent's model-based diagnosis system automatically diagnoses when the RA models are not consistent with the behavior of the spacecraft. In flight, this feature is used to diagnose failures in the spacecraft hardware. During integration, it proved valuable in finding problems in the spacecraft simulator or flight software. In addition, when modifications are made to the spacecraft hardware or flight software, the RA models are easily changed because they only capture a description of the spacecraft. one does not have to maintain procedural code that implements the correct behavior for every expected situation. On the other hand, several features of the RA made it more difficult to integrate than typical flight software. For example, the definition of correct behavior is more difficult to specify for a system that is expected to reason about and flexibly react to its environment than for a traditional flight software system. Consequently, whenever a change is made to the RA it is more time consuming to determine if the resulting behavior is correct. We conclude the paper with a discussion of future work on the Remote Agent as well as recommendations to ease integration of similar autonomy projects.

  1. Real-time UNIX in HEP data acquisition

    NASA Astrophysics Data System (ADS)

    Buono, S.; Gaponenko, I.; Jones, R.; Mapelli, L.; Mornacchi, G.; Prigent, D.; Sanchez-Corral, E.; Skiadelli, M.; Toppers, A.; Duval, P. Y.; Ferrato, D.; Le Van Suu, A.; Qian, Z.; Rondot, C.; Ambrosini, G.; Fumagalli, G.; Aguer, M.; Huet, M.

    1994-12-01

    Today's experimentation in high energy physics is characterized by an increasing need for sensitivity to rare phenomena and complex physics signatures, which require the use of huge and sophisticated detectors and consequently a high performance readout and data acquisition. Multi-level triggering, hierarchical data collection and an always increasing amount of processing power, distributed throughout the data acquisition layers, will impose a number of features on the software environment, especially the need for a high level of standardization. Real-time UNIX seems, today, the best solution for the platform independence, operating system interface standards and real-time features necessary for data acquisition in HEP experiments. We present the results of the evaluation, in a realistic application environment, of a Real-Time UNIX operating system: the EP/LX real-time UNIX system.

  2. [Software for illustrating a cost-quality balance carried out by clinical laboratory practice].

    PubMed

    Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi

    2010-09-01

    We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial

  3. Extracting patterns of database and software usage from the bioinformatics literature

    PubMed Central

    Duck, Geraint; Nenadic, Goran; Brass, Andy; Robertson, David L.; Stevens, Robert

    2014-01-01

    Motivation: As a natural consequence of being a computer-based discipline, bioinformatics has a strong focus on database and software development, but the volume and variety of resources are growing at unprecedented rates. An audit of database and software usage patterns could help provide an overview of developments in bioinformatics and community common practice, and comparing the links between resources through time could demonstrate both the persistence of existing software and the emergence of new tools. Results: We study the connections between bioinformatics resources and construct networks of database and software usage patterns, based on resource co-occurrence, that correspond to snapshots of common practice in the bioinformatics community. We apply our approach to pairings of phylogenetics software reported in the literature and argue that these could provide a stepping stone into the identification of scientific best practice. Availability and implementation: The extracted resource data, the scripts used for network generation and the resulting networks are available at http://bionerds.sourceforge.net/networks/ Contact: robert.stevens@manchester.ac.uk PMID:25161253

  4. RMP*Comp

    EPA Pesticide Factsheets

    You can use this free software program to complete the Off-site Consequence Analyses (both worst case scenarios and alternative scenarios) required under the Risk Management Program rule, so that you don't have to do calculations by hand.

  5. Computational science: shifting the focus from tools to models

    PubMed Central

    Hinsen, Konrad

    2014-01-01

    Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728

  6. Integrated Systems Health Management (ISHM) Toolkit

    NASA Technical Reports Server (NTRS)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  7. An Inquiry into the Cost of Post Deployment Software Support (PDSS)

    DTIC Science & Technology

    1989-09-01

    Equations .......... ii vi AFIT/GLM/LSY/835- I0 The increasing cost of software maintenance is taking a larger share of the military bidget each year... increments as needed (3:59). The second page of tne Form 75 starts with a section stating how the hours, and consequently the funds, will be allocated to...length of time required, the timeline can be in hourly, weekly, mnunthly, or quarterly increments . Some milestones included are formal approval, test

  8. MediaTracker system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sandoval, D. M.; Strittmatter, R. B.; Abeyta, J. D.

    2004-01-01

    The initial objectives of this effort were to provide a hardware and software platform that can address the requirements for the accountability of classified removable electronic media and vault access logging. The Media Tracker system software assists classified media custodian in managing vault access logging and Media Tracking to prevent the inadvertent violation of rules or policies for the access to a restricted area and the movement and use of tracked items. The MediaTracker system includes the software tools to track and account for high consequence security assets and high value items. The overall benefits include: (1) real-time access tomore » the disposition of all Classified Removable Electronic Media (CREM), (2) streamlined security procedures and requirements, (3) removal of ambiguity and managerial inconsistencies, (4) prevention of incidents that can and should be prevented, (5) alignment with the DOE's initiative to achieve improvements in security and facility operations through technology deployment, and (6) enhanced individual responsibility by providing a consistent method of dealing with daily responsibilities. In response to initiatives to enhance the control of classified removable electronic media (CREM), the Media Tracker software suite was developed, piloted and implemented at the Los Alamos National Laboratory beginning in July 2000. The Media Tracker software suite assists in the accountability and tracking of CREM and other high-value assets. One component of the MediaTracker software suite provides a Laboratory-approved media tracking system. Using commercial touch screen and bar code technology, the MediaTracker (MT) component of the MediaTracker software suite provides an efficient and effective means to meet current Laboratory requirements and provides new-engineered controls to help assure compliance with those requirements. It also establishes a computer infrastructure at vault entrances for vault access logging, and can accommodate several methods of positive identification including smart cards and biometrics. Currently, we have three mechanisms that provide added security for accountability and tracking purposes. One mechanism consists of a portable, hand-held inventory scanner, which allows the custodian to physically track the items that are not accessible within a particular area. The second mechanism is a radio frequency identification (RFID) consisting of a monitoring portal, which tracks and logs in a database all activity tagged of items that pass through the portals. The third mechanism consists of an electronic tagging of a flash memory device for automated inventory of CREM in storage. By modifying this USB device the user is provided with added assurance, limiting the data from being obtained from any other computer.« less

  9. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  10. Aviation Environmental Design Tool (AEDT): Version 2d: Installation Guide

    DOT National Transportation Integrated Search

    2017-09-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  11. Vienna Fortran - A Language Specification. Version 1.1

    DTIC Science & Technology

    1992-03-01

    other computer archi- tectures is the fact that the memory is physically distributed among the processors; the time required to access a non-local...datum may be an order of magnitude higher than the time taken to access locally stored data. This has important consequences for program efficiency. In...machine in many aspects. It is tedious, time -consuming and error prone. It has led to particularly slow software development cycles and, in consequence

  12. SharP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkata, Manjunath Gorentla; Aderholdt, William F

    The pre-exascale systems are expected to have a significant amount of hierarchical and heterogeneous on-node memory, and this trend of system architecture in extreme-scale systems is expected to continue into the exascale era. along with hierarchical-heterogeneous memory, the system typically has a high-performing network ad a compute accelerator. This system architecture is not only effective for running traditional High Performance Computing (HPC) applications (Big-Compute), but also for running data-intensive HPC applications and Big-Data applications. As a consequence, there is a growing desire to have a single system serve the needs of both Big-Compute and Big-Data applications. Though the system architecturemore » supports the convergence of the Big-Compute and Big-Data, the programming models and software layer have yet to evolve to support either hierarchical-heterogeneous memory systems or the convergence. A programming abstraction to address this problem. The programming abstraction is implemented as a software library and runs on pre-exascale and exascale systems supporting current and emerging system architecture. Using distributed data-structures as a central concept, it provides (1) a simple, usable, and portable abstraction for hierarchical-heterogeneous memory and (2) a unified programming abstraction for Big-Compute and Big-Data applications.« less

  13. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.

    PubMed

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo

    2016-12-13

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.

  14. cn.FARMS: a latent variable model to detect copy number variations in microarray data with a low false discovery rate.

    PubMed

    Clevert, Djork-Arné; Mitterecker, Andreas; Mayr, Andreas; Klambauer, Günter; Tuefferd, Marianne; De Bondt, An; Talloen, Willem; Göhlmann, Hinrich; Hochreiter, Sepp

    2011-07-01

    Cost-effective oligonucleotide genotyping arrays like the Affymetrix SNP 6.0 are still the predominant technique to measure DNA copy number variations (CNVs). However, CNV detection methods for microarrays overestimate both the number and the size of CNV regions and, consequently, suffer from a high false discovery rate (FDR). A high FDR means that many CNVs are wrongly detected and therefore not associated with a disease in a clinical study, though correction for multiple testing takes them into account and thereby decreases the study's discovery power. For controlling the FDR, we propose a probabilistic latent variable model, 'cn.FARMS', which is optimized by a Bayesian maximum a posteriori approach. cn.FARMS controls the FDR through the information gain of the posterior over the prior. The prior represents the null hypothesis of copy number 2 for all samples from which the posterior can only deviate by strong and consistent signals in the data. On HapMap data, cn.FARMS clearly outperformed the two most prevalent methods with respect to sensitivity and FDR. The software cn.FARMS is publicly available as a R package at http://www.bioinf.jku.at/software/cnfarms/cnfarms.html.

  15. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes

    PubMed Central

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J.; Wang, Liliang; Lin, Jianguo

    2016-01-01

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions. PMID:28060298

  16. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  17. NASGRO(registered trademark): Fracture Mechanics and Fatigue Crack Growth Analysis Software

    NASA Technical Reports Server (NTRS)

    Forman, Royce; Shivakumar, V.; Mettu, Sambi; Beek, Joachim; Williams, Leonard; Yeh, Feng; McClung, Craig; Cardinal, Joe

    2004-01-01

    This viewgraph presentation describes NASGRO, which is a fracture mechanics and fatigue crack growth analysis software package that is used to reduce risk of fracture in Space Shuttles. The contents include: 1) Consequences of Fracture; 2) NASA Fracture Control Requirements; 3) NASGRO Reduces Risk; 4) NASGRO Use Inside NASA; 5) NASGRO Components: Crack Growth Module; 6) NASGRO Components:Material Property Module; 7) Typical NASGRO analysis: Crack growth or component life calculation; and 8) NASGRO Sample Application: Orbiter feedline flowliner crack analysis.

  18. In Law We Trust? Trusted Computing and Legal Responsibility for Internet Security

    NASA Astrophysics Data System (ADS)

    Danidou, Yianna; Schafer, Burkhard

    This paper analyses potential legal responses and consequences to the anticipated roll out of Trusted Computing (TC). It is argued that TC constitutes such a dramatic shift in power away from users to the software providers, that it is necessary for the legal system to respond. A possible response is to mirror the shift in power by a shift in legal responsibility, creating new legal liabilities and duties for software companies as the new guardians of internet security.

  19. Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [December 2015

    DOT National Transportation Integrated Search

    2015-12-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  20. Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [June 2016

    DOT National Transportation Integrated Search

    2016-06-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  1. Aviation Environmental Design Tool (AEDT): Version 2b: Installation Guide : [July 2015

    DOT National Transportation Integrated Search

    2015-07-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  2. Interactive multimedia preventive alcohol education: a technology application in higher education.

    PubMed

    Reis, J; Riley, W; Lokman, L; Baer, J

    2000-01-01

    This article summarizes the process of implementation and short-term impact on knowledge and attitudes of an interactive multimedia software program on preventive alcohol education for young adults. The three factors related to behavioral change addressed in the software are self-efficacy in maintaining personal control and safety while using alcohol, attitudes and related expectations regarding the physiological and behavioral consequences of alcohol consumption, and peer norms regarding alcohol consumption. As compared to alternative alcohol education and a no-alcohol education groups, students using the interactive computer lesson reported learning more about dose-response and ways to intervene with friends in peril. The article concludes with consideration of the import of this technology for informing students about the consequences of alcohol use, and the utility to higher education institutions of using this technology in an era when pressures increase for due diligence around student safety but with few additional institutional resources.

  3. PolyPhred analysis software for mutation detection from fluorescence-based sequence data.

    PubMed

    Montgomery, Kate T; Iartchouck, Oleg; Li, Li; Loomis, Stephanie; Obourn, Vanessa; Kucherlapati, Raju

    2008-10-01

    The ability to search for genetic variants that may be related to human disease is one of the most exciting consequences of the availability of the sequence of the human genome. Large cohorts of individuals exhibiting certain phenotypes can be studied and candidate genes resequenced. However, the challenge of analyzing sequence data from many individuals with accuracy, speed, and economy is great. This unit describes one set of software tools: Phred, Phrap, PolyPhred, and Consed. Coverage includes the advantages and disadvantages of these analysis tools, details for obtaining and using the software, and the results one may expect. The software is being continually updated to permit further automation of mutation analysis. Currently, however, at least some manual review is required if one wishes to identify 100% of the variants in a sample set.

  4. Mapping of H.264 decoding on a multiprocessor architecture

    NASA Astrophysics Data System (ADS)

    van der Tol, Erik B.; Jaspers, Egbert G.; Gelderblom, Rob H.

    2003-05-01

    Due to the increasing significance of development costs in the competitive domain of high-volume consumer electronics, generic solutions are required to enable reuse of the design effort and to increase the potential market volume. As a result from this, Systems-on-Chip (SoCs) contain a growing amount of fully programmable media processing devices as opposed to application-specific systems, which offered the most attractive solutions due to a high performance density. The following motivates this trend. First, SoCs are increasingly dominated by their communication infrastructure and embedded memory, thereby making the cost of the functional units less significant. Moreover, the continuously growing design costs require generic solutions that can be applied over a broad product range. Hence, powerful programmable SoCs are becoming increasingly attractive. However, to enable power-efficient designs, that are also scalable over the advancing VLSI technology, parallelism should be fully exploited. Both task-level and instruction-level parallelism can be provided by means of e.g. a VLIW multiprocessor architecture. To provide the above-mentioned scalability, we propose to partition the data over the processors, instead of traditional functional partitioning. An advantage of this approach is the inherent locality of data, which is extremely important for communication-efficient software implementations. Consequently, a software implementation is discussed, enabling e.g. SD resolution H.264 decoding with a two-processor architecture, whereas High-Definition (HD) decoding can be achieved with an eight-processor system, executing the same software. Experimental results show that the data communication considerably reduces up to 65% directly improving the overall performance. Apart from considerable improvement in memory bandwidth, this novel concept of partitioning offers a natural approach for optimally balancing the load of all processors, thereby further improving the overall speedup.

  5. Aviation Environmental Design Tool (AEDT): Version 2c Service Pack 2: Installation Guide

    DOT National Transportation Integrated Search

    2017-03-01

    Aviation Environmental Design Tool (AEDT) is a software system that models aircraft performance in space and time to estimate fuel consumption, emissions, noise, and air quality consequences. AEDT facilitates environmental review activities required ...

  6. ATLAS particle detector CSC ROD software design and implementation, and, Addition of K physics to chi-squared analysis of FDQM

    NASA Astrophysics Data System (ADS)

    Hawkins, Donovan Lee

    In this thesis I present a software framework for use on the ATLAS muon CSC readout driver. This C++ framework uses plug-in Decoders incorporating hand-optimized assembly language routines to perform sparsification and data formatting. The software is designed with both flexibility and performance in mind, and runs on a custom 9U VME board using Texas Instruments TMS360C6203 digital signal processors. I describe the requirements of the software, the methods used in its design, and the results of testing the software with simulated data. I also present modifications to a chi-squared analysis of the Standard Model and Four Down Quark Model (FDQM) originally done by Dr. Dennis Silverman. The addition of four new experiments to the analysis has little effect on the Standard Model but provides important new restrictions on the FDQM. The method used to incorporate these new experiments is presented, and the consequences of their addition are reviewed.

  7. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  8. Choosing a software design method for real-time Ada applications: JSD process inversion as a means to tailor a design specification to the performance requirements and target machine

    NASA Technical Reports Server (NTRS)

    Withey, James V.

    1986-01-01

    The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.

  9. Using multi-attribute decision-making approaches in the selection of a hospital management system.

    PubMed

    Arasteh, Mohammad Ali; Shamshirband, Shahaboddin; Yee, Por Lip

    2018-01-01

    The most appropriate organizational software is always a real challenge for managers, especially, the IT directors. The illustration of the term "enterprise software selection", is to purchase, create, or order a software that; first, is best adapted to require of the organization; and second, has suitable price and technical support. Specifying selection criteria and ranking them, is the primary prerequisite for this action. This article provides a method to evaluate, rank, and compare the available enterprise software for choosing the apt one. The prior mentioned method is constituted of three-stage processes. First, the method identifies the organizational requires and assesses them. Second, it selects the best method throughout three possibilities; indoor-production, buying software, and ordering special software for the native use. Third, the method evaluates, compares and ranks the alternative software. The third process uses different methods of multi attribute decision making (MADM), and compares the consequent results. Based on different characteristics of the problem; several methods had been tested, namely, Analytic Hierarchy Process (AHP), Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS), Elimination and Choice Expressing Reality (ELECTURE), and easy weight method. After all, we propose the most practical method for same problems.

  10. Field Programmable Gate Array Failure Rate Estimation Guidelines for Launch Vehicle Fault Tree Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Today's launch vehicles complex electronic and avionics systems heavily utilize Field Programmable Gate Array (FPGA) integrated circuits (IC) for their superb speed and reconfiguration capabilities. Consequently, FPGAs are prevalent ICs in communication protocols such as MILSTD- 1553B and in control signal commands such as in solenoid valve actuations. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  11. A Discussion of the Software Quality Assurance Role

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2010-01-01

    The basic idea underlying this paper is that the conventional understanding of the role of a Software Quality Assurance (SQA) engineer is unduly limited. This is because few have asked who the customers of a SQA engineer are. Once you do this, you can better define what tasks a SQA engineer should perform, as well as identify the knowledge and skills that such a person should have. The consequence of doing this is that a SQA engineer can provide greater value to his or her customers. It is the position of this paper that a SQA engineer providing significant value to his or her customers must not only assume the role of an auditor, but also that of a software and systems engineer. This is because software engineers and their managers particularly value contributions that directly impact products and their development. These ideas are summarized as lessons learned, based on my experience at Jet Propulsion Laboratory (JPL).

  12. A planning language for activity scheduling

    NASA Technical Reports Server (NTRS)

    Zoch, David R.; Lavallee, David; Weinstein, Stuart; Tong, G. Michael

    1991-01-01

    Mission planning and scheduling of spacecraft operations are becoming more complex at NASA. Described here are a mission planning process; a robust, flexible planning language for spacecraft and payload operations; and a software scheduling system that generates schedules based on planning language inputs. The mission planning process often involves many people and organizations. Consequently, a planning language is needed to facilitate communication, to provide a standard interface, and to represent flexible requirements. The software scheduling system interprets the planning language and uses the resource, time duration, constraint, and alternative plan flexibilities to resolve scheduling conflicts.

  13. Bureaucracy, Safety and Software: a Potentially Lethal Cocktail

    NASA Astrophysics Data System (ADS)

    Hatton, Les

    This position paper identifies a potential problem with the evolution of software controlled safety critical systems. It observes that the rapid growth of bureaucracy in society quickly spills over into rules for behaviour. Whether the need for the rules comes first or there is simple anticipation of the need for a rule by a bureaucrat is unclear in many cases. Many such rules lead to draconian restrictions and often make the existing situation worse due to the presence of unintended consequences as will be shown with a number of examples.

  14. Psychosocial risks, burnout and intention to quit following the introduction of new software at work.

    PubMed

    Knani, Mouna; Fournier, Pierre-Sébastien; Biron, Caroline

    2018-05-01

    Despite a rich literature on association between psychosocial factors, the demand-control-support (DCS) model and burnout, there are few integrated frameworks encompassing the DCS model, burnout and intention to quit, particularly in a technological context. This manuscript examines the relationships between psychosocial risks, the demand-control-support (DCS) model, burnout syndrome and intention to quit following the introduction of new software at work. Data was collected from agents and advisors working at a Canadian university and using newstudy management software. An online questionnaire was sent via the university's internal mail. Finally, 112 people completed the online survey for a response rate of 60.9% . The results of structural equation modeling show that psychological demands, decision latitude and social support are associated with burnout. It is also clear that burnout, in particular depersonalization and emotional exhaustion, is positively associated with intention to quit. The few studies that raise the negative consequences of technology on quality of life in the workplace, and particularly on health, have not succeeded in establishing a direct link between a deterioration of health and the use of technology. This is due to the fact that there are few epidemiological studies on the direct consequences of the use of ITC on health.

  15. SVM classifier on chip for melanoma detection.

    PubMed

    Afifi, Shereen; GholamHosseini, Hamid; Sinha, Roopak

    2017-07-01

    Support Vector Machine (SVM) is a common classifier used for efficient classification with high accuracy. SVM shows high accuracy for classifying melanoma (skin cancer) clinical images within computer-aided diagnosis systems used by skin cancer specialists to detect melanoma early and save lives. We aim to develop a medical low-cost handheld device that runs a real-time embedded SVM-based diagnosis system for use in primary care for early detection of melanoma. In this paper, an optimized SVM classifier is implemented onto a recent FPGA platform using the latest design methodology to be embedded into the proposed device for realizing online efficient melanoma detection on a single system on chip/device. The hardware implementation results demonstrate a high classification accuracy of 97.9% and a significant acceleration factor of 26 from equivalent software implementation on an embedded processor, with 34% of resources utilization and 2 watts for power consumption. Consequently, the implemented system meets crucial embedded systems constraints of high performance and low cost, resources utilization and power consumption, while achieving high classification accuracy.

  16. Impact of detector simulation in particle physics collider experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elvira, V. Daniel

    Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less

  17. Impact of detector simulation in particle physics collider experiments

    DOE PAGES

    Elvira, V. Daniel

    2017-06-01

    Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less

  18. Implementation errors in the GingerALE Software: Description and recommendations.

    PubMed

    Eickhoff, Simon B; Laird, Angela R; Fox, P Mickle; Lancaster, Jack L; Fox, Peter T

    2017-01-01

    Neuroscience imaging is a burgeoning, highly sophisticated field the growth of which has been fostered by grant-funded, freely distributed software libraries that perform voxel-wise analyses in anatomically standardized three-dimensional space on multi-subject, whole-brain, primary datasets. Despite the ongoing advances made using these non-commercial computational tools, the replicability of individual studies is an acknowledged limitation. Coordinate-based meta-analysis offers a practical solution to this limitation and, consequently, plays an important role in filtering and consolidating the enormous corpus of functional and structural neuroimaging results reported in the peer-reviewed literature. In both primary data and meta-analytic neuroimaging analyses, correction for multiple comparisons is a complex but critical step for ensuring statistical rigor. Reports of errors in multiple-comparison corrections in primary-data analyses have recently appeared. Here, we report two such errors in GingerALE, a widely used, US National Institutes of Health (NIH)-funded, freely distributed software package for coordinate-based meta-analysis. These errors have given rise to published reports with more liberal statistical inferences than were specified by the authors. The intent of this technical report is threefold. First, we inform authors who used GingerALE of these errors so that they can take appropriate actions including re-analyses and corrective publications. Second, we seek to exemplify and promote an open approach to error management. Third, we discuss the implications of these and similar errors in a scientific environment dependent on third-party software. Hum Brain Mapp 38:7-11, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  19. Impact of detector simulation in particle physics collider experiments

    NASA Astrophysics Data System (ADS)

    Daniel Elvira, V.

    2017-06-01

    Through the last three decades, accurate simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics (HEP) experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detector simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the precision of the physics results and publication turnaround, from data-taking to submission. It also presents estimates of the cost and economic impact of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data with increasingly complex detectors, taxing heavily the performance of simulation and reconstruction software. Consequently, exploring solutions to speed up simulation and reconstruction software to satisfy the growing demand of computing resources in a time of flat budgets is a matter that deserves immediate attention. The article ends with a short discussion on the potential solutions that are being considered, based on leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering HEP code for concurrency and parallel computing.

  20. Pedagogical issues for effective teaching of biosignal processing and analysis.

    PubMed

    Sandham, William A; Hamilton, David J

    2010-01-01

    Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.

  1. D Cultural Heritage Documentation: a Comparison Between Different Photogrammetric Software and Their Products

    NASA Astrophysics Data System (ADS)

    Gagliolo, S.; Ausonio, E.; Federici, B.; Ferrando, I.; Passoni, D.; Sguerso, D.

    2018-05-01

    The conservation of Cultural Heritage depends on the availability of means and resources and, consequently, on the possibility to make effective operations of data acquisition. In facts, on the one hand the creation of data repositories allows the description of the present state-of-art, in order to preserve the testimonial value and to permit the fruition. On the other hand, data acquisition grants a metrical knowledge, which is particularly useful for a direct restoration of the surveyed objects, through the analysis of their 3D digital models. In the last decades, the continuous increase and improvement of 3D survey techniques and of tools for the geometric and digital data management have represented a great support to the development of documentary activities. In particular, Photogrammetry is a survey technique highly appropriate in the creation of data repositories in the field of Cultural Heritage, thanks to its advantages of cheapness, flexibility, speed, and the opportunity to ensure the operators' safety in hazardous areas too. In order to obtain a complete documentation, the high precision of the on-site operations must be coupled with an effective post-processing phase. Hence, a comparison among some of the photogrammetric software currently available was performed by the authors, with a particular attention to the workflow completeness and the final products quality.

  2. Cloud4Psi: cloud computing for 3D protein structure similarity searching.

    PubMed

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-10-01

    Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.

  3. Three-dimensional finite volume modelling of blood flow in simulated angular neck abdominal aortic aneurysm

    NASA Astrophysics Data System (ADS)

    Algabri, Y. A.; Rookkapan, S.; Chatpun, S.

    2017-09-01

    An abdominal aortic aneurysm (AAA) is considered a deadly cardiovascular disease that defined as a focal dilation of blood artery. The healthy aorta size is between 15 and 24 mm based on gender, bodyweight, and age. When the diameter increased to 30 mm or more, the rupture can occur if it is kept growing or untreated. Moreover, the proximal angular neck of aneurysm is categorized as a significant morphological feature with prime harmful effects on endovascular aneurysm repair (EVAR). Flow pattern in pathological vessel can influence the vascular intervention. The aim of this study is to investigate the blood flow behaviours in angular neck abdominal aortic aneurysm with simulated geometry based on patient’s information using computational fluid dynamics (CFD). The 3D angular neck AAA models have been designed by using SolidWorks Software. Consequently, CFD tools are used for simulating these 3D models of angular neck AAA in ANSYS FLUENT Software. Eventually, based on the results, we summarized that the CFD techniques have shown high performance in explaining and investigating the flow patterns for angular neck abdominal aortic aneurysm.

  4. Cloud4Psi: cloud computing for 3D protein structure similarity searching

    PubMed Central

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-01-01

    Summary: Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Availability and implementation: Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. Contact: dariusz.mrozek@polsl.pl PMID:24930141

  5. How to Submit a Risk Management Plan (RMP) to EPA

    EPA Pesticide Factsheets

    RMP*eSubmit software is the only way to submit RMPs. After you have prepared your plan using RMP*eSubmit, you may also re-submit, correct, or withdraw an RMP. Another electronic tool, RMP*Comp, performs the required off-site consequence analysis.

  6. Field Programmable Gate Array Reliability Analysis Guidelines for Launch Vehicle Reliability Block Diagrams

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Britton, Paul; Hatfield, Glen Spencer; Novack, Steven D.

    2017-01-01

    Field Programmable Gate Arrays (FPGAs) integrated circuits (IC) are one of the key electronic components in today's sophisticated launch and space vehicle complex avionic systems, largely due to their superb reprogrammable and reconfigurable capabilities combined with relatively low non-recurring engineering costs (NRE) and short design cycle. Consequently, FPGAs are prevalent ICs in communication protocols and control signal commands. This paper will identify reliability concerns and high level guidelines to estimate FPGA total failure rates in a launch vehicle application. The paper will discuss hardware, hardware description language, and radiation induced failures. The hardware contribution of the approach accounts for physical failures of the IC. The hardware description language portion will discuss the high level FPGA programming languages and software/code reliability growth. The radiation portion will discuss FPGA susceptibility to space environment radiation.

  7. [Stress management in large-scale establishments].

    PubMed

    Fukasawa, Kenji

    2002-07-01

    Due to a recent dramatic change in industrial structures in Japan, the role of large-scale enterprises is changing. Mass production used to be the major income sources of companies, but nowadays it has changed to high value-added products, including, software development. As a consequence of highly competitive inter-corporate development, there are various sources of job stress which induce health problems in employees, especially those concerned with development or management. To simply to obey the law or offer medical care are not enough to achieve management of these problems. Occupational health staff need to act according to the disease type and provide care with support from the Supervisor and Personnel Division. And for the training, development and consultation system, occupational health staff must work with the Personnel Division and Safety Division, and be approved by management supervisors.

  8. General guidelines for biomedical software development

    PubMed Central

    Silva, Luis Bastiao; Jimenez, Rafael C.; Blomberg, Niklas; Luis Oliveira, José

    2017-01-01

    Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic. PMID:28443186

  9. An efficient approach to the deployment of complex open source information systems

    PubMed Central

    Cong, Truong Van Chi; Groeneveld, Eildert

    2011-01-01

    Complex open source information systems are usually implemented as component-based software to inherit the available functionality of existing software packages developed by third parties. Consequently, the deployment of these systems not only requires the installation of operating system, application framework and the configuration of services but also needs to resolve the dependencies among components. The problem becomes more challenging when the application must be installed and used on different platforms such as Linux and Windows. To address this, an efficient approach using the virtualization technology is suggested and discussed in this paper. The approach has been applied in our project to deploy a web-based integrated information system in molecular genetics labs. It is a low-cost solution to benefit both software developers and end-users. PMID:22102770

  10. Antecedents and Consequences of Work Engagement Among Nurses

    PubMed Central

    Sohrabizadeh, Sanaz; Sayfouri, Nasrin

    2014-01-01

    Background: Engaged nurses have high levels of energy and are enthusiastic about their work which impacts quality of health care services. However, in the context of Iran, due to observed burnout, work engagement among nurses necessitates immediate exploration. Objectives: This investigation aimed to identify a suitable work engagement model in nursing profession in hospitals according to the hypothesized model and to determine antecedents and consequences related to work engagement among nurses. Patients and Methods: In this cross-sectional study, a questionnaire was given to 279 randomly-selected nurses working in two general teaching hospitals of Shiraz University of Medical Sciences (Shiraz, Iran) to measure antecedents and consequences of work engagement using the Saks’s (2005) model. Structural Equation Modeling was used to examine the model fitness. Results: Two paths were added using LISREL software. The resulting model showed good fitness indices (χ2 = 23.62, AGFI = 0.93, CFI = 0.97, RMSEA = 0.07) and all the coefficients of the paths were significant (t ≥ 2, t ≤ -2). A significant correlation was found between work engagement and model variables. Conclusions: Paying adequate attention to the antecedents of work engagement can enhance the quality of performance among nurses. Additionally, rewards, organizational and supervisory supports, and job characteristics should be taken into consideration to establish work engagement among nurses. Further researches are required to identify other probable antecedents and consequences of nursing work engagement, which might be related to specific cultural settings. PMID:25763212

  11. CAD/CAM and scientific data management at Dassault

    NASA Technical Reports Server (NTRS)

    Bohn, P.

    1984-01-01

    The history of CAD/CAM and scientific data management at Dassault are presented. Emphasis is put on the targets of the now commercially available software CATIA. The links with scientific computations such as aerodynamics and structural analysis are presented. Comments are made on the principles followed within the company. The consequences of the approximative nature of scientific data are examined. Consequence of the new history function is mainly its protection against copy or alteration. Future plans at Dassault for scientific data appear to be in opposite directions compared to some general tendencies.

  12. Erratum to "Solar Sources and Geospace Consequences of Interplanetary Magnetic Clouds Observed During Solar Cycle 23-Paper 1" [J. Atmos. Sol.-Terr. Phys. 70(2-4) (2008) 245-253

    NASA Technical Reports Server (NTRS)

    Gopalswamy, N.; Akiyama, S.; Yashiro, S.; Michalek, G.; Lepping, R. P.

    2009-01-01

    One of the figures (Fig. 4) in "Solar sources and geospace consequences of interplanetary magnetic Clouds observed during solar cycle 23 -- Paper 1" by Gopalswamy et al. (2008, JASTP, Vol. 70, Issues 2-4, February 2008, pp. 245-253) is incorrect because of a software error in t he routine that was used to make the plot. The source positions of various magnetic cloud (MC) types are therefore not plotted correctly.

  13. High/Scope Buyer's Guide to Children's Software. 11th Edition.

    ERIC Educational Resources Information Center

    Hohmann, Charles; And Others

    This 11th edition of the High/Scope Buyer's Guide to Children's Software was designed to help teachers, caregivers, and parents make good choices when purchasing software to enhance children's learning. The book consists of an introduction, a chapter on finding the best software, software reviews for 48 different software products. The…

  14. Plagiarism in Academia

    ERIC Educational Resources Information Center

    Shahabuddin, Syed

    2009-01-01

    Plagiarism sometimes creates legal and ethical problems for students and faculty. It can have serious consequences. Fortunately, there are ways to stop plagiarism. There are many tools available to detect plagiarism, e.g. using software for detecting submitted articles. Also, there are many ways to punish a plagiarist, e.g. banning plagiarists…

  15. DEVELOPMENT OF A COMPUTER ASSISTED PERSONAL INTERVIEW SOFTWARE SYSTEM FOR COLLECTION OF TRIBAL FISH CONSUMPTION DATA

    EPA Science Inventory

    Abstract: Native Americans who consume seafood often have higher seafood consumption rates and consequently greater exposures to contaminants in seafood than the general U.S. population. Defensible and quantifiable tribal seafood consumption rates are needed for development of ...

  16. An open-source Java-based Toolbox for environmental model evaluation: The MOUSE Software Application

    USDA-ARS?s Scientific Manuscript database

    A consequence of environmental model complexity is that the task of understanding how environmental models work and identifying their sensitivities/uncertainties, etc. becomes progressively more difficult. Comprehensive numerical and visual evaluation tools have been developed such as the Monte Carl...

  17. Low-Cost, High-Throughput Sequencing of DNA Assemblies Using a Highly Multiplexed Nextera Process.

    PubMed

    Shapland, Elaine B; Holmes, Victor; Reeves, Christopher D; Sorokin, Elena; Durot, Maxime; Platt, Darren; Allen, Christopher; Dean, Jed; Serber, Zach; Newman, Jack; Chandran, Sunil

    2015-07-17

    In recent years, next-generation sequencing (NGS) technology has greatly reduced the cost of sequencing whole genomes, whereas the cost of sequence verification of plasmids via Sanger sequencing has remained high. Consequently, industrial-scale strain engineers either limit the number of designs or take short cuts in quality control. Here, we show that over 4000 plasmids can be completely sequenced in one Illumina MiSeq run for less than $3 each (15× coverage), which is a 20-fold reduction over using Sanger sequencing (2× coverage). We reduced the volume of the Nextera tagmentation reaction by 100-fold and developed an automated workflow to prepare thousands of samples for sequencing. We also developed software to track the samples and associated sequence data and to rapidly identify correctly assembled constructs having the fewest defects. As DNA synthesis and assembly become a centralized commodity, this NGS quality control (QC) process will be essential to groups operating high-throughput pipelines for DNA construction.

  18. Blending an Android Development Course with Software Engineering Concepts

    ERIC Educational Resources Information Center

    Chatzigeorgiou, Alexander; Theodorou, Tryfon L.; Violettas, George E.; Xinogalos, Stelios

    2016-01-01

    The tremendous popularity of mobile computing and Android in particular has attracted millions of developers who see opportunities for building their own start-ups. As a consequence Computer Science students express an increasing interest into the related technology of Java development for Android applications. Android projects are complex by…

  19. An App for Every Psychological Problem: Vision for the Future

    ERIC Educational Resources Information Center

    Brown, Alicia M.

    2016-01-01

    Innovations in mobile technology are occurring at a rapid pace. Mobile phone software applications (apps) in particular have great potential within the field of mental health. Lack of organizational oversight and hesitancy from providers to utilize mobile technology has delayed technological advancement--consequently limiting the ability of the…

  20. Integration of Computational Chemistry into the Undergraduate Organic Chemistry Laboratory Curriculum

    ERIC Educational Resources Information Center

    Esselman, Brian J.; Hill, Nicholas J.

    2016-01-01

    Advances in software and hardware have promoted the use of computational chemistry in all branches of chemical research to probe important chemical concepts and to support experimentation. Consequently, it has become imperative that students in the modern undergraduate curriculum become adept at performing simple calculations using computational…

  1. Cellular Consequences of Telomere Shortening in Histologically Normal Breast Tissues

    DTIC Science & Technology

    2013-09-01

    using the open source, JAVA -based image analysis software package ImageJ (http://rsb.info.nih.gov/ij/) and a custom designed plugin (“Telometer...Tabulated data were stored in a MySQL (http://www.mysql.com) database and viewed through Microsoft Access (Microsoft Corp.). Statistical Analysis For

  2. Cognitive Diagnostic Modeling Using R

    ERIC Educational Resources Information Center

    Ravand, Hamdollah

    2015-01-01

    Cognitive diagnostic models (CDM) have been around for more than a decade but their application is far from widespread for mainly two reasons: (1) CDMs are novel, as compared to traditional IRT models. Consequently, many researchers lack familiarity with them and their properties, and (2) Software programs doing CDMs have been expensive and not…

  3. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  4. Space-time evolution of a growth fold (Betic Cordillera, Spain). Evidences from 3D geometrical modelling

    NASA Astrophysics Data System (ADS)

    Martin-Rojas, Ivan; Alfaro, Pedro; Estévez, Antonio

    2014-05-01

    We present a study that encompasses several software tools (iGIS©, ArcGIS©, Autocad©, etc.) and data (geological mapping, high resolution digital topographic data, high resolution aerial photographs, etc.) to create a detailed 3D geometric model of an active fault propagation growth fold. This 3D model clearly shows structural features of the analysed fold, as well as growth relationships and sedimentary patterns. The results obtained permit us to discuss the kinematics and structural evolution of the fold and the fault in time and space. The study fault propagation fold is the Crevillente syncline. This fold represents the northern limit of the Bajo Segura Basin, an intermontane basin in the Eastern Betic Cordillera (SE Spain) developed from upper Miocene on. 3D features of the Crevillente syncline, including growth pattern, indicate that limb rotation and, consequently, fault activity was higher during Messinian than during Tortonian; consequently, fault activity was also higher. From Pliocene on our data point that limb rotation and fault activity steadies or probably decreases. This in time evolution of the Crevillente syncline is not the same all along the structure; actually the 3D geometric model indicates that observed lateral heterogeneity is related to along strike variation of fault displacement.

  5. Cyber Risk Management for Critical Infrastructure: A Risk Analysis Model and Three Case Studies.

    PubMed

    Paté-Cornell, M-Elisabeth; Kuypers, Marshall; Smith, Matthew; Keller, Philip

    2018-02-01

    Managing cyber security in an organization involves allocating the protection budget across a spectrum of possible options. This requires assessing the benefits and the costs of these options. The risk analyses presented here are statistical when relevant data are available, and system-based for high-consequence events that have not happened yet. This article presents, first, a general probabilistic risk analysis framework for cyber security in an organization to be specified. It then describes three examples of forward-looking analyses motivated by recent cyber attacks. The first one is the statistical analysis of an actual database, extended at the upper end of the loss distribution by a Bayesian analysis of possible, high-consequence attack scenarios that may happen in the future. The second is a systems analysis of cyber risks for a smart, connected electric grid, showing that there is an optimal level of connectivity. The third is an analysis of sequential decisions to upgrade the software of an existing cyber security system or to adopt a new one to stay ahead of adversaries trying to find their way in. The results are distributions of losses to cyber attacks, with and without some considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. © 2017 Society for Risk Analysis.

  6. Software support in automation of medicinal product evaluations.

    PubMed

    Juric, Radmila; Shojanoori, Reza; Slevin, Lindi; Williams, Stephen

    2005-01-01

    Medicinal product evaluation is one of the most important tasks undertaken by government health departments and their regulatory authorities, in every country in the world. The automation and adequate software support are critical tasks that can improve the efficiency and interoperation of regulatory systems across the world. In this paper we propose a software solution that supports the automation of the (i) submission of licensing applications, and (ii) evaluations of submitted licensing applications, according to regulatory authorities' procedures. The novelty of our solution is in allowing licensing applications to be submitted in any country in the world and evaluated according to any evaluation procedure (which can be chosen by either regulatory authorities or pharmaceutical companies). Consequently, submission and evaluation procedures become interoperable and the associated data repositories/databases can be shared between various countries and regulatory authorities.

  7. Architecture Design and Experimental Platform Demonstration of Optical Network based on OpenFlow Protocol

    NASA Astrophysics Data System (ADS)

    Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang

    2016-02-01

    With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.

  8. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences.

    PubMed

    Stephens, Susie M; Chen, Jake Y; Davidson, Marcel G; Thomas, Shiby; Trute, Barry M

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html.

  9. Storage system software solutions for high-end user needs

    NASA Technical Reports Server (NTRS)

    Hogan, Carole B.

    1992-01-01

    Today's high-end storage user is one that requires rapid access to a reliable terabyte-capacity storage system running in a distributed environment. This paper discusses conventional storage system software and concludes that this software, designed for other purposes, cannot meet high-end storage requirements. The paper also reviews the philosophy and design of evolving storage system software. It concludes that this new software, designed with high-end requirements in mind, provides the potential for solving not only the storage needs of today but those of the foreseeable future as well.

  10. The high-rate data challenge: computing for the CBM experiment

    NASA Astrophysics Data System (ADS)

    Friese, V.; CBM Collaboration

    2017-10-01

    The Compressed Baryonic Matter experiment (CBM) is a next-generation heavy-ion experiment to be operated at the FAIR facility, currently under construction in Darmstadt, Germany. A key feature of CBM is very high interaction rate, exceeding those of contemporary nuclear collision experiments by several orders of magnitude. Such interaction rates forbid a conventional, hardware-triggered readout; instead, experiment data will be freely streaming from self-triggered front-end electronics. In order to reduce the huge raw data volume to a recordable rate, data will be selected exclusively on CPU, which necessitates partial event reconstruction in real-time. Consequently, the traditional segregation of online and offline software vanishes; an integrated on- and offline data processing concept is called for. In this paper, we will report on concepts and developments for computing for CBM as well as on the status of preparations for its first physics run.

  11. Development of imaging techniques to study the pathogenesis of biosafety level 2/3 infectious agents

    PubMed Central

    Rella, Courtney E.; Ruel, Nancy; Eugenin, Eliseo A.

    2015-01-01

    Despite significant advances in microbiology and molecular biology over the last decades, several infectious diseases remain global concerns, resulting in the death of millions of people worldwide each year. According to the Center for Disease Control (CDC) in 2012, there were 34 million people infected with HIV, 8.7 million new cases of tuberculosis, 500 million cases of hepatitis, and 50–100 million people infected with dengue. Several of these pathogens, despite high incidence, do not have reliable clinical detection methods. New or improved protocols have been generated to enhance detection and quantitation of several pathogens using high-end microscopy (light, confocal, and STORM microscopy) and imaging software. In the current manuscript, we discuss these approaches and the theories behind these methodologies. Thus, advances in imaging techniques will open new possibilities to discover therapeutic interventions to reduce or eliminate the devastating consequences of infectious diseases. PMID:24990818

  12. Power calculation for overall hypothesis testing with high-dimensional commensurate outcomes.

    PubMed

    Chi, Yueh-Yun; Gribbin, Matthew J; Johnson, Jacqueline L; Muller, Keith E

    2014-02-28

    The complexity of system biology means that any metabolic, genetic, or proteomic pathway typically includes so many components (e.g., molecules) that statistical methods specialized for overall testing of high-dimensional and commensurate outcomes are required. While many overall tests have been proposed, very few have power and sample size methods. We develop accurate power and sample size methods and software to facilitate study planning for high-dimensional pathway analysis. With an account of any complex correlation structure between high-dimensional outcomes, the new methods allow power calculation even when the sample size is less than the number of variables. We derive the exact (finite-sample) and approximate non-null distributions of the 'univariate' approach to repeated measures test statistic, as well as power-equivalent scenarios useful to generalize our numerical evaluations. Extensive simulations of group comparisons support the accuracy of the approximations even when the ratio of number of variables to sample size is large. We derive a minimum set of constants and parameters sufficient and practical for power calculation. Using the new methods and specifying the minimum set to determine power for a study of metabolic consequences of vitamin B6 deficiency helps illustrate the practical value of the new results. Free software implementing the power and sample size methods applies to a wide range of designs, including one group pre-intervention and post-intervention comparisons, multiple parallel group comparisons with one-way or factorial designs, and the adjustment and evaluation of covariate effects. Copyright © 2013 John Wiley & Sons, Ltd.

  13. Automated Transfer Vehicle (ATV) Critical Safety Software Overview

    NASA Astrophysics Data System (ADS)

    Berthelier, D.

    2002-01-01

    The European Automated Transfer Vehicle is an unmanned transportation system designed to dock to International Space Station (ISS) and to contribute to the logistic servicing of the ISS. Concisely, ATV control is realized by a nominal flight control function (using computers, softwares, sensors, actuators). In order to cover the extreme situations where this nominal chain can not ensure safe trajectory with respect to ISS, a segregated proximity flight safety function is activated, where unsafe free drift trajectories can be encountered. This function relies notably on a segregated computer, the Monitoring and Safing Unit (MSU) ; in case of major ATV malfunction detection, ATV is then controlled by MSU software. Therefore, this software is critical because a MSU software failure could result in catastrophic consequences. This paper provides an overview both of this software functions and of the software development and validation method which is specific considering its criticality. First part of the paper describes briefly the proximity flight safety chain. Second part deals with the software functions. Indeed, MSU software is in charge of monitoring nominal computers and ATV corridors, using its own navigation algorithms, and, if an abnormal situation is detected, it is in charge of the ATV control during the Collision Avoidance Manoeuvre (CAM) consisting in an attitude controlled braking boost, followed by a Post-CAM manoeuvre : a Sun-pointed ATV attitude control during up to 24 hours on a safe trajectory. Monitoring, navigation and control algorithms principles are presented. Third part of this paper describes the development and validation process : algorithms functional studies , ADA coding and unit validations ; algorithms ADA code integration and validation on a specific non real-time MATLAB/SIMULINK simulator ; global software functional engineering phase, architectural design, unit testing, integration and validation on target computer.

  14. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  15. The Use of Computer Software to Teach High Technology Skills to Vocational Students.

    ERIC Educational Resources Information Center

    Farmer, Edgar I.

    A study examined the type of computer software that is best suited to teach high technology skills to vocational students. During the study, 50 manufacturers of computer software and hardware were sent questionnaires designed to gather data concerning their recommendations in regard to: software to teach high technology skills to vocational…

  16. Indepth investigation of the system currently used by the Las Vegas Metropolitan Police Department to store and process crash data and all other interconnected systems.

    DOT National Transportation Integrated Search

    2016-12-31

    Some of the existing software and hardware used by law enforcement agencies to collect crash data are obsolete for several reasons, ranging from budget constraints to lack of coordination across various groups. The most significant consequence of usi...

  17. CrossTalk. The Journal of Defense Software Engineering. Volume 27, Number 2. March/April 2014

    DTIC Science & Technology

    2014-04-01

    Consequently, it is no wonder that we subconsciously and consciously expect that same level of consistency to hold in the cyber domains, though...including: cyber defense, synthetic biology, advanced design tools, AI learning, and quantum compilers. He is currently a co-PI for a DARPA-funded

  18. Computer software to estimate timber harvesting system production, cost, and revenue

    Treesearch

    Dr. John E. Baumgras; Dr. Chris B. LeDoux

    1992-01-01

    Large variations in timber harvesting cost and revenue can result from the differences between harvesting systems, the variable attributes of harvesting sites and timber stands, or changing product markets. Consequently, system and site specific estimates of production rates and costs are required to improve estimates of harvesting revenue. This paper describes...

  19. A Role-Playing Virtual World for Web-Based Application Courses

    ERIC Educational Resources Information Center

    Depradine, Colin

    2007-01-01

    With the rapid development of the information communication and technology (ICT) infrastructure in the Caribbean, there is an increasing demand for skilled software developers to meet the ICT needs of the region. Consequently, the web-based applications course offered at the University of the West Indies, has been redeveloped. One major part of…

  20. Can We Apply TAM in Computer-Based Classes?

    ERIC Educational Resources Information Center

    Williams, David; Williams, Denise

    2013-01-01

    While students may struggle in any classroom and consequently require help beyond the schedule meeting time and place of the class, computer-based courses pose the additional hurdle of requiring ready access to hardware and software that may be unavailable or inconvenient for students outside of the classroom and its scheduled meeting time. This…

  1. Cognitive Consequences of Participation in a "Fifth Dimension" After-School Computer Club.

    ERIC Educational Resources Information Center

    Mayer, Richard E.; Quilici, Jill; Moreno, Roxana; Duran, Richard; Woodbridge, Scott; Simon, Rebecca; Sanchez, David; Lavezzo, Amy

    1997-01-01

    Children who attended the Fifth Dimension after-school computer club at least 10 times during the 1994-95 school year performed better on word problem comprehension tests than did non-participating children. Results support the hypothesis that experience in using computer software in the Fifth Dimension club produces measurable, resilient, and…

  2. How to Teach Programming Indirectly--Using Spreadsheet Application

    ERIC Educational Resources Information Center

    Tahy, Zsuzsanna Szalayné

    2016-01-01

    It is a question in many countries whether ICT and application usage should be taught. There are some problems with IT literacy: users do not understand the concepts of a software, they cannot solve problems, and moreover, using applications gives them more problems. Consequently, using ICT seems to slow work down. Experts suggest learning…

  3. The Structure of the Library Market for Scientific Journals: The Case of Chemistry.

    ERIC Educational Resources Information Center

    Bensman, Stephen J.

    1996-01-01

    An analysis of price and scientific value of chemistry journals concluded that scientific value does not play a role in the pricing of scientific journals and that consequently little relationship exists between scientific value and the prices charged libraries for journals. Describes a software package, Serials Evaluator, being developed at…

  4. The Systems Thinking and Curriculum Innovation Project. Technical Report, Part 2.

    ERIC Educational Resources Information Center

    Mandinach, Ellen B.; Thorpe, Margaret E.

    This is the second of two reports on the first year activities and results of the Systems Thinking and Curriculum Innovation Project (STACI), a two-year project which is examining the cognitive demands and consequences of using the STELLA (Structural Thinking Experimental Learning Laboratory with Animation) software to teach systems thinking,…

  5. Visits, Hits, Caching and Counting on the World Wide Web: Old Wine in New Bottles?

    ERIC Educational Resources Information Center

    Berthon, Pierre; Pitt, Leyland; Prendergast, Gerard

    1997-01-01

    Although web browser caching speeds up retrieval, reduces network traffic, and decreases the load on servers and browser's computers, an unintended consequence for marketing research is that Web servers undercount hits. This article explores counting problems, caching, proxy servers, trawler software and presents a series of correction factors…

  6. AUTO-MUTE 2.0: A Portable Framework with Enhanced Capabilities for Predicting Protein Functional Consequences upon Mutation.

    PubMed

    Masso, Majid; Vaisman, Iosif I

    2014-01-01

    The AUTO-MUTE 2.0 stand-alone software package includes a collection of programs for predicting functional changes to proteins upon single residue substitutions, developed by combining structure-based features with trained statistical learning models. Three of the predictors evaluate changes to protein stability upon mutation, each complementing a distinct experimental approach. Two additional classifiers are available, one for predicting activity changes due to residue replacements and the other for determining the disease potential of mutations associated with nonsynonymous single nucleotide polymorphisms (nsSNPs) in human proteins. These five command-line driven tools, as well as all the supporting programs, complement those that run our AUTO-MUTE web-based server. Nevertheless, all the codes have been rewritten and substantially altered for the new portable software, and they incorporate several new features based on user feedback. Included among these upgrades is the ability to perform three highly requested tasks: to run "big data" batch jobs; to generate predictions using modified protein data bank (PDB) structures, and unpublished personal models prepared using standard PDB file formatting; and to utilize NMR structure files that contain multiple models.

  7. MITHRA 1.0: A full-wave simulation tool for free electron lasers

    NASA Astrophysics Data System (ADS)

    Fallahi, Arya; Yahaghi, Alireza; Kärtner, Franz X.

    2018-07-01

    Free Electron Lasers (FELs) are a solution for providing intense, coherent and bright radiation in the hard X-ray regime. Due to the low wall-plug efficiency of FEL facilities, it is crucial and additionally very useful to develop complete and accurate simulation tools for better optimizing a FEL interaction. The highly sophisticated dynamics involved in a FEL process was the main obstacle hindering the development of general simulation tools for this problem. We present a numerical algorithm based on finite difference time domain/Particle in cell (FDTD/PIC) in a Lorentz boosted coordinate system which is able to fulfill a full-wave simulation of a FEL process. The developed software offers a suitable tool for the analysis of FEL interactions without considering any of the usual approximations. A coordinate transformation to bunch rest frame makes the very different length scales of bunch size, optical wavelengths and the undulator period transform to values with the same order. Consequently, FDTD/PIC simulations in conjunction with efficient parallelization techniques make the full-wave simulation feasible using the available computational resources. Several examples of free electron lasers are analyzed using the developed software, the results are benchmarked based on standard FEL codes and discussed in detail.

  8. Evaluating Software Assurance Knowledge and Competency of Acquisition Professionals

    DTIC Science & Technology

    2014-10-01

    of ISO 12207 -2008, both internationally and in the United States [7]. That standard documents a comprehensive set of activities and supporting...grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance Competency...cyberattacks grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance

  9. TMS for Instantiating a Knowledge Base With Incomplete Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.

  10. Hydrology for Engineers, Geologists, and Environmental Professionals

    NASA Astrophysics Data System (ADS)

    Ince, Simon

    For people who are involved in the applied aspects of hydrology, it is refreshing to find a textbook that begins with a meaningful disclaimer, albeit in fine print on the back side of the frontispiece:“The present book and the accompanying software have been written according to the latest techniques in scientific hydrology. However, hydrology is at best an inexact science. A good book and a good computer software by themselves do not guarantee accurate or even realistic predictions. Acceptable results in the applications of hydrologic methods to engineering and environmental problems depend to a greater extend (sic) on the skills, logical assumptions, and practical experience of the user, and on the quantity and quality of long-term hydrologic data available. Neither the author nor the publisher assumes any responsibility or any liability, explicitly or implicitly, on the results or the consequences of using the information contained in this book or its accompanying software.”

  11. SPLASSH: Open source software for camera-based high-speed, multispectral in-vivo optical image acquisition

    PubMed Central

    Sun, Ryan; Bouchard, Matthew B.; Hillman, Elizabeth M. C.

    2010-01-01

    Camera-based in-vivo optical imaging can provide detailed images of living tissue that reveal structure, function, and disease. High-speed, high resolution imaging can reveal dynamic events such as changes in blood flow and responses to stimulation. Despite these benefits, commercially available scientific cameras rarely include software that is suitable for in-vivo imaging applications, making this highly versatile form of optical imaging challenging and time-consuming to implement. To address this issue, we have developed a novel, open-source software package to control high-speed, multispectral optical imaging systems. The software integrates a number of modular functions through a custom graphical user interface (GUI) and provides extensive control over a wide range of inexpensive IEEE 1394 Firewire cameras. Multispectral illumination can be incorporated through the use of off-the-shelf light emitting diodes which the software synchronizes to image acquisition via a programmed microcontroller, allowing arbitrary high-speed illumination sequences. The complete software suite is available for free download. Here we describe the software’s framework and provide details to guide users with development of this and similar software. PMID:21258475

  12. Engineering and Software Engineering

    NASA Astrophysics Data System (ADS)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  13. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    NASA Astrophysics Data System (ADS)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  14. The ALMA common software: dispatch from the trenches

    NASA Astrophysics Data System (ADS)

    Schwarz, J.; Sommer, H.; Jeram, B.; Sekoranja, M.; Chiozzi, G.; Grimstrup, A.; Caproni, A.; Paredes, C.; Allaert, E.; Harrington, S.; Turolla, S.; Cirami, R.

    2008-07-01

    The ALMA Common Software (ACS) provides both an application framework and CORBA-based middleware for the distributed software system of the Atacama Large Millimeter Array. Building upon open-source tools such as the JacORB, TAO and OmniORB ORBs, ACS supports the development of component-based software in any of three languages: Java, C++ and Python. Now in its seventh major release, ACS has matured, both in its feature set as well as in its reliability and performance. However, it is only recently that the ALMA observatory's hardware and application software has reached a level at which it can exploit and challenge the infrastructure that ACS provides. In particular, the availability of an Antenna Test Facility(ATF) at the site of the Very Large Array in New Mexico has enabled us to exercise and test the still evolving end-to-end ALMA software under realistic conditions. The major focus of ACS, consequently, has shifted from the development of new features to consideration of how best to use those that already exist. Configuration details which could be neglected for the purpose of running unit tests or skeletal end-to-end simulations have turned out to be sensitive levers for achieving satisfactory performance in a real-world environment. Surprising behavior in some open-source tools has required us to choose between patching code that we did not write or addressing its deficiencies by implementing workarounds in our own software. We will discuss these and other aspects of our recent experience at the ATF and in simulation.

  15. Software for Simulating a Complex Robot

    NASA Technical Reports Server (NTRS)

    Goza, S. Michael

    2003-01-01

    RoboSim (Robot Simulation) is a computer program that simulates the poses and motions of the Robonaut a developmental anthropomorphic robot that has a complex system of joints with 43 degrees of freedom and multiple modes of operation and control. RoboSim performs a full kinematic simulation of all degrees of freedom. It also includes interface components that duplicate the functionality of the real Robonaut interface with control software and human operators. Basically, users see no difference between the real Robonaut and the simulation. Consequently, new control algorithms can be tested by computational simulation, without risk to the Robonaut hardware, and without using excessive Robonaut-hardware experimental time, which is always at a premium. Previously developed software incorporated into RoboSim includes Enigma (for graphical displays), OSCAR (for kinematical computations), and NDDS (for communication between the Robonaut and external software). In addition, RoboSim incorporates unique inverse-kinematical algorithms for chains of joints that have fewer than six degrees of freedom (e.g., finger joints). In comparison with the algorithms of OSCAR, these algorithms are more readily adaptable and provide better results when using equivalent sets of data.

  16. Predictive Model and Software for Inbreeding-Purging Analysis of Pedigreed Populations

    PubMed Central

    García-Dorado, Aurora; Wang, Jinliang; López-Cortegano, Eugenio

    2016-01-01

    The inbreeding depression of fitness traits can be a major threat to the survival of populations experiencing inbreeding. However, its accurate prediction requires taking into account the genetic purging induced by inbreeding, which can be achieved using a “purged inbreeding coefficient”. We have developed a method to compute purged inbreeding at the individual level in pedigreed populations with overlapping generations. Furthermore, we derive the inbreeding depression slope for individual logarithmic fitness, which is larger than that for the logarithm of the population fitness average. In addition, we provide a new software, PURGd, based on these theoretical results that allows analyzing pedigree data to detect purging, and to estimate the purging coefficient, which is the parameter necessary to predict the joint consequences of inbreeding and purging. The software also calculates the purged inbreeding coefficient for each individual, as well as standard and ancestral inbreeding. Analysis of simulation data show that this software produces reasonably accurate estimates for the inbreeding depression rate and for the purging coefficient that are useful for predictive purposes. PMID:27605515

  17. Self-Regulated Learning Substudy: Systems Thinking and Curriculum Innovation (STACI) Project.

    ERIC Educational Resources Information Center

    Mandinach, Ellen B.

    The Systems Thinking and Curriculum Innovation (STACI) Project is a multi-year research effort intended to examine the cognitive demands and consequences of learning from a systems thinking approach to instruction and from using simulation-modeling software. The purpose of the study is to test the potentials and effects of integrating the systems…

  18. Students' Expression of Affect in an Inner-City SimCalc Classroom

    ERIC Educational Resources Information Center

    Schorr, Roberta Y.; Goldin, Gerald A.

    2008-01-01

    This research focuses on some of the affordances provided by SimCalc software, suggesting that its use can have important consequences for students' mathematical affect and motivation. We describe an episode in an inner-city SimCalc environment illustrating our approach to the study of affect in the mathematics classroom. We infer students'…

  19. GeoGebra for Mathematical Statistics

    ERIC Educational Resources Information Center

    Hewson, Paul

    2009-01-01

    The GeoGebra software is attracting a lot of interest in the mathematical community, consequently there is a wide range of experience and resources to help use this application. This article briefly outlines how GeoGebra will be of great value in statistical education. The release of GeoGebra is an excellent example of the power of free software…

  20. Copyright Law and Information Policy Planning: Public Rights of Use in the 1990s and Beyond.

    ERIC Educational Resources Information Center

    Crews, Kenneth D.

    1995-01-01

    Summarizes recent developments in copyright law, with a focus on their consequences for users in colleges, universities, and libraries. Highlights include the concept of fair use; library reproduction rights; recent court cases and legislation; and future copyright concerns, including fair use of computer software and electronic text, and license…

  1. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  2. Oracle Database 10g: a platform for BLAST search and Regular Expression pattern matching in life sciences

    PubMed Central

    Stephens, Susie M.; Chen, Jake Y.; Davidson, Marcel G.; Thomas, Shiby; Trute, Barry M.

    2005-01-01

    As database management systems expand their array of analytical functionality, they become powerful research engines for biomedical data analysis and drug discovery. Databases can hold most of the data types commonly required in life sciences and consequently can be used as flexible platforms for the implementation of knowledgebases. Performing data analysis in the database simplifies data management by minimizing the movement of data from disks to memory, allowing pre-filtering and post-processing of datasets, and enabling data to remain in a secure, highly available environment. This article describes the Oracle Database 10g implementation of BLAST and Regular Expression Searches and provides case studies of their usage in bioinformatics. http://www.oracle.com/technology/software/index.html PMID:15608287

  3. Situational Awareness Geospatial Application (iSAGA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sher, Benjamin

    Situational Awareness Geospatial Application (iSAGA) is a geospatial situational awareness software tool that uses an algorithm to extract location data from nearly any internet-based, or custom data source and display it geospatially; allows user-friendly conduct of spatial analysis using custom-developed tools; searches complex Geographic Information System (GIS) databases and accesses high resolution imagery. iSAGA has application at the federal, state and local levels of emergency response, consequence management, law enforcement, emergency operations and other decision makers as a tool to provide complete, visual, situational awareness using data feeds and tools selected by the individual agency or organization. Feeds may bemore » layered and custom tools developed to uniquely suit each subscribing agency or organization. iSAGA may similarly be applied to international agencies and organizations.« less

  4. VEDA: a web-based virtual environment for dynamic atomic force microscopy.

    PubMed

    Melcher, John; Hu, Shuiqing; Raman, Arvind

    2008-06-01

    We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.

  5. Invited Article: VEDA: A web-based virtual environment for dynamic atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Melcher, John; Hu, Shuiqing; Raman, Arvind

    2008-06-01

    We describe here the theory and applications of virtual environment dynamic atomic force microscopy (VEDA), a suite of state-of-the-art simulation tools deployed on nanoHUB (www.nanohub.org) for the accurate simulation of tip motion in dynamic atomic force microscopy (dAFM) over organic and inorganic samples. VEDA takes advantage of nanoHUB's cyberinfrastructure to run high-fidelity dAFM tip dynamics computations on local clusters and the teragrid. Consequently, these tools are freely accessible and the dAFM simulations are run using standard web-based browsers without requiring additional software. A wide range of issues in dAFM ranging from optimal probe choice, probe stability, and tip-sample interaction forces, power dissipation, to material property extraction and scanning dynamics over hetereogeneous samples can be addressed.

  6. Intelligent Sensors Security

    PubMed Central

    Bialas, Andrzej

    2010-01-01

    The paper is focused on the security issues of sensors provided with processors and software and used for high-risk applications. Common IT related threats may cause serious consequences for sensor system users. To improve their robustness, sensor systems should be developed in a restricted way that would provide them with assurance. One assurance creation methodology is Common Criteria (ISO/IEC 15408) used for IT products and systems. The paper begins with a primer on the Common Criteria, and then a general security model of the intelligent sensor as an IT product is discussed. The paper presents how the security problem of the intelligent sensor is defined and solved. The contribution of the paper is to provide Common Criteria (CC) related security design patterns and to improve the effectiveness of the sensor development process. PMID:22315571

  7. Computer systems and software engineering

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  8. Genepleio software for effective estimation of gene pleiotropy from protein sequences.

    PubMed

    Chen, Wenhai; Chen, Dandan; Zhao, Ming; Zou, Yangyun; Zeng, Yanwu; Gu, Xun

    2015-01-01

    Though pleiotropy, which refers to the phenomenon of a gene affecting multiple traits, has long played a central role in genetics, development, and evolution, estimation of the number of pleiotropy components remains a hard mission to accomplish. In this paper, we report a newly developed software package, Genepleio, to estimate the effective gene pleiotropy from phylogenetic analysis of protein sequences. Since this estimate can be interpreted as the minimum pleiotropy of a gene, it is used to play a role of reference for many empirical pleiotropy measures. This work would facilitate our understanding of how gene pleiotropy affects the pattern of genotype-phenotype map and the consequence of organismal evolution.

  9. Recursiveness in learning processes: An analogy to help software development for ABA intervention for autistic kids

    NASA Astrophysics Data System (ADS)

    Presti, Giovambattista; Premarini, Claudio; Leuzzi, Martina; Di Blasi, Melina; Squatrito, Valeria

    2017-11-01

    The operant was conceptualized by Skinner as a class of behaviors which have common effect on the environment and that, as a class can be shown to vary lawfully in their relations to the other environmental variables, namely antecedents and consequences. And Skinner himself underlined the fact that "operant field is the very field purpose of behavior". The operant offers interesting basic and applied characteristic to conceptualize complex behavior as a recursive process of learning. In this paper we will discuss how the operant concept can be applied in the implementation of software oriented to increase cognitive skills in autistic children and provide an example.

  10. Preparation of Power Distribution System for High Penetration of Renewable Energy Part I. Dynamic Voltage Restorer for Voltage Regulation Pat II. Distribution Circuit Modeling and Validation

    NASA Astrophysics Data System (ADS)

    Khoshkbar Sadigh, Arash

    Part I: Dynamic Voltage Restorer In the present power grids, voltage sags are recognized as a serious threat and a frequently occurring power-quality problem and have costly consequence such as sensitive loads tripping and production loss. Consequently, the demand for high power quality and voltage stability becomes a pressing issue. Dynamic voltage restorer (DVR), as a custom power device, is more effective and direct solutions for "restoring" the quality of voltage at its load-side terminals when the quality of voltage at its source-side terminals is disturbed. In the first part of this thesis, a DVR configuration with no need of bulky dc link capacitor or energy storage is proposed. This fact causes to reduce the size of the DVR and increase the reliability of the circuit. In addition, the proposed DVR topology is based on high-frequency isolation transformer resulting in the size reduction of transformer. The proposed DVR circuit, which is suitable for both low- and medium-voltage applications, is based on dc-ac converters connected in series to split the main dc link between the inputs of dc-ac converters. This feature makes it possible to use modular dc-ac converters and utilize low-voltage components in these converters whenever it is required to use DVR in medium-voltage application. The proposed configuration is tested under different conditions of load power factor and grid voltage harmonic. It has been shown that proposed DVR can compensate the voltage sag effectively and protect the sensitive loads. Following the proposition of the DVR topology, a fundamental voltage amplitude detection method which is applicable in both single/three-phase systems for DVR applications is proposed. The advantages of proposed method include application in distorted power grid with no need of any low-pass filter, precise and reliable detection, simple computation and implementation without using a phased locked loop and lookup table. The proposed method has been verified by simulation and experimental tests under various conditions considering all possible cases such as different amounts of voltage sag depth (VSD), different amounts of point-on-wave (POW) at which voltage sag occurs, harmonic distortion, line frequency variation, and phase jump (PJ). Furthermore, the ripple amount of fundamental voltage amplitude calculated by the proposed method and its error is analyzed considering the line frequency variation together with harmonic distortion. The best and worst detection time of proposed method were measured 1ms and 8.8ms, respectively. Finally, the proposed method has been compared with other voltage sag detection methods available in literature. Part 2: Power System Modeling for Renewable Energy Integration: As power distribution systems are evolving into more complex networks, electrical engineers have to rely on software tools to perform circuit analysis. There are dozens of powerful software tools available in the market to perform the power system studies. Although their main functions are similar, there are differences in features and formatting structures to suit specific applications. This creates challenges for transferring power system circuit models data (PSCMD) between different software and rebuilding the same circuit in the second software environment. The objective of this part of thesis is to develop a Unified Platform (UP) to facilitate transferring PSCMD among different software packages and relieve the challenges of the circuit model conversion process. UP uses a commonly available spreadsheet file with a defined format, for any home software to write data to and for any destination software to read data from, via a script-based application called PSCMD transfer application. The main considerations in developing the UP are to minimize manual intervention and import a one-line diagram into the destination software or export it from the source software, with all details to allow load flow, short circuit and other analyses. In this study, ETAP, OpenDSS, and GridLab-D are considered, and PSCMD transfer applications written in MATLAB have been developed for each of these to read the circuit model data provided in the UP spreadsheet. In order to test the developed PSCMD transfer applications, circuit model data of a test circuit and a power distribution circuit from Southern California Edison (SCE) - a utility company - both built in CYME, were exported into the spreadsheet file according to the UP format. Thereafter, circuit model data were imported successfully from the spreadsheet files into above mentioned software using the PSCMD transfer applications developed for each software. After the SCE studied circuit is transferred into OpenDSS software using the proposed UP scheme and developed application, it has been studied to investigate the impacts of large-scale solar energy penetration. The main challenge of solar energy integration into power grid is its intermittency (i.e., discontinuity of output power) nature due to cloud shading of photovoltaic panels which depends on weather conditions. In order to conduct this study, OpenDSS time-series simulation feature, which is required due to intermittency of solar energy, is utilized. In this study, the impacts of intermittency of solar energy penetration, especially high-variability points, on voltage fluctuation and operation of capacitor bank and voltage regulator is provided. In addition, the necessity to interpolate and resample unequally spaced time-series measurement data and convert them to equally spaced time-series data as well as the effect of resampling time-interval on the amount of error is discussed. Two applications are developed in Matlab to do interpolation and resampling as well as to calculate the amount of error for different resampling time-intervals to figure out the suitable resampling time-interval. Furthermore, an approach based on cumulative distribution, regarding the length for lines/cables types and the power rating for loads, is presented to prioritize which loads, lines and cables the meters should be installed at to have the most effect on model validation.

  11. A model-based approach for automated in vitro cell tracking and chemotaxis analyses.

    PubMed

    Debeir, Olivier; Camby, Isabelle; Kiss, Robert; Van Ham, Philippe; Decaestecker, Christine

    2004-07-01

    Chemotaxis may be studied in two main ways: 1) counting cells passing through an insert (e.g., using Boyden chambers), and 2) directly observing cell cultures (e.g., using Dunn chambers), both in response to stationary concentration gradients. This article promotes the use of Dunn chambers and in vitro cell-tracking, achieved by video microscopy coupled with automatic image analysis software, in order to extract quantitative and qualitative measurements characterizing the response of cells to a diffusible chemical agent. Previously, we set up a videomicroscopy system coupled with image analysis software that was able to compute cell trajectories from in vitro cell cultures. In the present study, we are introducing a new software increasing the application field of this system to chemotaxis studies. This software is based on an adapted version of the active contour methodology, enabling each cell to be efficiently tracked for hours and resulting in detailed descriptions of individual cell trajectories. The major advantages of this method come from an improved robustness with respect to variability in cell morphologies between different cell lines and dynamical changes in cell shape during cell migration. Moreover, the software includes a very small number of parameters which do not require overly sensitive tuning. Finally, the running time of the software is very short, allowing improved possibilities in acquisition frequency and, consequently, improved descriptions of complex cell trajectories, i.e. trajectories including cell division and cell crossing. We validated this software on several artificial and real cell culture experiments in Dunn chambers also including comparisons with manual (human-controlled) analyses. We developed new software and data analysis tools for automated cell tracking which enable cell chemotaxis to be efficiently analyzed. Copyright 2004 Wiley-Liss, Inc.

  12. ValWorkBench: an open source Java library for cluster validation, with applications to microarray data analysis.

    PubMed

    Giancarlo, R; Scaturro, D; Utro, F

    2015-02-01

    The prediction of the number of clusters in a dataset, in particular microarrays, is a fundamental task in biological data analysis, usually performed via validation measures. Unfortunately, it has received very little attention and in fact there is a growing need for software tools/libraries dedicated to it. Here we present ValWorkBench, a software library consisting of eleven well known validation measures, together with novel heuristic approximations for some of them. The main objective of this paper is to provide the interested researcher with the full software documentation of an open source cluster validation platform having the main features of being easily extendible in a homogeneous way and of offering software components that can be readily re-used. Consequently, the focus of the presentation is on the architecture of the library, since it provides an essential map that can be used to access the full software documentation, which is available at the supplementary material website [1]. The mentioned main features of ValWorkBench are also discussed and exemplified, with emphasis on software abstraction design and re-usability. A comparison with existing cluster validation software libraries, mainly in terms of the mentioned features, is also offered. It suggests that ValWorkBench is a much needed contribution to the microarray software development/algorithm engineering community. For completeness, it is important to mention that previous accurate algorithmic experimental analysis of the relative merits of each of the implemented measures [19,23,25], carried out specifically on microarray data, gives useful insights on the effectiveness of ValWorkBench for cluster validation to researchers in the microarray community interested in its use for the mentioned task. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Autonomy Software: V&V Challenges and Characteristics

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Visser, Willem

    2006-01-01

    The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.

  14. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel; Morasca, Sandro; Basili, Victor R.

    1995-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.

  15. Research in software allocation for advanced manned mission communications and tracking systems

    NASA Technical Reports Server (NTRS)

    Warnagiris, Tom; Wolff, Bill; Kusmanoff, Antone

    1990-01-01

    An assessment of the planned processing hardware and software/firmware for the Communications and Tracking System of the Space Station Freedom (SSF) was performed. The intent of the assessment was to determine the optimum distribution of software/firmware in the processing hardware for maximum throughput with minimum required memory. As a product of the assessment process an assessment methodology was to be developed that could be used for similar assessments of future manned spacecraft system designs. The assessment process was hampered by changing requirements for the Space Station. As a result, the initial objective of determining the optimum software/firmware allocation was not fulfilled, but several useful conclusions and recommendations resulted from the assessment. It was concluded that the assessment process would not be completely successful for a system with changing requirements. It was also concluded that memory requirements and hardware requirements were being modified to fit as a consequence of the change process, and although throughput could not be quantitized, potential problem areas could be identified. Finally, inherent flexibility of the system design was essential for the success of a system design with changing requirements. Recommendations resulting from the assessment included development of common software for some embedded controller functions, reduction of embedded processor requirements by hardwiring some Orbital Replacement Units (ORUs) to make better use of processor capabilities, and improvement in communications between software development personnel to enhance the integration process. Lastly, a critical observation was made regarding the software integration tasks did not appear to be addressed in the design process to the degree necessary for successful satisfaction of the system requirements.

  16. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  17. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  18. An Agile Constructionist Mentoring Methodology for Software Projects in the High School

    ERIC Educational Resources Information Center

    Meerbaum-Salant, Orni; Hazzan, Orit

    2010-01-01

    This article describes the construction process and evaluation of the Agile Constructionist Mentoring Methodology (ACMM), a mentoring method for guiding software development projects in the high school. The need for such a methodology has arisen due to the complexity of mentoring software project development in the high school. We introduce the…

  19. Development of a Software-Defined Radar

    DTIC Science & Technology

    2017-10-01

    waveform to the widest available (unoccupied) instantaneous bandwidth in real time. Consequently, the radar range resolution and target detection are...LabVIEW The matched filter range profile is calculated in real time using fast Fourier transform (FFT) operations to perform a cross-correlation...between the transmitted waveform and the received complex data. Figure 4 demonstrates the block logic used to achieve real -time range profile

  20. Iowa Virtual Literacy Protocol: A Pre-Experimental Design Using Kurzweil 3000 Text-to-Speech Software with Incarcerated Adult Learners

    ERIC Educational Resources Information Center

    McCulley, Yvette K.

    2012-01-01

    The problem: The increasingly competitive global economy demands literate, educated workers. Both men and women experience the effects of education on employment rates and income. Racial and ethnic minorities, English language learners, and especially those with prison records are most deeply affected by the economic consequences of dropping out…

  1. Obstacles to the Use of ICTs in Training and Consequences for the Development of "E"-Learning and "M"-Learning

    ERIC Educational Resources Information Center

    Marquet, Pascal

    2011-01-01

    Being able to access educational or training resources or programmes anytime and anywhere is one of the challenges that "e"-learning and now "m"-learning seek to address, with the aim of maintaining the highest possible level of professional skills. Although hardware and software solutions are now firmly established, instructional design issues…

  2. Phase 1 Development Report for the SESSA Toolkit

    DTIC Science & Technology

    2014-09-01

    data acquisition, data management, and data analysis. SESSA was designed to meet forensic crime scene needs as defined by the DoD’s Military Criminal...on the design , functional attributes, algorithm development, system architecture, and software programming include: Robert Knowlton, Brad Melton...Building Restoration Operations Optimization Model (BROOM). BROOM (Knowlton et al., 2012) was designed for consequence management activities (e.g

  3. Development of high-availability ATCA/PCIe data acquisition instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Correia, Miguel; Sousa, Jorge; Batista, Antonio J.N.

    2015-07-01

    Latest Fusion energy experiments envision a quasi-continuous operation regime. In consequence, the largest experimental devices, currently in development, specify high-availability (HA) requirements for the whole plant infrastructure. HA features enable the whole facility to perform seamlessly in the case of failure of any of its components, coping with the increasing duration of plasma discharges (steady-state) and assuring safety of equipment, people, environment and investment. IPFN developed a control and data acquisition system, aiming for fast control of advanced Fusion devices, which is thus required to provide such HA features. The system is based on in-house developed Advanced Telecommunication Computing Architecturemore » (ATCA) instrumentation modules - IO blades and data switch blades, establishing a PCIe network on the ATCA shelf's back-plane. The data switch communicates to an external host computer through a PCIe data network. At the hardware management level, the system architecture takes advantage of ATCA native redundancy and hot swap specifications to implement fail-over substitution of IO or data switch blades. A redundant host scheme is also supported by the ATCA/PCIe platform. At the software level, PCIe provides implementation of hot plug services, which translate the hardware changes to the corresponding software/operating system devices. The paper presents how the ATCA and PCIe based system can be setup to perform with the desired degree of HA, thus being suitable for advanced Fusion control and data acquisition systems. (authors)« less

  4. Food Web Designer: a flexible tool to visualize interaction networks.

    PubMed

    Sint, Daniela; Traugott, Michael

    Species are embedded in complex networks of ecological interactions and assessing these networks provides a powerful approach to understand what the consequences of these interactions are for ecosystem functioning and services. This is mandatory to develop and evaluate strategies for the management and control of pests. Graphical representations of networks can help recognize patterns that might be overlooked otherwise. However, there is a lack of software which allows visualizing these complex interaction networks. Food Web Designer is a stand-alone, highly flexible and user friendly software tool to quantitatively visualize trophic and other types of bipartite and tripartite interaction networks. It is offered free of charge for use on Microsoft Windows platforms. Food Web Designer is easy to use without the need to learn a specific syntax due to its graphical user interface. Up to three (trophic) levels can be connected using links cascading from or pointing towards the taxa within each level to illustrate top-down and bottom-up connections. Link width/strength and abundance of taxa can be quantified, allowing generating fully quantitative networks. Network datasets can be imported, saved for later adjustment and the interaction webs can be exported as pictures for graphical display in different file formats. We show how Food Web Designer can be used to draw predator-prey and host-parasitoid food webs, demonstrating that this software is a simple and straightforward tool to graphically display interaction networks for assessing pest control or any other type of interaction in both managed and natural ecosystems from an ecological network perspective.

  5. Survey of Software Assurance Techniques for Highly Reliable Systems

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy

    2004-01-01

    This document provides a survey of software assurance techniques for highly reliable systems including a discussion of relevant safety standards for various industries in the United States and Europe, as well as examples of methods used during software development projects. It contains one section for each industry surveyed: Aerospace, Defense, Nuclear Power, Medical Devices and Transportation. Each section provides an overview of applicable standards and examples of a mission or software development project, software assurance techniques used and reliability achieved.

  6. Managing the maintainance and conservation of the built heritage: a web-gis approach for the Richini courtyard in Milan

    NASA Astrophysics Data System (ADS)

    Toniolo, L.; Gulotta, D.; Bertoldi, M.; Bortolotto, S.

    2012-04-01

    The Richini courtyard is a masterpiece of northern Italy baroque and it is part of the complex that currently hosts the "Università degli Studi" of Milan. Its four internal façades are based on a double arcade structure with granitic columns along a rectangular plan. The architectural elements are enriched by an outstanding sculpted decoration made of Angera stone (a typical Lombard dolostone) with bas-relief panels, high-relief figures, mouldings and voussoirs. The courtyard suffers the consequences of a troubled conservation history: the Second World War bombardment caused devastating damages to both the structure and the sculpted surfaces, so that an extensive restoration was carried out during the early fifties. Moreover, a further and massive conservative intervention was required during the nineties due to the increasing degradation rate of the Angera stone subjected to severely polluted environmental conditions. The overall durability of this last intervention, as well as the long-term compatibility of the restoration materials, has been evaluated almost twenty years later, in 2011. A thorough study of representative areas of the courtyard has been conducted by a multi-disciplinary research group. The aim of the study was the evaluation of the state of conservation of the ancient and restoration materials, as well as the identification of the decay phenomena. A high-accurate 3D laser scanner survey of the courtyard has been performed as well. The results of the diagnostic activity has been summarised in the present work. The wide range of different type of data (analytical and geometrical data, historical records, photographic documentation) have been managed by the latest release of a web-GIS software specifically designed for the application in the built heritage conservation. A new data structure has been purposely designed in order to maximize the efficiency for what concerning data entry, data query and data updating. The enhanced web-GIS software has become a user-friendly platform for the collection and reference of all the results of the façades, based on high-resolution images of the sculpted surfaces. The data elaboration allowed the definition of a risk priority scheme for each different type of architectural elements. Guidelines for a long term monitoring and for a sustainable planned maintenance of the courtyard has been consequently set-up.

  7. Digital Geological Mapping for Earth Science Students

    NASA Astrophysics Data System (ADS)

    England, Richard; Smith, Sally; Tate, Nick; Jordan, Colm

    2010-05-01

    This SPLINT (SPatial Literacy IN Teaching) supported project is developing pedagogies for the introduction of teaching of digital geological mapping to Earth Science students. Traditionally students are taught to make geological maps on a paper basemap with a notebook to record their observations. Learning to use a tablet pc with GIS based software for mapping and data recording requires emphasis on training staff and students in specific GIS and IT skills and beneficial adjustments to the way in which geological data is recorded in the field. A set of learning and teaching materials are under development to support this learning process. Following the release of the British Geological Survey's Sigma software we have been developing generic methodologies for the introduction of digital geological mapping to students that already have experience of mapping by traditional means. The teaching materials introduce the software to the students through a series of structured exercises. The students learn the operation of the software in the laboratory by entering existing observations, preferably data that they have collected. Through this the students benefit from being able to reflect on their previous work, consider how it might be improved and plan new work. Following this they begin fieldwork in small groups using both methods simultaneously. They are able to practise what they have learnt in the classroom and review the differences, advantages and disadvantages of the two methods, while adding to the work that has already been completed. Once the field exercises are completed students use the data that they have collected in the production of high quality map products and are introduced to the use of integrated digital databases which they learn to search and extract information from. The relatively recent development of the technologies which underpin digital mapping also means that many academic staff also require training before they are able to deliver the course materials. Consequently, a set of staff training materials are being developed in parallel to those for the students. These focus on the operation of the software and an introduction to the structure of the exercises. The presentation will review the teaching exercises and student and staff responses to their introduction.

  8. Research Note: The consequences of different methods for handling missing network data in Stochastic Actor Based Models

    PubMed Central

    Hipp, John R.; Wang, Cheng; Butts, Carter T.; Jose, Rupa; Lakon, Cynthia M.

    2015-01-01

    Although stochastic actor based models (e.g., as implemented in the SIENA software program) are growing in popularity as a technique for estimating longitudinal network data, a relatively understudied issue is the consequence of missing network data for longitudinal analysis. We explore this issue in our research note by utilizing data from four schools in an existing dataset (the AddHealth dataset) over three time points, assessing the substantive consequences of using four different strategies for addressing missing network data. The results indicate that whereas some measures in such models are estimated relatively robustly regardless of the strategy chosen for addressing missing network data, some of the substantive conclusions will differ based on the missing data strategy chosen. These results have important implications for this burgeoning applied research area, implying that researchers should more carefully consider how they address missing data when estimating such models. PMID:25745276

  9. Research Note: The consequences of different methods for handling missing network data in Stochastic Actor Based Models.

    PubMed

    Hipp, John R; Wang, Cheng; Butts, Carter T; Jose, Rupa; Lakon, Cynthia M

    2015-05-01

    Although stochastic actor based models (e.g., as implemented in the SIENA software program) are growing in popularity as a technique for estimating longitudinal network data, a relatively understudied issue is the consequence of missing network data for longitudinal analysis. We explore this issue in our research note by utilizing data from four schools in an existing dataset (the AddHealth dataset) over three time points, assessing the substantive consequences of using four different strategies for addressing missing network data. The results indicate that whereas some measures in such models are estimated relatively robustly regardless of the strategy chosen for addressing missing network data, some of the substantive conclusions will differ based on the missing data strategy chosen. These results have important implications for this burgeoning applied research area, implying that researchers should more carefully consider how they address missing data when estimating such models.

  10. CATE 2016 Indonesia: Camera, Software, and User Interface

    NASA Astrophysics Data System (ADS)

    Kovac, S. A.; Jensen, L.; Hare, H. S.; Mitchell, A. M.; McKay, M. A.; Bosh, R.; Watson, Z.; Penn, M.

    2016-12-01

    The Citizen Continental-America Telescopic Eclipse (Citizen CATE) Experiment will use a fleet of 60 identical telescopes across the United States to image the inner solar corona during the 2017 total solar eclipse. For a proof of concept, five sites were hosted along the path of totality during the 2016 total solar eclipse in Indonesia. Tanjung Pandan, Belitung, Indonesia was the first site to experience totality. This site had the best seeing conditions and focus, resulting in the highest quality images. This site proved that the equipment that is going to be used is capable of recording high quality images of the solar corona. Because 60 sites will be funded, each set up needs to be cost effective. This requires us to use an inexpensive camera, which consequently has a small dynamic range. To compensate for the corona's intensity drop off factor of 1,000, images are taken at seven frames per second, at exposures 0.4ms, 1.3ms, 4.0ms, 13ms, 40ms, 130ms, and 400ms. Using MatLab software, we are able to capture a high dynamic range with an Arduino that controls the 2448 x 2048 CMOS camera. A major component of this project is to train average citizens to use the software, meaning it needs to be as user friendly as possible. The CATE team is currently working with MathWorks to create a graphic user interface (GUI) that will make data collection run smoothly. This interface will include tabs for alignment, focus, calibration data, drift data, GPS, totality, and a quick look function. This work was made possible through the National Solar Observatory Research Experiences for Undergraduates (REU) Program, which is funded by the National Science Foundation (NSF). The NSO Training for 2017 Citizen CATE Experiment, funded by NASA (NASA NNX16AB92A), also provided support for this project. The National Solar Observatory is operated by the Association of Universities for Research in Astronomy, Inc. (AURA) under cooperative agreement with the NSF.

  11. Innovations in the Assay of Un-Segregated Multi-Isotopic Grade TRU Waste Boxes with SuperHENC and FRAM Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, A. P.; Barber, S.; Abdurrahman, N. M.

    2006-07-01

    The Super High Efficiency Neutron Coincidence Counter (SuperHENC) was originally developed by BIL Solutions Inc., Los Alamos National Laboratory (LANL) and Rocky Flats Environmental Technology Site (RFETS) for assay of transuranic (TRU) waste in Standard Waste Boxes (SWB) at Rocky Flats. This mobile system was a key component in the shipment of over 4,000 SWBs to the Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. The system was WIPP certified in 2001 and operated at the site for four years. The success of this system, a passive neutron coincidence counter combined with high resolution gamma spectroscopy, led to themore » order of two new units, delivered to Hanford in 2004. Several new challenges were faced at Hanford: For example, the original RFETS system was calibrated for segregated waste streams such that metals, plastics, wet combustibles and dry combustibles were separated by 'Item Description Codes' prior to assay. Furthermore, the RFETS mission of handling only weapons grade plutonium, enabled the original SuperHENC to benefit from the use of known Pu isotopics. Operations at Hanford, as with most other DOE sites, generate un-segregated waste streams, with a wide diversity of Pu isotopics. Consequently, the new SuperHENCs are required to deal with new technical challenges. The neutron system's software and calibration methodology have been modified to encompass these new requirements. In addition, PC-FRAM software has been added to the gamma system, providing a robust isotopic measurement capability. Finally a new software package has been developed that integrates the neutron and gamma data to provide a final assay results and analysis report. The new system's performance has been rigorously tested and validated against WIPP quality requirements. These modifications, together with the mobile platform, make the new SuperHENC far more versatile in handling diverse waste streams and allow for rapid redeployment around the DOE complex. (authors)« less

  12. WinHPC System Software | High-Performance Computing | NREL

    Science.gov Websites

    Software WinHPC System Software Learn about the software applications, tools, toolchains, and for industrial applications. Intel Compilers Development Tool, Toolchain Suite featuring an industry

  13. High Performance Computing Software Applications for Space Situational Awareness

    NASA Astrophysics Data System (ADS)

    Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.

    The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.

  14. The current role of high-resolution mass spectrometry in food analysis.

    PubMed

    Kaufmann, Anton

    2012-05-01

    High-resolution mass spectrometry (HRMS), which is used for residue analysis in food, has gained wider acceptance in the last few years. This development is due to the availability of more rugged, sensitive, and selective instrumentation. The benefits provided by HRMS over classical unit-mass-resolution tandem mass spectrometry are considerable. These benefits include the collection of full-scan spectra, which provides greater insight into the composition of a sample. Consequently, the analyst has the freedom to measure compounds without previous compound-specific tuning, the possibility of retrospective data analysis, and the capability of performing structural elucidations of unknown or suspected compounds. HRMS strongly competes with classical tandem mass spectrometry in the field of quantitative multiresidue methods (e.g., pesticides and veterinary drugs). It is one of the most promising tools when moving towards nontargeted approaches. Certain hardware and software issues still have to be addressed by the instrument manufacturers for it to dislodge tandem mass spectrometry from its position as the standard trace analysis tool.

  15. Misalignment of disposable pulse oximeter probes results in false saturation readings that influence anesthetic management.

    PubMed

    Guan, Zhonghui; Baker, Keith; Sandberg, Warren S

    2009-11-01

    We report a small case series in which misaligned disposable pulse oximeter sensors gave falsely low saturation readings. In each instance, the sensor performed well during preinduction oxygen administration and the early part of the case, most notably by producing a plethysmographic trace rated as high quality by the oximeter software. The reported pulse oximeter oxygen saturation eventually decreased to concerning levels in each instance, but the anesthesiologists, relying on the reported high-quality signal, initially sought other causes for apparent hypoxia. They undertook maneuvers and diagnostic procedures later deemed unnecessary. When the malpositioned sensors were discovered and repositioned, the apparent hypoxia was quickly relieved in each case. We then undertook a survey of disposable oximeter sensors as patients entered the recovery room, and discovered malposition of more than 1 cm in approximately 20% of all sensors, without apparent consequence. We conclude that the technology is quite robust, but that the diagnosis of apparent hypoxia should include a quick check of oximeter position early on.

  16. Development of imaging techniques to study the pathogenesis of biosafety level 2/3 infectious agents.

    PubMed

    Rella, Courtney E; Ruel, Nancy; Eugenin, Eliseo A

    2014-12-01

    Despite significant advances in microbiology and molecular biology over the last decades, several infectious diseases remain global concerns, resulting in the death of millions of people worldwide each year. According to the Center for Disease Control (CDC) in 2012, there were 34 million people infected with HIV, 8.7 million new cases of tuberculosis, 500 million cases of hepatitis, and 50-100 million people infected with dengue. Several of these pathogens, despite high incidence, do not have reliable clinical detection methods. New or improved protocols have been generated to enhance detection and quantitation of several pathogens using high-end microscopy (light, confocal, and STORM microscopy) and imaging software. In the current manuscript, we discuss these approaches and the theories behind these methodologies. Thus, advances in imaging techniques will open new possibilities to discover therapeutic interventions to reduce or eliminate the devastating consequences of infectious diseases. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.

  17. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  18. Analytical Design of Evolvable Software for High-Assurance Computing

    DTIC Science & Technology

    2001-02-14

    Mathematical expression for the Total Sum of Squares which measures the variability that results when all values are treated as a combined sample coming from...primarily interested in background on software design and high-assurance computing, research in software architecture generation or evaluation...respectively. Those readers solely interested in the validation of a software design approach should at the minimum read Chapter 6 followed by Chapter

  19. Design of Control Software for a High-Speed Coherent Doppler Lidar System for CO2 Measurement

    NASA Technical Reports Server (NTRS)

    Vanvalkenburg, Randal L.; Beyon, Jeffrey Y.; Koch, Grady J.; Yu, Jirong; Singh, Upendra N.; Kavaya, Michael J.

    2010-01-01

    The design of the software for a 2-micron coherent high-speed Doppler lidar system for CO2 measurement at NASA Langley Research Center is discussed in this paper. The specific strategy and design topology to meet the requirements of the system are reviewed. In order to attain the high-speed digitization of the different types of signals to be sampled on multiple channels, a carefully planned design of the control software is imperative. Samples of digitized data from each channel and their roles in data analysis post processing are also presented. Several challenges of extremely-fast, high volume data acquisition are discussed. The software must check the validity of each lidar return as well as other monitoring channel data in real-time. For such high-speed data acquisition systems, the software is a key component that enables the entire scope of CO2 measurement studies using commercially available system components.

  20. Keeping Things Interesting: A Reuse Case Study

    NASA Astrophysics Data System (ADS)

    Troisi, V.; Swick, R.; Seufert, E.

    2006-12-01

    Software reuse has several obvious advantages. By taking advantage of the experience and skill of colleagues one not only saves time, money and resources, but can also jump start a project that might otherwise have floundered from the start, or not even have been possible. One of the least talked about advantages of software reuse is it helps keep the work interesting for the developers. Reuse prevents developers from spending time and energy writing software solutions to problems that have already been solved, and frees them to concentrate on solving new problems, developing new components, and doing things that have never been done before. At the National Snow and Ice Data Center we are fortunate our user community has some unique needs that aren't met by mainstream solutions. Consequently we look for reuse opportunities wherever possible so we can focus on the tasks that add value for our user community. This poster offers a case study of one thread through a decade of reuse at NSIDC that has involved eight different development efforts to date.

  1. Hospital information system: reusability, designing, modelling, recommendations for implementing.

    PubMed

    Huet, B

    1998-01-01

    The aims of this paper are to precise some essential conditions for building reuse models for hospital information systems (HIS) and to present an application for hospital clinical laboratories. Reusability is a general trend in software, however reuse can involve a more or less part of design, classes, programs; consequently, a project involving reusability must be precisely defined. In the introduction it is seen trends in software, the stakes of reuse models for HIS and the special use case constituted with a HIS. The main three parts of this paper are: 1) Designing a reuse model (which objects are common to several information systems?) 2) A reuse model for hospital clinical laboratories (a genspec object model is presented for all laboratories: biochemistry, bacteriology, parasitology, pharmacology, ...) 3) Recommendations for generating plug-compatible software components (a reuse model can be implemented as a framework, concrete factors that increase reusability are presented). In conclusion reusability is a subtle exercise of which project must be previously and carefully defined.

  2. Intermediate Levels of Autonomy within the SSM/PMAD Breadboard

    NASA Technical Reports Server (NTRS)

    Dugal-Whitehead, Norma R.; Walls, Bryan

    1995-01-01

    The Space Station Module Power Management and Distribution (SSM/PMAD) bread-board is a test bed for the development of advanced power system control and automation. Software control in the SSM/PMAD breadboard is through co-operating systems, called Autonomous Agents. Agents can be a mixture of algorithmic software and expert systems. The early SSM/PMAD system was envisioned as being completely autonomous. It soon became apparent, though, that there would always be a need for human intervention, at least as long as a human interacts with the system in any way. In a system designed only for autonomous operation, manual intervention meant taking full control of the whole system, and loosing whatever expertise was in the system. Several methods for allowing humans to interact at an appropriate level of control were developed. This paper examines some of these intermediate modes of autonomy. The least humanly intrusive mode is simple monitoring. The ability to modify future behavior by altering a schedule involves high-level interaction. Modification of operating activities comes next. The coarsest mode of control is individual, unplanned operation of individual Power System components. Each of these levels is integrated into the SSM/PMAD breadboard, with support for the user (such as warnings of the consequences of control decisions) at every level.

  3. A Cs2LiYCl6:Ce-based advanced radiation monitoring device

    NASA Astrophysics Data System (ADS)

    Budden, B. S.; Stonehill, L. C.; Dallmann, N.; Baginski, M. J.; Best, D. J.; Smith, M. B.; Graham, S. A.; Dathy, C.; Frank, J. M.; McClish, M.

    2015-06-01

    Cs2LiYCl6:Ce3+ (CLYC) scintillator has gained recent interest because of its ability to perform simultaneous gamma spectroscopy and thermal neutron detection. Discrimination between the two incident particle types owes to the fundamentally unique emission waveforms, a consequence of the interaction and subsequent scintillation mechanisms within the crystal. Due to this dual-mode detector capability, CLYC was selected for the development of an Advanced Radiation Monitoring Device (ARMD), a compact handheld instrument for radioisotope identification and localization. ARMD consists of four 1 in.-right cylindrical CLYC crystals, custom readout electronics including a suitable multi-window application specific integrated circuit (ASIC), battery pack, proprietary software, and Android-based tablet for high-level analysis and display. We herein describe the motivation of the work and engineering design of the unit, and we explain the software embedded in the core module and for radioisotope analysis. We report an operational range of tens of keV to 8.5 MeV with approximately 5.3% gamma energy resolution at 662 keV, thermal neutron detection efficiency of 10%, battery lifetime of up to 10 h, manageable rates of 20 kHz; further, we describe in greater detail time to identify specific gamma source setups.

  4. Approximation of Engine Casing Temperature Constraints for Casing Mounted Electronics

    NASA Technical Reports Server (NTRS)

    Kratz, Jonathan L.; Culley, Dennis E.; Chapman, Jeffryes W.

    2017-01-01

    The performance of propulsion engine systems is sensitive to weight and volume considerations. This can severely constrain the configuration and complexity of the control system hardware. Distributed Engine Control technology is a response to these concerns by providing more flexibility in designing the control system, and by extension, more functionality leading to higher performing engine systems. Consequently, there can be a weight benefit to mounting modular electronic hardware on the engine core casing in a high temperature environment. This paper attempts to quantify the in-flight temperature constraints for engine casing mounted electronics. In addition, an attempt is made at studying heat soak back effects. The Commercial Modular Aero Propulsion System Simulation 40k (C-MAPSS40k) software is leveraged with real flight data as the inputs to the simulation. A two-dimensional (2-D) heat transfer model is integrated with the engine simulation to approximate the temperature along the length of the engine casing. This modification to the existing C-MAPSS40k software will provide tools and methodologies to develop a better understanding of the requirements for the embedded electronics hardware in future engine systems. Results of the simulations are presented and their implications on temperature constraints for engine casing mounted electronics is discussed.

  5. Approximation of Engine Casing Temperature Constraints for Casing Mounted Electronics

    NASA Technical Reports Server (NTRS)

    Kratz, Jonathan; Culley, Dennis; Chapman, Jeffryes

    2016-01-01

    The performance of propulsion engine systems is sensitive to weight and volume considerations. This can severely constrain the configuration and complexity of the control system hardware. Distributed Engine Control technology is a response to these concerns by providing more flexibility in designing the control system, and by extension, more functionality leading to higher performing engine systems. Consequently, there can be a weight benefit to mounting modular electronic hardware on the engine core casing in a high temperature environment. This paper attempts to quantify the in-flight temperature constraints for engine casing mounted electronics. In addition, an attempt is made at studying heat soak back effects. The Commercial Modular Aero Propulsion System Simulation 40k (C-MAPSS40k) software is leveraged with real flight data as the inputs to the simulation. A two-dimensional (2-D) heat transfer model is integrated with the engine simulation to approximate the temperature along the length of the engine casing. This modification to the existing C-MAPSS40k software will provide tools and methodologies to develop a better understanding of the requirements for the embedded electronics hardware in future engine systems. Results of the simulations are presented and their implications on temperature constraints for engine casing mounted electronics is discussed.

  6. Cloud computing approaches to accelerate drug discovery value chain.

    PubMed

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  7. [Working conditions and common mental disorders among primary health care workers from Botucatu, São Paulo State].

    PubMed

    Braga, Ludmila Candida de; Carvalho, Lidia Raquel de; Binder, Maria Cecília Pereira

    2010-06-01

    Common mental disorders (CMD) present high prevalence among general populations and workers with important individual and social consequences. This cross-sectional and descriptive study explores the relationship between psychological job demands, job control degree and job support and prevalence of CMD among primary health care workers of Botucatu - SP. The data collection was carried out using an unidentified self-administered questionnaire, with emphasis on items relating to demand-control-support situation and occurrence of CMD (Self Reporting Questionnaire, SRQ-20). The data were stored using the software Excel / Office XP 2003, and the statistical analyses were performed in SAS system. It was evidenced that 42.6% of primary health care workers presented CMD. The observed association - high prevalence of CMD with high-strain job (Karasek model) and low prevalence of CMD with low-strain job - indicates that, in the studied city, primary health care work conditions are contributive factors to workers' illness. The survey reveals the need of interventions aiming at caring the workers and also gets better work conditions and increase social support at work.

  8. Lifecycle Prognostics Architecture for Selected High-Cost Active Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    N. Lybeck; B. Pham; M. Tawfik

    There are an extensive body of knowledge and some commercial products available for calculating prognostics, remaining useful life, and damage index parameters. The application of these technologies within the nuclear power community is still in its infancy. Online monitoring and condition-based maintenance is seeing increasing acceptance and deployment, and these activities provide the technological bases for expanding to add predictive/prognostics capabilities. In looking to deploy prognostics there are three key aspects of systems that are presented and discussed: (1) component/system/structure selection, (2) prognostic algorithms, and (3) prognostics architectures. Criteria are presented for component selection: feasibility, failure probability, consequences of failure,more » and benefits of the prognostics and health management (PHM) system. The basis and methods commonly used for prognostics algorithms are reviewed and summarized. Criteria for evaluating PHM architectures are presented: open, modular architecture; platform independence; graphical user interface for system development and/or results viewing; web enabled tools; scalability; and standards compatibility. Thirteen software products were identified and discussed in the context of being potentially useful for deployment in a PHM program applied to systems in a nuclear power plant (NPP). These products were evaluated by using information available from company websites, product brochures, fact sheets, scholarly publications, and direct communication with vendors. The thirteen products were classified into four groups of software: (1) research tools, (2) PHM system development tools, (3) deployable architectures, and (4) peripheral tools. Eight software tools fell into the deployable architectures category. Of those eight, only two employ all six modules of a full PHM system. Five systems did not offer prognostic estimates, and one system employed the full health monitoring suite but lacked operations and maintenance support. Each product is briefly described in Appendix A. Selection of the most appropriate software package for a particular application will depend on the chosen component, system, or structure. Ongoing research will determine the most appropriate choices for a successful demonstration of PHM systems in aging NPPs.« less

  9. The use of emulator-based simulators for on-board software maintenance

    NASA Astrophysics Data System (ADS)

    Irvine, M. M.; Dartnell, A.

    2002-07-01

    Traditionally, onboard software maintenance activities within the space sector are performed using hardware-based facilities. These facilities are developed around the use of hardware emulation or breadboards containing target processors. Some sort of environment is provided around the hardware to support the maintenance actives. However, these environments are not easy to use to set-up the required test scenarios, particularly when the onboard software executes in a dynamic I/O environment, e.g. attitude control software, or data handling software. In addition, the hardware and/or environment may not support the test set-up required during investigations into software anomalies, e.g. raise spurious interrupt, fail memory, etc, and the overall "visibility" of the software executing may be limited. The Software Maintenance Simulator (SOMSIM) is a tool that can support the traditional maintenance facilities. The following list contains some of the main benefits that SOMSIM can provide: Low cost flexible extension to existing product - operational simulator containing software processor emulator; System-level high-fidelity test-bed in which software "executes"; Provides a high degree of control/configuration over the entire "system", including contingency conditions perhaps not possible with real hardware; High visibility and control over execution of emulated software. This paper describes the SOMSIM concept in more detail, and also describes the SOMSIM study being carried out for ESA/ESOC by VEGA IT GmbH.

  10. A study of malware detection on smart mobile devices

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Zhang, Hanlin; Xu, Guobin

    2013-05-01

    The growing in use of smart mobile devices for everyday applications has stimulated the spread of mobile malware, especially on popular mobile platforms. As a consequence, malware detection becomes ever more critical in sustaining the mobile market and providing a better user experience. In this paper, we review the existing malware and detection schemes. Using real-world malware samples with known signatures, we evaluate four popular commercial anti-virus tools and our data shows that these tools can achieve high detection accuracy. To deal with the new malware with unknown signatures, we study the anomaly based detection using decision tree algorithm. We evaluate the effectiveness of our detection scheme using malware and legitimate software samples. Our data shows that the detection scheme using decision tree can achieve a detection rate up to 90% and a false positive rate as low as 10%.

  11. Multiphase and multiscale approaches for modelling the injection of textured moulds

    NASA Astrophysics Data System (ADS)

    Nakhoul, Rebecca; Laure, Patrice; Silva, Luisa; Vincent, Michel

    2016-10-01

    Micro-injection moulding is frequently used for the mass production of devices in micro-medical technologies, micro-optics and micro-mechanics. This work focuses mainly on offering numerical tools to model the injection of micro-textured moulds. Such tools can predict the different filling scenarios of the micro-details and consequently offer optimal operating conditions (mould and melt temperatures, melt flow, stresses, etc.) to analyse the final part quality. To do so, a full Eulerian approach is used to model the injection of textured moulds at both the macroscopic and microscopic scales as usual industrial software cannot handle the filling of micro details. Since heat transfers with the mould are very relevant due to high cooling rates, the coupling between micro- and macro- simulations is primordial to insure a complete and accurate representation of textured mould injection.

  12. Reliability of Fault Tolerant Control Systems. Part 1

    NASA Technical Reports Server (NTRS)

    Wu, N. Eva

    2001-01-01

    This paper reports Part I of a two part effort, that is intended to delineate the relationship between reliability and fault tolerant control in a quantitative manner. Reliability analysis of fault-tolerant control systems is performed using Markov models. Reliability properties, peculiar to fault-tolerant control systems are emphasized. As a consequence, coverage of failures through redundancy management can be severely limited. It is shown that in the early life of a syi1ein composed of highly reliable subsystems, the reliability of the overall system is affine with respect to coverage, and inadequate coverage induces dominant single point failures. The utility of some existing software tools for assessing the reliability of fault tolerant control systems is also discussed. Coverage modeling is attempted in Part II in a way that captures its dependence on the control performance and on the diagnostic resolution.

  13. Generating Code Review Documentation for Auto-Generated Mission-Critical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2009-01-01

    Model-based design and automated code generation are increasingly used at NASA to produce actual flight code, particularly in the Guidance, Navigation, and Control domain. However, since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently auto-generated code still needs to be fully tested and certified. We have thus developed AUTOCERT, a generator-independent plug-in that supports the certification of auto-generated code. AUTOCERT takes a set of mission safety requirements, and formally verifies that the autogenerated code satisfies these requirements. It generates a natural language report that explains why and how the code complies with the specified requirements. The report is hyper-linked to both the program and the verification conditions and thus provides a high-level structured argument containing tracing information for use in code reviews.

  14. Surface Temperature Data Analysis

    NASA Technical Reports Server (NTRS)

    Hansen, James; Ruedy, Reto

    2012-01-01

    Small global mean temperature changes may have significant to disastrous consequences for the Earth's climate if they persist for an extended period. Obtaining global means from local weather reports is hampered by the uneven spatial distribution of the reliably reporting weather stations. Methods had to be developed that minimize as far as possible the impact of that situation. This software is a method of combining temperature data of individual stations to obtain a global mean trend, overcoming/estimating the uncertainty introduced by the spatial and temporal gaps in the available data. Useful estimates were obtained by the introduction of a special grid, subdividing the Earth's surface into 8,000 equal-area boxes, using the existing data to create virtual stations at the center of each of these boxes, and combining temperature anomalies (after assessing the radius of high correlation) rather than temperatures.

  15. Using social media to facilitate knowledge transfer in complex engineering environments: a primer for educators

    NASA Astrophysics Data System (ADS)

    Murphy, Glen; Salomone, Sonia

    2013-03-01

    While highly cohesive groups are potentially advantageous they are also often correlated with the emergence of knowledge and information silos based around those same functional or occupational clusters. Consequently, an essential challenge for engineering organisations wishing to overcome informational silos is to implement mechanisms that facilitate, encourage and sustain interactions between otherwise disconnected groups. This paper acts as a primer for those seeking to gain an understanding of the design, functionality and utility of a suite of software tools generically termed social media technologies in the context of optimising the management of tacit engineering knowledge. Underpinned by knowledge management theory and using detailed case examples, this paper explores how social media technologies achieve such goals, allowing for the transfer of knowledge by tapping into the tacit and explicit knowledge of disparate groups in complex engineering environments.

  16. Use of computers in dysmorphology.

    PubMed Central

    Diliberti, J H

    1988-01-01

    As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092

  17. Tethys: A Platform for Water Resources Modeling and Decision Support Apps

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Christensen, S. D.; Jones, N.; Nelson, E. J.

    2014-12-01

    Cloud-based applications or apps are a promising medium through which water resources models and data can be conveyed in a user-friendly environment—making them more accessible to decision-makers and stakeholders. In the context of this work, a water resources web app is a web application that exposes limited modeling functionality for a scenario exploration activity in a structured workflow (e.g.: land use change runoff analysis, snowmelt runoff prediction, and flood potential analysis). The technical expertise required to develop water resources web apps can be a barrier to many potential developers of water resources apps. One challenge that developers face is in providing spatial storage, analysis, and visualization for the spatial data that is inherent to water resources models. The software projects that provide this functionality are non-standard to web development and there are a large number of free and open source software (FOSS) projects to choose from. In addition, it is often required to synthesize several software projects to provide all of the needed functionality. Another challenge for the developer will be orchestrating the use of several software components. Consequently, the initial software development investment required to deploy an effective water resources cloud-based application can be substantial. The Tethys Platform has been developed to lower the technical barrier and minimize the initial development investment that prohibits many scientists and engineers from making use of the web app medium. Tethys synthesizes several software projects including PostGIS for spatial storage, 52°North WPS for spatial analysis, GeoServer for spatial publishing, Google Earth™, Google Maps™ and OpenLayers for spatial visualization, and Highcharts for plotting tabular data. The software selection came after a literature review of software projects being used to create existing earth sciences web apps. All of the software is linked via a Python-powered software development kit (SDK). Tethys developers use the SDK to build their apps and incorporate the needed functionality from the software suite. The presentation will include several apps that have been developed using Tethys to demonstrate its capabilities. Based upon work supported by the National Science Foundation under Grant No. 1135483.

  18. Functional description of the ISIS system

    NASA Technical Reports Server (NTRS)

    Berman, W. J.

    1979-01-01

    Development of software for avionic and aerospace applications (flight software) is influenced by a unique combination of factors which includes: (1) length of the life cycle of each project; (2) necessity for cooperation between the aerospace industry and NASA; (3) the need for flight software that is highly reliable; (4) the increasing complexity and size of flight software; and (5) the high quality of the programmers and the tightening of project budgets. The interactive software invocation system (ISIS) which is described is designed to overcome the problems created by this combination of factors.

  19. The Legacy of Space Shuttle Flight Software

    NASA Technical Reports Server (NTRS)

    Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.

    2011-01-01

    The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.

  20. A Construct for Describing Software Development Risks

    DTIC Science & Technology

    1994-07-01

    consequences during any risk identification process. It is more important and expedient to capture the conditions since the basic statement of conse ...chain of causal events. The contrast between the exploration of conditions rather than conse - quences in risk identification is similar to that of...extent that a general con - dition has a multiplicity of individual characteristics. In the CTC representation, these distinct risks can share common

  1. Recommendations and settings of word prediction software by health-related professionals for patients with spinal cord injury: a prospective observational study.

    PubMed

    Pouplin, Samuel; Roche, Nicolas; Hugeron, Caroline; Vaugier, Isabelle; Bensmail, Djamel

    2016-02-01

    For people with cervical spinal cord injury (SCI), access to computers can be difficult, thus several devices have been developed to facilitate their use. However, text input speed remains very slow compared to users who do not have a disability, even with these devices. Several methods have been developed to increase text input speed, such as word prediction software (WPS). Health-related professionals (HRP) often recommend this type of software to people with cervical SCI. WPS can be customized using different settings. It is likely that the settings used will influence the effectiveness of the software on text input speed. However, there is currently a lack of literature regarding professional practices for the setting of WPS as well as the impact for users. To analyze word prediction software settings used by HRP for people with cervical SCI. Prospective observational study. Garches, France; health-related professionals who recommend Word Prediction Software. A questionnaire was submitted to HRP who advise tetraplegic people regarding the use of communication devices. A total of 93 professionals responded to the survey. The most frequently recommended software was Skippy, a commercially available software. HRP rated the importance of the possibility to customise the settings as high. Moreover, they rated some settings as more important than others (P<0.001). However, except for the number of words displayed, each setting was configured by less than 50% of HRP. The results showed that there was a difference between the perception of the importance of some settings and data in the literature regarding the optimization of settings. Moreover, although some parameters were considered as very important, they were rarely specifically configured. Confidence in default settings and lack of information regarding optimal settings seem to be the main reasons for this discordance. This could also explain the disparate results of studies which evaluated the impact of WPS on text input speed in people with cervical SCI. The results showed that there was a difference between the perception of the importance of some settings and data in the literature regarding the optimization of settings. Moreover, although some parameters were considered as very important, they were rarely specifically configured. Confidence in default settings and lack of information regarding optimal settings seem to be the main reasons for this discordance. This could also explain the disparate results of studies which evaluated the impact of WPS on text input speed in people with cervical SCI. Professionals tend to have confidence in default settings, despite the fact they are not always appropriate for users. It thus seems essential to develop information networks and training to disseminate the results of studies and in consequence possibly improve communication for people with cervical SCI who use such devices.

  2. Software engineering as an engineering discipline

    NASA Technical Reports Server (NTRS)

    Gibbs, Norman

    1988-01-01

    The goals of the Software Engineering Institute's Education Program are as follows: to increase the number of highly qualified software engineers--new software engineers and existing practitioners; and to be the leading center of expertise for software engineering education and training. A discussion of these goals is presented in vugraph form.

  3. Understanding Acceptance of Software Metrics--A Developer Perspective

    ERIC Educational Resources Information Center

    Umarji, Medha

    2009-01-01

    Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…

  4. Have More Fun Teaching Physics: Simulating, Stimulating Software.

    ERIC Educational Resources Information Center

    Jenkins, Doug

    1996-01-01

    High school physics offers opportunities to use problem solving and lab practices as well as cement skills in research, technical writing, and software applications. Describes and evaluates computer software enhancing the high school physics curriculum including spreadsheets for laboratory data, all-in-one simulators, projectile motion simulators,…

  5. A Stochastic Spiking Neural Network for Virtual Screening.

    PubMed

    Morro, A; Canals, V; Oliver, A; Alomar, M L; Galan-Prado, F; Ballester, P J; Rossello, J L

    2018-04-01

    Virtual screening (VS) has become a key computational tool in early drug design and screening performance is of high relevance due to the large volume of data that must be processed to identify molecules with the sought activity-related pattern. At the same time, the hardware implementations of spiking neural networks (SNNs) arise as an emerging computing technique that can be applied to parallelize processes that normally present a high cost in terms of computing time and power. Consequently, SNN represents an attractive alternative to perform time-consuming processing tasks, such as VS. In this brief, we present a smart stochastic spiking neural architecture that implements the ultrafast shape recognition (USR) algorithm achieving two order of magnitude of speed improvement with respect to USR software implementations. The neural system is implemented in hardware using field-programmable gate arrays allowing a highly parallelized USR implementation. The results show that, due to the high parallelization of the system, millions of compounds can be checked in reasonable times. From these results, we can state that the proposed architecture arises as a feasible methodology to efficiently enhance time-consuming data-mining processes such as 3-D molecular similarity search.

  6. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  7. Using MATLAB Software on the Peregrine System | High-Performance Computing

    Science.gov Websites

    | NREL MATLAB Software on the Peregrine System Using MATLAB Software on the Peregrine System Learn how to use MATLAB software on the Peregrine system. Running MATLAB in Batch Mode Using the node. Understanding Versions and Licenses Learn about the MATLAB software versions and licenses

  8. Experience with Ada on the F-18 High Alpha Research Vehicle Flight Test Program

    NASA Technical Reports Server (NTRS)

    Regenie, Victoria A.; Earls, Michael; Le, Jeanette; Thomson, Michael

    1992-01-01

    Considerable experience was acquired with Ada at the NASA Dryden Flight Research Facility during the on-going High Alpha Technology Program. In this program, an F-18 aircraft was highly modified by the addition of thrust-vectoring vanes to the airframe. In addition, substantial alteration was made in the original quadruplex flight control system. The result is the High Alpha Research Vehicle. An additional research flight control computer was incorporated in each of the four channels. Software for the research flight control computer was written in Ada. To date, six releases of this software have been flown. This paper provides a detailed description of the modifications to the research flight control system. Efficient ground-testing of the software was accomplished by using simulations that used the Ada for portions of their software. These simulations are also described. Modifying and transferring the Ada for flight software to the software simulation configuration has allowed evaluation of this language. This paper also discusses such significant issues in using Ada as portability, modifiability, and testability as well as documentation requirements.

  9. Experience with Ada on the F-18 High Alpha Research Vehicle flight test program

    NASA Technical Reports Server (NTRS)

    Regenie, Victoria A.; Earls, Michael; Le, Jeanette; Thomson, Michael

    1994-01-01

    Considerable experience has been acquired with Ada at the NASA Dryden Flight Research Facility during the on-going High Alpha Technology Program. In this program, an F-18 aircraft has been highly modified by the addition of thrust-vectoring vanes to the airframe. In addition, substantial alteration was made in the original quadruplex flight control system. The result is the High Alpha Research Vehicle. An additional research flight control computer was incorporated in each of the four channels. Software for the research flight control computer was written Ada. To date, six releases of this software have been flown. This paper provides a detailed description of the modifications to the research flight control system. Efficient ground-testing of the software was accomplished by using simulations that used the Ada for portions of their software. These simulations are also described. Modifying and transferring the Ada flight software to the software simulation configuration has allowed evaluation of this language. This paper also discusses such significant issues in using Ada as portability, modifiability, and testability as well as documentation requirements.

  10. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  11. Partnerships for progress at the U.S. Geological Survey

    USGS Publications Warehouse

    ,

    2005-01-01

    This is about opportunity for the private sector. It is about combining the research capabilities of Government scientists with the commercial development potential of private companies. It is, consequently, about partnerships leading to products and services to enhance the quality of life and strengthen the American economy. The image at the left shows a computer screen image of Washington, DC using RevPG software for map revision.

  12. Identifying Acquisition Patterns of Failure Using Systems Archetypes

    DTIC Science & Technology

    2008-04-02

    OF PAGES 18 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b . ABSTRACT unclassified c. THIS PAGE unclassified Standard Form 298 (Rev...any but the smallest programs, complete path coverage for defect detection is impractical. Adapted from Pressman , R.S., Software Engineering: A...Firefighting” concept from “Past the Tipping Point” Fix S O B Problem Symptom R “Fixes That Fail” – Systems Archetype S Unintended Consequences S

  13. CubeSat mission design software tool for risk estimating relationships

    NASA Astrophysics Data System (ADS)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  14. Mathematical Understanding of the Underprivileged Students through GeoGebra

    NASA Astrophysics Data System (ADS)

    Amam, A.; Fatimah, A. T.; Hartono, W.; Effendi, A.

    2017-09-01

    A student’s mathematical understanding in high school from poor families in the district of Ciamis is still low. After reviews the various literature and earlier research, consequently, researchers convince that learning mathematics with GeoGebra can help students improve for the better understanding. Our long-term goal of this research is to support the implementation of new curriculum, namely ICT-based learning mathematics. Another goal is to give a basic mastery skill regarding mathematics software to students from underprivileged families. Moreover, the specific objective of this study is to examine the students’ mathematical understanding from underprivileged families after the implementation of learning with GeoGebra. We use a quantitative comparative research method to determine differences in the mathematical understanding of students’ from underprivileged families before and after mathematics learning with GeoGebra. Accordingly, the students of senior high school from underprivileged family in Baregbeg, Ciamis district, are the population of this study. This research is using purposive sampling. The instrument is in the form of a test question, which is the test of mathematical understanding. Research results show that the mathematical understanding students’ from underprivileged families after the mathematics learning with GeoGebra becomes better than before. The novelty of this research is that students understand the material of trigonometry through the use of modules, aided by GeoGebra in learning activities. Thus, the understanding has an impact on improving students’ mathematical understanding. Students also master the use of GeoGebra Software. Implementing these two things will be very useful for the next lesson.

  15. Measures of phylogenetic differentiation provide robust and complementary insights into microbial communities.

    PubMed

    Parks, Donovan H; Beiko, Robert G

    2013-01-01

    High-throughput sequencing techniques have made large-scale spatial and temporal surveys of microbial communities routine. Gaining insight into microbial diversity requires methods for effectively analyzing and visualizing these extensive data sets. Phylogenetic β-diversity measures address this challenge by allowing the relationship between large numbers of environmental samples to be explored using standard multivariate analysis techniques. Despite the success and widespread use of phylogenetic β-diversity measures, an extensive comparative analysis of these measures has not been performed. Here, we compare 39 measures of phylogenetic β diversity in order to establish the relative similarity of these measures along with key properties and performance characteristics. While many measures are highly correlated, those commonly used within microbial ecology were found to be distinct from those popular within classical ecology, and from the recently recommended Gower and Canberra measures. Many of the measures are surprisingly robust to different rootings of the gene tree, the choice of similarity threshold used to define operational taxonomic units, and the presence of outlying basal lineages. Measures differ considerably in their sensitivity to rare organisms, and the effectiveness of measures can vary substantially under alternative models of differentiation. Consequently, the depth of sequencing required to reveal underlying patterns of relationships between environmental samples depends on the selected measure. Our results demonstrate that using complementary measures of phylogenetic β diversity can further our understanding of how communities are phylogenetically differentiated. Open-source software implementing the phylogenetic β-diversity measures evaluated in this manuscript is available at http://kiwi.cs.dal.ca/Software/ExpressBetaDiversity.

  16. Development of an Expert System for Representing Procedural Knowledge

    NASA Technical Reports Server (NTRS)

    Georgeff, Michael P.; Lansky, Amy L.

    1985-01-01

    A high level of automation is of paramount importance in most space operations. It is critical for unmanned missions and greatly increases the effectiveness of manned missions. However, although many functions can be automated by using advanced engineering techniques, others require complex reasoning, sensing, and manipulatory capabilities that go beyond this technology. Automation of fault diagnosis and malfunction handling is a case in point. The military have long been interested in this problem, and have developed automatic test equipment to aid in the maintenance of complex military hardware. These systems are all based on conventional software and engineering techniques. However, the effectiveness of such test equipment is severely limited. The equipment is inflexible and unresponsive to the skill level of the technicians using it. The diagnostic procedures cannot be matched to the exigencies of the current situation nor can they cope with reconfiguration or modification of the items under test. The diagnosis cannot be guided by useful advice from technicians and, when a fault cannot be isolated, no explanation is given as to the cause of failure. Because these systems perform a prescribed sequence of tests, they cannot utilize knowledge of a particular situation to focus attention on more likely trouble spots. Consequently, real-time performance is highly unsatisfactory. Furthermore, the cost of developing test software is substantial and time to maturation is excessive. Significant advances in artificial intelligence (AI) have recently led to the development of powerful and flexible reasoning systems, known as expert or knowledge-based systems. We have devised a powerful and theoretically sound scheme for representing and reasoning about procedural knowledge.

  17. Factors That Affect Software Testability

    NASA Technical Reports Server (NTRS)

    Voas, Jeffrey M.

    1991-01-01

    Software faults that infrequently affect software's output are dangerous. When a software fault causes frequent software failures, testing is likely to reveal the fault before the software is releases; when the fault remains undetected during testing, it can cause disaster after the software is installed. A technique for predicting whether a particular piece of software is likely to reveal faults within itself during testing is found in [Voas91b]. A piece of software that is likely to reveal faults within itself during testing is said to have high testability. A piece of software that is not likely to reveal faults within itself during testing is said to have low testability. It is preferable to design software with higher testabilities from the outset, i.e., create software with as high of a degree of testability as possible to avoid the problems of having undetected faults that are associated with low testability. Information loss is a phenomenon that occurs during program execution that increases the likelihood that a fault will remain undetected. In this paper, I identify two brad classes of information loss, define them, and suggest ways of predicting the potential for information loss to occur. We do this in order to decrease the likelihood that faults will remain undetected during testing.

  18. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  19. HSCT4.0 Application: Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Salas, A. O.; Walsh, J. L.; Mason, B. H.; Weston, R. P.; Townsend, J. C.; Samareh, J. A.; Green, L. L.

    2001-01-01

    The software requirements for the High Performance Computing and Communication Program High Speed Civil Transport application project, referred to as HSCT4.0, are described. The objective of the HSCT4.0 application project is to demonstrate the application of high-performance computing techniques to the problem of multidisciplinary design optimization of a supersonic transport configuration, using high-fidelity analysis simulations. Descriptions of the various functions (and the relationships among them) that make up the multidisciplinary application as well as the constraints on the software design arc provided. This document serves to establish an agreement between the suppliers and the customer as to what the HSCT4.0 application should do and provides to the software developers the information necessary to design and implement the system.

  20. Educational interactive multimedia software: The impact of interactivity on learning

    NASA Astrophysics Data System (ADS)

    Reamon, Derek Trent

    This dissertation discusses the design, development, deployment and testing of two versions of educational interactive multimedia software. Both versions of the software are focused on teaching mechanical engineering undergraduates about the fundamentals of direct-current (DC) motor physics and selection. The two versions of Motor Workshop software cover the same basic materials on motors, but differ in the level of interactivity between the students and the software. Here, the level of interactivity refers to the particular role of the computer in the interaction between the user and the software. In one version, the students navigate through information that is organized by topic, reading text, and viewing embedded video clips; this is referred to as "low-level interactivity" software because the computer simply presents the content. In the other version, the students are given a task to accomplish---they must design a small motor-driven 'virtual' vehicle that competes against computer-generated opponents. The interaction is guided by the software which offers advice from 'experts' and provides contextual information; we refer to this as "high-level interactivity" software because the computer is actively participating in the interaction. The software was used in two sets of experiments, where students using the low-level interactivity software served as the 'control group,' and students using the highly interactive software were the 'treatment group.' Data, including pre- and post-performance tests, questionnaire responses, learning style characterizations, activity tracking logs and videotapes were collected for analysis. Statistical and observational research methods were applied to the various data to test the hypothesis that the level of interactivity effects the learning situation, with higher levels of interactivity being more effective for learning. The results show that both the low-level and high-level interactive versions of the software were effective in promoting learning about the subject of motors. The focus of learning varied between users of the two versions, however. The low-level version was more effective for teaching concepts and terminology, while the high-level version seemed to be more effective for teaching engineering applications.

  1. Automated identification of retained surgical items in radiological images

    NASA Astrophysics Data System (ADS)

    Agam, Gady; Gan, Lin; Moric, Mario; Gluncic, Vicko

    2015-03-01

    Retained surgical items (RSIs) in patients is a major operating room (OR) patient safety concern. An RSI is any surgical tool, sponge, needle or other item inadvertently left in a patients body during the course of surgery. If left undetected, RSIs may lead to serious negative health consequences such as sepsis, internal bleeding, and even death. To help physicians efficiently and effectively detect RSIs, we are developing computer-aided detection (CADe) software for X-ray (XR) image analysis, utilizing large amounts of currently available image data to produce a clinically effective RSI detection system. Physician analysis of XRs for the purpose of RSI detection is a relatively lengthy process that may take up to 45 minutes to complete. It is also error prone due to the relatively low acuity of the human eye for RSIs in XR images. The system we are developing is based on computer vision and machine learning algorithms. We address the problem of low incidence by proposing synthesis algorithms. The CADe software we are developing may be integrated into a picture archiving and communication system (PACS), be implemented as a stand-alone software application, or be integrated into portable XR machine software through application programming interfaces. Preliminary experimental results on actual XR images demonstrate the effectiveness of the proposed approach.

  2. ProteoWizard: open source software for rapid proteomics tools development.

    PubMed

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  3. Emerging High School Students' Problem Solving Trajectories Based on the Use of Dynamic Software

    ERIC Educational Resources Information Center

    Santos-Trigo, Manuel; Cristobal-Escalante, Cesar

    2008-01-01

    This study documents problem solving approaches that high school students develop as a result of using systematically Cabri-Geometry software. Results show that the use of the software becomes an important tool for students to construct dynamic representations of the problems that were used to identify and examine different mathematical relations.…

  4. Supporting Early Math--Rationales and Requirements for High Quality Software

    ERIC Educational Resources Information Center

    Haake, Magnus; Husain, Layla; Gulz, Agneta

    2015-01-01

    There is substantial evidence that preschooler's performance in early math is highly correlated to math performance throughout school as well as academic skills in general. One way to help children attain early math skills is by using targeted educational software and the paper discusses potential gains of using such software to support early math…

  5. Porting and refurbishment of the WSS TNG control software

    NASA Astrophysics Data System (ADS)

    Caproni, Alessandro; Zacchei, Andrea; Vuerli, Claudio; Pucillo, Mauro

    2004-09-01

    The Workstation Software Sytem (WSS) is the high level control software of the Italian Galileo Galilei Telescope settled in La Palma Canary Island developed at the beginning of '90 for HP-UX workstations. WSS may be seen as a middle layer software system that manages the communications between the real time systems (VME), different workstations and high level applications providing a uniform distributed environment. The project to port the control software from the HP workstation to Linux environment started at the end of 2001. It is aimed to refurbish the control software introducing some of the new software technologies and languages, available for free in the Linux operating system. The project was realized by gradually substituting each HP workstation with a Linux PC with the goal to avoid main changes in the original software running under HP-UX. Three main phases characterized the project: creation of a simulated control room with several Linux PCs running WSS (to check all the functionality); insertion in the simulated control room of some HPs (to check the mixed environment); substitution of HP workstation in the real control room. From a software point of view, the project introduces some new technologies, like multi-threading, and the possibility to develop high level WSS applications with almost every programming language that implements the Berkley sockets. A library to develop java applications has also been created and tested.

  6. Lessons learned in managing crowdsourced data in the Alaskan Arctic.

    NASA Astrophysics Data System (ADS)

    Mastracci, Diana

    2017-04-01

    There is perhaps no place in which the consequences of global climate change can be felt more acutely than the Arctic. However, due to lack of measurements at the high latitudes, validation processes are often problematic. Citizen science projects, co-designed together with Native communities at the interface of traditional knowledge and scientific research, could play a major role in climate change adaptation strategies by advancing knowledge of the Arctic system, strengthening inter-generational bonds and facilitating improved knowledge transfer. This presentation will present lessons learned from a pilot project in the Alaskan Arctic, in which innovative approaches were used to design climate change adaptation strategies to support young subsistence hunters in taking in-situ measurements whilst out on the sea-ice. Both the socio-cultural and hardware/software challenges presented in this presentation, could provide useful guidance for future programs that aim to integrate citizens' with scientific data in Arctic communities.

  7. Self-Configuration and Self-Optimization Process in Heterogeneous Wireless Networks

    PubMed Central

    Guardalben, Lucas; Villalba, Luis Javier García; Buiati, Fábio; Sobral, João Bosco Mangueira; Camponogara, Eduardo

    2011-01-01

    Self-organization in Wireless Mesh Networks (WMN) is an emergent research area, which is becoming important due to the increasing number of nodes in a network. Consequently, the manual configuration of nodes is either impossible or highly costly. So it is desirable for the nodes to be able to configure themselves. In this paper, we propose an alternative architecture for self-organization of WMN based on Optimized Link State Routing Protocol (OLSR) and the ad hoc on demand distance vector (AODV) routing protocols as well as using the technology of software agents. We argue that the proposed self-optimization and self-configuration modules increase the throughput of network, reduces delay transmission and network load, decreases the traffic of HELLO messages according to network’s scalability. By simulation analysis, we conclude that the self-optimization and self-configuration mechanisms can significantly improve the performance of OLSR and AODV protocols in comparison to the baseline protocols analyzed. PMID:22346584

  8. Self-configuration and self-optimization process in heterogeneous wireless networks.

    PubMed

    Guardalben, Lucas; Villalba, Luis Javier García; Buiati, Fábio; Sobral, João Bosco Mangueira; Camponogara, Eduardo

    2011-01-01

    Self-organization in Wireless Mesh Networks (WMN) is an emergent research area, which is becoming important due to the increasing number of nodes in a network. Consequently, the manual configuration of nodes is either impossible or highly costly. So it is desirable for the nodes to be able to configure themselves. In this paper, we propose an alternative architecture for self-organization of WMN based on Optimized Link State Routing Protocol (OLSR) and the ad hoc on demand distance vector (AODV) routing protocols as well as using the technology of software agents. We argue that the proposed self-optimization and self-configuration modules increase the throughput of network, reduces delay transmission and network load, decreases the traffic of HELLO messages according to network's scalability. By simulation analysis, we conclude that the self-optimization and self-configuration mechanisms can significantly improve the performance of OLSR and AODV protocols in comparison to the baseline protocols analyzed.

  9. Optimizing Excited-State Electronic-Structure Codes for Intel Knights Landing: A Case Study on the BerkeleyGW Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deslippe, Jack; da Jornada, Felipe H.; Vigil-Fowler, Derek

    2016-10-06

    We profile and optimize calculations performed with the BerkeleyGW code on the Xeon-Phi architecture. BerkeleyGW depends both on hand-tuned critical kernels as well as on BLAS and FFT libraries. We describe the optimization process and performance improvements achieved. We discuss a layered parallelization strategy to take advantage of vector, thread and node-level parallelism. We discuss locality changes (including the consequence of the lack of L3 cache) and effective use of the on-package high-bandwidth memory. We show preliminary results on Knights-Landing including a roofline study of code performance before and after a number of optimizations. We find that the GW methodmore » is particularly well-suited for many-core architectures due to the ability to exploit a large amount of parallelism over plane-wave components, band-pairs, and frequencies.« less

  10. In silico Study of the Pharmacologic Properties and Cytotoxicity Pathways in Cancer Cells of Various Indolylquinone Analogues of Perezone.

    PubMed

    Escobedo-González, René; Vargas-Requena, Claudia Lucia; Moyers-Montoya, Edgar; Aceves-Hernández, Juan Manuel; Nicolás-Vázquez, María Inés; Miranda-Ruvalcaba, René

    2017-06-25

    Several indolylquinone analogues of perezone, a natural sesquiterpene quinone, were characterized in this work by theoretical methods. In addition, some physicochemical, toxicological and metabolic properties were predicted using bioinformatics software. The predicted physicochemical properties are in agreement with the solubility and cLogP values, the penetration across the cell membrane, and absorption values, as well as with a possible apoptosis-activated mechanism of cytotoxic action. The toxicological predictions suggest no mutagenic, tumorigenic or reproductive effects of the four target molecules. Complementarily, the results of a performed docking study show high scoring values and hydrogen bonding values in agreement with the cytotoxicity IC 50 value ranking, i.e: indolylmenadione > indolylperezone > indolylplumbagine > indolylisoperezone. Consequently, it is possible to suggest an appropriate apoptotic pathway for each compound. Finally, potential metabolic pathways of the molecules were proposed.

  11. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  12. An Evaluation of Departmental Radiation Oncology Incident Reports: Anticipating a National Reporting System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Terezakis, Stephanie A., E-mail: stereza1@jhmi.edu; Harris, Kendra M.; Ford, Eric

    Purpose: Systems to ensure patient safety are of critical importance. The electronic incident reporting systems (IRS) of 2 large academic radiation oncology departments were evaluated for events that may be suitable for submission to a national reporting system (NRS). Methods and Materials: All events recorded in the combined IRS were evaluated from 2007 through 2010. Incidents were graded for potential severity using the validated French Nuclear Safety Authority (ASN) 5-point scale. These incidents were categorized into 7 groups: (1) human error, (2) software error, (3) hardware error, (4) error in communication between 2 humans, (5) error at the human-software interface,more » (6) error at the software-hardware interface, and (7) error at the human-hardware interface. Results: Between the 2 systems, 4407 incidents were reported. Of these events, 1507 (34%) were considered to have the potential for clinical consequences. Of these 1507 events, 149 (10%) were rated as having a potential severity of ≥2. Of these 149 events, the committee determined that 79 (53%) of these events would be submittable to a NRS of which the majority was related to human error or to the human-software interface. Conclusions: A significant number of incidents were identified in this analysis. The majority of events in this study were related to human error and to the human-software interface, further supporting the need for a NRS to facilitate field-wide learning and system improvement.« less

  13. Design and Application of a Community Land Benchmarking System for Earth System Models

    NASA Astrophysics Data System (ADS)

    Mu, M.; Hoffman, F. M.; Lawrence, D. M.; Riley, W. J.; Keppel-Aleks, G.; Koven, C. D.; Kluzek, E. B.; Mao, J.; Randerson, J. T.

    2015-12-01

    Benchmarking has been widely used to assess the ability of climate models to capture the spatial and temporal variability of observations during the historical era. For the carbon cycle and terrestrial ecosystems, the design and development of an open-source community platform has been an important goal as part of the International Land Model Benchmarking (ILAMB) project. Here we developed a new benchmarking software system that enables the user to specify the models, benchmarks, and scoring metrics, so that results can be tailored to specific model intercomparison projects. Evaluation data sets included soil and aboveground carbon stocks, fluxes of energy, carbon and water, burned area, leaf area, and climate forcing and response variables. We used this system to evaluate simulations from the 5th Phase of the Coupled Model Intercomparison Project (CMIP5) with prognostic atmospheric carbon dioxide levels over the period from 1850 to 2005 (i.e., esmHistorical simulations archived on the Earth System Grid Federation). We found that the multi-model ensemble had a high bias in incoming solar radiation across Asia, likely as a consequence of incomplete representation of aerosol effects in this region, and in South America, primarily as a consequence of a low bias in mean annual precipitation. The reduced precipitation in South America had a larger influence on gross primary production than the high bias in incoming light, and as a consequence gross primary production had a low bias relative to the observations. Although model to model variations were large, the multi-model mean had a positive bias in atmospheric carbon dioxide that has been attributed in past work to weak ocean uptake of fossil emissions. In mid latitudes of the northern hemisphere, most models overestimate latent heat fluxes in the early part of the growing season, and underestimate these fluxes in mid-summer and early fall, whereas sensible heat fluxes show the opposite trend.

  14. Software engineering processes for Class D missions

    NASA Astrophysics Data System (ADS)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  15. Space Shuttle Software Development and Certification

    NASA Technical Reports Server (NTRS)

    Orr, James K.; Henderson, Johnnie A

    2000-01-01

    Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools

  16. CoLiTec software - detection of the near-zero apparent motion

    NASA Astrophysics Data System (ADS)

    Khlamov, Sergii V.; Savanevych, Vadym E.; Briukhovetskyi, Olexandr B.; Pohorelov, Artem V.

    2017-06-01

    In this article we described CoLiTec software for full automated frames processing. CoLiTec software allows processing the Big Data of observation results as well as processing of data that is continuously formed during observation. The scope of solving tasks includes frames brightness equalization, moving objects detection, astrometry, photometry, etc. Along with the high efficiency of Big Data processing CoLiTec software also ensures high accuracy of data measurements. A comparative analysis of the functional characteristics and positional accuracy was performed between CoLiTec and Astrometrica software. The benefits of CoLiTec used with wide field and low quality frames were observed. The efficiency of the CoLiTec software was proved by about 700.000 observations and over 1.500 preliminary discoveries.

  17. Computing in high-energy physics

    DOE PAGES

    Mount, Richard P.

    2016-05-31

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  18. Computing in high-energy physics

    NASA Astrophysics Data System (ADS)

    Mount, Richard P.

    2016-04-01

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  19. Computing in high-energy physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mount, Richard P.

    I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.

  20. Using FDA reports to inform a classification for health information technology safety problems

    PubMed Central

    Ong, Mei-Sing; Runciman, William; Coiera, Enrico

    2011-01-01

    Objective To expand an emerging classification for problems with health information technology (HIT) using reports submitted to the US Food and Drug Administration Manufacturer and User Facility Device Experience (MAUDE) database. Design HIT events submitted to MAUDE were retrieved using a standardized search strategy. Using an emerging classification with 32 categories of HIT problems, a subset of relevant events were iteratively analyzed to identify new categories. Two coders then independently classified the remaining events into one or more categories. Free-text descriptions were analyzed to identify the consequences of events. Measurements Descriptive statistics by number of reported problems per category and by consequence; inter-rater reliability analysis using the κ statistic for the major categories and consequences. Results A search of 899 768 reports from January 2008 to July 2010 yielded 1100 reports about HIT. After removing duplicate and unrelated reports, 678 reports describing 436 events remained. The authors identified four new categories to describe problems with software functionality, system configuration, interface with devices, and network configuration; the authors' classification with 32 categories of HIT problems was expanded by the addition of these four categories. Examination of the 436 events revealed 712 problems, 96% were machine-related, and 4% were problems at the human–computer interface. Almost half (46%) of the events related to hazardous circumstances. Of the 46 events (11%) associated with patient harm, four deaths were linked to HIT problems (0.9% of 436 events). Conclusions Only 0.1% of the MAUDE reports searched were related to HIT. Nevertheless, Food and Drug Administration reports did prove to be a useful new source of information about the nature of software problems and their safety implications with potential to inform strategies for safe design and implementation. PMID:21903979

  1. Debugging and Performance Analysis Software Tools for Peregrine System |

    Science.gov Websites

    High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea

  2. Component Models for Semantic Web Languages

    NASA Astrophysics Data System (ADS)

    Henriksson, Jakob; Aßmann, Uwe

    Intelligent applications and agents on the Semantic Web typically need to be specified with, or interact with specifications written in, many different kinds of formal languages. Such languages include ontology languages, data and metadata query languages, as well as transformation languages. As learnt from years of experience in development of complex software systems, languages need to support some form of component-based development. Components enable higher software quality, better understanding and reusability of already developed artifacts. Any component approach contains an underlying component model, a description detailing what valid components are and how components can interact. With the multitude of languages developed for the Semantic Web, what are their underlying component models? Do we need to develop one for each language, or is a more general and reusable approach achievable? We present a language-driven component model specification approach. This means that a component model can be (automatically) generated from a given base language (actually, its specification, e.g. its grammar). As a consequence, we can provide components for different languages and simplify the development of software artifacts used on the Semantic Web.

  3. Using digital photo technology to improve visualization of gastric lumen CT images

    NASA Astrophysics Data System (ADS)

    Pyrgioti, M.; Kyriakidis, A.; Chrysostomou, S.; Panaritis, V.

    2006-12-01

    In order to evaluate the gastric lumen CT images better, a new method is being applied to images using an Image Processing software. During a 12-month period, 69 patients with various gastric symptoms and 20 normal (as far as it concerns the upper gastrointestinal system) volunteers underwent computed tomography of the upper gastrointestinal system. Just before the examination the patients and the normal volunteers underwent preparation with 40 ml soda water and 10 ml gastrografin. All the CT images were digitized with an Olympus 3.2 Mpixel digital camera and further processed with an Image Processing software. The administration per os of gastrografin and soda water resulted in the distension of the stomach and consequently better visualization of all the anatomic parts. By using an Image Processing software in a PC, all the pathological and normal images of the stomach were better diagnostically estimated. We believe that the photo digital technology improves the diagnostic capacity not only of the CT image but also in MRI and probably many other imaging methods.

  4. Technology collaboration by means of an open source government

    NASA Astrophysics Data System (ADS)

    Berardi, Steven M.

    2009-05-01

    The idea of open source software originally began in the early 1980s, but it never gained widespread support until recently, largely due to the explosive growth of the Internet. Only the Internet has made this kind of concept possible, bringing together millions of software developers from around the world to pool their knowledge. The tremendous success of open source software has prompted many corporations to adopt the culture of open source and thus share information they previously held secret. The government, and specifically the Department of Defense (DoD), could also benefit from adopting an open source culture. In acquiring satellite systems, the DoD often builds walls between program offices, but installing doors between programs can promote collaboration and information sharing. This paper addresses the challenges and consequences of adopting an open source culture to facilitate technology collaboration for DoD space acquisitions. DISCLAIMER: The views presented here are the views of the author, and do not represent the views of the United States Government, United States Air Force, or the Missile Defense Agency.

  5. Human performance interfaces in air traffic control.

    PubMed

    Chang, Yu-Hern; Yeh, Chung-Hsing

    2010-01-01

    This paper examines how human performance factors in air traffic control (ATC) affect each other through their mutual interactions. The paper extends the conceptual SHEL model of ergonomics to describe the ATC system as human performance interfaces in which the air traffic controllers interact with other human performance factors including other controllers, software, hardware, environment, and organisation. New research hypotheses about the relationships between human performance interfaces of the system are developed and tested on data collected from air traffic controllers, using structural equation modelling. The research result suggests that organisation influences play a more significant role than individual differences or peer influences on how the controllers interact with the software, hardware, and environment of the ATC system. There are mutual influences between the controller-software, controller-hardware, controller-environment, and controller-organisation interfaces of the ATC system, with the exception of the controller-controller interface. Research findings of this study provide practical insights in managing human performance interfaces of the ATC system in the face of internal or external change, particularly in understanding its possible consequences in relation to the interactions between human performance factors.

  6. Empirical Estimates of 0Day Vulnerabilities in Control Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miles A. McQueen; Wayne F. Boyer; Sean M. McBride

    2009-01-01

    We define a 0Day vulnerability to be any vulnerability, in deployed software, which has been discovered by at least one person but has not yet been publicly announced or patched. These 0Day vulnerabilities are of particular interest when assessing the risk to well managed control systems which have already effectively mitigated the publicly known vulnerabilities. In these well managed systems the risk contribution from 0Days will have proportionally increased. To aid understanding of how great a risk 0Days may pose to control systems, an estimate of how many are in existence is needed. Consequently, using the 0Day definition given above,more » we developed and applied a method for estimating how many 0Day vulnerabilities are in existence on any given day. The estimate is made by: empirically characterizing the distribution of the lifespans, measured in days, of 0Day vulnerabilities; determining the number of vulnerabilities publicly announced each day; and applying a novel method for estimating the number of 0Day vulnerabilities in existence on any given day using the number of vulnerabilities publicly announced each day and the previously derived distribution of 0Day lifespans. The method was first applied to a general set of software applications by analyzing the 0Day lifespans of 491 software vulnerabilities and using the daily rate of vulnerability announcements in the National Vulnerability Database. This led to a conservative estimate that in the worst year there were, on average, 2500 0Day software related vulnerabilities in existence on any given day. Using a smaller but intriguing set of 15 0Day software vulnerability lifespans representing the actual time from discovery to public disclosure, we then made a more aggressive estimate. In this case, we estimated that in the worst year there were, on average, 4500 0Day software vulnerabilities in existence on any given day. We then proceeded to identify the subset of software applications likely to be used in some control systems, analyzed the associated subset of vulnerabilities, and characterized their lifespans. Using the previously developed method of analysis, we very conservatively estimated 250 control system related 0Day vulnerabilities in existence on any given day. While reasonable, this first order estimate for control systems is probably far more conservative than those made for general software systems since the estimate did not include vulnerabilities unique to control system specific components. These control system specific vulnerabilities were unable to be included in the estimate for a variety of reasons with the most problematic being that the public announcement of unique control system vulnerabilities is very sparse. Consequently, with the intent to improve the above 0Day estimate for control systems, we first identified the additional, unique to control systems, vulnerability estimation constraints and then investigated new mechanisms which may be useful for estimating the number of unique 0Day software vulnerabilities found in control system components. We proceeded to identify a number of new mechanisms and approaches for estimating and incorporating control system specific vulnerabilities into an improved 0Day estimation method. These new mechanisms and approaches appear promising and will be more rigorously evaluated during the course of the next year.« less

  7. A high-rate PCI-based telemetry processor system

    NASA Astrophysics Data System (ADS)

    Turri, R.

    2002-07-01

    The high performances reached by the Satellite on-board telemetry generation and transmission, as consequently, will impose the design of ground facilities with higher processing capabilities at low cost to allow a good diffusion of these ground station. The equipment normally used are based on complex, proprietary bus and computing architectures that prevent the systems from exploiting the continuous and rapid increasing in computing power available on market. The PCI bus systems now allow processing of high-rate data streams in a standard PC-system. At the same time the Windows NT operating system supports multitasking and symmetric multiprocessing, giving the capability to process high data rate signals. In addition, high-speed networking, 64 bit PCI-bus technologies and the increase in processor power and software, allow creating a system based on COTS products (which in future may be easily and inexpensively upgraded). In the frame of EUCLID RTP 9.8 project, a specific work element was dedicated to develop the architecture of a system able to acquire telemetry data of up to 600 Mbps. Laben S.p.A - a Finmeccanica Company -, entrusted of this work, has designed a PCI-based telemetry system making possible the communication between a satellite down-link and a wide area network at the required rate.

  8. Agile Acceptance Test–Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software

    PubMed Central

    Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-01-01

    Background Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test–driven development and automated regression testing promotes reliability. Test–driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a “safety net” for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and “living” design documentation. Rapid-cycle development or “agile” methods are being successfully applied to CDS development. The agile practice of automated test–driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as “executable requirements.” Objective We aimed to establish feasibility of acceptance test–driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Methods Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory’s expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. Results We used test–driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the “executable requirements” are shown prior to building the CDS alert, during build, and after successful build. Conclusions Automated acceptance test–driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test–driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. PMID:29653922

  9. Agile Acceptance Test-Driven Development of Clinical Decision Support Advisories: Feasibility of Using Open Source Software.

    PubMed

    Basit, Mujeeb A; Baldwin, Krystal L; Kannan, Vaishnavi; Flahaven, Emily L; Parks, Cassandra J; Ott, Jason M; Willett, Duwayne L

    2018-04-13

    Moving to electronic health records (EHRs) confers substantial benefits but risks unintended consequences. Modern EHRs consist of complex software code with extensive local configurability options, which can introduce defects. Defects in clinical decision support (CDS) tools are surprisingly common. Feasible approaches to prevent and detect defects in EHR configuration, including CDS tools, are needed. In complex software systems, use of test-driven development and automated regression testing promotes reliability. Test-driven development encourages modular, testable design and expanding regression test coverage. Automated regression test suites improve software quality, providing a "safety net" for future software modifications. Each automated acceptance test serves multiple purposes, as requirements (prior to build), acceptance testing (on completion of build), regression testing (once live), and "living" design documentation. Rapid-cycle development or "agile" methods are being successfully applied to CDS development. The agile practice of automated test-driven development is not widely adopted, perhaps because most EHR software code is vendor-developed. However, key CDS advisory configuration design decisions and rules stored in the EHR may prove amenable to automated testing as "executable requirements." We aimed to establish feasibility of acceptance test-driven development of clinical decision support advisories in a commonly used EHR, using an open source automated acceptance testing framework (FitNesse). Acceptance tests were initially constructed as spreadsheet tables to facilitate clinical review. Each table specified one aspect of the CDS advisory's expected behavior. Table contents were then imported into a test suite in FitNesse, which queried the EHR database to automate testing. Tests and corresponding CDS configuration were migrated together from the development environment to production, with tests becoming part of the production regression test suite. We used test-driven development to construct a new CDS tool advising Emergency Department nurses to perform a swallowing assessment prior to administering oral medication to a patient with suspected stroke. Test tables specified desired behavior for (1) applicable clinical settings, (2) triggering action, (3) rule logic, (4) user interface, and (5) system actions in response to user input. Automated test suite results for the "executable requirements" are shown prior to building the CDS alert, during build, and after successful build. Automated acceptance test-driven development and continuous regression testing of CDS configuration in a commercial EHR proves feasible with open source software. Automated test-driven development offers one potential contribution to achieving high-reliability EHR configuration. Vetting acceptance tests with clinicians elicits their input on crucial configuration details early during initial CDS design and iteratively during rapid-cycle optimization. ©Mujeeb A Basit, Krystal L Baldwin, Vaishnavi Kannan, Emily L Flahaven, Cassandra J Parks, Jason M Ott, Duwayne L Willett. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 13.04.2018.

  10. Evaluation of a New Software Version of the RTVue Optical Coherence Tomograph for Image Segmentation and Detection of Glaucoma in High Myopia.

    PubMed

    Holló, Gábor; Shu-Wei, Hsu; Naghizadeh, Farzaneh

    2016-06-01

    To compare the current (6.3) and a novel software version (6.12) of the RTVue-100 optical coherence tomograph (RTVue-OCT) for ganglion cell complex (GCC) and retinal nerve fiber layer thickness (RNFLT) image segmentation and detection of glaucoma in high myopia. RNFLT and GCC scans were acquired with software version 6.3 of the RTVue-OCT on 51 highly myopic eyes (spherical refractive error ≤-6.0 D) of 51 patients, and were analyzed with both the software versions. Twenty-two eyes were nonglaucomatous, 13 were ocular hypertensive and 16 eyes had glaucoma. No difference was seen for any RNFLT, and average GCC parameter between the software versions (paired t test, P≥0.084). Global loss volume was significantly lower (more normal) with version 6.12 than with version 6.3 (Wilcoxon signed-rank test, P<0.001). The percentage agreement (κ) between the clinical (normal and ocular hypertensive vs. glaucoma) and the software-provided classifications (normal and borderline vs. outside normal limits) were 0.3219 and 0.4442 for average RNFLT, and 0.2926 and 0.4977 for average GCC with versions 1 and 2, respectively (McNemar symmetry test, P≥0.289). No difference in average RNFLT and GCC classification (McNemar symmetry test, P≥0.727) and the number of eyes with at least 1 segmentation error (P≥0.109) was found between the software versions, respectively. Although GCC segmentation was improved with software version 6.12 compared with the current version in highly myopic eyes, this did not result in a significant change of the average RNFLT and GCC values, and did not significantly improve the software-provided classification for glaucoma.

  11. Dynamic Weather Routes Architecture Overview

    NASA Technical Reports Server (NTRS)

    Eslami, Hassan; Eshow, Michelle

    2014-01-01

    Dynamic Weather Routes Architecture Overview, presents the high level software architecture of DWR, based on the CTAS software framework and the Direct-To automation tool. The document also covers external and internal data flows, required dataset, changes to the Direct-To software for DWR, collection of software statistics, and the code structure.

  12. The Design of Software for Three-Phase Induction Motor Test System

    NASA Astrophysics Data System (ADS)

    Haixiang, Xu; Fengqi, Wu; Jiai, Xue

    2017-11-01

    The design and development of control system software is important to three-phase induction motor test equipment, which needs to be completely familiar with the test process and the control procedure of test equipment. In this paper, the software is developed according to the national standard (GB/T1032-2005) about three-phase induction motor test method by VB language. The control system and data analysis software and the implement about motor test system are described individually, which has the advantages of high automation and high accuracy.

  13. High-energy physics software parallelization using database techniques

    NASA Astrophysics Data System (ADS)

    Argante, E.; van der Stok, P. D. V.; Willers, I.

    1997-02-01

    A programming model for software parallelization, called CoCa, is introduced that copes with problems caused by typical features of high-energy physics software. By basing CoCa on the database transaction paradimg, the complexity induced by the parallelization is for a large part transparent to the programmer, resulting in a higher level of abstraction than the native message passing software. CoCa is implemented on a Meiko CS-2 and on a SUN SPARCcenter 2000 parallel computer. On the CS-2, the performance is comparable with the performance of native PVM and MPI.

  14. Software attribute visualization for high integrity software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  15. ACTS: from ATLAS software towards a common track reconstruction software

    NASA Astrophysics Data System (ADS)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  16. PyNN: A Common Interface for Neuronal Network Simulators.

    PubMed

    Davison, Andrew P; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.

  17. PyNN: A Common Interface for Neuronal Network Simulators

    PubMed Central

    Davison, Andrew P.; Brüderle, Daniel; Eppler, Jochen; Kremkow, Jens; Muller, Eilif; Pecevski, Dejan; Perrinet, Laurent; Yger, Pierre

    2008-01-01

    Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN. PMID:19194529

  18. Just Do It Yourself: Implementing 3D Printing in a Deployed Environment

    DTIC Science & Technology

    2017-04-01

    This 3D model data can be stored for future manufacturing or manipulated, using software, to improve the parts’ design .8 3D manufactured parts can be...be developed and tested in a virtual environment, very quickly, and before manufacturing has commenced. Additionally, these 3D designs can be...capitalize on this innovative technology. Consequently, AM may offer the best hope for designing a reusable hypersonic weapon. Traditional manufacturing

  19. Chicago Sanitary and Ship Canal (CSSC) Marine Safety Risk Assessment

    DTIC Science & Technology

    2013-12-01

    calculation of the rate of loss events and the associated consequences. Further, the selected tool supports a clear understanding of the drivers of failures...spreadsheet software. The event tree has a series of events stated in a success mode, or simply as the occurrence of a phenomenological condition. The...Public | December 2013 of a “transit” (when applicable). As subsequent events occur, there is a branch point, one branch representing success and the

  20. Contradictory genetic make-up of Dutch harbour porpoises: Response to van der Plas-Duivesteijn et al.

    NASA Astrophysics Data System (ADS)

    Kopps, Anna M.; Palsbøll, Per J.

    2016-02-01

    The assessment of the status of endangered species or populations typically draw generously on the plethora of population genetic software available to detect population genetic structuring. However, despite the many available analytical approaches, population genetic inference methods [of neutral genetic variation] essentially capture three basic processes; migration, random genetic drift and mutation. Consequently, different analytical approaches essentially capture the same basic process, and should yield consistent results.

  1. Level 1 Processing of MODIS Direct Broadcast Data From Terra

    NASA Technical Reports Server (NTRS)

    Lynnes, Christopher; Smith, Peter; Shotland, Larry; El-Ghazawi, Tarek; Zhu, Ming

    2000-01-01

    In February 2000, an effort was begun to adapt the Moderate Resolution Imaging Spectroradiometer (MODIS) Level 1 production software to process direct broadcast data. Three Level 1 algorithms have been adapted and packaged for release: Level 1A converts raw (level 0) data into Hierarchical Data Format (HDF), unpacking packets into scans; Geolocation computes geographic information for the data points in the Level 1A; and the Level 1B computes geolocated, calibrated radiances from the Level 1A and Geolocation products. One useful aspect of adapting the production software is the ability to incorporate enhancements contributed by the MODIS Science Team. We have therefore tried to limit changes to the software. However, in order to process the data immediately on receipt, we have taken advantage of a branch in the geolocation software that reads orbit and altitude information from the packets themselves, rather than external ancillary files used in standard production. We have also verified that the algorithms can be run with smaller time increments (2.5 minutes) than the five-minute increments used in production. To make the code easier to build and run, we have simplified directories and build scripts. Also, dependencies on a commercial numerics library have been replaced by public domain software. A version of the adapted code has been released for Silicon Graphics machines running lrix. Perhaps owing to its origin in production, the software is rather CPU-intensive. Consequently, a port to Linux is underway, followed by a version to run on PC clusters, with an eventual goal of running in near-real-time (i.e., process a ten-minute pass in ten minutes).

  2. Next Generation Cloud-based Science Data Systems and Their Implications on Data and Software Stewardship, Preservation, and Provenance

    NASA Astrophysics Data System (ADS)

    Hua, H.; Manipon, G.; Starch, M.

    2017-12-01

    NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.

  3. Automated Transfer Vehicle Proximity Flight Safety Overview

    NASA Astrophysics Data System (ADS)

    Cornier, Dominique; Berthelier, David; Requiston, Helene; Zekri, Eric; Chase, Richard

    2005-12-01

    The European Automated Transfer Vehicle (ATV) is an unmanned transportation spacecraft designed to contribute to the logistic servicing of the ISS. The ATV will be launched by ARIANE 5 and, after phasing and rendezvous maneuvers, it autonomously docks to the International Space Station (ISS).The ATV control is nominally handled by the Guidance, Navigation and Control (GNC) function using computers, software, sensors and actuators. During rendezvous operations, in order to cover the extreme situations where the GNC function fails to ensure a safe trajectory with respect to the ISS, a segregated Proximity Flight Safety (PFS) function is activated : this function will initiate a collision avoidance maneuver which will place the ATV on a trajectory ensuring safety with respect to the ISS. The PFS function relies on segregated computers, the Monitoring and Safing Units (MSUs) running specific software, on four dedicated thrusters, on dedicated batteries and on specific interfaces with ATV gyrometers.The PFS function being the ultimate protection to ensure ISS safety in case of ATV malfunction, specific rules have been applied to its implementation, in particular for the development of the MSU software, which is critical since any failure of this software may result in catastrophic consequences.This paper provides an overview of the ATV Proximity Flight Safety function. After a short description of the overall ATV avionics architecture and its rationale, the second part of the paper presents more details on the PFS function both in terms of hardware and software implementation. The third part of the paper is dedicated to the MSU software validation method that is specific considering its criticality. The last part of the paper provides information on the different operations related to the use of the PFS function during an ATV flight.

  4. Exploring faculty perceptions towards electronic health records for nursing education.

    PubMed

    Kowitlawakul, Y; Chan, S W C; Wang, L; Wang, W

    2014-12-01

    The use of electronic health records in nursing education is rapidly increasing worldwide. The successful implementation of electronic health records for nursing education software program relies on students as well as nursing faculty members. This study aimed to explore the experiences and perceptions of nursing faculty members using electronic health records for nursing education software program, and to identify the influential factors for successful implementation of this technology. This exploratory qualitative study was conducted using in-depth individual interviews at a university in Singapore. Seven faculty members participated in the study. The data were gathered and analysed at the end of the semester in the 2012/2013 academic year. The participants' perceptions of the software program were organized into three main categories: innovation, transition and integration. The participants perceived this technology as innovative, with both values and challenges for the users. In addition, using the new software program was perceived as transitional process. The integration of this technology required time from faculty members and students, as well as support from administrators. The software program had only been implemented for 2-3 months at the time of the interviews. Consequently, the participants might have lacked the necessary skill and competence and confidence to implement it successfully. In addition, the unequal exposure to the software program might have had an impact on participants' perceptions. The findings show that the integration of electronic health records into nursing education curricula is dependent on the faculty members' experiences with the new technology, as well as their perceptions of it. Hence, cultivating a positive attitude towards the use of new technologies is important. Electronic health records are significant applications of health information technology. Health informatics competency should be included as a required competency component in faculty professional development policy and programmes. © 2014 International Council of Nurses.

  5. Accounting for Uncertainties in Strengths of SiC MEMS Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel; Evans, Laura; Beheim, Glen; Trapp, Mark; Jadaan, Osama; Sharpe, William N., Jr.

    2007-01-01

    A methodology has been devised for accounting for uncertainties in the strengths of silicon carbide structural components of microelectromechanical systems (MEMS). The methodology enables prediction of the probabilistic strengths of complexly shaped MEMS parts using data from tests of simple specimens. This methodology is intended to serve as a part of a rational basis for designing SiC MEMS, supplementing methodologies that have been borrowed from the art of designing macroscopic brittle material structures. The need for this or a similar methodology arises as a consequence of the fundamental nature of MEMS and the brittle silicon-based materials of which they are typically fabricated. When tested to fracture, MEMS and structural components thereof show wide part-to-part scatter in strength. The methodology involves the use of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) software in conjunction with the ANSYS Probabilistic Design System (PDS) software to simulate or predict the strength responses of brittle material components while simultaneously accounting for the effects of variability of geometrical features on the strength responses. As such, the methodology involves the use of an extended version of the ANSYS/CARES/PDS software system described in Probabilistic Prediction of Lifetimes of Ceramic Parts (LEW-17682-1/4-1), Software Tech Briefs supplement to NASA Tech Briefs, Vol. 30, No. 9 (September 2006), page 10. The ANSYS PDS software enables the ANSYS finite-element-analysis program to account for uncertainty in the design-and analysis process. The ANSYS PDS software accounts for uncertainty in material properties, dimensions, and loading by assigning probabilistic distributions to user-specified model parameters and performing simulations using various sampling techniques.

  6. Interpreting CMMI High Maturity for Small Organizations

    DTIC Science & Technology

    2008-09-01

    Stoddard September, 2008 Congreso Internacional en Ingeniería de Software y sus Aplicaciones (International Congress of Software Engineering d...Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Congreso Internacional en Ingeniería de Software y sus Aplicaciones (International Congress of...de Software y sus Aplicaciones (International Congress of Software Engineering and its Applications) Why This Workshop? CMMI Process Performance

  7. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, W. A.; Lepicovsky, J.

    1992-01-01

    The software for configuring an LV counter processor system has been developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system has been developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.

  8. LV software support for supersonic flow analysis

    NASA Technical Reports Server (NTRS)

    Bell, William A.

    1992-01-01

    The software for configuring a Laser Velocimeter (LV) counter processor system was developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system was developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.

  9. Floodplain-mapping With Modern It-instruments

    NASA Astrophysics Data System (ADS)

    Bley, D.; Pasche, E.

    of all natural hazards, floods occur globally most frequently, claim most casualities and cause the biggest economic losses. Reasons are anthropogenic changes (river cor- rection, land surface sealing, waldsterben, climatic changes) combined with a high population density. Counteractions must be the resettlement of human beings away from flood-prone areas, flood controls and environmental monitoring, as well as renat- uralization and provision of retention basins and areas. The consequence, especially if we think of the recent flood-events on the rivers Rhine, Odra and Danube must be a preventive and sustainable flood control. As a consequence the legislator de- manded in the Water Management Act nation-wide floodplain-mapping, to preserve the necessary retention-areas for high water flows and prevent misuses. In this context, water level calculations based on a one-dimensional steady-flow computer model are among the major tasks in hydraulic engineering practice. Bjoernsen Consulting En- gineers developed in cooperation with the Technical University of Hamburg-Harburg the integrated software system WSPWIN. It is based upon state of the art informa- tion technology and latest developments in hydraulic research. WSPWIN consists of a pre-processing module, a calculation core, and GIS-based post-processing elements. As water level calculations require the recording and storage of large amounts of to- pographic and hydraulic data it is helpful that WSPWIN consists of an interactive graphical profile-editor, which allows visual data checking and editing. The calcu- lation program comprises water level calculations under steady uniform and steady non-uniform flow conditions using the formulas of Darcy-Weisbach and Gauckler- Manning-Strickler. Bridges, weirs, pipes as well as the effects of submerged vege- tation are taken into account. Post-processing includes plotting facilities for cross- sectional and longitudinal profiles as well as map-oriented GIS-based data editing and result presentation. Import of digital elevation models and generation of profiles are possible. Furthermore, the intersection of the DEM with the calculated water level en- ables the creation of floodplain maps. WSPWIN is the official standard software for one-dimensional hydraulic modeling in six German Federal States, where it is used by all water-management agencies. Moreover, many private companies, universities and water-associations employ WSPWIN as well. The program is presented showing the procedure and difficulties of floodplain-mapping and flood control on a Bavarian river.

  10. PROGame: A process framework for serious game development for motor rehabilitation therapy

    PubMed Central

    Jaume-i-Capó, Antoni; Moyà-Alcover, Biel

    2018-01-01

    Serious game development for rehabilitation therapy is becoming increasingly popular because of the motivational advantages that these types of applications provide. Consequently, the need for a common process framework for this category of software development has become increasingly evident. The goal is to guarantee that products are developed and validated by following a coherent and systematic method that leads to high-quality serious games. This paper introduces a new process framework for the development of serious games for motor rehabilitation therapy. We introduce the new model and demonstrate its application for the development of a serious game for the improvement of the balance and postural control of adults with cerebral palsy. The development of this application has been facilitated by two technological transfer contracts and is being exploited by two different organizations. According to clinical measurements, patients using the application improved from high fall risk to moderate fall risk. We believe that our development strategy can be useful not only for motor rehabilitation therapy, but also for the development of serious games in many other rehabilitation areas. PMID:29768472

  11. PROGame: A process framework for serious game development for motor rehabilitation therapy.

    PubMed

    Amengual Alcover, Esperança; Jaume-I-Capó, Antoni; Moyà-Alcover, Biel

    2018-01-01

    Serious game development for rehabilitation therapy is becoming increasingly popular because of the motivational advantages that these types of applications provide. Consequently, the need for a common process framework for this category of software development has become increasingly evident. The goal is to guarantee that products are developed and validated by following a coherent and systematic method that leads to high-quality serious games. This paper introduces a new process framework for the development of serious games for motor rehabilitation therapy. We introduce the new model and demonstrate its application for the development of a serious game for the improvement of the balance and postural control of adults with cerebral palsy. The development of this application has been facilitated by two technological transfer contracts and is being exploited by two different organizations. According to clinical measurements, patients using the application improved from high fall risk to moderate fall risk. We believe that our development strategy can be useful not only for motor rehabilitation therapy, but also for the development of serious games in many other rehabilitation areas.

  12. A Phenomenological Inquiry into the Perceptions of Software Professionals on the Asperger's Syndrome/High Functioning Autism Spectrum and the Success of Software Development Projects

    ERIC Educational Resources Information Center

    Kendall, Leslie R.

    2013-01-01

    Individuals who have Asperger's Syndrome/High-Functioning Autism, as a group, are chronically underemployed and underutilized. Many in this group have abilities that are well suited for various roles within the practice of software development. Multiple studies have shown that certain organizational and management changes in the software…

  13. The need for scientific software engineering in the pharmaceutical industry

    NASA Astrophysics Data System (ADS)

    Luty, Brock; Rose, Peter W.

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  14. The need for scientific software engineering in the pharmaceutical industry.

    PubMed

    Luty, Brock; Rose, Peter W

    2017-03-01

    Scientific software engineering is a distinct discipline from both computational chemistry project support and research informatics. A scientific software engineer not only has a deep understanding of the science of drug discovery but also the desire, skills and time to apply good software engineering practices. A good team of scientific software engineers can create a software foundation that is maintainable, validated and robust. If done correctly, this foundation enable the organization to investigate new and novel computational ideas with a very high level of efficiency.

  15. Overview of the TriBITS Lifecycle Model: Lean/Agile Software Lifecycle Model for Research-based Computational Science and Engineering Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  16. Kepler Planet Detection Metrics: Statistical Bootstrap Test

    NASA Technical Reports Server (NTRS)

    Jenkins, Jon M.; Burke, Christopher J.

    2016-01-01

    This document describes the data produced by the Statistical Bootstrap Test over the final three Threshold Crossing Event (TCE) deliveries to NExScI: SOC 9.1 (Q1Q16)1 (Tenenbaum et al. 2014), SOC 9.2 (Q1Q17) aka DR242 (Seader et al. 2015), and SOC 9.3 (Q1Q17) aka DR253 (Twicken et al. 2016). The last few years have seen significant improvements in the SOC science data processing pipeline, leading to higher quality light curves and more sensitive transit searches. The statistical bootstrap analysis results presented here and the numerical results archived at NASAs Exoplanet Science Institute (NExScI) bear witness to these software improvements. This document attempts to introduce and describe the main features and differences between these three data sets as a consequence of the software changes.

  17. Advanced communications technology satellite high burst rate link evaluation terminal experiment control and monitor software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document.

  18. Repercussions of imprisonment for conjugal violence: discourses of men 1

    PubMed Central

    de Sousa, Anderson Reis; Pereira, Álvaro; Paixão, Gilvânia Patrícia do Nascimento; Pereira, Nadirlene Gomes; Campos, Luana Moura; Couto, Telmara Menezes

    2016-01-01

    ABSTRACT Objective: to know the consequences that men experience related to incarceration by conjugal violence. Methods: qualitative study on 20 men in jail and indicted in criminal processes related to conjugal violence in a Court specialized in Family and Domestic Violence against women. The interviews were classified based on Collective Subject Discourse method, using NVIVO(r) software. Results: the collective discourse shows that the experience of preventive imprisonment starts a process of family dismantling, social stigma, financial hardship and psycho-emotional symptoms such as phobia, depression, hypertension, and headaches. Conclusion: due to the physical, mental and social consequences of the conjugal violence-related imprisonment experience, it is urgent to look carefully into the somatization process as well as to the prevention strategies regarding this process. PMID:27982312

  19. A software communication tool for the tele-ICU.

    PubMed

    Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider.

  20. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this

  1. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  2. Control-structure-thermal interactions in analysis of lunar telescopes

    NASA Technical Reports Server (NTRS)

    Thompson, Roger C.

    1992-01-01

    The lunar telescope project was an excellent model for the CSTI study because a telescope is a very sensitive instrument, and thermal expansion or mechanical vibration of the mirror assemblies will rapidly degrade the resolution of the device. Consequently, the interactions are strongly coupled. The lunar surface experiences very large temperature variations that range from approximately -180 C to over 100 C. Although the optical assemblies of the telescopes will be well insulated, the temperature of the mirrors will inevitably fluctuate in a similar cycle, but of much smaller magnitude. In order to obtain images of high quality and clarity, allowable thermal deformations of any point on a mirror must be less than 1 micron. Initial estimates indicate that this corresponds to a temperature variation of much less than 1 deg through the thickness of the mirror. Therefore, a lunar telescope design will most probably include active thermal control, a means of controlling the shape of the mirrors, or a combination of both systems. Historically, the design of a complex vehicle was primarily a sequential process in which the basic structure was defined without concurrent detailed analyses or other subsystems. The basic configuration was then passed to the different teams responsible for each subsystem, and their task was to produce a workable solution without requiring major alterations to any principal components or subsystems. Consequently, the final design of the vehicle was not always the most efficient, owing to the fact that each subsystem design was partially constrained by the previous work. This procedure was necessary at the time because the analysis process was extremely time-consuming and had to be started over with each significant alteration of the vehicle. With recent advances in the power and capacity of small computers, and the parallel development of powerful software in structural, thermal, and control system analysis, it is now possible to produce very detailed analyses of intermediate designs in a much shorter period of time. The subsystems can thus be designed concurrently, and alterations in the overall design can be quickly adopted into each analysis; the design becomes an iterative process in which it is much easier to experiment with new ideas, configurations, and components. Concurrent engineering has the potential to produce efficient, highly capable designs because the effect of one subystem on another can be assessed in much more detail at a very early point in the program. The research program consisted of several tasks: scale a prototype telescope assembly to a 1 m aperture, develop a model of the telescope assembly by using finite element (FEM) codes that are available on site, determine structural deflections of the mirror surfaces due to the temperature variations, develop a prototype control system to maintain the proper shape of the optical elements, and most important of all, demonstrate the concurrent engineering approach with this example. In addition, the software used for the finite element models and thermal analysis was relatively new within the Program Development Office and had yet to be applied to systems this large or complex; understanding the software and modifying it for use with this project was also required. The I-DEAS software by Structural Dynamics Research Corporation (SDRC) was used to build the finite element models, and TMG developed by Maya Heat Transfer Technologies, Ltd. (which runs as an I-DEAS module) was used for the thermal model calculations. All control system development was accomplished with MATRIX(sub X) by Integrated Systems, Inc.

  3. QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.

    PubMed

    Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei

    2014-01-01

    Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.

  4. Framework Support For Knowledge-Based Software Development

    NASA Astrophysics Data System (ADS)

    Huseth, Steve

    1988-03-01

    The advent of personal engineering workstations has brought substantial information processing power to the individual programmer. Advanced tools and environment capabilities supporting the software lifecycle are just beginning to become generally available. However, many of these tools are addressing only part of the software development problem by focusing on rapid construction of self-contained programs by a small group of talented engineers. Additional capabilities are required to support the development of large programming systems where a high degree of coordination and communication is required among large numbers of software engineers, hardware engineers, and managers. A major player in realizing these capabilities is the framework supporting the software development environment. In this paper we discuss our research toward a Knowledge-Based Software Assistant (KBSA) framework. We propose the development of an advanced framework containing a distributed knowledge base that can support the data representation needs of tools, provide environmental support for the formalization and control of the software development process, and offer a highly interactive and consistent user interface.

  5. Advanced Communications Technology Satellite high burst rate link evaluation terminal experiment control and monitor software maintenance manual, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1992-01-01

    The Experiment Control and Monitor (EC&M) software was developed at NASA Lewis Research Center to support the Advanced Communications Technology Satellite (ACTS) High Burst Rate Link Evaluation Terminal (HBR-LET). The HBR-LET is an experimenter's terminal to communicate with the ACTS for various investigations by government agencies, universities, and industry. The EC&M software is one segment of the Control and Performance Monitoring (C&PM) software system of the HBR-LET. The EC&M software allows users to initialize, control, and monitor the instrumentation within the HBR-LET using a predefined sequence of commands. Besides instrument control, the C&PM software system is also responsible for computer communication between the HBR-LET and the ACTS NASA Ground Station and for uplink power control of the HBR-LET to demonstrate power augmentation during rain fade events. The EC&M Software User's Guide, Version 1.0 (NASA-CR-189160) outlines the commands required to install and operate the EC&M software. Input and output file descriptions, operator commands, and error recovery procedures are discussed in the document. The EC&M Software Maintenance Manual, Version 1.0 (NASA-CR-189161) is a programmer's guide that describes current implementation of the EC&M software from a technical perspective. An overview of the EC&M software, computer algorithms, format representation, and computer hardware configuration are included in the manual.

  6. PScan 1.0: flexible software framework for polygon based multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Li, Yongxiao; Lee, Woei Ming

    2016-12-01

    Multiphoton laser scanning microscopes exhibit highly localized nonlinear optical excitation and are powerful instruments for in-vivo deep tissue imaging. Customized multiphoton microscopy has a significantly superior performance for in-vivo imaging because of precise control over the scanning and detection system. To date, there have been several flexible software platforms catered to custom built microscopy systems i.e. ScanImage, HelioScan, MicroManager, that perform at imaging speeds of 30-100fps. In this paper, we describe a flexible software framework for high speed imaging systems capable of operating from 5 fps to 1600 fps. The software is based on the MATLAB image processing toolbox. It has the capability to communicate directly with a high performing imaging card (Matrox Solios eA/XA), thus retaining high speed acquisition. The program is also designed to communicate with LabVIEW and Fiji for instrument control and image processing. Pscan 1.0 can handle high imaging rates and contains sufficient flexibility for users to adapt to their high speed imaging systems.

  7. Verification and Validation in a Rapid Software Development Process

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Easterbrook, Steve M.

    1997-01-01

    The high cost of software production is driving development organizations to adopt more automated design and analysis methods such as rapid prototyping, computer-aided software engineering (CASE) tools, and high-level code generators. Even developers of safety-critical software system have adopted many of these new methods while striving to achieve high levels Of quality and reliability. While these new methods may enhance productivity and quality in many cases, we examine some of the risks involved in the use of new methods in safety-critical contexts. We examine a case study involving the use of a CASE tool that automatically generates code from high-level system designs. We show that while high-level testing on the system structure is highly desirable, significant risks exist in the automatically generated code and in re-validating releases of the generated code after subsequent design changes. We identify these risks and suggest process improvements that retain the advantages of rapid, automated development methods within the quality and reliability contexts of safety-critical projects.

  8. Software-Related Recalls of Health Information Technology and Other Medical Devices: Implications for FDA Regulation of Digital Health.

    PubMed

    Ronquillo, Jay G; Zuckerman, Diana M

    2017-09-01

    Policy Points: Medical software has become an increasingly critical component of health care, yet the regulation of these devices is inconsistent and controversial. No studies of medical devices and software assess the impact on patient safety of the FDA's current regulatory safeguards and new legislative changes to those standards. Our analysis quantifies the impact of software problems in regulated medical devices and indicates that current regulations are necessary but not sufficient for ensuring patient safety by identifying and eliminating dangerous defects in software currently on the market. New legislative changes will further deregulate health IT, reducing safeguards that facilitate the reporting and timely recall of flawed medical software that could harm patients. Medical software has become an increasingly critical component of health care, yet the regulatory landscape for digital health is inconsistent and controversial. To understand which policies might best protect patients, we examined the impact of the US Food and Drug Administration's (FDA's) regulatory safeguards on software-related technologies in recent years and the implications for newly passed legislative changes in regulatory policy. Using FDA databases, we identified all medical devices that were recalled from 2011 through 2015 primarily because of software defects. We counted all software-related recalls for each FDA risk category and evaluated each high-risk and moderate-risk recall of electronic medical records to determine the manufacturer, device classification, submission type, number of units, and product details. A total of 627 software devices (1.4 million units) were subject to recalls, with 12 of these devices (190,596 units) subject to the highest-risk recalls. Eleven of the devices recalled as high risk had entered the market through the FDA review process that does not require evidence of safety or effectiveness, and one device was completely exempt from regulatory review. The largest high-risk recall categories were anesthesiology and general hospital, with one each in cardiovascular and neurology. Five electronic medical record systems (9,347 units) were recalled for software defects classified as posing a moderate risk to patient safety. Software problems in medical devices are not rare and have the potential to negatively influence medical care. Premarket regulation has not captured all the software issues that could harm patients, evidenced by the potentially large number of patients exposed to software products later subject to high-risk and moderate-risk recalls. Provisions of the 21st Century Cures Act that became law in late 2016 will reduce safeguards further. Absent stronger regulations and implementation to create robust risk assessment and adverse event reporting, physicians and their patients are likely to be at risk from medical errors caused by software-related problems in medical devices. © 2017 Milbank Memorial Fund.

  9. Multiphase flow calculation software

    DOEpatents

    Fincke, James R.

    2003-04-15

    Multiphase flow calculation software and computer-readable media carrying computer executable instructions for calculating liquid and gas phase mass flow rates of high void fraction multiphase flows. The multiphase flow calculation software employs various given, or experimentally determined, parameters in conjunction with a plurality of pressure differentials of a multiphase flow, preferably supplied by a differential pressure flowmeter or the like, to determine liquid and gas phase mass flow rates of the high void fraction multiphase flows. Embodiments of the multiphase flow calculation software are suitable for use in a variety of applications, including real-time management and control of an object system.

  10. Critical Thinking Skills of Students through Mathematics Learning with ASSURE Model Assisted by Software Autograph

    NASA Astrophysics Data System (ADS)

    Kristianti, Y.; Prabawanto, S.; Suhendra, S.

    2017-09-01

    This study aims to examine the ability of critical thinking and students who attain learning mathematics with learning model ASSURE assisted Autograph software. The design of this study was experimental group with pre-test and post-test control group. The experimental group obtained a mathematics learning with ASSURE-assisted model Autograph software and the control group acquired the mathematics learning with the conventional model. The data are obtained from the research results through critical thinking skills tests. This research was conducted at junior high school level with research population in one of junior high school student in Subang Regency of Lesson Year 2016/2017 and research sample of class VIII student in one of junior high school in Subang Regency for 2 classes. Analysis of research data is administered quantitatively. Quantitative data analysis was performed on the normalized gain level between the two sample groups using a one-way anova test. The results show that mathematics learning with ASSURE assisted model Autograph software can improve the critical thinking ability of junior high school students. Mathematical learning using ASSURE-assisted model Autograph software is significantly better in improving the critical thinking skills of junior high school students compared with conventional models.

  11. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  12. A Reusable and Adaptable Software Architecture for Embedded Space Flight System: The Core Flight Software System (CFS)

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2005-01-01

    The contents include the following: High availability. Hardware is in harsh environment. Flight processor (constraints) very widely due to power and weight constraints. Software must be remotely modifiable and still operate while changes are being made. Many custom one of kind interfaces for one of a kind missions. Sustaining engineering. Price of failure is high, tens to hundreds of millions of dollars.

  13. Towards an Open, Distributed Software Architecture for UxS Operations

    NASA Technical Reports Server (NTRS)

    Cross, Charles D.; Motter, Mark A.; Neilan, James H.; Qualls, Garry D.; Rothhaar, Paul M.; Tran, Loc; Trujillo, Anna C.; Allen, B. Danette

    2015-01-01

    To address the growing need to evaluate, test, and certify an ever expanding ecosystem of UxS platforms in preparation of cultural integration, NASA Langley Research Center's Autonomy Incubator (AI) has taken on the challenge of developing a software framework in which UxS platforms developed by third parties can be integrated into a single system which provides evaluation and testing, mission planning and operation, and out-of-the-box autonomy and data fusion capabilities. This software framework, named AEON (Autonomous Entity Operations Network), has two main goals. The first goal is the development of a cross-platform, extensible, onboard software system that provides autonomy at the mission execution and course-planning level, a highly configurable data fusion framework sensitive to the platform's available sensor hardware, and plug-and-play compatibility with a wide array of computer systems, sensors, software, and controls hardware. The second goal is the development of a ground control system that acts as a test-bed for integration of the proposed heterogeneous fleet, and allows for complex mission planning, tracking, and debugging capabilities. The ground control system should also be highly extensible and allow plug-and-play interoperability with third party software systems. In order to achieve these goals, this paper proposes an open, distributed software architecture which utilizes at its core the Data Distribution Service (DDS) standards, established by the Object Management Group (OMG), for inter-process communication and data flow. The design decisions proposed herein leverage the advantages of existing robotics software architectures and the DDS standards to develop software that is scalable, high-performance, fault tolerant, modular, and readily interoperable with external platforms and software.

  14. [Use of Adobe Photoshop software in medical criminology].

    PubMed

    Nikitin, S A; Demidov, I V

    2000-01-01

    Describes the method of comparative analysis of various objects in practical medical criminology and making of high-quality photographs with the use of Adobe Photoshop software. Options of the software needed for expert evaluations are enumerated.

  15. Improved methods and resources for paramecium genomics: transcription units, gene annotation and gene expression.

    PubMed

    Arnaiz, Olivier; Van Dijk, Erwin; Bétermier, Mireille; Lhuillier-Akakpo, Maoussi; de Vanssay, Augustin; Duharcourt, Sandra; Sallet, Erika; Gouzy, Jérôme; Sperling, Linda

    2017-06-26

    The 15 sibling species of the Paramecium aurelia cryptic species complex emerged after a whole genome duplication that occurred tens of millions of years ago. Given extensive knowledge of the genetics and epigenetics of Paramecium acquired over the last century, this species complex offers a uniquely powerful system to investigate the consequences of whole genome duplication in a unicellular eukaryote as well as the genetic and epigenetic mechanisms that drive speciation. High quality Paramecium gene models are important for research using this system. The major aim of the work reported here was to build an improved gene annotation pipeline for the Paramecium lineage. We generated oriented RNA-Seq transcriptome data across the sexual process of autogamy for the model species Paramecium tetraurelia. We determined, for the first time in a ciliate, candidate P. tetraurelia transcription start sites using an adapted Cap-Seq protocol. We developed TrUC, multi-threaded Perl software that in conjunction with TopHat mapping of RNA-Seq data to a reference genome, predicts transcription units for the annotation pipeline. We used EuGene software to combine annotation evidence. The high quality gene structural annotations obtained for P. tetraurelia were used as evidence to improve published annotations for 3 other Paramecium species. The RNA-Seq data were also used for differential gene expression analysis, providing a gene expression atlas that is more sensitive than the previously established microarray resource. We have developed a gene annotation pipeline tailored for the compact genomes and tiny introns of Paramecium species. A novel component of this pipeline, TrUC, predicts transcription units using Cap-Seq and oriented RNA-Seq data. TrUC could prove useful beyond Paramecium, especially in the case of high gene density. Accurate predictions of 3' and 5' UTR will be particularly valuable for studies of gene expression (e.g. nucleosome positioning, identification of cis regulatory motifs). The P. tetraurelia improved transcriptome resource, gene annotations for P. tetraurelia, P. biaurelia, P. sexaurelia and P. caudatum, and Paramecium-trained EuGene configuration are available through ParameciumDB ( http://paramecium.i2bc.paris-saclay.fr ). TrUC software is freely distributed under a GNU GPL v3 licence ( https://github.com/oarnaiz/TrUC ).

  16. Software Management Environment (SME) concepts and architecture, revision 1

    NASA Technical Reports Server (NTRS)

    Hendrick, Robert; Kistler, David; Valett, Jon

    1992-01-01

    This document presents the concepts and architecture of the Software Management Environment (SME), developed for the Software Engineering Branch of the Flight Dynamic Division (FDD) of GSFC. The SME provides an integrated set of experience-based management tools that can assist software development managers in managing and planning flight dynamics software development projects. This document provides a high-level description of the types of information required to implement such an automated management tool.

  17. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  18. Low-cost mm-wave Doppler/FMCW transceivers for ground surveillance applications

    NASA Astrophysics Data System (ADS)

    Hansen, H. J.; Lindop, R. W.; Majstorovic, D.

    2005-12-01

    A 35 GHz Doppler CW/FMCW transceiver (Equivalent Radiated Power ERP=30dBm) has been assembled and its operation described. Both instantaneous beat signals (relating to range in FMCW mode) and Doppler signals (relating to targets moving at ~1.5 ms -1) exhibit audio frequencies. Consequently, the radar processing is provided by laptop PC using its inbuilt video-audio media system with appropriate MathWorks software. The implications of radar-on-chip developments are addressed.

  19. Scaling Task Management in Space and Time: Reducing User Overhead in Ubiquitous-Computing Environments

    DTIC Science & Technology

    2005-03-28

    consequently users are torn between taking advantage of increasingly pervasive computing systems, and the price (in attention and skill) that they have to... advantage of the surrounding computing environments; and (c) that it is usable by non-experts. Second, from a software architect’s perspective, we...take full advantage of the computing systems accessible to them, much as they take advantage of the furniture in each physical space. In the example

  20. A study of discrete control signal fault conditions in the shuttle DPS

    NASA Technical Reports Server (NTRS)

    Reddi, S. S.; Retter, C. T.

    1976-01-01

    An analysis of the effects of discrete failures on the data processing subsystem is presented. A functional description of each discrete together with a list of software modules that use this discrete are included. A qualitative description of the consequences that may ensue due to discrete failures is given followed by a probabilistic reliability analysis of the data processing subsystem. Based on the investigation conducted, recommendations were made to improve the reliability of the subsystem.

  1. CrossTalk. The Journal of Defense Software Engineering. Volume 24, Number 5, Sep/Oct 2011

    DTIC Science & Technology

    2011-09-01

    Reduced security risks to data and information systems • Improved compliance • Reduction in the consequences of data breaches . In turn, these...applications do not generate the most useful data in the first place [1]. So many major data breaches reportedly occur without the knowledge of their...the need for such information. According to the Verizon Business 2010 Data Breach Investiga- tions Report [6], a large percentage of total breaches

  2. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    DOT National Transportation Integrated Search

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  3. Students' different understandings of class diagrams

    NASA Astrophysics Data System (ADS)

    Boustedt, Jonas

    2012-03-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a phenomenographic investigation on how students understand class diagrams, Unified Modeling Language (UML) symbols, and relations to object-oriented (OO) concepts. The informants were 20 Computer Science students from four different universities in Sweden. The results show qualitatively different ways to understand and describe UML class diagrams and the "diamond symbols" representing aggregation and composition. The purpose of class diagrams was understood in a varied way, from describing it as a documentation to a more advanced view related to communication. The descriptions of class diagrams varied from seeing them as a specification of classes to a more advanced view, where they were described to show hierarchic structures of classes and relations. The diamond symbols were seen as "relations" and a more advanced way was seeing the white and the black diamonds as different symbols for aggregation and composition. As a consequence of the results, it is recommended that UML should be adopted in courses. It is briefly indicated how the phenomenographic results in combination with variation theory can be used by teachers to enhance students' possibilities to reach advanced understanding of phenomena related to UML class diagrams. Moreover, it is recommended that teachers should put more effort in assessing skills in proper usage of the basic symbols and models and students should be provided with opportunities to practise collaborative design, e.g. using whiteboards.

  4. Advanced communications technology satellite high burst rate link evaluation terminal power control and rain fade software test plan, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Power Control and Rain Fade Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Power Control and Rain Fade Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Power Control and Rain Fade Software automatically controls the LET uplink power to compensate for signal fades. Besides power augmentation, the C&PM Software system is also responsible for instrument control during HBR-LET experiments, control of the Intermediate Frequency Switch Matrix on board the ACTS to yield a desired path through the spacecraft payload, and data display. The Power Control and Rain Fade Software User's Guide, Version 1.0 outlines the commands and procedures to install and operate the Power Control and Rain Fade Software. The Power Control and Rain Fade Software Maintenance Manual, Version 1.0 is a programmer's guide to the Power Control and Rain Fade Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Power Control and Rain Fade Software, computer algorithms, format representations, and computer hardware configuration. The Power Control and Rain Fade Test Plan provides a step-by-step procedure to verify the operation of the software using a predetermined signal fade event. The Test Plan also provides a means to demonstrate the capability of the software.

  5. Advanced communications technology satellite high burst rate link evaluation terminal communication protocol software user's guide, version 1.0

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    1993-01-01

    The Communication Protocol Software was developed at the NASA Lewis Research Center to support the Advanced Communications Technology Satellite High Burst Rate Link Evaluation Terminal (ACTS HBR-LET). The HBR-LET is an experimenters terminal to communicate with the ACTS for various experiments by government, university, and industry agencies. The Communication Protocol Software is one segment of the Control and Performance Monitor (C&PM) Software system of the HBR-LET. The Communication Protocol Software allows users to control and configure the Intermediate Frequency Switch Matrix (IFSM) on board the ACTS to yield a desired path through the spacecraft payload. Besides IFSM control, the C&PM Software System is also responsible for instrument control during HBR-LET experiments, uplink power control of the HBR-LET to demonstrate power augmentation during signal fade events, and data display. The Communication Protocol Software User's Guide, Version 1.0 (NASA CR-189162) outlines the commands and procedures to install and operate the Communication Protocol Software. Configuration files used to control the IFSM, operator commands, and error recovery procedures are discussed. The Communication Protocol Software Maintenance Manual, Version 1.0 (NASA CR-189163, to be published) is a programmer's guide to the Communication Protocol Software. This manual details the current implementation of the software from a technical perspective. Included is an overview of the Communication Protocol Software, computer algorithms, format representations, and computer hardware configuration. The Communication Protocol Software Test Plan (NASA CR-189164, to be published) provides a step-by-step procedure to verify the operation of the software. Included in the Test Plan is command transmission, telemetry reception, error detection, and error recovery procedures.

  6. Orthographic learning and the role of text-to-speech software in Dutch disabled readers.

    PubMed

    Staels, Eva; Van den Broeck, Wim

    2015-01-01

    In this study, we examined whether orthographic learning can be demonstrated in disabled readers learning to read in a transparent orthography (Dutch). In addition, we tested the effect of the use of text-to-speech software, a new form of direct instruction, on orthographic learning. Both research goals were investigated by replicating Share's self-teaching paradigm. A total of 65 disabled Dutch readers were asked to read eight stories containing embedded homophonic pseudoword targets (e.g., Blot/Blod), with or without the support of text-to-speech software. The amount of orthographic learning was assessed 3 or 7 days later by three measures of orthographic learning. First, the results supported the presence of orthographic learning during independent silent reading by demonstrating that target spellings were correctly identified more often, named more quickly, and spelled more accurately than their homophone foils. Our results support the hypothesis that all readers, even poor readers of transparent orthographies, are capable of developing word-specific knowledge. Second, a negative effect of text-to-speech software on orthographic learning was demonstrated in this study. This negative effect was interpreted as the consequence of passively listening to the auditory presentation of the text. We clarify how these results can be interpreted within current theoretical accounts of orthographic learning and briefly discuss implications for remedial interventions. © Hammill Institute on Disabilities 2013.

  7. Elements of strategic capability for software outsourcing enterprises based on the resource

    NASA Astrophysics Data System (ADS)

    Shi, Wengeng

    2011-10-01

    Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.

  8. Complexity reduction in the H.264/AVC using highly adaptive fast mode decision based on macroblock motion activity

    NASA Astrophysics Data System (ADS)

    Abdellah, Skoudarli; Mokhtar, Nibouche; Amina, Serir

    2015-11-01

    The H.264/AVC video coding standard is used in a wide range of applications from video conferencing to high-definition television according to its high compression efficiency. This efficiency is mainly acquired from the newly allowed prediction schemes including variable block modes. However, these schemes require a high complexity to select the optimal mode. Consequently, complexity reduction in the H.264/AVC encoder has recently become a very challenging task in the video compression domain, especially when implementing the encoder in real-time applications. Fast mode decision algorithms play an important role in reducing the overall complexity of the encoder. In this paper, we propose an adaptive fast intermode algorithm based on motion activity, temporal stationarity, and spatial homogeneity. This algorithm predicts the motion activity of the current macroblock from its neighboring blocks and identifies temporal stationary regions and spatially homogeneous regions using adaptive threshold values based on content video features. Extensive experimental work has been done in high profile, and results show that the proposed source-coding algorithm effectively reduces the computational complexity by 53.18% on average compared with the reference software encoder, while maintaining the high-coding efficiency of H.264/AVC by incurring only 0.097 dB in total peak signal-to-noise ratio and 0.228% increment on the total bit rate.

  9. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  10. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  11. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  12. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    NASA Astrophysics Data System (ADS)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  13. A software control system for the ACTS high-burst-rate link evaluation terminal

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Daugherty, Elaine S.

    1991-01-01

    Control and performance monitoring of NASA's High Burst Rate Link Evaluation Terminal (HBR-LET) is accomplished by using several software control modules. Different software modules are responsible for controlling remote radio frequency (RF) instrumentation, supporting communication between a host and a remote computer, controlling the output power of the Link Evaluation Terminal and data display. Remote commanding of microwave RF instrumentation and the LET digital ground terminal allows computer control of various experiments, including bit error rate measurements. Computer communication allows system operators to transmit and receive from the Advanced Communications Technology Satellite (ACTS). Finally, the output power control software dynamically controls the uplink output power of the terminal to compensate for signal loss due to rain fade. Included is a discussion of each software module and its applications.

  14. Fully automated measurement of field-dependent AMS using MFK1-FA Kappabridge equipped with 3D rotator

    NASA Astrophysics Data System (ADS)

    Chadima, Martin; Studynka, Jan

    2013-04-01

    Low-field magnetic susceptibility of paramagnetic and diamagnetic minerals is field-independent by definition being also field-independent in pure magnetite. On the other hand, in pyrrhotite, hematite and high-Ti titanomagnetite it may be clearly field-dependent. Consequently, the field-dependent AMS enables the magnetic fabric of the latter group of minerals to be separated from the whole-rock AMS. The methods for the determination of the field-dependent AMS consist of separate measurements of each specimen in several fields within the Rayleigh Law range and subsequent processing in which the field-independent and field-dependent AMS components are calculated. The disadvantage of this technique is that each specimen must be measured several times, which is relatively laborious and time consuming. Recently, a new 3D rotator was developed for the MFK1-FA Kappabridge, which rotates the specimen simultaneously about two axes with different velocities. The measurement is fully automated in such a way that, once the specimen is inserted into the rotator, it requires no additional manipulation to measure the full AMS tensor. Consequently, the 3D rotator enables to measure the AMS tensors in the pre-set field intensities without any operator interference. Whole procedure is controlled by newly developed Safyr5 software; once the measurements are finished, the acquired data are immediately processed and can be visualized in a standard way.

  15. The Future of Statistical Software. Proceedings of a Forum--Panel on Guidelines for Statistical Software (Washington, D.C., February 22, 1991).

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    The Panel on Guidelines for Statistical Software was organized in 1990 to document, assess, and prioritize problem areas regarding quality and reliability of statistical software; present prototype guidelines in high priority areas; and make recommendations for further research and discussion. This document provides the following papers presented…

  16. Center for Efficient Exascale Discretizations Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolev, Tzanio; Dobrev, Veselin; Tomov, Vladimir

    The CEED Software suite is a collection of generally applicable software tools focusing on the following computational motives: PDE discretizations on unstructured meshes, high-order finite element and spectral element methods and unstructured adaptive mesh refinement. All of this software is being developed as part of CEED, a co-design Center for Efficient Exascale Discretizations, within DOE's Exascale Computing Project (ECP) program.

  17. Evaluation of the benefits of assistive reading software: perceptions of high school students with learning disabilities.

    PubMed

    Chiang, Hsin-Yu; Liu, Chien-Hsiou

    2011-01-01

    Using assistive reading software may be a cost-effective way to increase the opportunity for independent learning in students with learning disabilities. However, the effectiveness and perception of assistive reading software has seldom been explored in English-as-a-second language students with learning disabilities. This research was designed to explore the perception and effect of using assistive reading software in high school students with dyslexia (one subtype of learning disability) to improve their English reading and other school performance. The Kurzweil 3000 software was used as the intervention tool in this study. Fifteen students with learning disabilities were recruited, and instruction in the usage of the Kurzweil 3000 was given. Then after 2 weeks, when they were familiarized with the use of Kurzweil 3000, interviews were used to determine the perception and potential benefit of using the software. The results suggested that the Kurzweil 3000 had an immediate impact on students' English word recognition. The students reported that the software made reading, writing, spelling, and pronouncing easier. They also comprehended more during their English class. Further study is needed to determine under which conditions certain hardware/software might be helpful for individuals with special learning needs.

  18. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  19. Scalable geocomputation: evolving an environmental model building platform from single-core to supercomputers

    NASA Astrophysics Data System (ADS)

    Schmitz, Oliver; de Jong, Kor; Karssenberg, Derek

    2017-04-01

    There is an increasing demand to run environmental models on a big scale: simulations over large areas at high resolution. The heterogeneity of available computing hardware such as multi-core CPUs, GPUs or supercomputer potentially provides significant computing power to fulfil this demand. However, this requires detailed knowledge of the underlying hardware, parallel algorithm design and the implementation thereof in an efficient system programming language. Domain scientists such as hydrologists or ecologists often lack this specific software engineering knowledge, their emphasis is (and should be) on exploratory building and analysis of simulation models. As a result, models constructed by domain specialists mostly do not take full advantage of the available hardware. A promising solution is to separate the model building activity from software engineering by offering domain specialists a model building framework with pre-programmed building blocks that they combine to construct a model. The model building framework, consequently, needs to have built-in capabilities to make full usage of the available hardware. Developing such a framework providing understandable code for domain scientists and being runtime efficient at the same time poses several challenges on developers of such a framework. For example, optimisations can be performed on individual operations or the whole model, or tasks need to be generated for a well-balanced execution without explicitly knowing the complexity of the domain problem provided by the modeller. Ideally, a modelling framework supports the optimal use of available hardware whichsoever combination of model building blocks scientists use. We demonstrate our ongoing work on developing parallel algorithms for spatio-temporal modelling and demonstrate 1) PCRaster, an environmental software framework (http://www.pcraster.eu) providing spatio-temporal model building blocks and 2) parallelisation of about 50 of these building blocks using the new Fern library (https://github.com/geoneric/fern/), an independent generic raster processing library. Fern is a highly generic software library and its algorithms can be configured according to the configuration of a modelling framework. With manageable programming effort (e.g. matching data types between programming and domain language) we created a binding between Fern and PCRaster. The resulting PCRaster Python multicore module can be used to execute existing PCRaster models without having to make any changes to the model code. We show initial results on synthetic and geoscientific models indicating significant runtime improvements provided by parallel local and focal operations. We further outline challenges in improving remaining algorithms such as flow operations over digital elevation maps and further potential improvements like enhancing disk I/O.

  20. Topographic Structure from Motion

    NASA Astrophysics Data System (ADS)

    Fonstad, M. A.; Dietrich, J. T.; Courville, B. C.; Jensen, J.; Carbonneau, P.

    2011-12-01

    The production of high-resolution topographic datasets is of increasing concern and application throughout the geomorphic sciences, and river science is no exception. Consequently, a wide range of topographic measurement methods have evolved. Despite the range of available methods, the production of high resolution, high quality digital elevation models (DEMs) generally requires a significant investment in personnel time, hardware and/or software. However, image-based methods such as digital photogrammetry have steadily been decreasing in costs. Initially developed for the purpose of rapid, inexpensive and easy three dimensional surveys of buildings or small objects, the "structure from motion" photogrammetric approach (SfM) is a purely image based method which could deliver a step-change if transferred to river remote sensing, and requires very little training and is extremely inexpensive. Using the online SfM program Microsoft Photosynth, we have created high-resolution digital elevation models (DEM) of rivers from ordinary photographs produced from a multi-step workflow that takes advantage of free and open source software. This process reconstructs real world scenes from SfM algorithms based on the derived positions of the photographs in three-dimensional space. One of the products of the SfM process is a three-dimensional point cloud of features present in the input photographs. This point cloud can be georeferenced from a small number of ground control points collected via GPS in the field. The georeferenced point cloud can then be used to create a variety of digital elevation model products. Among several study sites, we examine the applicability of SfM in the Pedernales River in Texas (USA), where several hundred images taken from a hand-held helikite are used to produce DEMs of the fluvial topographic environment. This test shows that SfM and low-altitude platforms can produce point clouds with point densities considerably better than airborne LiDAR, with horizontal and vertical precision in the centimeter range, and with very low capital and labor costs and low expertise levels. Advanced structure from motion software (such as Bundler and OpenSynther) are currently under development and should increase the density of topographic points rivaling those of terrestrial laser scanning when using images shot from low altitude platforms such as helikites, poles, remote-controlled aircraft and rotocraft, and low-flying manned aircraft. Clearly, the development of this set of inexpensive and low-required-expertise tools has the potential to fundamentally shift the production of digital fluvial topography from a capital-intensive enterprise of a low number of researchers to a low-cost exercise of many river researchers.

  1. A Design and Development of Multi-Purpose CCD Camera System with Thermoelectric Cooling: Software

    NASA Astrophysics Data System (ADS)

    Oh, S. H.; Kang, Y. W.; Byun, Y. I.

    2007-12-01

    We present a software which we developed for the multi-purpose CCD camera. This software can be used on the all 3 types of CCD - KAF-0401E (768×512), KAF-1602E (15367times;1024), KAF-3200E (2184×1472) made in KODAK Co.. For the efficient CCD camera control, the software is operated with two independent processes of the CCD control program and the temperature/shutter operation program. This software is designed to fully automatic operation as well as manually operation under LINUX system, and is controled by LINUX user signal procedure. We plan to use this software for all sky survey system and also night sky monitoring and sky observation. As our results, the read-out time of each CCD are about 15sec, 64sec, 134sec for KAF-0401E, KAF-1602E, KAF-3200E., because these time are limited by the data transmission speed of parallel port. For larger format CCD, the data transmission is required more high speed. we are considering this control software to one using USB port for high speed data transmission.

  2. Water Security Toolkit User Manual: Version 1.3 | Science ...

    EPA Pesticide Factsheets

    User manual: Data Product/Software The Water Security Toolkit (WST) is a suite of tools that help provide the information necessary to make good decisions resulting in the minimization of further human exposure to contaminants, and the maximization of the effectiveness of intervention strategies. WST assists in the evaluation of multiple response actions in order to select the most beneficial consequence management strategy. It includes hydraulic and water quality modeling software and optimization methodologies to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove or destroy contaminants, (5) locations in the network to take grab sample to confirm contamination or cleanup and (6) valves to close in order to isolate contaminated areas of the network.

  3. Efficiency of jet grout columns and sand-recycled material mixtures for mitigating liquefaction damage

    NASA Astrophysics Data System (ADS)

    Kerem Ertek, M.; Demir, Gökhan; Köktan, Utku

    2017-04-01

    Liquefaction is an important seismic phenomena that has to be assessed and consequently makes it essential to take measures in order to reduce related hazards. There are several ways to assess liquefaction potential analytically and some constitutive models implemented in FEM softwares presenting cyclic behaviour of sand making it possible to observe shear strain or excess pore pressure ratio which are measures to hold a view about liquefaction occurrence. According to various studies in the literature, post-earthquake inspections show that the measures in terms of grouting, piled rafts and sand mixtures with different non-liquefiable materials reduce liquefaction related damage. This paper aims to provide a brief information about effectiveness of jet-grout columns and recycled material-sand mixtures against liquefaction by the help of numerical analyses performed with MIDAS GTS NX software with regard to generation of shear strains. Key words: liquefaction, numerical analyses, jet-grout, sand mixtures

  4. beta-Aminoalcohols as Potential Reactivators of Aged Sarin-/Soman-Inhibited Acetylcholinesterase

    DTIC Science & Technology

    2017-02-08

    This approach includes high - quality quantum mechanical/molecular mechanical calcula- tions, providing reliable reactivation steps and energetics...I. V. Khavrutskii Department of Defense Biotechnology High Performance Computing Software Applications Institute Telemedicine and Advanced...b] Dr. A. Wallqvist Department of Defense Biotechnology High Performance Computing Software Applications Institute Telemedicine and Advanced

  5. OSIRIX: open source multimodality image navigation software

    NASA Astrophysics Data System (ADS)

    Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman

    2005-04-01

    The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/

  6. Project based, Collaborative, Algorithmic Robotics for High School Students: Programming Self Driving Race Cars at MIT

    DTIC Science & Technology

    2017-02-19

    software systems: the students design and build robotics software towards real-world applications, without being distracted by hardware issues; (ii) it...high school students require the students to focus on building and integrating the hardware that make up the robot, at the expense of designing and...robotics programs focus on the mechanics; as a result, they do not have room for students to design and implement relatively complex software systems, as

  7. A Scheduling Algorithm for Computational Grids that Minimizes Centralized Processing in Genome Assembly of Next-Generation Sequencing Data

    PubMed Central

    Lima, Jakelyne; Cerdeira, Louise Teixeira; Bol, Erick; Schneider, Maria Paula Cruz; Silva, Artur; Azevedo, Vasco; Abelém, Antônio Jorge Gomes

    2012-01-01

    Improvements in genome sequencing techniques have resulted in generation of huge volumes of data. As a consequence of this progress, the genome assembly stage demands even more computational power, since the incoming sequence files contain large amounts of data. To speed up the process, it is often necessary to distribute the workload among a group of machines. However, this requires hardware and software solutions specially configured for this purpose. Grid computing try to simplify this process of aggregate resources, but do not always offer the best performance possible due to heterogeneity and decentralized management of its resources. Thus, it is necessary to develop software that takes into account these peculiarities. In order to achieve this purpose, we developed an algorithm aimed to optimize the functionality of de novo assembly software ABySS in order to optimize its operation in grids. We run ABySS with and without the algorithm we developed in the grid simulator SimGrid. Tests showed that our algorithm is viable, flexible, and scalable even on a heterogeneous environment, which improved the genome assembly time in computational grids without changing its quality. PMID:22461785

  8. An operations manual for the digital data system

    NASA Technical Reports Server (NTRS)

    Jones, Michael G.

    1988-01-01

    The Digital Data System (DDS) was designed to incorporate the analog-to-digital conversion process into the initial data acquisition stage and to store the data in a digital format. This conversion is done as part of the acquisition process. Consequently, the data are ready to be analyzed as soon as the test is completed. This capability permits the researcher to alter test parameters during the course of the experiment based on the information acquired in a prior portion of the test. The DDS is currently able to simultaneously acquire up to 10 channels of data. The purpose of this document is fourfold: (1) to describe the capabilities of the hardware in sufficient detail to allow the reader to determine whether the DDS is the optimum system for a particular experiment; (2) to present some of the more significant software developed to provide analyses within a short time of the completion of data acquisition; (3) to provide the reader with sample runs of major software routines to demonstrate their convenience and simple usage; and (4) a portion of the document is used to describe software which uses an FFT-box to provide a means of comparison against which the DDS can be checked.

  9. Implementation of an open adoption research data management system for clinical studies.

    PubMed

    Müller, Jan; Heiss, Kirsten Ingmar; Oberhoffer, Renate

    2017-07-06

    Research institutions need to manage multiple studies with individual data sets, processing rules and different permissions. So far, there is no standard technology that provides an easy to use environment to create databases and user interfaces for clinical trials or research studies. Therefore various software solutions are being used-from custom software, explicitly designed for a specific study, to cost intensive commercial Clinical Trial Management Systems (CTMS) up to very basic approaches with self-designed Microsoft ® databases. The technology applied to conduct those studies varies tremendously from study to study, making it difficult to evaluate data across various studies (meta-analysis) and keeping a defined level of quality in database design, data processing, displaying and exporting. Furthermore, the systems being used to collect study data are often operated redundantly to systems used in patient care. As a consequence the data collection in studies is inefficient and data quality may suffer from unsynchronized datasets, non-normalized database scenarios and manually executed data transfers. With OpenCampus Research we implemented an open adoption software (OAS) solution on an open source basis, which provides a standard environment for state-of-the-art research database management at low cost.

  10. Reliability of infarct volumetry: Its relevance and the improvement by a software-assisted approach.

    PubMed

    Friedländer, Felix; Bohmann, Ferdinand; Brunkhorst, Max; Chae, Ju-Hee; Devraj, Kavi; Köhler, Yvette; Kraft, Peter; Kuhn, Hannah; Lucaciu, Alexandra; Luger, Sebastian; Pfeilschifter, Waltraud; Sadler, Rebecca; Liesz, Arthur; Scholtyschik, Karolina; Stolz, Leonie; Vutukuri, Rajkumar; Brunkhorst, Robert

    2017-08-01

    Despite the efficacy of neuroprotective approaches in animal models of stroke, their translation has so far failed from bench to bedside. One reason is presumed to be a low quality of preclinical study design, leading to bias and a low a priori power. In this study, we propose that the key read-out of experimental stroke studies, the volume of the ischemic damage as commonly measured by free-handed planimetry of TTC-stained brain sections, is subject to an unrecognized low inter-rater and test-retest reliability with strong implications for statistical power and bias. As an alternative approach, we suggest a simple, open-source, software-assisted method, taking advantage of automatic-thresholding techniques. The validity and the improvement of reliability by an automated method to tMCAO infarct volumetry are demonstrated. In addition, we show the probable consequences of increased reliability for precision, p-values, effect inflation, and power calculation, exemplified by a systematic analysis of experimental stroke studies published in the year 2015. Our study reveals an underappreciated quality problem in translational stroke research and suggests that software-assisted infarct volumetry might help to improve reproducibility and therefore the robustness of bench to bedside translation.

  11. A Predictive Safety Management System Software Package Based on the Continuous Hazard Tracking and Failure Prediction Methodology

    NASA Technical Reports Server (NTRS)

    Quintana, Rolando

    2003-01-01

    The goal of this research was to integrate a previously validated and reliable safety model, called Continuous Hazard Tracking and Failure Prediction Methodology (CHTFPM), into a software application. This led to the development of a safety management information system (PSMIS). This means that the theory or principles of the CHTFPM were incorporated in a software package; hence, the PSMIS is referred to as CHTFPM management information system (CHTFPM MIS). The purpose of the PSMIS is to reduce the time and manpower required to perform predictive studies as well as to facilitate the handling of enormous quantities of information in this type of studies. The CHTFPM theory encompasses the philosophy of looking at the concept of safety engineering from a new perspective: from a proactive, than a reactive, viewpoint. That is, corrective measures are taken before a problem instead of after it happened. That is why the CHTFPM is a predictive safety because it foresees or anticipates accidents, system failures and unacceptable risks; therefore, corrective action can be taken in order to prevent all these unwanted issues. Consequently, safety and reliability of systems or processes can be further improved by taking proactive and timely corrective actions.

  12. Effect of inlet cone pipe angle in catalytic converter

    NASA Astrophysics Data System (ADS)

    Amira Zainal, Nurul; Farhain Azmi, Ezzatul; Arifin Samad, Mohd

    2018-03-01

    The catalytic converter shows significant consequence to improve the performance of the vehicle start from it launched into production. Nowadays, the geometric design of the catalytic converter has become critical to avoid the behavior of backpressure in the exhaust system. The backpressure essentially reduced the performance of vehicles and increased the fuel consumption gradually. Consequently, this study aims to design various models of catalytic converter and optimize the volume of fluid flow inside the catalytic converter by changing the inlet cone pipe angles. Three different geometry angles of the inlet cone pipe of the catalytic converter were assessed. The model is simulated in Solidworks software to determine the optimum geometric design of the catalytic converter. The result showed that by decreasing the divergence angle of inlet cone pipe will upsurge the performance of the catalytic converter.

  13. Preliminary description of the area navigation software for a microcomputer-based Loran-C receiver

    NASA Technical Reports Server (NTRS)

    Oguri, F.

    1983-01-01

    The development of new software implementation of this software on a microcomputer (MOS 6502) to provide high quality navigation information is described. This software development provides Area/Route Navigation (RNAV) information from Time Differences (TDs) in raw form using an elliptical Earth model and a spherical model. The software is prepared for the microcomputer based Loran-C receiver. To compute navigation infomation, a (MOS 6502) microcomputer and a mathematical chip (AM 9511A) were combined with the Loran-C receiver. Final data reveals that this software does indeed provide accurate information with reasonable execution times.

  14. Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mbaya, Timmy; Menghoel, Ole

    2011-01-01

    Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.

  15. Positive consequences of SETI before detection

    NASA Astrophysics Data System (ADS)

    Tough, A.

    Even before a signal is detected, six positive consequences will result from the scientific search for extraterrestrial intelligence, usually called SETI. (1) Humanity's self-image: SETI has enlarged our view of ourselves and enhanced our sense of meaning. Increasingly, we feel a kinship with the civilizations whose signals we are trying to detect. (2) A fresh perspective: SETI forces us to think about how extraterrestrials might perceive us. This gives us a fresh perspective on our society's values, priorities, laws and foibles. (3) Questions: SETI is stimulating thought and discussion about several fundamental questions. (4) Education: some broad-gage educational programs have already been centered around SETI. (5) Tangible spin-offs: in addition to providing jobs for some people, SETI provides various spin-offs, such as search methods, computer software, data, and international scientific cooperation. (6) Future scenarios: SETI will increasingly stimulate us to think carefully about possible detection scenarios and their consequences, about our reply, and generally about the role of extraterrestrial communication in our long-term future. Such thinking leads, in turn, to fresh perspectives on the SETI enterprise itself.

  16. Cost Effective Development of Usable Systems: Gaps between HCI and Software Architecture Design

    NASA Astrophysics Data System (ADS)

    Folmer, Eelke; Bosch, Jan

    A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their software. Practice, however, shows that product quality (which includes usability among others) is not that high as it could be. Studies of software projects (Pressman, 2001) reveal that organizations spend a relative large amount of money and effort on fixing usability problems during late stage development. Some of these problems could have been detected and fixed much earlier. This avoidable rework leads to high costs and because during development different tradeoffs have to be made, for example between cost and quality leads to systems with less than optimal usability. This problem has been around for a couple of decades especially after software engineering (SE) and human computer interaction (HCI) became disciplines on their own. While both disciplines developed themselves, several gaps appeared which are now receiving increased attention in research literature. Major gaps of understanding, both between suggested practice and how software is actually developed in industry, but also between the best practices of each of the fields have been identified (Carrol et al, 1994, Bass et al, 2001, Folmer and Bosch, 2002). In addition, there are gaps in the fields of differing terminology, concepts, education, and methods.

  17. Guidelines on What Constitutes Plagiarism and Electronic Tools to Detect it.

    PubMed

    Luksanapruksa, Panya; Millhouse, Paul W

    2016-04-01

    Plagiarism is a serious ethical problem among scientific publications. There are various definitions of plagiarism, and the major categories include unintentional (unsuitable paraphrasing or improper citations) and intentional. Intentional plagiarism includes mosaic plagiarism, plagiarism of ideas, plagiarism of text, and self-plagiarism. There are many Web sites and software packages that claim to detect plagiarism effectively. A violation of plagiarism laws can lead to serious consequences including author banning, loss of professional reputation, termination of a position, and even legal action.

  18. Reducing maintenance costs in agreement with CNC machine tools reliability

    NASA Astrophysics Data System (ADS)

    Ungureanu, A. L.; Stan, G.; Butunoi, P. A.

    2016-08-01

    Aligning maintenance strategy with reliability is a challenge due to the need to find an optimal balance between them. Because the various methods described in the relevant literature involve laborious calculations or use of software that can be costly, this paper proposes a method that is easier to implement on CNC machine tools. The new method, called the Consequence of Failure Analysis (CFA) is based on technical and economic optimization, aimed at obtaining a level of required performance with minimum investment and maintenance costs.

  19. Significant lexical relationships

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedersen, T.; Kayaalp, M.; Bruce, R.

    Statistical NLP inevitably deals with a large number of rare events. As a consequence, NLP data often violates the assumptions implicit in traditional statistical procedures such as significance testing. We describe a significance test, an exact conditional test, that is appropriate for NLP data and can be performed using freely available software. We apply this test to the study of lexical relationships and demonstrate that the results obtained using this test are both theoretically more reliable and different from the results obtained using previously applied tests.

  20. Patterns of Interaction and Mathematical Thinking of High School Students in Classroom Environments That Include Use of Java-Based, Curriculum-Embedded Software

    ERIC Educational Resources Information Center

    Fonkert, Karen L.

    2012-01-01

    This study analyzes the nature of student interaction and discourse in an environment that includes the use of Java-based, curriculum-embedded mathematical software. The software "CPMP-Tools" was designed as part of the development of the second edition of the "Core-Plus Mathematics" curriculum. The use of the software on…

  1. Validation of highly reliable, real-time knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1988-01-01

    Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.

  2. Framework for Development of Object-Oriented Software

    NASA Technical Reports Server (NTRS)

    Perez-Poveda, Gus; Ciavarella, Tony; Nieten, Dan

    2004-01-01

    The Real-Time Control (RTC) Application Framework is a high-level software framework written in C++ that supports the rapid design and implementation of object-oriented application programs. This framework provides built-in functionality that solves common software development problems within distributed client-server, multi-threaded, and embedded programming environments. When using the RTC Framework to develop software for a specific domain, designers and implementers can focus entirely on the details of the domain-specific software rather than on creating custom solutions, utilities, and frameworks for the complexities of the programming environment. The RTC Framework was originally developed as part of a Space Shuttle Launch Processing System (LPS) replacement project called Checkout and Launch Control System (CLCS). As a result of the framework s development, CLCS software development time was reduced by 66 percent. The framework is generic enough for developing applications outside of the launch-processing system domain. Other applicable high-level domains include command and control systems and simulation/ training systems.

  3. Integration of LCoS-SLM and LabVIEW based software to simulate fundamental optics, wave optics, and Fourier optics

    NASA Astrophysics Data System (ADS)

    Lyu, Bo-Han; Wang, Chen; Tsai, Chun-Wei

    2017-08-01

    Jasper Display Corp. (JDC) offer high reflectivity, high resolution Liquid Crystal on Silicon - Spatial Light Modulator (LCoS-SLM) which include an associated controller ASIC and LabVIEW based modulation software. Based on this LCoS-SLM, also called Education Kit (EDK), we provide a training platform which includes a series of optical theory and experiments to university students. This EDK not only provides a LabVIEW based operation software to produce Computer Generated Holograms (CGH) to generate some basic diffraction image or holographic image, but also provides simulation software to verity the experiment results simultaneously. However, we believe that a robust LCoSSLM, operation software, simulation software, training system, and training course can help students to study the fundamental optics, wave optics, and Fourier optics more easily. Based on these fundamental knowledges, they could develop their unique skills and create their new innovations on the optoelectronic application in the future.

  4. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  5. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  6. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Roser, Robert

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less

  7. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habib, Salman; Roser, Robert; LeCompte, Tom

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less

  8. Power API Prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-12-04

    The software serves two purposes. The first purpose of the software is to prototype the Sandia High Performance Computing Power Application Programming Interface Specification effort. The specification can be found at http://powerapi.sandia.gov . Prototypes of the specification were developed in parallel with the development of the specification. Release of the prototype will be instructive to anyone who intends to implement the specification. More specifically, our vendor collaborators will benefit from the availability of the prototype. The second is in direct support of the PowerInsight power measurement device, which was co-developed with Penguin Computing. The software provides a cluster wide measurementmore » capability enabled by the PowerInsight device. The software can be used by anyone who purchases a PowerInsight device. The software will allow the user to easily collect power and energy information of a node that is instrumented with PowerInsight. The software can also be used as an example prototype implementation of the High Performance Computing Power Application Programming Interface Specification.« less

  9. Free and open source software for the manipulation of digital images.

    PubMed

    Solomon, Robert W

    2009-06-01

    Free and open source software is a type of software that is nearly as powerful as commercial software but is freely downloadable. This software can do almost everything that the expensive programs can. GIMP (gnu image manipulation program) is the free program that is comparable to Photoshop, and versions are available for Windows, Macintosh, and Linux platforms. This article briefly describes how GIMP can be installed and used to manipulate radiology images. It is no longer necessary to budget large amounts of money for high-quality software to achieve the goals of image processing and document creation because free and open source software is available for the user to download at will.

  10. Advanced software development workstation project: Engineering scripting language. Graphical editor

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  11. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  12. Design and Validation of High Date Rate Ka-Band Software Defined Radio for Small Satellite

    NASA Technical Reports Server (NTRS)

    Xia, Tian

    2016-01-01

    The Design and Validation of High Date Rate Ka- Band Software Defined Radio for Small Satellite project will develop a novel Ka-band software defined radio (SDR) that is capable of establishing high data rate inter-satellite links with a throughput of 500 megabits per second (Mb/s) and providing millimeter ranging precision. The system will be designed to operate with high performance and reliability that is robust against various interference effects and network anomalies. The Ka-band radio resulting from this work will improve upon state of the art Ka-band radios in terms of dimensional size, mass and power dissipation, which limit their use in small satellites.

  13. High pressure single-crystal micro X-ray diffraction analysis with GSE_ADA/RSV software

    NASA Astrophysics Data System (ADS)

    Dera, Przemyslaw; Zhuravlev, Kirill; Prakapenka, Vitali; Rivers, Mark L.; Finkelstein, Gregory J.; Grubor-Urosevic, Ognjen; Tschauner, Oliver; Clark, Simon M.; Downs, Robert T.

    2013-08-01

    GSE_ADA/RSV is a free software package for custom analysis of single-crystal micro X-ray diffraction (SCμXRD) data, developed with particular emphasis on data from samples enclosed in diamond anvil cells and subject to high pressure conditions. The package has been in extensive use at the high pressure beamlines of Advanced Photon Source (APS), Argonne National Laboratory and Advanced Light Source (ALS), Lawrence Berkeley National Laboratory. The software is optimized for processing of wide-rotation images and includes a variety of peak intensity corrections and peak filtering features, which are custom-designed to make processing of high pressure SCμXRD easier and more reliable.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly

    Typical engineering systems in applications with high failure consequences such as nuclear reactor plants often employ redundancy and diversity of equipment in an effort to lower the probability of failure and therefore risk. However, it has long been recognized that dependencies exist in these redundant and diverse systems. Some dependencies, such as common sources of electrical power, are typically captured in the logic structure of the risk model. Others, usually referred to as intercomponent dependencies, are treated implicitly by introducing one or more statistical parameters into the model. Such common-cause failure models have limitations in a simulation environment. In addition,more » substantial subjectivity is associated with parameter estimation for these models. This paper describes an approach in which system performance is simulated by drawing samples from the joint distributions of dependent variables. The approach relies on the notion of a copula distribution, a notion which has been employed by the actuarial community for ten years or more, but which has seen only limited application in technological risk assessment. The paper also illustrates how equipment failure data can be used in a Bayesian framework to estimate the parameter values in the copula model. This approach avoids much of the subjectivity required to estimate parameters in traditional common-cause failure models. Simulation examples are presented for failures in time. The open-source software package R is used to perform the simulations. The open-source software package WinBUGS is used to perform the Bayesian inference via Markov chain Monte Carlo sampling.« less

  15. Using Ada: The deeper challenges

    NASA Technical Reports Server (NTRS)

    Feinberg, David A.

    1986-01-01

    The Ada programming language and the associated Ada Programming Support Environment (APSE) and Ada Run Time Environment (ARTE) provide the potential for significant life-cycle cost reductions in computer software development and maintenance activities. The Ada programming language itself is standardized, trademarked, and controlled via formal validation procedures. Though compilers are not yet production-ready as most would desire, the technology for constructing them is sufficiently well known and understood that time and money should suffice to correct current deficiencies. The APSE and ARTE are, on the other hand, significantly newer issues within most software development and maintenance efforts. Currently, APSE and ARTE are highly dependent on differing implementer concepts, strategies, and market objectives. Complex and sophisticated mission-critical computing systems require the use of a complete Ada-based capability, not just the programming language itself; yet the range of APSE and ARTE features which must actually be utilized can vary significantly from one system to another. As a consequence, the need to understand, objectively evaluate, and select differing APSE and ARTE capabilities and features is critical to the effective use of Ada and the life-cycle efficiencies it is intended to promote. It is the selection, collection, and understanding of APSE and ARTE which provide the deeper challenges of using Ada for real-life mission-critical computing systems. Some of the current issues which must be clarified, often on a case-by-case basis, in order to successfully realize the full capabilities of Ada are discussed.

  16. Energy analysis of holographic lenses for solar concentration

    NASA Astrophysics Data System (ADS)

    Marín-Sáez, Julia; Collados, M. Victoria; Chemisana, Daniel; Atencia, Jesús

    2017-05-01

    The use of volume and phase holographic elements in the design of photovoltaic solar concentrators has become very popular as an alternative solution to refractive systems, due to their high efficiency, low cost and possibilities of building integration. Angular and chromatic selectivity of volume holograms can affect their behavior as solar concentrators. In holographic lenses, angular and chromatic selectivity varies along the lens plane. Besides, considering that the holographic materials are not sensitive to the wavelengths for which the solar cells are most efficient, the reconstruction wavelength is usually different from the recording one. As a consequence, not all points of the lens work at Bragg condition for a defined incident direction or wavelength. A software tool that calculates the direction and efficiency of solar rays at the output of a volume holographic element has been developed in this study. It allows the analysis of the total energy that reaches the solar cell, taking into account the sun movement, the solar spectrum and the sensitivity of the solar cell. The dependence of the recording wavelength on the collected energy is studied with this software. As the recording angle is different along a holographic lens, some zones of the lens could not act as a volume hologram. The efficiency at the transition zones between volume and thin behavior in lenses recorded in Bayfol HX is experimentally analyzed in order to decide if the energy of generated higher diffraction orders has to be included in the simulation.

  17. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less

  18. Development of Efficient Authoring Software for e-Learning Contents

    NASA Astrophysics Data System (ADS)

    Kozono, Kazutake; Teramoto, Akemi; Akiyama, Hidenori

    The contents creation in e-Learning system becomes an important problem. The contents of e-Learning should include figure and voice media for a high-level educational effect. However, the use of figure and voice complicates the operation of authoring software considerably. A new authoring software, which can build e-Learning contents efficiently, has been developed to solve this problem. This paper reports development results of the authoring software.

  19. Assuring Software Reliability

    DTIC Science & Technology

    2014-08-01

    technologies and processes to achieve a required level of confidence that software systems and services function in the intended manner. 1.3 Security Example...that took three high-voltage lines out of service and a software fail- ure (a race condition3) that disabled the computing service that notified the... service had failed. Instead of analyzing the details of the alarm server failure, the reviewers asked why the following software assurance claim had

  20. The Implementation of Satellite Attitude Control System Software Using Object Oriented Design

    NASA Technical Reports Server (NTRS)

    Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek

    1998-01-01

    NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.

  1. Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven

    1997-01-01

    Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.

  2. CEval: All-in-one software for data processing and statistical evaluations in affinity capillary electrophoresis.

    PubMed

    Dubský, Pavel; Ördögová, Magda; Malý, Michal; Riesová, Martina

    2016-05-06

    We introduce CEval software (downloadable for free at echmet.natur.cuni.cz) that was developed for quicker and easier electrophoregram evaluation and further data processing in (affinity) capillary electrophoresis. This software allows for automatic peak detection and evaluation of common peak parameters, such as its migration time, area, width etc. Additionally, the software includes a nonlinear regression engine that performs peak fitting with the Haarhoff-van der Linde (HVL) function, including automated initial guess of the HVL function parameters. HVL is a fundamental peak-shape function in electrophoresis, based on which the correct effective mobility of the analyte represented by the peak is evaluated. Effective mobilities of an analyte at various concentrations of a selector can be further stored and plotted in an affinity CE mode. Consequently, the mobility of the free analyte, μA, mobility of the analyte-selector complex, μAS, and the apparent complexation constant, K('), are first guessed automatically from the linearized data plots and subsequently estimated by the means of nonlinear regression. An option that allows two complexation dependencies to be fitted at once is especially convenient for enantioseparations. Statistical processing of these data is also included, which allowed us to: i) express the 95% confidence intervals for the μA, μAS and K(') least-squares estimates, ii) do hypothesis testing on the estimated parameters for the first time. We demonstrate the benefits of the CEval software by inspecting complexation of tryptophan methyl ester with two cyclodextrins, neutral heptakis(2,6-di-O-methyl)-β-CD and charged heptakis(6-O-sulfo)-β-CD. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. IMMAN: free software for information theory-based chemometric analysis.

    PubMed

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA supervised algorithms. Graphic representation for Shannon's distribution of MD calculating software.

  4. An Automated Method for Identifying Inconsistencies within Diagrammatic Software Requirements Specifications

    NASA Technical Reports Server (NTRS)

    Zhang, Zhong

    1997-01-01

    The development of large-scale, composite software in a geographically distributed environment is an evolutionary process. Often, in such evolving systems, striving for consistency is complicated by many factors, because development participants have various locations, skills, responsibilities, roles, opinions, languages, terminology and different degrees of abstraction they employ. This naturally leads to many partial specifications or viewpoints. These multiple views on the system being developed usually overlap. From another aspect, these multiple views give rise to the potential for inconsistency. Existing CASE tools do not efficiently manage inconsistencies in distributed development environment for a large-scale project. Based on the ViewPoints framework the WHERE (Web-Based Hypertext Environment for requirements Evolution) toolkit aims to tackle inconsistency management issues within geographically distributed software development projects. Consequently, WHERE project helps make more robust software and support software assurance process. The long term goal of WHERE tools aims to the inconsistency analysis and management in requirements specifications. A framework based on Graph Grammar theory and TCMJAVA toolkit is proposed to detect inconsistencies among viewpoints. This systematic approach uses three basic operations (UNION, DIFFERENCE, INTERSECTION) to study the static behaviors of graphic and tabular notations. From these operations, subgraphs Query, Selection, Merge, Replacement operations can be derived. This approach uses graph PRODUCTIONS (rewriting rules) to study the dynamic transformations of graphs. We discuss the feasibility of implementation these operations. Also, We present the process of porting original TCM (Toolkit for Conceptual Modeling) project from C++ to Java programming language in this thesis. A scenario based on NASA International Space Station Specification is discussed to show the applicability of our approach. Finally, conclusion and future work about inconsistency management issues in WHERE project will be summarized.

  5. OpenChrom: a cross-platform open source software for the mass spectrometric analysis of chromatographic data.

    PubMed

    Wenig, Philip; Odermatt, Juergen

    2010-07-30

    Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.

  6. Locus of Control or Self-Esteem; Which One is the Best Predictor of Academic Achievement in Iranian College Students.

    PubMed

    Hosseini, Seyyed Nasrollah; Mirzaei Alavijeh, Mehdi; Karami Matin, Behzad; Hamzeh, Behrooz; Ashtarian, Hossein; Jalilian, Farzad

    2016-03-01

    Self-esteem and behavioral consequences, which are due to external or internal locus of control, are effective on academic achievement of students. The aim of this study was to determine the prediction of locus of control and self-esteem in academic achievement among the students. This cross-sectional study was conducted on 300 college students in Kermanshah University of Medical Sciences in 2014. Data collection tools were in three sections: demographic, Rotter internal-external locus of control scale and Coopersmith self-esteem inventory. Data were analyzed using the SPSS software version 21. Results showed that 29.8% and 76.2% of the participants had internal locus of control, and high self-esteem, respectively. There was a significant correlation between self-esteem, locus of control and academic achievement of the students. Self-esteem accounted for 39.5% of the variation in academic achievement. It seems that interventions to increase self-esteem among student can help improve academic achievement among them.

  7. Visualization and Analysis of MiRNA-Targets Interactions Networks.

    PubMed

    León, Luis E; Calligaris, Sebastián D

    2017-01-01

    MicroRNAs are a class of small, noncoding RNA molecules of 21-25 nucleotides in length that regulate the gene expression by base-pairing with the target mRNAs, mainly leading to down-regulation or repression of the target genes. MicroRNAs are involved in diverse regulatory pathways in normal and pathological conditions. In this context, it is highly important to identify the targets of specific microRNA in order to understand the mechanism of its regulation and consequently its involvement in disease. However, the microRNA target identification is experimentally laborious and time-consuming. The in silico prediction of microRNA targets is an extremely useful approach because you can identify potential mRNA targets, reduce the number of possibilities and then, validate a few microRNA-mRNA interactions in an in vitro experimental model. In this chapter, we describe, in a simple way, bioinformatics guidelines to use miRWalk database and Cytoscape software for analyzing microRNA-mRNA interactions through their visualization as a network.

  8. Tomography experiment of an integrated circuit specimen using 3 MeV electrons in the transmission electron microscope.

    PubMed

    Zhang, Hai-Bo; Zhang, Xiang-Liang; Wang, Yong; Takaoka, Akio

    2007-01-01

    The possibility of utilizing high-energy electron tomography to characterize the micron-scale three dimensional (3D) structures of integrated circuits has been demonstrated experimentally. First, electron transmission through a tilted SiO(2) film was measured with an ultrahigh-voltage electron microscope (ultra-HVEM) and analyzed from the point of view of elastic scattering of electrons, showing that linear attenuation of the logarithmic electron transmission still holds valid for effective specimen thicknesses up to 5 microm under 2 MV accelerating voltages. Electron tomography of a micron-order thick integrated circuit specimen including the Cu/via interconnect was then tried with 3 MeV electrons in the ultra-HVEM. Serial projection images of the specimen tilted at different angles over the range of +/-90 degrees were acquired, and 3D reconstruction was performed with the images by means of the IMOD software package. Consequently, the 3D structures of the Cu lines, via and void, were revealed by cross sections and surface rendering.

  9. a Prompt Methodology to Georeference Complex Hypogea Environments

    NASA Astrophysics Data System (ADS)

    Troisi, S.; Baiocchi, V.; Del Pizzo, S.; Giannone, F.

    2017-02-01

    Actually complex underground structures and facilities occupy a wide space in our cities, most of them are often unsurveyed; cable duct, drainage system are not exception. Furthermore, several inspection operations are performed in critical air condition, that do not allow or make more difficult a conventional survey. In this scenario a prompt methodology to survey and georeferencing such facilities is often indispensable. A visual based approach was proposed in this paper; such methodology provides a 3D model of the environment and the path followed by the camera using the conventional photogrammetric/Structure from motion software tools. The key-role is played by the lens camera; indeed, a fisheye system was employed to obtain a very wide field of view (FOV) and therefore high overlapping among the frames. The camera geometry is in according to a forward motion along the axis camera. Consequently, to avoid instability of bundle adjustment algorithm a preliminary calibration of camera was carried out. A specific case study was reported and the accuracy achieved.

  10. Modernized build and test infrastructure for control software at ESO: highly flexible building, testing, and automatic quality practices for telescope control software

    NASA Astrophysics Data System (ADS)

    Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.

    2016-07-01

    The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.

  11. A high order approach to flight software development and testing

    NASA Technical Reports Server (NTRS)

    Steinbacher, J.

    1981-01-01

    The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.

  12. Structural Brain Imaging of Long-Term Anabolic-Androgenic Steroid Users and Nonusing Weightlifters.

    PubMed

    Bjørnebekk, Astrid; Walhovd, Kristine B; Jørstad, Marie L; Due-Tønnessen, Paulina; Hullstein, Ingunn R; Fjell, Anders M

    2017-08-15

    Prolonged high-dose anabolic-androgenic steroid (AAS) use has been associated with psychiatric symptoms and cognitive deficits, yet we have almost no knowledge of the long-term consequences of AAS use on the brain. The purpose of this study is to investigate the association between long-term AAS exposure and brain morphometry, including subcortical neuroanatomical volumes and regional cortical thickness. Male AAS users and weightlifters with no experience with AASs or any other equivalent doping substances underwent structural magnetic resonance imaging scans of the brain. The current paper is based upon high-resolution structural T1-weighted images from 82 current or past AAS users exceeding 1 year of cumulative AAS use and 68 non-AAS-using weightlifters. Images were processed with the FreeSurfer software to compare neuroanatomical volumes and cerebral cortical thickness between the groups. Compared to non-AAS-using weightlifters, the AAS group had thinner cortex in widespread regions and significantly smaller neuroanatomical volumes, including total gray matter, cerebral cortex, and putamen. Both volumetric and thickness effects remained relatively stable across different AAS subsamples comprising various degrees of exposure to AASs and also when excluding participants with previous and current non-AAS drug abuse. The effects could not be explained by differences in verbal IQ, intracranial volume, anxiety/depression, or attention or behavioral problems. This large-scale systematic investigation of AAS use on brain structure shows negative correlations between AAS use and brain volume and cortical thickness. Although the findings are correlational, they may serve to raise concern about the long-term consequences of AAS use on structural features of the brain. Copyright © 2016 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  13. Uncertainty and Sensitivity of Direct Economic Flood Damages: the FloodRisk Free and Open-Source Software

    NASA Astrophysics Data System (ADS)

    Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.

    2017-12-01

    The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push forward research into the implementation of methods and models for the assimilation of uncertainties in decision-making processes emerges.

  14. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.

  15. Visualising biological data: a semantic approach to tool and database integration

    PubMed Central

    Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K

    2009-01-01

    Motivation In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customised for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. Methods To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. Results The toolkit, named Utopia, is freely available from . PMID:19534744

  16. Development of seismic tomography software for hybrid supercomputers

    NASA Astrophysics Data System (ADS)

    Nikitin, Alexandr; Serdyukov, Alexandr; Duchkov, Anton

    2015-04-01

    Seismic tomography is a technique used for computing velocity model of geologic structure from first arrival travel times of seismic waves. The technique is used in processing of regional and global seismic data, in seismic exploration for prospecting and exploration of mineral and hydrocarbon deposits, and in seismic engineering for monitoring the condition of engineering structures and the surrounding host medium. As a consequence of development of seismic monitoring systems and increasing volume of seismic data, there is a growing need for new, more effective computational algorithms for use in seismic tomography applications with improved performance, accuracy and resolution. To achieve this goal, it is necessary to use modern high performance computing systems, such as supercomputers with hybrid architecture that use not only CPUs, but also accelerators and co-processors for computation. The goal of this research is the development of parallel seismic tomography algorithms and software package for such systems, to be used in processing of large volumes of seismic data (hundreds of gigabytes and more). These algorithms and software package will be optimized for the most common computing devices used in modern hybrid supercomputers, such as Intel Xeon CPUs, NVIDIA Tesla accelerators and Intel Xeon Phi co-processors. In this work, the following general scheme of seismic tomography is utilized. Using the eikonal equation solver, arrival times of seismic waves are computed based on assumed velocity model of geologic structure being analyzed. In order to solve the linearized inverse problem, tomographic matrix is computed that connects model adjustments with travel time residuals, and the resulting system of linear equations is regularized and solved to adjust the model. The effectiveness of parallel implementations of existing algorithms on target architectures is considered. During the first stage of this work, algorithms were developed for execution on supercomputers using multicore CPUs only, with preliminary performance tests showing good parallel efficiency on large numerical grids. Porting of the algorithms to hybrid supercomputers is currently ongoing.

  17. Unexpected Control Structure Interaction on International Space Station

    NASA Technical Reports Server (NTRS)

    Gomez, Susan F.; Platonov, Valery; Medina, Elizabeth A.; Borisenko, Alexander; Bogachev, Alexey

    2017-01-01

    On June 23, 2011, the International Space Station (ISS) was performing a routine 180 degree yaw maneuver in support of a Russian vehicle docking when the on board Russian Segment (RS) software unexpectedly declared two attitude thrusters failed and switched thruster configurations in response to unanticipated ISS dynamic motion. Flight data analysis after the maneuver indicated that higher than predicted structural loads had been induced at various locations on the United States (U.S.) segment of the ISS. Further analysis revealed that the attitude control system was firing thrusters in response to both structural flex and rigid body rates, which resonated the structure and caused high loads and fatigue cycles. It was later determined that the thruster themselves were healthy. The RS software logic, which was intended to react to thruster failures, had instead been heavily influenced by interaction between the control system and structural flex. This paper will discuss the technical aspects of the control structure interaction problem that led to the RS control system firing thrusters in response to structural flex, the factors that led to insufficient preflight analysis of the thruster firings, and the ramifications the event had on the ISS. An immediate consequence included limiting which thrusters could be used for attitude control. This complicated the planning of on-orbit thruster events and necessitated the use of suboptimal thruster configurations that increased propellant usage and caused thruster lifetime usage concerns. In addition to the technical aspects of the problem, the team dynamics and communication shortcomings that led to such an event happening in an environment where extensive analysis is performed in support of human space flight will also be examined. Finally, the technical solution will be presented, which required a multidisciplinary effort between the U.S. and Russian control system engineers and loads and dynamics structural engineers to develop and implement an extensive modification in the RS software logic for ISS attitude control thruster firings.

  18. FDSTools: A software package for analysis of massively parallel sequencing data with the ability to recognise and correct STR stutter and other PCR or sequencing noise.

    PubMed

    Hoogenboom, Jerry; van der Gaag, Kristiaan J; de Leeuw, Rick H; Sijen, Titia; de Knijff, Peter; Laros, Jeroen F J

    2017-03-01

    Massively parallel sequencing (MPS) is on the advent of a broad scale application in forensic research and casework. The improved capabilities to analyse evidentiary traces representing unbalanced mixtures is often mentioned as one of the major advantages of this technique. However, most of the available software packages that analyse forensic short tandem repeat (STR) sequencing data are not well suited for high throughput analysis of such mixed traces. The largest challenge is the presence of stutter artefacts in STR amplifications, which are not readily discerned from minor contributions. FDSTools is an open-source software solution developed for this purpose. The level of stutter formation is influenced by various aspects of the sequence, such as the length of the longest uninterrupted stretch occurring in an STR. When MPS is used, STRs are evaluated as sequence variants that each have particular stutter characteristics which can be precisely determined. FDSTools uses a database of reference samples to determine stutter and other systemic PCR or sequencing artefacts for each individual allele. In addition, stutter models are created for each repeating element in order to predict stutter artefacts for alleles that are not included in the reference set. This information is subsequently used to recognise and compensate for the noise in a sequence profile. The result is a better representation of the true composition of a sample. Using Promega Powerseq™ Auto System data from 450 reference samples and 31 two-person mixtures, we show that the FDSTools correction module decreases stutter ratios above 20% to below 3%. Consequently, much lower levels of contributions in the mixed traces are detected. FDSTools contains modules to visualise the data in an interactive format allowing users to filter data with their own preferred thresholds. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Next Generation Models for Storage and Representation of Microbial Biological Annotation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quest, Daniel J; Land, Miriam L; Brettin, Thomas S

    2010-01-01

    Background Traditional genome annotation systems were developed in a very different computing era, one where the World Wide Web was just emerging. Consequently, these systems are built as centralized black boxes focused on generating high quality annotation submissions to GenBank/EMBL supported by expert manual curation. The exponential growth of sequence data drives a growing need for increasingly higher quality and automatically generated annotation. Typical annotation pipelines utilize traditional database technologies, clustered computing resources, Perl, C, and UNIX file systems to process raw sequence data, identify genes, and predict and categorize gene function. These technologies tightly couple the annotation software systemmore » to hardware and third party software (e.g. relational database systems and schemas). This makes annotation systems hard to reproduce, inflexible to modification over time, difficult to assess, difficult to partition across multiple geographic sites, and difficult to understand for those who are not domain experts. These systems are not readily open to scrutiny and therefore not scientifically tractable. The advent of Semantic Web standards such as Resource Description Framework (RDF) and OWL Web Ontology Language (OWL) enables us to construct systems that address these challenges in a new comprehensive way. Results Here, we develop a framework for linking traditional data to OWL-based ontologies in genome annotation. We show how data standards can decouple hardware and third party software tools from annotation pipelines, thereby making annotation pipelines easier to reproduce and assess. An illustrative example shows how TURTLE (Terse RDF Triple Language) can be used as a human readable, but also semantically-aware, equivalent to GenBank/EMBL files. Conclusions The power of this approach lies in its ability to assemble annotation data from multiple databases across multiple locations into a representation that is understandable to researchers. In this way, all researchers, experimental and computational, will more easily understand the informatics processes constructing genome annotation and ultimately be able to help improve the systems that produce them.« less

  20. Adaptation of G-TAG Software for Validating Touch-and-Go Comet Surface Sampling Design Methodology

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    The G-TAG software tool was developed under the R&TD on Integrated Autonomous Guidance, Navigation, and Control for Comet Sample Return, and represents a novel, multi-body dynamics simulation software tool for studying TAG sampling. The G-TAG multi-body simulation tool provides a simulation environment in which a Touch-and-Go (TAG) sampling event can be extensively tested. TAG sampling requires the spacecraft to descend to the surface, contact the surface with a sampling collection device, and then to ascend to a safe altitude. The TAG event lasts only a few seconds but is mission-critical with potentially high risk. Consequently, there is a need for the TAG event to be well characterized and studied by simulation and analysis in order for the proposal teams to converge on a reliable spacecraft design. This adaptation of the G-TAG tool was developed to support the Comet Odyssey proposal effort, and is specifically focused to address comet sample return missions. In this application, the spacecraft descends to and samples from the surface of a comet. Performance of the spacecraft during TAG is assessed based on survivability and sample collection performance. For the adaptation of the G-TAG simulation tool to comet scenarios, models are developed that accurately describe the properties of the spacecraft, approach trajectories, and descent velocities, as well as the models of the external forces and torques acting on the spacecraft. The adapted models of the spacecraft, descent profiles, and external sampling forces/torques were more sophisticated and customized for comets than those available in the basic G-TAG simulation tool. Scenarios implemented include the study of variations in requirements, spacecraft design (size, locations, etc. of the spacecraft components), and the environment (surface properties, slope, disturbances, etc.). The simulations, along with their visual representations using G-View, contributed to the Comet Odyssey New Frontiers proposal effort by indicating problems and/or benefits of different approaches and designs.

  1. Visualising biological data: a semantic approach to tool and database integration.

    PubMed

    Pettifer, Steve; Thorne, David; McDermott, Philip; Marsh, James; Villéger, Alice; Kell, Douglas B; Attwood, Teresa K

    2009-06-16

    In the biological sciences, the need to analyse vast amounts of information has become commonplace. Such large-scale analyses often involve drawing together data from a variety of different databases, held remotely on the internet or locally on in-house servers. Supporting these tasks are ad hoc collections of data-manipulation tools, scripting languages and visualisation software, which are often combined in arcane ways to create cumbersome systems that have been customized for a particular purpose, and are consequently not readily adaptable to other uses. For many day-to-day bioinformatics tasks, the sizes of current databases, and the scale of the analyses necessary, now demand increasing levels of automation; nevertheless, the unique experience and intuition of human researchers is still required to interpret the end results in any meaningful biological way. Putting humans in the loop requires tools to support real-time interaction with these vast and complex data-sets. Numerous tools do exist for this purpose, but many do not have optimal interfaces, most are effectively isolated from other tools and databases owing to incompatible data formats, and many have limited real-time performance when applied to realistically large data-sets: much of the user's cognitive capacity is therefore focused on controlling the software and manipulating esoteric file formats rather than on performing the research. To confront these issues, harnessing expertise in human-computer interaction (HCI), high-performance rendering and distributed systems, and guided by bioinformaticians and end-user biologists, we are building reusable software components that, together, create a toolkit that is both architecturally sound from a computing point of view, and addresses both user and developer requirements. Key to the system's usability is its direct exploitation of semantics, which, crucially, gives individual components knowledge of their own functionality and allows them to interoperate seamlessly, removing many of the existing barriers and bottlenecks from standard bioinformatics tasks. The toolkit, named Utopia, is freely available from http://utopia.cs.man.ac.uk/.

  2. Enhancing outpatient clinics management software by reducing patients' waiting time.

    PubMed

    Almomani, Iman; AlSarheed, Ahlam

    The Kingdom of Saudi Arabia (KSA) gives great attention to improving the quality of services provided by health care sectors including outpatient clinics. One of the main drawbacks in outpatient clinics is long waiting time for patients-which affects the level of patient satisfaction and the quality of services. This article addresses this problem by studying the Outpatient Management Software (OMS) and proposing solutions to reduce waiting times. Many hospitals around the world apply solutions to overcome the problem of long waiting times in outpatient clinics such as hospitals in the USA, China, Sri Lanka, and Taiwan. These clinics have succeeded in reducing wait times by 15%, 78%, 60% and 50%, respectively. Such solutions depend mainly on adding more human resources or changing some business or management policies. The solutions presented in this article reduce waiting times by enhancing the software used to manage outpatient clinics services. Both quantitative and qualitative methods have been used to understand current OMS and examine level of patient's satisfaction. Five main problems that may cause high or unmeasured waiting time have been identified: appointment type, ticket numbering, doctor late arrival, early arriving patient and patients' distribution list. These problems have been mapped to the corresponding OMS components. Solutions to the above problems have been introduced and evaluated analytically or by simulation experiments. Evaluation of the results shows a reduction in patient waiting time. When late doctor arrival issues are solved, this can reduce the clinic service time by up to 20%. However, solutions for early arriving patients reduces 53.3% of vital time, 20% of the clinic time and overall 30.3% of the total waiting time. Finally, well patient-distribution lists make improvements by 54.2%. Improvements introduced to the patients' waiting time will consequently affect patients' satisfaction and improve the quality of health care services. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  3. Software-Realized Scaffolding to Facilitate Programming for Science Learning.

    ERIC Educational Resources Information Center

    Guzdial, Mark

    1994-01-01

    Discussion of the use of programming as a learning activity focuses on software-realized scaffolding. Emile, software that facilitates programming for modeling and simulation in physics, is described, and results of an evaluation of the use of Emile with high school students are reported. (Contains 95 references.) (LRW)

  4. A Call for Bioimaging Software Usability

    PubMed Central

    Carpenter, Anne E.; Kamentsky, Lee; Eliceiri, Kevin W.

    2013-01-01

    Bioimaging software developed in a research setting often fails to be widely used by the scientific community. We suggest that, to maximize both the public’s and researchers’ investments, usability should be a more highly valued goal. We describe specific characteristics of usability towards which bioimaging software projects should aim. PMID:22743771

  5. Software Repository

    NASA Technical Reports Server (NTRS)

    Merwarth, P., D.

    1983-01-01

    The Common Software Module Repository (CSMR) is computerized library system with high product and service visibility to potential users. Online capabilities of system allow both librarian and user to interact with library. Librarian is responsible for maintaining information in CSMR library. User searches library to locate software modules that meet his or her current needs.

  6. ESSCOTS for Learning: Transforming Commercial Software into Powerful Educational Tools.

    ERIC Educational Resources Information Center

    McArthur, David; And Others

    1995-01-01

    Gives an overview of Educational Support Systems based on commercial off-the-shelf software (ESSCOTS), and discusses the benefits of developing such educational software. Presents results of a study that revealed the learning processes of middle and high school students who used a geographical information system. (JMV)

  7. Future of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  8. Evaluating Games-Based Learning

    ERIC Educational Resources Information Center

    Hainey, Thomas; Connolly, Thomas

    2010-01-01

    A highly important part of software engineering education is requirements collection and analysis, one of the initial stages of the Software Development Lifecycle. No other conceptual work is as difficult to rectify at a later stage or as damaging to the overall system if performed incorrectly. As software engineering is a field with a reputation…

  9. Analysis of Five Instructional Methods for Teaching Sketchpad to Junior High Students

    ERIC Educational Resources Information Center

    Wright, Geoffrey; Shumway, Steve; Terry, Ronald; Bartholomew, Scott

    2012-01-01

    This manuscript addresses a problem teachers of computer software applications face today: What is an effective method for teaching new computer software? Technology and engineering teachers, specifically those with communications and other related courses that involve computer software applications, face this problem when teaching computer…

  10. Campus-Based Practices for Promoting Student Success: Software Solutions. Research Brief

    ERIC Educational Resources Information Center

    Horn, Aaron S.; Reinert, Leah; Reis, Michael

    2015-01-01

    Colleges and universities are increasingly adopting various software solutions to raise degree completion rates and lower costs (Ferguson, 2012; Vendituoli, 2014; Yanosky, 2014). Student success software, also known as Integrated Planning and Advising Services (IPAS), appears to be in high demand among both students and faculty (Dahlstrom &…

  11. NASA PC software evaluation project

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  12. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.

  13. RT-Syn: A real-time software system generator

    NASA Technical Reports Server (NTRS)

    Setliff, Dorothy E.

    1992-01-01

    This paper presents research into providing highly reusable and maintainable components by using automatic software synthesis techniques. This proposal uses domain knowledge combined with automatic software synthesis techniques to engineer large-scale mission-critical real-time software. The hypothesis centers on a software synthesis architecture that specifically incorporates application-specific (in this case real-time) knowledge. This architecture synthesizes complex system software to meet a behavioral specification and external interaction design constraints. Some examples of these external constraints are communication protocols, precisions, timing, and space limitations. The incorporation of application-specific knowledge facilitates the generation of mathematical software metrics which are used to narrow the design space, thereby making software synthesis tractable. Success has the potential to dramatically reduce mission-critical system life-cycle costs not only by reducing development time, but more importantly facilitating maintenance, modifications, and extensions of complex mission-critical software systems, which are currently dominating life cycle costs.

  14. A Validation Framework for the Long Term Preservation of High Energy Physics Data

    NASA Astrophysics Data System (ADS)

    Ozerov, Dmitri; South, David M.

    2014-06-01

    The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.

  15. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    NASA Technical Reports Server (NTRS)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  16. Metronome LKM: An open source virtual keyboard driver to measure experiment software latencies.

    PubMed

    Garaizar, Pablo; Vadillo, Miguel A

    2017-10-01

    Experiment software is often used to measure reaction times gathered with keyboards or other input devices. In previous studies, the accuracy and precision of time stamps has been assessed through several means: (a) generating accurate square wave signals from an external device connected to the parallel port of the computer running the experiment software, (b) triggering the typematic repeat feature of some keyboards to get an evenly separated series of keypress events, or (c) using a solenoid handled by a microcontroller to press the input device (keyboard, mouse button, touch screen) that will be used in the experimental setup. Despite the advantages of these approaches in some contexts, none of them can isolate the measurement error caused by the experiment software itself. Metronome LKM provides a virtual keyboard to assess an experiment's software. Using this open source driver, researchers can generate keypress events using high-resolution timers and compare the time stamps collected by the experiment software with those gathered by Metronome LKM (with nanosecond resolution). Our software is highly configurable (in terms of keys pressed, intervals, SysRq activation) and runs on 2.6-4.8 Linux kernels.

  17. Multimedia software to help caregivers cope.

    PubMed

    Chambers, Mary G; Connor, Samantha L; McGonigle, Mary; Diver, Mike G

    2003-01-01

    This report describes the design and evaluation of a software application to help carers cope when faced with caring problems and emergencies. The design process involved users at each stage to ensure the content of the software application was appropriate, and the research team carefully considered the requirements of disabled and elderly users. Focus group discussions and individual interviews were conducted in five European countries to ascertain the needs of caregivers in this area. The findings were used to design a three-part multimedia software application to help family caregivers prepare to cope with sudden, unexpected, and difficult situations that may arise during their time as a caregiver. This prototype then was evaluated via user trials and usability questionnaires to consider the usability and acceptance of the application and any changes that may be required. User acceptance of the software application was high, and the key features of usability such as content, appearance, and navigation were highly rated. In general, comments were positive and enthusiastic regarding the content of the software application and relevance to the caring situation. The software application has the potential to offer information and support to those who are caring for the elderly and disabled at home and to help them prepare for a crisis.

  18. Developing interpretable models with optimized set reduction for identifying high risk software components

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  19. Hardware/software codesign for embedded RISC core

    NASA Astrophysics Data System (ADS)

    Liu, Peng

    2001-12-01

    This paper describes hardware/software codesign method of the extendible embedded RISC core VIRGO, which based on MIPS-I instruction set architecture. VIRGO is described by Verilog hardware description language that has five-stage pipeline with shared 32-bit cache/memory interface, and it is controlled by distributed control scheme. Every pipeline stage has one small controller, which controls the pipeline stage status and cooperation among the pipeline phase. Since description use high level language and structure is distributed, VIRGO core has highly extension that can meet the requirements of application. We take look at the high-definition television MPEG2 MPHL decoder chip, constructed the hardware/software codesign virtual prototyping machine that can research on VIRGO core instruction set architecture, and system on chip memory size requirements, and system on chip software, etc. We also can evaluate the system on chip design and RISC instruction set based on the virtual prototyping machine platform.

  20. Evaluation of an Area-Based matching algorithm with advanced shape models

    NASA Astrophysics Data System (ADS)

    Re, C.; Roncella, R.; Forlani, G.; Cremonese, G.; Naletto, G.

    2014-04-01

    Nowadays, the scientific institutions involved in planetary mapping are working on new strategies to produce accurate high resolution DTMs from space images at planetary scale, usually dealing with extremely large data volumes. From a methodological point of view, despite the introduction of a series of new algorithms for image matching (e.g. the Semi Global Matching) that yield superior results (especially because they produce usually smooth and continuous surfaces) with lower processing times, the preference in this field still goes to well established area-based matching techniques. Many efforts are consequently directed to improve each phase of the photogrammetric process, from image pre-processing to DTM interpolation. In this context, the Dense Matcher software (DM) developed at the University of Parma has been recently optimized to cope with very high resolution images provided by the most recent missions (LROC NAC and HiRISE) focusing the efforts mainly to the improvement of the correlation phase and the process automation. Important changes have been made to the correlation algorithm, still maintaining its high performance in terms of precision and accuracy, by implementing an advanced version of the Least Squares Matching (LSM) algorithm. In particular, an iterative algorithm has been developed to adapt the geometric transformation in image resampling using different shape functions as originally proposed by other authors in different applications.

  1. Meeting the memory challenges of brain-scale network simulation.

    PubMed

    Kunkel, Susanne; Potjans, Tobias C; Eppler, Jochen M; Plesser, Hans Ekkehard; Morrison, Abigail; Diesmann, Markus

    2011-01-01

    The development of high-performance simulation software is crucial for studying the brain connectome. Using connectome data to generate neurocomputational models requires software capable of coping with models on a variety of scales: from the microscale, investigating plasticity, and dynamics of circuits in local networks, to the macroscale, investigating the interactions between distinct brain regions. Prior to any serious dynamical investigation, the first task of network simulations is to check the consistency of data integrated in the connectome and constrain ranges for yet unknown parameters. Thanks to distributed computing techniques, it is possible today to routinely simulate local cortical networks of around 10(5) neurons with up to 10(9) synapses on clusters and multi-processor shared-memory machines. However, brain-scale networks are orders of magnitude larger than such local networks, in terms of numbers of neurons and synapses as well as in terms of computational load. Such networks have been investigated in individual studies, but the underlying simulation technologies have neither been described in sufficient detail to be reproducible nor made publicly available. Here, we discover that as the network model sizes approach the regime of meso- and macroscale simulations, memory consumption on individual compute nodes becomes a critical bottleneck. This is especially relevant on modern supercomputers such as the Blue Gene/P architecture where the available working memory per CPU core is rather limited. We develop a simple linear model to analyze the memory consumption of the constituent components of neuronal simulators as a function of network size and the number of cores used. This approach has multiple benefits. The model enables identification of key contributing components to memory saturation and prediction of the effects of potential improvements to code before any implementation takes place. As a consequence, development cycles can be shorter and less expensive. Applying the model to our freely available Neural Simulation Tool (NEST), we identify the software components dominant at different scales, and develop general strategies for reducing the memory consumption, in particular by using data structures that exploit the sparseness of the local representation of the network. We show that these adaptations enable our simulation software to scale up to the order of 10,000 processors and beyond. As memory consumption issues are likely to be relevant for any software dealing with complex connectome data on such architectures, our approach and our findings should be useful for researchers developing novel neuroinformatics solutions to the challenges posed by the connectome project.

  2. Semi-automated software to measure luminal and stromal areas of choroid in optical coherence tomographic images.

    PubMed

    Sonoda, Shozo; Sakamoto, Taiji; Kakiuchi, Naoko; Shiihara, Hideki; Sakoguchi, Tomonori; Tomita, Masatoshi; Yamashita, Takehiro; Uchino, Eisuke

    2018-03-01

    To determine the capabilities of "EyeGround" software in measuring the choroidal cross sectional areas in optical coherence tomographic (OCT) images. Cross sectional, prospective study. The cross-sectional area of the subfoveal choroid within a 1500 µm diameter circle centered on the fovea was measured both with and without using the EyeGround software in the OCT images. The differences between the evaluation times and the results of the measurements were compared. The inter-rater, intra-rater, inter-method agreements were determined. Fifty-one eyes of 51 healthy subjects were studied: 24 men and 27 women with an average age of 35.0 ± 8.8 years. The time for analyzing a single image was significantly shorter with the software at 3.2±1.1 min than without the software at 12.1±5.1 min (P <0.001). The inter-method correlation efficient for the measurements of the whole choroid was high [0.989, 95% CI (0.981-0.994)]. With the software, the inter-rater correlation efficient was significantly high [0.997, 95% CI (0.995-0.999)], and the intra-rater correlation efficient was also significantly high [0.999, 95% CI (0.999-1.0)]. The EyeGround software can measure the choroidal area in the OCT cross sectional images with good reproducibility and in a significantly shorter times. It can be a valuable tool for analyzing the choroid.

  3. Pilot/Vehicle display development from simulation to flight

    NASA Technical Reports Server (NTRS)

    Dare, Alan R.; Burley, James R., II

    1992-01-01

    The Pilot Vehicle Interface Group, Cockpit Technology Branch, Flight Management Division, at the NASA Langley Research Center is developing display concepts for air combat in the next generation of highly maneuverable aircraft. The High-Alpha Technology Program, under which the research is being done, is involved in flight tests of many new control and display concepts on the High-Alpha Research Vehicle, a highly modified F-18 aircraft. In order to support display concept development through flight testing, a software/hardware system is being developed which will support each phase of the project with little or no software modifications, thus saving thousands of manhours in software development time. Simulation experiments are in progress now and flight tests are slated to begin in FY1994.

  4. Effect of Unifocal versus Multifocal Lenses on Cervical Spine Posture in Patients with Presbyopia.

    PubMed

    Abbas, Rami L; Houri, Mohamad T; Rayyan, Mohammad M; Hamada, Hamada Ahmad; Saab, Ibtissam M

    2018-04-04

    There are many environmental considerations which may or may not lead to the development of faulty cervical mechanics. The design of near vision lenses could contribute to the development of such cervical dysfunction and consequently neck pain. Decision making regarding proper type of lens prescription seems important for presbyopic individuals. To investigate the effect of unifocal and multifocal lenses on cervical posture. Thirty subjects (18 females and 12 males) participated in the study with an age range from 40 to 64 years. Each subject wore consequently both unifocal and multifocal lenses randomly while reading. Then lateral cervical spine X-ray films were taken for each subject during each lens wearing. X-ray films were analyzed with digital software (Autocad software, 2 D) to measure segmental angles of the cervical vertebrae (Occiput/C1, C1/C2, C2/C3, C3/C4, C4/C5, C5/C6, C6/C7, C3/C7, C0/C3, and occiput/C7). Higher significant extension angle in the segments C0/C7, C1/C2, C5/C6, C6/C7, and C3/C7 (p<0.05) during multifocal lenses wearing were observed in contrast with higher flexion angle between C3/C4 and C4/C5 (p<0.05) with unifocal lenses wear. Multifocal lens spectacles produces increased extension in the cervical vertebrae angles when compared with the use of unifocal lenses.

  5. Estimating Consequences of MMOD Penetrations on ISS

    NASA Technical Reports Server (NTRS)

    Evans, H.; Hyde, James; Christiansen, E.; Lear, D.

    2017-01-01

    The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.

  6. Predicting the Consequences of MMOD Penetrations on the International Space Station

    NASA Technical Reports Server (NTRS)

    Hyde, James; Christiansen, E.; Lear, D.; Evans

    2018-01-01

    The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.

  7. Enhancing GIS Capabilities for High Resolution Earth Science Grids

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.

    2017-12-01

    Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.

  8. CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 11

    DTIC Science & Technology

    2005-11-01

    languages. Our discipline of software engineering has really experienced phenomenal growth right before our eyes. A sign that software design has...approach on a high level of abstraction. The main emphasis is on the identification and allocation of a needed functionality (e.g., a target tracker ), rather...messaging software that is the backbone of teenage culture. As increasing security constraints will increase the cost of developing and main- taining any

  9. MYRaf: A new Approach with IRAF for Astronomical Photometric Reduction

    NASA Astrophysics Data System (ADS)

    Kilic, Y.; Shameoni Niaei, M.; Özeren, F. F.; Yesilyaprak, C.

    2016-12-01

    In this study, the design and some developments of MYRaf software for astronomical photometric reduction are presented. MYRaf software is an easy to use, reliable, and has a fast IRAF aperture photometry GUI tools. MYRaf software is an important step for the automated software process of robotic telescopes, and uses IRAF, PyRAF, matplotlib, ginga, alipy, and Sextractor with the general-purpose and high-level programming language Python and uses the QT framework.

  10. Installing and Setting Up the Git Software Tool on OS X | High-Performance

    Science.gov Websites

    Computing | NREL the Git Software Tool on OS X Installing and Setting Up the Git Software Tool on OS X Learn how to install the Git software tool on OS X for use with the Peregrine system. You can . Binary Installer for OS X - Easiest! You can download the latest version of git from http://git-scm.com

  11. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.

  12. Glioblastoma Segmentation: Comparison of Three Different Software Packages.

    PubMed

    Fyllingen, Even Hovig; Stensjøen, Anne Line; Berntsen, Erik Magnus; Solheim, Ole; Reinertsen, Ingerid

    2016-01-01

    To facilitate a more widespread use of volumetric tumor segmentation in clinical studies, there is an urgent need for reliable, user-friendly segmentation software. The aim of this study was therefore to compare three different software packages for semi-automatic brain tumor segmentation of glioblastoma; namely BrainVoyagerTM QX, ITK-Snap and 3D Slicer, and to make data available for future reference. Pre-operative, contrast enhanced T1-weighted 1.5 or 3 Tesla Magnetic Resonance Imaging (MRI) scans were obtained in 20 consecutive patients who underwent surgery for glioblastoma. MRI scans were segmented twice in each software package by two investigators. Intra-rater, inter-rater and between-software agreement was compared by using differences of means with 95% limits of agreement (LoA), Dice's similarity coefficients (DSC) and Hausdorff distance (HD). Time expenditure of segmentations was measured using a stopwatch. Eighteen tumors were included in the analyses. Inter-rater agreement was highest for BrainVoyager with difference of means of 0.19 mL and 95% LoA from -2.42 mL to 2.81 mL. Between-software agreement and 95% LoA were very similar for the different software packages. Intra-rater, inter-rater and between-software DSC were ≥ 0.93 in all analyses. Time expenditure was approximately 41 min per segmentation in BrainVoyager, and 18 min per segmentation in both 3D Slicer and ITK-Snap. Our main findings were that there is a high agreement within and between the software packages in terms of small intra-rater, inter-rater and between-software differences of means and high Dice's similarity coefficients. Time expenditure was highest for BrainVoyager, but all software packages were relatively time-consuming, which may limit usability in an everyday clinical setting.

  13. Optimization of Vertical Double-Diffused Metal-Oxide Semiconductor (VDMOS) Power Transistor Structure for Use in High Frequencies and Medical Devices

    PubMed Central

    Farhadi, Rozita; Farhadi, Bita

    2014-01-01

    Power transistors, such as the vertical, double-diffused, metal-oxide semiconductor (VDMOS), are used extensively in the amplifier circuits of medical devices. The aim of this research was to construct a VDMOS power transistor with an optimized structure to enhance the operation of medical devices. First, boron was implanted in silicon by implanting unclamped inductive switching (UIS) and a Faraday shield. The Faraday shield was implanted in order to replace the gate-field parasitic capacitor on the entry part of the device. Also, implanting the UIS was used in order to decrease the effect of parasitic bipolar junction transistor (BJT) of the VDMOS power transistor. The research tool used in this study was Silvaco software. By decreasing the transistor entry resistance in the optimized VDMOS structure, power losses and noise at the entry of the transistor were decreased, and, by increasing the breakdown voltage, the lifetime of the VDMOS transistor lifetime was increased, which resulted in increasing drain flow and decreasing Ron. This consequently resulted in enhancing the operation of high-frequency medical devices that use transistors, such as Radio Frequency (RF) and electrocardiograph machines. PMID:25763152

  14. Optimization of Vertical Double-Diffused Metal-Oxide Semiconductor (VDMOS) Power Transistor Structure for Use in High Frequencies and Medical Devices.

    PubMed

    Farhadi, Rozita; Farhadi, Bita

    2014-01-01

    Power transistors, such as the vertical, double-diffused, metal-oxide semiconductor (VDMOS), are used extensively in the amplifier circuits of medical devices. The aim of this research was to construct a VDMOS power transistor with an optimized structure to enhance the operation of medical devices. First, boron was implanted in silicon by implanting unclamped inductive switching (UIS) and a Faraday shield. The Faraday shield was implanted in order to replace the gate-field parasitic capacitor on the entry part of the device. Also, implanting the UIS was used in order to decrease the effect of parasitic bipolar junction transistor (BJT) of the VDMOS power transistor. The research tool used in this study was Silvaco software. By decreasing the transistor entry resistance in the optimized VDMOS structure, power losses and noise at the entry of the transistor were decreased, and, by increasing the breakdown voltage, the lifetime of the VDMOS transistor lifetime was increased, which resulted in increasing drain flow and decreasing Ron. This consequently resulted in enhancing the operation of high-frequency medical devices that use transistors, such as Radio Frequency (RF) and electrocardiograph machines.

  15. [Standardization and modeling of surgical processes].

    PubMed

    Strauss, G; Schmitz, P

    2016-12-01

    Due to the technological developments around the operating room, surgery in the twenty-first century is undergoing a paradigm shift. Which technologies have already been integrated into the surgical routine? How can a favorable cost-benefit balance be achieved by the implementation of new software-based assistance systems? This article presents the state of the art technology as exemplified by a semi-automated operation system for otorhinolaryngology surgery. The main focus is on systems for implementation of digital handbooks and navigational functions in situ. On the basis of continuous development in digital imaging, decisions may by facilitated by individual patient models thus allowing procedures to be optimized. The ongoing digitization and linking of all relevant information enable a high level of standardization in terms of operating procedures. This may be used by assistance systems as a basis for complete documentation and high process reliability. Automation of processes in the operating room results in an increase in quality, precision and standardization so that the effectiveness and efficiency of treatment can be improved; however, care must be taken that detrimental consequences, such as loss of skills and placing too much faith in technology must be avoided by adapted training concepts.

  16. Efficient Smart CMOS Camera Based on FPGAs Oriented to Embedded Image Processing

    PubMed Central

    Bravo, Ignacio; Baliñas, Javier; Gardel, Alfredo; Lázaro, José L.; Espinosa, Felipe; García, Jorge

    2011-01-01

    This article describes an image processing system based on an intelligent ad-hoc camera, whose two principle elements are a high speed 1.2 megapixel Complementary Metal Oxide Semiconductor (CMOS) sensor and a Field Programmable Gate Array (FPGA). The latter is used to control the various sensor parameter configurations and, where desired, to receive and process the images captured by the CMOS sensor. The flexibility and versatility offered by the new FPGA families makes it possible to incorporate microprocessors into these reconfigurable devices, and these are normally used for highly sequential tasks unsuitable for parallelization in hardware. For the present study, we used a Xilinx XC4VFX12 FPGA, which contains an internal Power PC (PPC) microprocessor. In turn, this contains a standalone system which manages the FPGA image processing hardware and endows the system with multiple software options for processing the images captured by the CMOS sensor. The system also incorporates an Ethernet channel for sending processed and unprocessed images from the FPGA to a remote node. Consequently, it is possible to visualize and configure system operation and captured and/or processed images remotely. PMID:22163739

  17. Needs of ergonomic design at control units in production industries.

    PubMed

    Levchuk, I; Schäfer, A; Lang, K-H; Gebhardt, Hj; Klussmann, A

    2012-01-01

    During the last decades, an increasing use of innovative technologies in manufacturing areas was monitored. A huge amount of physical workload was replaced by the change from conventional machine tools to computer-controlled units. CNC systems spread in current production processes. Because of this, machine operators today mostly have an observational function. This caused increasing of static work (e.g., standing, sitting) and cognitive demands (e.g., process observation). Machine operators have a high responsibility, because mistakes may lead to human injuries as well as to product losses - and in consequence may lead to high monetary losses (for the company) as well. Being usable often means for a CNC machine being efficient. An intuitive usability and an ergonomic organization of CNC workplaces can be an essential basis to reduce the risk of failures in operation as well as physical complaints (e.g. pain or diseases because of bad body posture during work). In contrast to conventional machines, CNC machines are equipped both with hardware and software. An intuitive and clear-sighted operating of CNC systems is a requirement for quick learning of new systems. Within this study, a survey was carried out among trainees learning the operation of CNC machines.

  18. Student Use of Scaffolding Software: Relationships with Motivation and Conceptual Understanding

    ERIC Educational Resources Information Center

    Butler, Kyle A.; Lumpe, Andrew

    2008-01-01

    This study was designed to theoretically articulate and empirically assess the role of computer scaffolds. In this project, several examples of educational software were developed to scaffold the learning of students performing high level cognitive activities. The software used in this study, Artemis, focused on scaffolding the learning of…

  19. Novice and Experienced Instructional Software Developers: Effects on Materials Created with Instructional Software Templates

    ERIC Educational Resources Information Center

    Boot, Eddy W.; van Merrienboer, Jeroen J. G.; Veerman, Arja L.

    2007-01-01

    The development of instructional software is a complex process, posing high demands to the technical and didactical expertise of developers. Domain specialists rather than professional developers are often responsible for it, but authoring tools with pre-structured templates claim to compensate for this limited experience. This study compares…

  20. The Impact of Software on Associate Degree Programs in Electronic Engineering Technology.

    ERIC Educational Resources Information Center

    Hata, David M.

    1986-01-01

    Assesses the range and extent of computer assisted instruction software available in electronic engineering technology education. Examines the need for software skills in four areas: (1) high-level languages; (2) assembly language; (3) computer-aided engineering; and (4) computer-aided instruction. Outlines strategies for the future in three…

  1. Cooperative GN&C development in a rapid prototyping environment. [flight software design for space vehicles

    NASA Technical Reports Server (NTRS)

    Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim

    1993-01-01

    The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.

  2. Effect of Software Version on the Accuracy of an Intraoral Scanning Device.

    PubMed

    Haddadi, Yasser; Bahrami, Golnosh; Isidor, Flemming

    2018-04-06

    To investigate the impact of software version on the accuracy of an intraoral scanning device. A master tooth was scanned with a high-precision optical scanner and then 10 times with a CEREC Omnicam scanner with software versions 4.4.0 and 4.4.4. Discrepancies were measured using quality control software. Mean deviation for 4.4.0 was 36.2 ± 35 μm and for 4.4.4 was 20.7 ± 14.2 μm (P ≤ .001). Software version has a significant impact on the accuracy of an intraoral scanner. It is important that researchers also publish the software version of scanners when publishing their findings.

  3. Organizational management practices for achieving software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  4. High-Precision Phenotyping of Grape Bunch Architecture Using Fast 3D Sensor and Automation.

    PubMed

    Rist, Florian; Herzog, Katja; Mack, Jenny; Richter, Robert; Steinhage, Volker; Töpfer, Reinhard

    2018-03-02

    Wine growers prefer cultivars with looser bunch architecture because of the decreased risk for bunch rot. As a consequence, grapevine breeders have to select seedlings and new cultivars with regard to appropriate bunch traits. Bunch architecture is a mosaic of different single traits which makes phenotyping labor-intensive and time-consuming. In the present study, a fast and high-precision phenotyping pipeline was developed. The optical sensor Artec Spider 3D scanner (Artec 3D, L-1466, Luxembourg) was used to generate dense 3D point clouds of grapevine bunches under lab conditions and an automated analysis software called 3D-Bunch-Tool was developed to extract different single 3D bunch traits, i.e., the number of berries, berry diameter, single berry volume, total volume of berries, convex hull volume of grapes, bunch width and bunch length. The method was validated on whole bunches of different grapevine cultivars and phenotypic variable breeding material. Reliable phenotypic data were obtained which show high significant correlations (up to r² = 0.95 for berry number) compared to ground truth data. Moreover, it was shown that the Artec Spider can be used directly in the field where achieved data show comparable precision with regard to the lab application. This non-invasive and non-contact field application facilitates the first high-precision phenotyping pipeline based on 3D bunch traits in large plant sets.

  5. 49 CFR 195.452 - Pipeline integrity management in high consequence areas.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 3 2013-10-01 2013-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...

  6. 49 CFR 195.452 - Pipeline integrity management in high consequence areas.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 3 2012-10-01 2012-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...

  7. 49 CFR 195.452 - Pipeline integrity management in high consequence areas.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 3 2014-10-01 2014-10-01 false Pipeline integrity management in high consequence... Management § 195.452 Pipeline integrity management in high consequence areas. (a) Which pipelines are covered... by this section must: (1) Develop a written integrity management program that addresses the risks on...

  8. Unintended Consequences of Wearable Sensor Use in Healthcare. Contribution of the IMIA Wearable Sensors in Healthcare WG.

    PubMed

    Schukat, M; McCaldin, D; Wang, K; Schreier, G; Lovell, N H; Marschollek, M; Redmond, S J

    2016-11-10

    As wearable sensors take the consumer market by storm, and medical device manufacturers move to make their devices wireless and appropriate for ambulatory use, this revolution brings with it some unintended consequences, which we aim to discuss in this paper. We discuss some important unintended consequences, both beneficial and unwanted, which relate to: modifications of behavior; creation and use of big data sets; new security vulnerabilities; and unforeseen challenges faced by regulatory authorities, struggling to keep pace with recent innovations. Where possible, we proposed potential solutions to unwanted consequences. Intelligent and inclusive design processes may mitigate unintended modifications in behavior. For big data, legislating access to and use of these data will be a legal and political challenge in the years ahead, as we trade the health benefits of wearable sensors against the risk to our privacy. The wireless and personal nature of wearable sensors also exposes them to a number of unique security vulnerabilities. Regulation plays an important role in managing these security risks, but also has the dual responsibility of ensuring that wearable devices are fit for purpose. However, the burden of validating the function and security of medical devices is becoming infeasible for regulators, given the many software apps and wearable sensors entering the market each year, which are only a subset of an even larger 'internet of things'. Wearable sensors may serve to improve wellbeing, but we must be vigilant against the occurrence of unintended consequences. With collaboration between device manufacturers, regulators, and end-users, we balance the risk of unintended consequences occurring against the incredible benefit that wearable sensors promise to bring to the world.

  9. Unintended Consequences of Wearable Sensor Use in Healthcare

    PubMed Central

    McCaldin, D.; Wang, K.; Schreier, G.; Lovell, N. H.; Marschollek, M.; Redmond, S. J.

    2016-01-01

    Summary Objectives As wearable sensors take the consumer market by storm, and medical device manufacturers move to make their devices wireless and appropriate for ambulatory use, this revolution brings with it some unintended consequences, which we aim to discuss in this paper. Methods We discuss some important unintended consequences, both beneficial and unwanted, which relate to: modifications of behavior; creation and use of big data sets; new security vulnerabilities; and unforeseen challenges faced by regulatory authorities, struggling to keep pace with recent innovations. Where possible, we proposed potential solutions to unwanted consequences. Results Intelligent and inclusive design processes may mitigate unintended modifications in behavior. For big data, legislating access to and use of these data will be a legal and political challenge in the years ahead, as we trade the health benefits of wearable sensors against the risk to our privacy. The wireless and personal nature of wearable sensors also exposes them to a number of unique security vulnerabilities. Regulation plays an important role in managing these security risks, but also has the dual responsibility of ensuring that wearable devices are fit for purpose. However, the burden of validating the function and security of medical devices is becoming infeasible for regulators, given the many software apps and wearable sensors entering the market each year, which are only a subset of an even larger ‘internet of things’. Conclusion Wearable sensors may serve to improve wellbeing, but we must be vigilant against the occurrence of unintended consequences. With collaboration between device manufacturers, regulators, and end-users, we balance the risk of unintended consequences occurring against the incredible benefit that wearable sensors promise to bring to the world. PMID:27830234

  10. Arsenic removal from contaminated groundwater by membrane-integrated hybrid plant: optimization and control using Visual Basic platform.

    PubMed

    Chakrabortty, S; Sen, M; Pal, P

    2014-03-01

    A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.

  11. EEGgui: a program used to detect electroencephalogram anomalies after traumatic brain injury.

    PubMed

    Sick, Justin; Bray, Eric; Bregy, Amade; Dietrich, W Dalton; Bramlett, Helen M; Sick, Thomas

    2013-05-21

    Identifying and quantifying pathological changes in brain electrical activity is important for investigations of brain injury and neurological disease. An example is the development of epilepsy, a secondary consequence of traumatic brain injury. While certain epileptiform events can be identified visually from electroencephalographic (EEG) or electrocorticographic (ECoG) records, quantification of these pathological events has proved to be more difficult. In this study we developed MATLAB-based software that would assist detection of pathological brain electrical activity following traumatic brain injury (TBI) and present our MATLAB code used for the analysis of the ECoG. Software was developed using MATLAB(™) and features of the open access EEGLAB. EEGgui is a graphical user interface in the MATLAB programming platform that allows scientists who are not proficient in computer programming to perform a number of elaborate analyses on ECoG signals. The different analyses include Power Spectral Density (PSD), Short Time Fourier analysis and Spectral Entropy (SE). ECoG records used for demonstration of this software were derived from rats that had undergone traumatic brain injury one year earlier. The software provided in this report provides a graphical user interface for displaying ECoG activity and calculating normalized power density using fast fourier transform of the major brain wave frequencies (Delta, Theta, Alpha, Beta1, Beta2 and Gamma). The software further detects events in which power density for these frequency bands exceeds normal ECoG by more than 4 standard deviations. We found that epileptic events could be identified and distinguished from a variety of ECoG phenomena associated with normal changes in behavior. We further found that analysis of spectral entropy was less effective in distinguishing epileptic from normal changes in ECoG activity. The software presented here was a successful modification of EEGLAB in the Matlab environment that allows detection of epileptiform ECoG signals in animals after TBI. The code allows import of large EEG or ECoG data records as standard text files and uses fast fourier transform as a basis for detection of abnormal events. The software can also be used to monitor injury-induced changes in spectral entropy if required. We hope that the software will be useful for other investigators in the field of traumatic brain injury and will stimulate future advances of quantitative analysis of brain electrical activity after neurological injury or disease.

  12. Analysis of flood modeling through innovative geomatic methods

    NASA Astrophysics Data System (ADS)

    Zazo, Santiago; Molina, José-Luis; Rodríguez-Gonzálvez, Pablo

    2015-05-01

    A suitable assessment and management of the exposure level to natural flood risks necessarily requires an exhaustive knowledge of the terrain. This study, primarily aimed to evaluate flood risk, firstly assesses the suitability of an innovative technique, called Reduced Cost Aerial Precision Photogrammetry (RC-APP), based on a motorized technology ultra-light aircraft ULM (Ultra-Light Motor), together with the hybridization of reduced costs sensors, for the acquisition of geospatial information. Consequently, this research generates the RC-APP technique which is found to be a more accurate-precise, economical and less time consuming geomatic product. This technique is applied in river engineering for the geometric modeling and risk assessment to floods. Through the application of RC-APP, a high spatial resolution image (orthophoto of 2.5 cm), and a Digital Elevation Model (DEM) of 0.10 m mesh size and high density points (about 100 points/m2), with altimetric accuracy of -0.02 ± 0.03 m have been obtained. These products have provided a detailed knowledge of the terrain, afterward used for the hydraulic simulation which has allowed a better definition of the inundated area, with important implications for flood risk assessment and management. In this sense, it should be noted that the achieved spatial resolution of DEM is 0.10 m which is especially interesting and useful in hydraulic simulations through 2D software. According to the results, the developed methodology and technology allows for a more accurate riverbed representation, compared with other traditional techniques such as Light Detection and Ranging (LiDAR), with a Root-Mean-Square Error (RMSE ± 0.50 m). This comparison has revealed that RC-APP has one lower magnitude order of error than the LiDAR method. Consequently, this technique arises as an efficient and appropriate tool, especially in areas with high exposure to risk of flooding. In hydraulic terms, the degree of detail achieved in the 3D model, has allowed reaching a significant increase in the knowledge of hydraulic variables in natural waterways.

  13. Software and electronic developments for TUG - T60 robotic telescope

    NASA Astrophysics Data System (ADS)

    Parmaksizoglu, M.; Dindar, M.; Kirbiyik, H.; Helhel, S.

    2014-12-01

    A robotic telescope is a telescope that can make observations without hands-on human control. Its low level behavior is automatic and computer-controlled. Robotic telescopes usually run under the control of a scheduler, which provides high-level control by selecting astronomical targets for observation. TUBITAK National Observatory (TUG) T60 Robotic Telescope is controlled by open source OCAAS software, formally named TALON. This study introduces the improvements on TALON software, new electronic and mechanic designs. The designs and software improvements were implemented in the T60 telescope control software and tested on the real system successfully.

  14. Microcomputer software development facilities

    NASA Technical Reports Server (NTRS)

    Gorman, J. S.; Mathiasen, C.

    1980-01-01

    A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.

  15. Development, Validation and Integration of the ATLAS Trigger System Software in Run 2

    NASA Astrophysics Data System (ADS)

    Keyes, Robert; ATLAS Collaboration

    2017-10-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware, and software, associated to various sub-detectors that must seamlessly cooperate in order to select one collision of interest out of every 40,000 delivered by the LHC every millisecond. These proceedings discuss the challenges, organization and work flow of the ongoing trigger software development, validation, and deployment. The goal of this development is to ensure that the most up-to-date algorithms are used to optimize the performance of the experiment. The goal of the validation is to ensure the reliability and predictability of the software performance. Integration tests are carried out to ensure that the software deployed to the online trigger farm during data-taking run as desired. Trigger software is validated by emulating online conditions using a benchmark run and mimicking the reconstruction that occurs during normal data-taking. This exercise is computationally demanding and thus runs on the ATLAS high performance computing grid with high priority. Performance metrics ranging from low-level memory and CPU requirements, to distributions and efficiencies of high-level physics quantities are visualized and validated by a range of experts. This is a multifaceted critical task that ties together many aspects of the experimental effort and thus directly influences the overall performance of the ATLAS experiment.

  16. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  17. Spectral Knowledge (SK-UTALCA): Software for Exploratory Analysis of High-Resolution Spectral Reflectance Data on Plant Breeding

    PubMed Central

    Lobos, Gustavo A.; Poblete-Echeverría, Carlos

    2017-01-01

    This article describes public, free software that provides efficient exploratory analysis of high-resolution spectral reflectance data. Spectral reflectance data can suffer from problems such as poor signal to noise ratios in various wavebands or invalid measurements due to changes in incoming solar radiation or operator fatigue leading to poor orientation of sensors. Thus, exploratory data analysis is essential to identify appropriate data for further analyses. This software overcomes the problem that analysis tools such as Excel are cumbersome to use for the high number of wavelengths and samples typically acquired in these studies. The software, Spectral Knowledge (SK-UTALCA), was initially developed for plant breeding, but it is also suitable for other studies such as precision agriculture, crop protection, ecophysiology plant nutrition, and soil fertility. Various spectral reflectance indices (SRIs) are often used to relate crop characteristics to spectral data and the software is loaded with 255 SRIs which can be applied quickly to the data. This article describes the architecture and functions of SK-UTALCA and the features of the data that led to the development of each of its modules. PMID:28119705

  18. Spectral Knowledge (SK-UTALCA): Software for Exploratory Analysis of High-Resolution Spectral Reflectance Data on Plant Breeding.

    PubMed

    Lobos, Gustavo A; Poblete-Echeverría, Carlos

    2016-01-01

    This article describes public, free software that provides efficient exploratory analysis of high-resolution spectral reflectance data. Spectral reflectance data can suffer from problems such as poor signal to noise ratios in various wavebands or invalid measurements due to changes in incoming solar radiation or operator fatigue leading to poor orientation of sensors. Thus, exploratory data analysis is essential to identify appropriate data for further analyses. This software overcomes the problem that analysis tools such as Excel are cumbersome to use for the high number of wavelengths and samples typically acquired in these studies. The software, Spectral Knowledge (SK-UTALCA), was initially developed for plant breeding, but it is also suitable for other studies such as precision agriculture, crop protection, ecophysiology plant nutrition, and soil fertility. Various spectral reflectance indices (SRIs) are often used to relate crop characteristics to spectral data and the software is loaded with 255 SRIs which can be applied quickly to the data. This article describes the architecture and functions of SK-UTALCA and the features of the data that led to the development of each of its modules.

  19. Corrected High-Frame Rate Anchored Ultrasound with Software Alignment

    ERIC Educational Resources Information Center

    Miller, Amanda L.; Finch, Kenneth B.

    2011-01-01

    Purpose: To improve lingual ultrasound imaging with the Corrected High Frame Rate Anchored Ultrasound with Software Alignment (CHAUSA; Miller, 2008) method. Method: A production study of the IsiXhosa alveolar click is presented. Articulatory-to-acoustic alignment is demonstrated using a Tri-Modal 3-ms pulse generator. Images from 2 simultaneous…

  20. Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges

    NASA Astrophysics Data System (ADS)

    Maruping, Likoebe M.

    Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.

  1. Defect measurement and analysis of JPL ground software: a case study

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Spagnuolo, John N., Jr.

    2004-01-01

    Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.

  2. High Functioning Autism Spectrum Disorders in Adults: Consequences for Primary Caregivers Compared to Schizophrenia and Depression

    ERIC Educational Resources Information Center

    Grootscholten, Inge A. C.; van Wijngaarden, Bob; Kan, Cornelis C.

    2018-01-01

    Primary caregivers experience consequences from being in close contact to a person with autism spectrum disorder (ASD). This study used the Involvement Evaluation Questionnaire to explore the level of consequences of 104 caregivers involved with adults with High Functioning ASD (HF-ASD) and compared these with the consequences reported by…

  3. A design methodology for portable software on parallel computers

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Miller, Keith W.; Chrisman, Dan A.

    1993-01-01

    This final report for research that was supported by grant number NAG-1-995 documents our progress in addressing two difficulties in parallel programming. The first difficulty is developing software that will execute quickly on a parallel computer. The second difficulty is transporting software between dissimilar parallel computers. In general, we expect that more hardware-specific information will be included in software designs for parallel computers than in designs for sequential computers. This inclusion is an instance of portability being sacrificed for high performance. New parallel computers are being introduced frequently. Trying to keep one's software on the current high performance hardware, a software developer almost continually faces yet another expensive software transportation. The problem of the proposed research is to create a design methodology that helps designers to more precisely control both portability and hardware-specific programming details. The proposed research emphasizes programming for scientific applications. We completed our study of the parallelizability of a subsystem of the NASA Earth Radiation Budget Experiment (ERBE) data processing system. This work is summarized in section two. A more detailed description is provided in Appendix A ('Programming Practices to Support Eventual Parallelism'). Mr. Chrisman, a graduate student, wrote and successfully defended a Ph.D. dissertation proposal which describes our research associated with the issues of software portability and high performance. The list of research tasks are specified in the proposal. The proposal 'A Design Methodology for Portable Software on Parallel Computers' is summarized in section three and is provided in its entirety in Appendix B. We are currently studying a proposed subsystem of the NASA Clouds and the Earth's Radiant Energy System (CERES) data processing system. This software is the proof-of-concept for the Ph.D. dissertation. We have implemented and measured the performance of a portion of this subsystem on the Intel iPSC/2 parallel computer. These results are provided in section four. Our future work is summarized in section five, our acknowledgements are stated in section six, and references for published papers associated with NAG-1-995 are provided in section seven.

  4. A novel tracking tool for the analysis of plant-root tip movements.

    PubMed

    Russino, A; Ascrizzi, A; Popova, L; Tonazzini, A; Mancuso, S; Mazzolai, B

    2013-06-01

    The growth process of roots consists of many activities, such as exploring the soil volume, mining minerals, avoiding obstacles and taking up water to fulfil the plant's primary functions, that are performed differently, depending on environmental conditions. Root movements are strictly related to a root decision strategy, which helps plants to survive under stressful conditions by optimizing energy consumption. In this work, we present a novel image-analysis tool to study the kinematics of the root tip (apex), named analyser for root tip tracks (ARTT). The software implementation combines a segmentation algorithm with additional software imaging filters in order to realize a 2D tip detection. The resulting paths, or tracks, arise from the sampled tip positions through the acquired images during the growth. ARTT allows work with no markers and deals autonomously with new emerging root tips, as well as handling a massive number of data relying on minimum user interaction. Consequently, ARTT can be used for a wide range of applications and for the study of kinematics in different plant species. In particular, the study of the root growth and behaviour could lead to the definition of novel principles for the penetration and/or control paradigms for soil exploration and monitoring tasks. The software capabilities were demonstrated by experimental trials performed with Zea mays and Oryza sativa.

  5. New Design for Rapid Prototyping of Digital Master Casts for Multiple Dental Implant Restorations

    PubMed Central

    Romero, Luis; Jiménez, Mariano; Espinosa, María del Mar; Domínguez, Manuel

    2015-01-01

    Aim This study proposes the replacement of all the physical devices used in the manufacturing of conventional prostheses through the use of digital tools, such as 3D scanners, CAD design software, 3D implants files, rapid prototyping machines or reverse engineering software, in order to develop laboratory work models from which to finish coatings for dental prostheses. Different types of dental prosthetic structures are used, which were adjusted by a non-rotatory threaded fixing system. Method From a digital process, the relative positions of dental implants, soft tissue and adjacent teeth of edentulous or partially edentulous patients has been captured, and a maser working model which accurately replicates data relating to the patients oral cavity has been through treatment of three-dimensional digital data. Results Compared with the conventional master cast, the results show a significant cost savings in attachments, as well as an increase in the quality of reproduction and accuracy of the master cast, with the consequent reduction in the number of patient consultation visits. The combination of software and hardware three-dimensional tools allows the optimization of the planning of dental implant-supported rehabilitations protocol, improving the predictability of clinical treatments and the production cost savings of master casts for restorations upon implants. PMID:26696528

  6. New Design for Rapid Prototyping of Digital Master Casts for Multiple Dental Implant Restorations.

    PubMed

    Romero, Luis; Jiménez, Mariano; Espinosa, María Del Mar; Domínguez, Manuel

    2015-01-01

    This study proposes the replacement of all the physical devices used in the manufacturing of conventional prostheses through the use of digital tools, such as 3D scanners, CAD design software, 3D implants files, rapid prototyping machines or reverse engineering software, in order to develop laboratory work models from which to finish coatings for dental prostheses. Different types of dental prosthetic structures are used, which were adjusted by a non-rotatory threaded fixing system. From a digital process, the relative positions of dental implants, soft tissue and adjacent teeth of edentulous or partially edentulous patients has been captured, and a maser working model which accurately replicates data relating to the patients oral cavity has been through treatment of three-dimensional digital data. Compared with the conventional master cast, the results show a significant cost savings in attachments, as well as an increase in the quality of reproduction and accuracy of the master cast, with the consequent reduction in the number of patient consultation visits. The combination of software and hardware three-dimensional tools allows the optimization of the planning of dental implant-supported rehabilitations protocol, improving the predictability of clinical treatments and the production cost savings of master casts for restorations upon implants.

  7. Evaluation of a Multicolor, Single-Tube Technique To Enumerate Lymphocyte Subpopulations▿

    PubMed Central

    Colombo, F.; Cattaneo, A.; Lopa, R.; Portararo, P.; Rebulla, P.; Porretti, L.

    2008-01-01

    To evaluate the fully automated FACSCanto software, we compared lymphocyte subpopulation counts obtained using three-color FACSCalibur-CELLQuest and six-color FACSCanto-FACSCanto software techniques. High correlations were observed between data obtained with these techniques. Our study indicated that FACSCanto clinical software is accurate and sensitive in single-platform lymphocyte immunophenotyping. PMID:18448621

  8. The Effect of Firm Strategy and Corporate Performance on Software Market Growth in Emerging Regions

    ERIC Educational Resources Information Center

    Mertz, Sharon A.

    2013-01-01

    The purpose of this research is to evaluate the impact of firm strategies and corporate performance on enterprise software market growth in emerging regions. The emerging regions of Asia Pacific, Eastern Europe, the Middle East and Africa, and Latin America, currently represent smaller overall markets for software vendors, but exhibit high growth…

  9. A Guideline of Using Case Method in Software Engineering Courses

    ERIC Educational Resources Information Center

    Zainal, Dzulaiha Aryanee Putri; Razali, Rozilawati; Shukur, Zarina

    2014-01-01

    Software Engineering (SE) education has been reported to fall short in producing high quality software engineers. In seeking alternative solutions, Case Method (CM) is regarded as having potential to solve the issue. CM is a teaching and learning (T&L) method that has been found to be effective in Social Science education. In principle,…

  10. A discussion of higher order software concepts as they apply to functional requirements and specifications. [space shuttles and guidance

    NASA Technical Reports Server (NTRS)

    Hamilton, M.

    1973-01-01

    The entry guidance software functional requirements (requirements design phase), its architectural requirements (specifications design phase), and the entry guidance software verified code are discussed. It was found that the proper integration of designs at both the requirements and specifications levels are of high priority consideration.

  11. The Virtual Genetics Lab II: Improvements to a Freely Available Software Simulation of Genetics

    ERIC Educational Resources Information Center

    White, Brian T.

    2012-01-01

    The Virtual Genetics Lab II (VGLII) is an improved version of the highly successful genetics simulation software, the Virtual Genetics Lab (VGL). The software allows students to use the techniques of genetic analysis to design crosses and interpret data to solve realistic genetics problems involving a hypothetical diploid insect. This is a brief…

  12. Teachers' Critical Evaluations of Dynamic Geometry Software Implementation in 1:1 Classrooms

    ERIC Educational Resources Information Center

    Ware, Jennifer; Stein, Sarah

    2014-01-01

    Although the use of dynamic software in high school mathematics in the United States has emerged as a research topic, little research has been conducted on how teachers integrate new software in relation to at-home technology networks. Interviews with eight mathematics teachers from four North Carolina counties participating in 1:1 laptop…

  13. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov Websites

    battery designers, developers, and manufacturers create affordable, high-performance lithium-ion (Li-ion Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided ) batteries for next-generation electric-drive vehicles (EDVs). An image of a simulation of a battery pack

  14. The positive bystander effect: passive bystanders increase helping in situations with high expected negative consequences for the helper.

    PubMed

    Fischer, Peter; Greitemeyer, Tobias

    2013-01-01

    The present field study investigated the interplay between the presence of a passive bystander (not present versus present) in a simulated bike theft and expected negative consequences (low versus high) in predicting intervention behavior when no physical victim is present. It was found that an additional bystander increases individual intervention in situations where the expected negative consequences for the helper in case of intervention were high (i.e., when the bike thief looks fierce) compared to situations where the expected negative consequences for the helper were low (i.e., when the bike thief does not look fierce). In contrast, no such effect for high vs. low expected negative consequences was observed when no additional bystander observed the critical situation. The results are discussed in light of previous laboratory findings on expected negative consequences and bystander intervention.

  15. FermiLib v0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MCCLEAN, JARROD; HANER, THOMAS; STEIGER, DAMIAN

    FermiLib is an open source software package designed to facilitate the development and testing of algorithms for simulations of fermionic systems on quantum computers. Fermionic simulations represent an important application of early quantum devices with a lot of potential high value targets, such as quantum chemistry for the development of new catalysts. This software strives to provide a link between the required domain expertise in specific fermionic applications and quantum computing to enable more users to directly interface with, and develop for, these applications. It is an extensible Python library designed to interface with the high performance quantum simulator, ProjectQ,more » as well as application specific software such as PSI4 from the domain of quantum chemistry. Such software is key to enabling effective user facilities in quantum computation research.« less

  16. Software/hardware distributed processing network supporting the Ada environment

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.; Pryk, Zen

    1993-09-01

    A high-performance, fault-tolerant, distributed network has been developed, tested, and demonstrated. The network is based on the MIPS Computer Systems, Inc. R3000 Risc for processing, VHSIC ASICs for high speed, reliable, inter-node communications and compatible commercial memory and I/O boards. The network is an evolution of the Advanced Onboard Signal Processor (AOSP) architecture. It supports Ada application software with an Ada- implemented operating system. A six-node implementation (capable of expansion up to 256 nodes) of the RISC multiprocessor architecture provides 120 MIPS of scalar throughput, 96 Mbytes of RAM and 24 Mbytes of non-volatile memory. The network provides for all ground processing applications, has merit for space-qualified RISC-based network, and interfaces to advanced Computer Aided Software Engineering (CASE) tools for application software development.

  17. Software Fault Tolerance: A Tutorial

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2000-01-01

    Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.

  18. Perceptions of the software skills of graduates by employers in the financial services industry

    NASA Astrophysics Data System (ADS)

    Kyng, Tim; Tickle, Leonie; Wood, Leigh N.

    2013-12-01

    Software, particularly spreadsheet software, is ubiquitous in the financial services workplace. Yet little is known about the extent to which universities should, and do, prepare graduates for this aspect of the modern workplace. We have investigated this issue through a survey of financial services employers of graduates, the results of which are reported in this paper, as well as surveys of university graduates and academics, reported previously. Financial services employers rate software skills as important, would like their employees to be more highly skilled in the use of such software, and tend to prefer 'on-the-job' training rather than university training for statistical, database and specialized actuarial/financial software. There is a perception among graduates that employers do not provide adequate formal workplace training in the use of technical software.

  19. Development and Application of Collaborative Optimization Software for Plate - fin Heat Exchanger

    NASA Astrophysics Data System (ADS)

    Chunzhen, Qiao; Ze, Zhang; Jiangfeng, Guo; Jian, Zhang

    2017-12-01

    This paper introduces the design ideas of the calculation software and application examples for plate - fin heat exchangers. Because of the large calculation quantity in the process of designing and optimizing heat exchangers, we used Visual Basic 6.0 as a software development carrier to design a basic calculation software to reduce the calculation quantity. Its design condition is plate - fin heat exchanger which was designed according to the boiler tail flue gas. The basis of the software is the traditional design method of the plate-fin heat exchanger. Using the software for design and calculation of plate-fin heat exchangers, discovery will effectively reduce the amount of computation, and similar to traditional methods, have a high value.

  20. Effectiveness of back-to-back testing

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.; Eckhardt, David E.; Caglayan, Alper; Kelly, John P. J.

    1987-01-01

    Three models of back-to-back testing processes are described. Two models treat the case where there is no intercomponent failure dependence. The third model describes the more realistic case where there is correlation among the failure probabilities of the functionally equivalent components. The theory indicates that back-to-back testing can, under the right conditions, provide a considerable gain in software reliability. The models are used to analyze the data obtained in a fault-tolerant software experiment. It is shown that the expected gain is indeed achieved, and exceeded, provided the intercomponent failure dependence is sufficiently small. However, even with the relatively high correlation the use of several functionally equivalent components coupled with back-to-back testing may provide a considerable reliability gain. Implications of this finding are that the multiversion software development is a feasible and cost effective approach to providing highly reliable software components intended for fault-tolerant software systems, on condition that special attention is directed at early detection and elimination of correlated faults.

  1. Providing an empirical basis for optimizing the verification and testing phases of software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  2. HIGH-RATE FORMABILITY OF HIGH-STRENGTH ALUMINUM ALLOYS: A STUDY ON OBJECTIVITY OF MEASURED STRAIN AND STRAIN RATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Rohatgi, Aashish; Stephens, Elizabeth V.

    2015-02-18

    Al alloy AA7075 sheets were deformed at room temperature at strain-rates exceeding 1000 /s using the electrohydraulic forming (EHF) technique. A method that combines high speed imaging and digital image correlation technique, developed at Pacific Northwest National Laboratory, is used to investigate high strain rate deformation behavior of AA7075. For strain-rate sensitive materials, the ability to accurately model their high-rate deformation behavior is dependent upon the ability to accurately quantify the strain-rate that the material is subjected to. This work investigates the objectivity of software-calculated strain and strain rate by varying different parameters within commonly used commercially available digital imagemore » correlation software. Except for very close to the time of crack opening the calculated strain and strain rates are very consistent and independent of the adjustable parameters of the software.« less

  3. Scout: high-performance heterogeneous computing made simple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice

    2011-01-26

    Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less

  4. SEURAT: visual analytics for the integrated analysis of microarray data.

    PubMed

    Gribov, Alexander; Sill, Martin; Lück, Sonja; Rücker, Frank; Döhner, Konstanze; Bullinger, Lars; Benner, Axel; Unwin, Antony

    2010-06-03

    In translational cancer research, gene expression data is collected together with clinical data and genomic data arising from other chip based high throughput technologies. Software tools for the joint analysis of such high dimensional data sets together with clinical data are required. We have developed an open source software tool which provides interactive visualization capability for the integrated analysis of high-dimensional gene expression data together with associated clinical data, array CGH data and SNP array data. The different data types are organized by a comprehensive data manager. Interactive tools are provided for all graphics: heatmaps, dendrograms, barcharts, histograms, eventcharts and a chromosome browser, which displays genetic variations along the genome. All graphics are dynamic and fully linked so that any object selected in a graphic will be highlighted in all other graphics. For exploratory data analysis the software provides unsupervised data analytics like clustering, seriation algorithms and biclustering algorithms. The SEURAT software meets the growing needs of researchers to perform joint analysis of gene expression, genomical and clinical data.

  5. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2007-01-01

    NASA relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft launched that does not have a computer on board that will provide command and control services. There have been recent incidents where software has played a role in high-profile mission failures and hazardous incidents. For example, the Mars Orbiter, Mars Polar Lander, the DART (Demonstration of Autonomous Rendezvous Technology), and MER (Mars Exploration Rover) Spirit anomalies were all caused or contributed to by software. The Mission Control Centers for the Shuttle, ISS, and unmanned programs are highly dependant on software for data displays, analysis, and mission planning. Despite this growing dependence on software control and monitoring, there has been little to no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Meanwhile, academia and private industry have been stepping forward with procedures and standards for safety critical systems and software, for example Dr. Nancy Leveson's book Safeware: System Safety and Computers. The NASA Software Safety Standard, originally published in 1997, was widely ignored due to its complexity and poor organization. It also focused on concepts rather than definite procedural requirements organized around a software project lifecycle. Led by NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard has recently undergone a significant update. This new standard provides the procedures and guidelines for evaluating a project for safety criticality and then lays out the minimum project lifecycle requirements to assure the software is created, operated, and maintained in the safest possible manner. This update of the standard clearly delineates the minimum set of software safety requirements for a project without detailing the implementation for those requirements. This allows the projects leeway to meet these requirements in many forms that best suit a particular project's needs and safety risk. In other words, it tells the project what to do, not how to do it. This update also incorporated advances in the state of the practice of software safety from academia and private industry. It addresses some of the more common issues now facing software developers in the NASA environment such as the use of Commercial-Off-the-Shelf Software (COTS), Modified OTS (MOTS), Government OTS (GOTS), and reused software. A team from across NASA developed the update and it has had both NASA-wide internal reviews by software engineering, quality, safety, and project management. It has also had expert external review. This presentation and paper will discuss the new NASA Software Safety Standard, its organization, and key features. It will start with a brief discussion of some NASA mission failures and incidents that had software as one of their root causes. It will then give a brief overview of the NASA Software Safety Process. This will include an overview of the key personnel responsibilities and functions that must be performed for safety-critical software.

  6. EngineSim: Turbojet Engine Simulator Adapted for High School Classroom Use

    NASA Technical Reports Server (NTRS)

    Petersen, Ruth A.

    2001-01-01

    EngineSim is an interactive educational computer program that allows users to explore the effect of engine operation on total aircraft performance. The software is supported by a basic propulsion web site called the Beginner's Guide to Propulsion, which includes educator-created, web-based activities for the classroom use of EngineSim. In addition, educators can schedule videoconferencing workshops in which EngineSim's creator demonstrates the software and discusses its use in the educational setting. This software is a product of NASA Glenn Research Center's Learning Technologies Project, an educational outreach initiative within the High Performance Computing and Communications Program.

  7. A New Look at NASA: Strategic Research In Information Technology

    NASA Technical Reports Server (NTRS)

    Alfano, David; Tu, Eugene (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.

  8. Data-Driven Software Framework for Web-Based ISS Telescience

    NASA Technical Reports Server (NTRS)

    Tso, Kam S.

    2005-01-01

    Software that enables authorized users to monitor and control scientific payloads aboard the International Space Station (ISS) from diverse terrestrial locations equipped with Internet connections is undergoing development. This software reflects a data-driven approach to distributed operations. A Web-based software framework leverages prior developments in Java and Extensible Markup Language (XML) to create portable code and portable data, to which one can gain access via Web-browser software on almost any common computer. Open-source software is used extensively to minimize cost; the framework also accommodates enterprise-class server software to satisfy needs for high performance and security. To accommodate the diversity of ISS experiments and users, the framework emphasizes openness and extensibility. Users can take advantage of available viewer software to create their own client programs according to their particular preferences, and can upload these programs for custom processing of data, generation of views, and planning of experiments. The same software system, possibly augmented with a subset of data and additional software tools, could be used for public outreach by enabling public users to replay telescience experiments, conduct their experiments with simulated payloads, and create their own client programs and other custom software.

  9. Software IV and V Research Priorities and Applied Program Accomplishments Within NASA

    NASA Technical Reports Server (NTRS)

    Blazy, Louis J.

    2000-01-01

    The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering

  10. A Project Management Approach to Using Simulation for Cost Estimation on Large, Complex Software Development Projects

    NASA Technical Reports Server (NTRS)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    It is very difficult for project managers to develop accurate cost and schedule estimates for large, complex software development projects. None of the approaches or tools available today can estimate the true cost of software with any high degree of accuracy early in a project. This paper provides an approach that utilizes a software development process simulation model that considers and conveys the level of uncertainty that exists when developing an initial estimate. A NASA project will be analyzed using simulation and data from the Software Engineering Laboratory to show the benefits of such an approach.

  11. PINT, a New Pulsar Timing Software

    NASA Astrophysics Data System (ADS)

    Luo, Jing; Jenet, Fredrick A.; Ransom, Scott M.; Demorest, Paul; Van Haasteren, Rutger; Archibald, Anne

    2015-01-01

    We are presenting a new pulsar timing software PINT. The current pulsar timing group are heavily depending on Tempo/Tempo2, a package for analysis pulsar data. However, for a high accuracy pulsar timing related project, such as pulsar timing for gravitational waves, an alternative software is needed for the purpose of examing the results. We are developing a Tempo independent software with a different structure. Different modules is designed to be more isolated and easier to be expanded. Instead of C, we are using Python as our programming language for the advantage of flexibility and powerful docstring. Here, we are presenting the detailed design and the first result of the software.

  12. Estimation of combined sewer overflow discharge: a software sensor approach based on local water level measurements.

    PubMed

    Ahm, Malte; Thorndahl, Søren; Nielsen, Jesper E; Rasmussen, Michael R

    2016-12-01

    Combined sewer overflow (CSO) structures are constructed to effectively discharge excess water during heavy rainfall, to protect the urban drainage system from hydraulic overload. Consequently, most CSO structures are not constructed according to basic hydraulic principles for ideal measurement weirs. It can, therefore, be a challenge to quantify the discharges from CSOs. Quantification of CSO discharges are important in relation to the increased environmental awareness of the receiving water bodies. Furthermore, CSO discharge quantification is essential for closing the rainfall-runoff mass-balance in combined sewer catchments. A closed mass-balance is an advantage for calibration of all urban drainage models based on mass-balance principles. This study presents three different software sensor concepts based on local water level sensors, which can be used to estimate CSO discharge volumes from hydraulic complex CSO structures. The three concepts were tested and verified under real practical conditions. All three concepts were accurate when compared to electromagnetic flow measurements.

  13. Application distribution model and related security attacks in VANET

    NASA Astrophysics Data System (ADS)

    Nikaein, Navid; Kanti Datta, Soumya; Marecar, Irshad; Bonnet, Christian

    2013-03-01

    In this paper, we present a model for application distribution and related security attacks in dense vehicular ad hoc networks (VANET) and sparse VANET which forms a delay tolerant network (DTN). We study the vulnerabilities of VANET to evaluate the attack scenarios and introduce a new attacker`s model as an extension to the work done in [6]. Then a VANET model has been proposed that supports the application distribution through proxy app stores on top of mobile platforms installed in vehicles. The steps of application distribution have been studied in detail. We have identified key attacks (e.g. malware, spamming and phishing, software attack and threat to location privacy) for dense VANET and two attack scenarios for sparse VANET. It has been shown that attacks can be launched by distributing malicious applications and injecting malicious codes to On Board Unit (OBU) by exploiting OBU software security holes. Consequences of such security attacks have been described. Finally, countermeasures including the concepts of sandbox have also been presented in depth.

  14. gSRT-Soft: a generic software application and some methodological guidelines to investigate implicit learning through visual-motor sequential tasks.

    PubMed

    Chambaron, Stéphanie; Ginhac, Dominique; Perruchet, Pierre

    2008-05-01

    Serial reaction time tasks and, more generally, the visual-motor sequential paradigms are increasingly popular tools in a variety of research domains, from studies on implicit learning in laboratory contexts to the assessment of residual learning capabilities of patients in clinical settings. A consequence of this success, however, is the increased variability in paradigms and the difficulty inherent in respecting the methodological principles that two decades of experimental investigations have made more and more stringent. The purpose of the present article is to address those problems. We present a user-friendly application that simplifies running classical experiments, but is flexible enough to permit a broad range of nonstandard manipulations for more specific objectives. Basic methodological guidelines are also provided, as are suggestions for using the software to explore unconventional directions of research. The most recent version of gSRT-Soft may be obtained for free by contacting the authors.

  15. Numerical investigation of flow on NACA4412 aerofoil with different aspect ratios

    NASA Astrophysics Data System (ADS)

    Demir, Hacımurat; Özden, Mustafa; Genç, Mustafa Serdar; Çağdaş, Mücahit

    2016-03-01

    In this study, the flow over NACA4412 was investigated both numerically and experimentally at a different Reynolds numbers. The experiments were carried out in a low speed wind tunnel with various angles of attack and different Reynolds numbers (25000 and 50000). Airfoil was manufactured using 3D printer with a various aspect ratios (AR = 1 and AR = 3). Smoke-wire and oil flow visualization methods were used to visualize the surface flow patterns. NACA4412 aerofoil was designed by using SOLIDWORKS. The structural grid of numerical model was constructed by ANSYS ICEM CFD meshing software. Furthermore, ANSYS FLUENT™ software was used to perform numerical calculations. The numerical results were compared with experimental results. Bubble formation was shown in CFD streamlines and smoke-wire experiments at z / c = 0.4. Furthermore, bubble shrunk at z / c = 0.2 by reason of the effects of tip vortices in both numerical and experimental studies. Consequently, it was seen that there was a good agreement between numerical and experimental results.

  16. Near DC eddy current measurement of aluminum multilayers using MR sensors and commodity low-cost computer technology

    NASA Astrophysics Data System (ADS)

    Perry, Alexander R.

    2002-06-01

    Low Frequency Eddy Current (EC) probes are capable of measurement from 5 MHz down to DC through the use of Magnetoresistive (MR) sensors. Choosing components with appropriate electrical specifications allows them to be matched to the power and impedance characteristics of standard computer connectors. This permits direct attachment of the probe to inexpensive computers, thereby eliminating external power supplies, amplifiers and modulators that have heretofore precluded very low system purchase prices. Such price reduction is key to increased market penetration in General Aviation maintenance and consequent reduction in recurring costs. This paper examines our computer software CANDETECT, which implements this approach and permits effective probe operation. Results are presented to show the intrinsic sensitivity of the software and demonstrate its practical performance when seeking cracks in the underside of a thick aluminum multilayer structure. The majority of the General Aviation light aircraft fleet uses rivets and screws to attach sheet aluminum skin to the airframe, resulting in similar multilayer lap joints.

  17. Motofit - integrating neutron reflectometry acquisition, reduction and analysis into one, easy to use, package

    NASA Astrophysics Data System (ADS)

    Nelson, Andrew

    2010-11-01

    The efficient use of complex neutron scattering instruments is often hindered by the complex nature of their operating software. This complexity exists at each experimental step: data acquisition, reduction and analysis, with each step being as important as the previous. For example, whilst command line interfaces are powerful at automated acquisition they often reduce accessibility by novice users and sometimes reduce the efficiency for advanced users. One solution to this is the development of a graphical user interface which allows the user to operate the instrument by a simple and intuitive "push button" approach. This approach was taken by the Motofit software package for analysis of multiple contrast reflectometry data. Here we describe the extension of this package to cover the data acquisition and reduction steps for the Platypus time-of-flight neutron reflectometer. Consequently, the complete operation of an instrument is integrated into a single, easy to use, program, leading to efficient instrument usage.

  18. Creating Dynamic Learning Environment to Enhance Students’ Engagement in Learning Geometry

    NASA Astrophysics Data System (ADS)

    Sariyasa

    2017-04-01

    Learning geometry gives many benefits to students. It strengthens the development of deductive thinking and reasoning; it also provides an opportunity to improve visualisation and spatial ability. Some studies, however, have pointed out the difficulties that students encountered when learning geometry. A preliminary study by the author in Bali revealed that one of the main problems was teachers’ difficulties in delivering geometry instruction. It was partly due to the lack of appropriate instructional media. Coupling with dynamic geometry software, dynamic learning environments is a promising solution to this problem. Employing GeoGebra software supported by the well-designed instructional process may result in more meaningful learning, and consequently, students are motivated to engage in the learning process more deeply and actively. In this paper, we provide some examples of GeoGebra-aided learning activities that allow students to interactively explore and investigate geometry concepts and the properties of geometry objects. Thus, it is expected that such learning environment will enhance students’ internalisation process of geometry concepts.

  19. Software and the Scientist: Coding and Citation Practices in Geodynamics

    NASA Astrophysics Data System (ADS)

    Hwang, Lorraine; Fish, Allison; Soito, Laura; Smith, MacKenzie; Kellogg, Louise H.

    2017-11-01

    In geodynamics as in other scientific areas, computation has become a core component of research, complementing field observation, laboratory analysis, experiment, and theory. Computational tools for data analysis, mapping, visualization, modeling, and simulation are essential for all aspects of the scientific workflow. Specialized scientific software is often developed by geodynamicists for their own use, and this effort represents a distinctive intellectual contribution. Drawing on a geodynamics community that focuses on developing and disseminating scientific software, we assess the current practices of software development and attribution, as well as attitudes about the need and best practices for software citation. We analyzed publications by participants in the Computational Infrastructure for Geodynamics and conducted mixed method surveys of the solid earth geophysics community. From this we learned that coding skills are typically learned informally. Participants considered good code as trusted, reusable, readable, and not overly complex and considered a good coder as one that participates in the community in an open and reasonable manor contributing to both long- and short-term community projects. Participants strongly supported citing software reflected by the high rate a software package was named in the literature and the high rate of citations in the references. However, lacking are clear instructions from developers on how to cite and education of users on what to cite. In addition, citations did not always lead to discoverability of the resource. A unique identifier to the software package itself, community education, and citation tools would contribute to better attribution practices.

  20. GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data

    NASA Astrophysics Data System (ADS)

    Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.

    2016-08-01

    The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.

Top