Sample records for start based software

  1. Recommended approach to software development, revision 3

    NASA Technical Reports Server (NTRS)

    Landis, Linda; Waligora, Sharon; Mcgarry, Frank; Pajerski, Rose; Stark, Mike; Johnson, Kevin Orlin; Cover, Donna

    1992-01-01

    Guidelines for an organized, disciplined approach to software development that is based on studies conducted by the Software Engineering Laboratory (SEL) since 1976 are presented. It describes methods and practices for each phase of a software development life cycle that starts with requirements definition and ends with acceptance testing. For each defined life cycle phase, guidelines for the development process and its management, and for the products produced and their reviews are presented.

  2. Proceedings of the 3rd International Workshop on a Research Agenda for Maintenance and Evolution of Service-Oriented Systems (MESOA 2009)

    DTIC Science & Technology

    2010-02-01

    through software -as-a- service ( SaaS ) (Nitu 2009, Sedayao 2008). In practice, an organization’s initial SOA implementation almost never attempts to cover...004 Nitu. "Configurability in SaaS ( Software as a Service ) Applications." Proceedings of the 2nd An- nual Conference on India Software Engineering...and evolution of service -oriented systems. In 2007, the Software Engineering Institute started assembling a SOA Research Agenda based on a

  3. Common Grounds for Modelling Mathematics in Educational Software

    ERIC Educational Resources Information Center

    Neuper, Walther

    2010-01-01

    Two kinds of software, CAS and DGS, are starting to work towards mutual integration. This paper envisages common grounds for such integration based on principles of computer theorem proving (CTP). Presently, the CTP community seems to lack awareness as to which of their products' features might serve mathematics education from high-school to…

  4. Recommended approach to sofware development

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.; Page, J.; Eslinger, S.; Church, V.; Merwarth, P.

    1983-01-01

    A set of guideline for an organized, disciplined approach to software development, based on data collected and studied for 46 flight dynamics software development projects. Methods and practices for each phase of a software development life cycle that starts with requirements analysis and ends with acceptance testing are described; maintenance and operation is not addressed. For each defined life cycle phase, guidelines for the development process and its management, and the products produced and their reviews are presented.

  5. Layout Study and Application of Mobile App Recommendation Approach Based On Spark Streaming Framework

    NASA Astrophysics Data System (ADS)

    Wang, H. T.; Chen, T. T.; Yan, C.; Pan, H.

    2018-05-01

    For App recommended areas of mobile phone software, made while using conduct App application recommended combined weighted Slope One algorithm collaborative filtering algorithm items based on further improvement of the traditional collaborative filtering algorithm in cold start, data matrix sparseness and other issues, will recommend Spark stasis parallel algorithm platform, the introduction of real-time streaming streaming real-time computing framework to improve real-time software applications recommended.

  6. The Effect of Software Reusability on Information Theory Based Software Metrics

    DTIC Science & Technology

    1990-01-01

    of plans across programming languages and application areas, only a brief abstract treatment of non-contiguous "program parts" is mentioned in the...info->num = linenum; CA6 if(*info->text) W. if(find(linenum)) C.8 patchup(linenum, 1); /*fix up old line numbers*/ 107 C.9 if(*info->text) C-10 start

  7. R-189 (C-620) air compressor control logic software documentation. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, K.E.

    1995-06-08

    This relates to FFTF plant air compressors. Purpose of this document is to provide an updated Computer Software Description for the software to be used on R-189 (C-620-C) air compressor programmable controllers. Logic software design changes were required to allow automatic starting of a compressor that had not been previously started.

  8. Getting started on metrics - Jet Propulsion Laboratory productivity and quality

    NASA Technical Reports Server (NTRS)

    Bush, M. W.

    1990-01-01

    A review is presented to describe the effort and difficulties of reconstructing fifteen years of JPL software history. In 1987 the collection and analysis of project data were started with the objective of creating laboratory-wide measures of quality and productivity for software development. As a result of this two-year Software Product Assurance metrics study, a rough measurement foundation for software productivity and software quality, and an order-of-magnitude quantitative baseline for software systems and subsystems are now available.

  9. A Remote Registration Based on MIDAS

    NASA Astrophysics Data System (ADS)

    JIN, Xin

    2017-04-01

    We often need for software registration to protect the interests of the software developers. This article narrated one kind of software long-distance registration technology. The registration method is: place the registration information in a database table, after the procedure starts in check table registration information, if it has registered then the procedure may the normal operation; Otherwise, the customer must input the sequence number and registers through the network on the long-distance server. If it registers successfully, then records the registration information in the database table. This remote registration method can protect the rights of software developers.

  10. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  11. Windows VPN Set Up | High-Performance Computing | NREL

    Science.gov Websites

    it in your My Documents folder Configure the client software using that conf file Start the TEXT NEEDED Configure the Client Software Start the Endian Connect App. You'll configure the connection using the hpcvpn-win.conf file, uncheck the "save password" link, and add your UserID. Start

  12. MATTS- A Step Towards Model Based Testing

    NASA Astrophysics Data System (ADS)

    Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.

    2016-08-01

    In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.

  13. Towards easing the configuration and new team member accommodation for open source software based portals

    NASA Astrophysics Data System (ADS)

    Fu, L.; West, P.; Zednik, S.; Fox, P. A.

    2013-12-01

    For simple portals such as vocabulary based services, which contain small amounts of data and require only hyper-textual representation, it is often an overkill to adopt the whole software stack of database, middleware and front end, or to use a general Web development framework as the starting point of development. Directly combining open source software is a much more favorable approach. However, our experience with the Coastal and Marine Spatial Planning Vocabulary (CMSPV) service portal shows that there are still issues such as system configuration and accommodating a new team member that need to be handled carefully. In this contribution, we share our experience in the context of the CMSPV portal, and focus on the tools and mechanisms we've developed to ease the configuration job and the incorporation process of new project members. We discuss the configuration issues that arise when we don't have complete control over how the software in use is configured and need to follow existing configuration styles that may not be well documented, especially when multiple pieces of such software need to work together as a combined system. As for the CMSPV portal, it is built on two pieces of open source software that are still under rapid development: a Fuseki data server and Epimorphics Linked Data API (ELDA) front end. Both lack mature documentation and tutorials. We developed comparison and labeling tools to ease the problem of system configuration. Another problem that slowed down the project is that project members came and went during the development process, so new members needed to start with a partially configured system and incomplete documentation left by old members. We developed documentation/tutorial maintenance mechanisms based on our comparison and labeling tools to make it easier for the new members to be incorporated into the project. These tools and mechanisms also provided benefit to other projects that reused the software components from the CMSPV system.

  14. Apple OS X VPN Set Up | High-Performance Computing | NREL

    Science.gov Websites

    software using that conf file and your UserID Start the connection using your password plus the 6-digit OTP . Configure the Client Software Start the Endian Connect App (It should have installed into Applications in an password" link, and add your UserID. Start the app, and begin configuring the connection by clicking

  15. A Taxonomy of Object-Oriented Measures Modeling the Object-Oriented Space

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.

    1997-01-01

    In order to control the quality of software and the software development process, it is important to understand the measurement of software. A first step toward a better comprehension of software measurement is the categorization of software measures by some meaningful taxonomy. The most worthwhile taxonomy would capture the fundamental nature of the object-oriented (O-O) space. The principal characteristics of object-oriented software offer a starting point for such a categorization of measures. This paper introduces a taxonomy of measures based upon fourteen characteristics of object-oriented software gathered from the literature. This taxonomy allows us to easily see gaps or redundancies in the existing O-O measures. The taxonomy also clearly differentiates among taxa so that there is no ambiguity as to the taxon to which a measure belongs. The taxonomy has been populated with measures taken from the literature.

  16. A Taxonomy of Object-Oriented Measures Modeling the Object Oriented Space

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.; Weistroffer, H. Roland; Coppins, Richard J.

    1997-01-01

    In order to control the quality of software and the software development process, it is important to understand the measurement of software. A first step toward a better comprehension of software measurement is the categorization of software measures by some meaningful taxonomy. The most worthwhile taxonomy would capture the fundamental nature of the object-oriented (O-O) space. The principal characteristics of object-oriented software offer a starting point for such a categorization of measures. This paper introduces a taxonomy of measures based upon fourteen characteristics of object-oriented software gathered from the literature. This taxonomy allows us to easily see gaps or redundancies in the existing O-O measures. The taxonomy also clearly differentiates among taxa so that there is no ambiguity as to the taxon to which a measure belongs. The taxonomy has been populated with measures taken from the literature.

  17. A first-generation software product line for data acquisition systems in astronomy

    NASA Astrophysics Data System (ADS)

    López-Ruiz, J. C.; Heradio, Rubén; Cerrada Somolinos, José Antonio; Coz Fernandez, José Ramón; López Ramos, Pablo

    2008-07-01

    This article presents a case study on developing a software product line for data acquisition systems in astronomy based on the Exemplar Driven Development methodology and the Exemplar Flexibilization Language tool. The main strategies to build the software product line are based on the domain commonality and variability, the incremental scope and the use of existing artifacts. It consists on a lean methodology with little impact on the organization, suitable for small projects, which reduces product line start-up time. Software Product Lines focuses on creating a family of products instead of individual products. This approach has spectacular benefits on reducing the time to market, maintaining the know-how, reducing the development costs and increasing the quality of new products. The maintenance of the products is also enhanced since all the data acquisition systems share the same product line architecture.

  18. The LHCb Starterkit

    NASA Astrophysics Data System (ADS)

    Puig, Albert; LHCb Starterkit Team

    2017-10-01

    The vast majority of high-energy physicists use and produce software every day. Software skills are usually acquired “on the go” and dedicated training courses are rare. The LHCb Starterkit is a new training format for getting LHCb collaborators started in effectively using software to perform their research. The course focuses on teaching basic skills for research computing. Unlike traditional tutorials we focus on starting with basics, performing all the material live, with a high degree of interactivity, giving priority to understanding the tools as opposed to handing out recipes that work “as if by magic”. The LHCb Starterkit was started by two young members of the collaboration inspired by the principles of Software Carpentry, and the material is created in a collaborative fashion using the tools we teach. Three successful entry-level workshops, as well as an advance one, have taken place since the start of the initiative in 2015, and were taught largely by PhD students to other PhD students.

  19. Libre Software in Spanish Public Administrations

    NASA Astrophysics Data System (ADS)

    Ortega, Felipe; Lafuente, Isabel; Gato, Jose; González-Barahona, Jesús M.

    Libre software started to be used in Public Administrations in Spain during the 1990s, in some isolated but interesting experiences.During the early 2000s, and specially in some regional governments, libre software started to be considered as an integral part of ITrelated policies. In 2007, it was evident that many experiences related to libre software were running in Public Administrations with different levels of success. However, no study had looked into the details of these experiences, and no comprehensive analysis had been performed to better understand the different factors that affect them.

  20. Electronic Health Record for Intensive Care based on Usual Windows Based Software.

    PubMed

    Reper, Arnaud; Reper, Pascal

    2015-08-01

    In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed an EHR based on usual software and components. The software was designed as a client-server architecture running on the Windows operating system and powered by the access data base system. The client software was developed using Visual Basic interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in September 2004, the EHR was used to care more than five thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of basic functionalities communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on usual software components was able to respond to the medical needs of the local ICU environment. The use of Windows for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.

  1. Programs for Testing an SSME-Monitoring System

    NASA Technical Reports Server (NTRS)

    Lang, Andre; Cecil, Jimmie; Heusinger, Ralph; Freestone, Kathleen; Blue, Lisa; Wilkerson, DeLisa; McMahon, Leigh Anne; Hall, Richard B.; Varnavas, Kosta; Smith, Keary; hide

    2007-01-01

    A suite of computer programs has been developed for special test equipment (STE) that is used in verification testing of the Health Management Computer Integrated Rack Assembly (HMCIRA), a ground-based system of analog and digital electronic hardware and software for "flight-like" testing for development of components of an advanced health-management system for the space shuttle main engine (SSME). The STE software enables the STE to simulate the analog input and the data flow of an SSME test firing from start to finish.

  2. Modeling the Object-Oriented Space Through Validated Measures

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1996-01-01

    In order to truly understand software and the software development process, software measurement must be better understood. A beginning step toward a better understanding of software measurement is the categorization of the measurements by some meaningful taxonomy. The most meaningful taxonomy would capture the basic nature of the subject oriented (O-O) space. The interesting characteristics of object oriented software offer a starting point for such a categorization of measures. A taxonomy has been developed based on fourteen characteristics of object-oriented software gathered from the literature This taxonomy allows us to easily see gaps and redundancies in the O-O measures. The taxonomy also clearly differentiates among taxa so that there is no ambiguity as to the taxon to which a measure belongs. The taxonomy has been populated with thirty-two measures that have been validated in the narrow sense of Fenton, using measurement theory with Zuse's augmentation.

  3. Software quality in 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized thatmore » success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.« less

  4. A Heuristic Evaluation of the Generalized Intelligent Framework for Tutoring (GIFT) Authoring Tools

    DTIC Science & Technology

    2016-03-01

    Software Is Difficult to Locate and Download from the GIFT Website 5 2.1.2 Issue: Unclear Process for Starting GIFT Software Installation 7 2.1.3 Issue...and change information availability as the user’s expertise in using the authoring tools grows. • Aesthetic and Minimalist Design: Dialogues should...public release; distribution is unlimited. 7 2.1.2 Issue: Unclear Process for Starting GIFT Software Installation Users may not understand how to

  5. HTTM - Design and Implementation of a Type-2 Hypervisor for MIPS64 Based Systems

    NASA Astrophysics Data System (ADS)

    Ain, Qurrat ul; Anwar, Usama; Mehmood, Muhammad Amir; Waheed, Abdul

    2017-01-01

    Virtualization has emerged as an attractive software solution for many problems in server domain. Recently, it has started to enrich embedded systems domain by offering features such as hardware consolidation, security, and isolation. Our objective is to bring virtualization to high-end MIPS64 based systems, such as network routers, switches, wireless base station, etc. For this purpose a Type-2 hypervisor is a viable software solution which is easy to deploy and requires no changes in host system. In this paper we present the internal design HTTM -A Type-2 hypervisor for MIPS64 based systems and demonstrate its functional correctness by using Linux Testing Project (LTP) tests. Finally, we performed LMbench tests for performance evaluation.

  6. A preliminary architecture for building communication software from traffic captures

    NASA Astrophysics Data System (ADS)

    Acosta, Jaime C.; Estrada, Pedro

    2017-05-01

    Security analysts are tasked with identifying and mitigating network service vulnerabilities. A common problem associated with in-depth testing of network protocols is the availability of software that communicates across disparate protocols. Many times, the software required to communicate with these services is not publicly available. Developing this software is a time-consuming undertaking that requires expertise and understanding of the protocol specification. The work described in this paper aims at developing a software package that is capable of automatically creating communication clients by using packet capture (pcap) and TShark dissectors. Currently, our focus is on simple protocols with fixed fields. The methodologies developed as part of this work will extend to other complex protocols such as the Gateway Load Balancing Protocol (GLBP), Port Aggregation Protocol (PAgP), and Open Shortest Path First (OSPF). Thus far, we have architected a modular pipeline for an automatic traffic-based software generator. We start the transformation of captured network traffic by employing TShark to convert packets into a Packet Details Markup Language (PDML) file. The PDML file contains a parsed, textual, representation of the packet data. Then, we extract field data, types, along with inter and intra-packet dependencies. This information is then utilized to construct an XML file that encompasses the protocol state machine and field vocabulary. Finally, this XML is converted into executable code. Using our methodology, and as a starting point, we have succeeded in automatically generating software that communicates with other hosts using an automatically generated Internet Control Message Protocol (ICMP) client program.

  7. Online Searching with a Microcomputer--Getting Started.

    ERIC Educational Resources Information Center

    Casbon, Susan

    1983-01-01

    Based on online searching experiences on microcomputer at a small liberal arts college, this article outlines for the novice advantages and disadvantages of micro-searching, legal implications, future trends, and factors to consider in selecting hardware and software. A 16-item bibliography arranged in order of usefulness and 10 references are…

  8. Surveillance Jumps on the Network

    ERIC Educational Resources Information Center

    Raths, David

    2011-01-01

    Internet protocol (IP) network-based cameras and digital video management software are maturing, and many issues that have surrounded them, including bandwidth, data storage, ease of use, and integration are starting to become clearer as the technology continues to evolve. Prices are going down and the number of features is going up. Many school…

  9. Path Searching Based Fault Automated Recovery Scheme for Distribution Grid with DG

    NASA Astrophysics Data System (ADS)

    Xia, Lin; Qun, Wang; Hui, Xue; Simeng, Zhu

    2016-12-01

    Applying the method of path searching based on distribution network topology in setting software has a good effect, and the path searching method containing DG power source is also applicable to the automatic generation and division of planned islands after the fault. This paper applies path searching algorithm in the automatic division of planned islands after faults: starting from the switch of fault isolation, ending in each power source, and according to the line load that the searching path traverses and the load integrated by important optimized searching path, forming optimized division scheme of planned islands that uses each DG as power source and is balanced to local important load. Finally, COBASE software and distribution network automation software applied are used to illustrate the effectiveness of the realization of such automatic restoration program.

  10. Comparison of Automated Atlas-Based Segmentation Software for Postoperative Prostate Cancer Radiotherapy

    PubMed Central

    Delpon, Grégory; Escande, Alexandre; Ruef, Timothée; Darréon, Julien; Fontaine, Jimmy; Noblet, Caroline; Supiot, Stéphane; Lacornerie, Thomas; Pasquier, David

    2016-01-01

    Automated atlas-based segmentation (ABS) algorithms present the potential to reduce the variability in volume delineation. Several vendors offer software that are mainly used for cranial, head and neck, and prostate cases. The present study will compare the contours produced by a radiation oncologist to the contours computed by different automated ABS algorithms for prostate bed cases, including femoral heads, bladder, and rectum. Contour comparison was evaluated by different metrics such as volume ratio, Dice coefficient, and Hausdorff distance. Results depended on the volume of interest showed some discrepancies between the different software. Automatic contours could be a good starting point for the delineation of organs since efficient editing tools are provided by different vendors. It should become an important help in the next few years for organ at risk delineation. PMID:27536556

  11. Software to Go--And It Goes!

    ERIC Educational Resources Information Center

    Abrams, Mary; Kurlychek, Ken

    1989-01-01

    This article describes the Software Evaluation Clearinghouse for Educators of the Hearing Impaired at Gallaudet University (Washington, DC). Software compatible with Apple and IBM hardware is collected, rated by clearinghouse members, and described in a printed catalog. Tips on starting a software lending library are offered. (PB)

  12. Autonomous system for Web-based microarray image analysis.

    PubMed

    Bozinov, Daniel

    2003-12-01

    Software-based feature extraction from DNA microarray images still requires human intervention on various levels. Manual adjustment of grid and metagrid parameters, precise alignment of superimposed grid templates and gene spots, or simply identification of large-scale artifacts have to be performed beforehand to reliably analyze DNA signals and correctly quantify their expression values. Ideally, a Web-based system with input solely confined to a single microarray image and a data table as output containing measurements for all gene spots would directly transform raw image data into abstracted gene expression tables. Sophisticated algorithms with advanced procedures for iterative correction function can overcome imminent challenges in image processing. Herein is introduced an integrated software system with a Java-based interface on the client side that allows for decentralized access and furthermore enables the scientist to instantly employ the most updated software version at any given time. This software tool is extended from PixClust as used in Extractiff incorporated with Java Web Start deployment technology. Ultimately, this setup is destined for high-throughput pipelines in genome-wide medical diagnostics labs or microarray core facilities aimed at providing fully automated service to its users.

  13. Vision Videos Empower Students

    ERIC Educational Resources Information Center

    Patt, Mary Johnson

    2009-01-01

    Increasing numbers of school districts are starting the higher education drumbeat by the freshman year of high school, employing 21st-century technology such as the popular career-based software developed by Naviance to help students map their school and life journeys. But what is the first step in inspiring those teens to define and pursue their…

  14. Integrated Assessment and Improvement of the Quality Assurance System for the Cosworth Casting Process

    NASA Astrophysics Data System (ADS)

    Yousif, Dilon

    The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).

  15. Using C to build a satellite scheduling expert system: Examples from the Explorer Platform planning system

    NASA Technical Reports Server (NTRS)

    Mclean, David R.; Tuchman, Alan; Potter, William J.

    1991-01-01

    A C-based artificial intelligence (AI) development effort which is based on a software tools approach is discussed with emphasis on reusability and maintainability of code. The discussion starts with simple examples of how list processing can easily be implemented in C and then proceeds to the implementations of frames and objects which use dynamic memory allocation. The implementation of procedures which use depth first search, constraint propagation, context switching, and blackboard-like simulation environment are described. Techniques for managing the complexity of C-based AI software are noted, especially the object-oriented techniques of data encapsulation and incremental development. Finally, all these concepts are put together by describing the components of planning software called the Planning And Resource Reasoning (PARR) Shell. This shell was successfully utilized for scheduling services of the Tracking and Data Relay Satellite System for the Earth Radiation Budget Satellite since May of 1987 and will be used for operations scheduling of the Explorer Platform in Nov. of 1991.

  16. Genome-wide study of correlations between genomic features and their relationship with the regulation of gene expression.

    PubMed

    Kravatsky, Yuri V; Chechetkin, Vladimir R; Tchurikov, Nikolai A; Kravatskaya, Galina I

    2015-02-01

    The broad class of tasks in genetics and epigenetics can be reduced to the study of various features that are distributed over the genome (genome tracks). The rapid and efficient processing of the huge amount of data stored in the genome-scale databases cannot be achieved without the software packages based on the analytical criteria. However, strong inhomogeneity of genome tracks hampers the development of relevant statistics. We developed the criteria for the assessment of genome track inhomogeneity and correlations between two genome tracks. We also developed a software package, Genome Track Analyzer, based on this theory. The theory and software were tested on simulated data and were applied to the study of correlations between CpG islands and transcription start sites in the Homo sapiens genome, between profiles of protein-binding sites in chromosomes of Drosophila melanogaster, and between DNA double-strand breaks and histone marks in the H. sapiens genome. Significant correlations between transcription start sites on the forward and the reverse strands were observed in genomes of D. melanogaster, Caenorhabditis elegans, Mus musculus, H. sapiens, and Danio rerio. The observed correlations may be related to the regulation of gene expression in eukaryotes. Genome Track Analyzer is freely available at http://ancorr.eimb.ru/. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  17. Open source electronic health record and patient data management system for intensive care.

    PubMed

    Massaut, Jacques; Reper, Pascal

    2008-01-01

    In Intensive Care Units, the amount of data to be processed for patients care, the turn over of the patients, the necessity for reliability and for review processes indicate the use of Patient Data Management Systems (PDMS) and electronic health records (EHR). To respond to the needs of an Intensive Care Unit and not to be locked with proprietary software, we developed a PDMS and EHR based on open source software and components. The software was designed as a client-server architecture running on the Linux operating system and powered by the PostgreSQL data base system. The client software was developed in C using GTK interface library. The application offers to the users the following functions: medical notes captures, observations and treatments, nursing charts with administration of medications, scoring systems for classification, and possibilities to encode medical activities for billing processes. Since his deployment in February 2004, the PDMS was used to care more than three thousands patients with the expected software reliability and facilitated data management and review processes. Communications with other medical software were not developed from the start, and are realized by the use of the Mirth HL7 communication engine. Further upgrade of the system will include multi-platform support, use of typed language with static analysis, and configurable interface. The developed system based on open source software components was able to respond to the medical needs of the local ICU environment. The use of OSS for development allowed us to customize the software to the preexisting organization and contributed to the acceptability of the whole system.

  18. A Process for Evaluating Student Records Management Software. ERIC/AE Digest.

    ERIC Educational Resources Information Center

    Vecchioli, Lisa

    This digest provides practical advice on evaluating software for managing student records. An evaluation of record-keeping software should start with a process to identify all of the individual needs the software produce must meet in order to be considered for purchase. The first step toward establishing an administrative computing system is…

  19. Modern Corneal Eye-Banking Using a Software-Based IT Management Solution.

    PubMed

    Kern, C; Kortuem, K; Wertheimer, C; Nilmayer, O; Dirisamer, M; Priglinger, S; Mayer, W J

    2018-01-01

    Increasing government legislation and regulations in manufacturing have led to additional documentation regarding the pharmaceutical product requirements of corneal grafts in the European Union. The aim of this project was to develop a software within a hospital information system (HIS) to support the documentation process, to improve the management of the patient waiting list and to increase informational flow between the clinic and eye bank. After an analysis of the current documentation process, a new workflow and software were implemented in our electronic health record (EHR) system. The software takes over most of the documentation and reduces the time required for record keeping. It guarantees real-time tracing of all steps during human corneal tissue processing from the start of production until allocation during surgery and includes follow-up within the HIS. Moreover, listing of the patient for surgery as well as waiting list management takes place in the same system. The new software for corneal eye banking supports the whole process chain by taking over both most of the required documentation and the management of the transplant waiting list. It may provide a standardized IT-based solution for German eye banks working within the same HIS.

  20. Universities Report More Licensing Income but Fewer Start-Ups in 2005

    ERIC Educational Resources Information Center

    Blumenstyk, Goldie

    2007-01-01

    According to a survey conducted by the Association of University Technology Managers, at least two dozen universities each earned more than $10-million from their licensing of rights to new drugs, software, and other inventions in the 2005 fiscal year. The number of institutions creating large numbers of spinoff companies based on their…

  1. Expert Systems for Libraries at SCIL [Small Computers in Libraries]'88.

    ERIC Educational Resources Information Center

    Kochtanek, Thomas R.; And Others

    1988-01-01

    Six brief papers on expert systems for libraries cover (1) a knowledge-based approach to database design; (2) getting started in expert systems; (3) using public domain software to develop a business reference system; (4) a music cataloging inquiry system; (5) linguistic analysis of reference transactions; and (6) a model of a reference librarian.…

  2. Functional Specifications for Computer Aided Training Systems Development and Management (CATSDM) Support Functions. Final Report.

    ERIC Educational Resources Information Center

    Hughes, John; And Others

    This report provides a description of a Computer Aided Training System Development and Management (CATSDM) environment based on state-of-the-art hardware and software technology, and including recommendations for off the shelf systems to be utilized as a starting point in addressing the particular systematic training and instruction design and…

  3. Using the Git Software Tool on the Peregrine System | High-Performance

    Science.gov Websites

    branch workflow. Create a local branch called "experimental" based on the current master... git branch experimental Use your branch (start working on that experimental branch....) git checkout experimental git pull origin experimental # work, work, work, commit.... Send local branch to the repo git push

  4. Web accessibility and open source software.

    PubMed

    Obrenović, Zeljko

    2009-07-01

    A Web browser provides a uniform user interface to different types of information. Making this interface universally accessible and more interactive is a long-term goal still far from being achieved. Universally accessible browsers require novel interaction modalities and additional functionalities, for which existing browsers tend to provide only partial solutions. Although functionality for Web accessibility can be found as open source and free software components, their reuse and integration is complex because they were developed in diverse implementation environments, following standards and conventions incompatible with the Web. To address these problems, we have started several activities that aim at exploiting the potential of open-source software for Web accessibility. The first of these activities is the development of Adaptable Multi-Interface COmmunicator (AMICO):WEB, an infrastructure that facilitates efficient reuse and integration of open source software components into the Web environment. The main contribution of AMICO:WEB is in enabling the syntactic and semantic interoperability between Web extension mechanisms and a variety of integration mechanisms used by open source and free software components. Its design is based on our experiences in solving practical problems where we have used open source components to improve accessibility of rich media Web applications. The second of our activities involves improving education, where we have used our platform to teach students how to build advanced accessibility solutions from diverse open-source software. We are also partially involved in the recently started Eclipse projects called Accessibility Tools Framework (ACTF), the aim of which is development of extensible infrastructure, upon which developers can build a variety of utilities that help to evaluate and enhance the accessibility of applications and content for people with disabilities. In this article we briefly report on these activities.

  5. A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.

    ERIC Educational Resources Information Center

    Suen, Che-yin; Pok, Yang-ming

    Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…

  6. Generalized implementation of software safety policies

    NASA Technical Reports Server (NTRS)

    Knight, John C.; Wika, Kevin G.

    1994-01-01

    As part of a research program in the engineering of software for safety-critical systems, we are performing two case studies. The first case study, which is well underway, is a safety-critical medical application. The second, which is just starting, is a digital control system for a nuclear research reactor. Our goal is to use these case studies to permit us to obtain a better understanding of the issues facing developers of safety-critical systems, and to provide a vehicle for the assessment of research ideas. The case studies are not based on the analysis of existing software development by others. Instead, we are attempting to create software for new and novel systems in a process that ultimately will involve all phases of the software lifecycle. In this abstract, we summarize our results to date in a small part of this project, namely the determination and classification of policies related to software safety that must be enforced to ensure safe operation. We hypothesize that this classification will permit a general approach to the implementation of a policy enforcement mechanism.

  7. Software architecture of biomimetic underwater vehicle

    NASA Astrophysics Data System (ADS)

    Praczyk, Tomasz; Szymak, Piotr

    2016-05-01

    Autonomous underwater vehicles are vehicles that are entirely or partly independent of human decisions. In order to obtain operational independence, the vehicles have to be equipped with a specialized software. The main task of the software is to move the vehicle along a trajectory with collision avoidance. Moreover, the software has also to manage different devices installed on the vehicle board, e.g. to start and stop cameras, sonars etc. In addition to the software embedded on the vehicle board, the software responsible for managing the vehicle by the operator is also necessary. Its task is to define mission of the vehicle, to start, to stop the mission, to send emergency commands, to monitor vehicle parameters, and to control the vehicle in remotely operated mode. An important objective of the software is also to support development and tests of other software components. To this end, a simulation environment is necessary, i.e. simulation model of the vehicle and all its key devices, the model of the sea environment, and the software to visualize behavior of the vehicle. The paper presents architecture of the software designed for biomimetic autonomous underwater vehicle (BAUV) that is being constructed within the framework of the scientific project financed by Polish National Center of Research and Development.

  8. Using Dynamic Geometry and Computer Algebra Systems in Problem Based Courses for Future Engineers

    ERIC Educational Resources Information Center

    Tomiczková, Svetlana; Lávicka, Miroslav

    2015-01-01

    It is a modern trend today when formulating the curriculum of a geometric course at the technical universities to start from a real-life problem originated in technical praxis and subsequently to define which geometric theories and which skills are necessary for its solving. Nowadays, interactive and dynamic geometry software plays a more and more…

  9. Clinical, information and business process modeling to promote development of safe and flexible software.

    PubMed

    Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn

    2006-09-01

    Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.

  10. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    NASA Astrophysics Data System (ADS)

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  11. Adaptive Integration of Nonsmooth Dynamical Systems

    DTIC Science & Technology

    2017-10-11

    controlled time stepping method to interactively design running robots. [1] John Shepherd, Samuel Zapolsky, and Evan M. Drumwright, “Fast multi-body...software like this to test software running on my robots. Started working in simulation after attempting to use software like this to test software... running on my robots. The libraries that produce these beautiful results have failed at simulating robotic manipulation. Postulate: It is easier to

  12. Parallel Logic Programming and Parallel Systems Software and Hardware

    DTIC Science & Technology

    1989-07-29

    Conference, Dallas TX. January 1985. (55) [Rous75] Roussel, P., "PROLOG: Manuel de Reference et d’Uilisation", Group d’ Intelligence Artificielle , Universite d...completed. Tools were provided for software development using artificial intelligence techniques. Al software for massively parallel architectures was...using artificial intelligence tech- niques. Al software for massively parallel architectures was started. 1. Introduction We describe research conducted

  13. Project Report: Automatic Sequence Processor Software Analysis

    NASA Technical Reports Server (NTRS)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  14. Implementation of Systematic Review Tools in IRIS | Science ...

    EPA Pesticide Factsheets

    Currently, the number of chemicals present in the environment exceeds the ability of public health scientists to efficiently screen the available data in order to produce well-informed human health risk assessments in a timely manner. For this reason, the US EPA’s Integrated Risk Information System (IRIS) program has started implementing new software tools into the hazard characterization workflow. These automated tools aid in multiple phases of the systematic review process including scoping and problem formulation, literature search, and identification and screening of available published studies. The increased availability of these tools lays the foundation for automating or semi-automating multiple phases of the systematic review process. Some of these software tools include modules to facilitate a structured approach to study quality evaluation of human and animal data, although approaches are generally lacking for assessing complex mechanistic information, in particular “omics”-based evidence tools are starting to become available to evaluate these types of studies. We will highlight how new software programs, online tools, and approaches for assessing study quality can be better integrated to allow for a more efficient and transparent workflow of the risk assessment process as well as identify tool gaps that would benefit future risk assessments. Disclaimer: The views expressed here are those of the authors and do not necessarily represent the view

  15. Software Security Knowledge: Training

    DTIC Science & Technology

    2011-05-01

    eliminating those erro~rs. It can be found at http:ffcwe.mitre.org/top25. Any programmer who writes C’Ode \\r-Vith~out betng aware of those proble ~ms a·nd...time on security. Ultimately, these reasons stem from an underlying problem in the software market . B~cause software is essentially a black·box, it is...security of software and start to effect change in the software market . Nevertheless, we still frequently get pushback when we advocate for security

  16. The Profiles in Practice School Reporting Software.

    ERIC Educational Resources Information Center

    Griffin, Patrick

    "The Profiles in Practice: School Reporting Software" provides a framework for reports on different aspects of performance in an assessment program. This booklet is the installation guide and user manual for the Profiles in Practice software, which is included as a CD-ROM. The chapters of the guide are: (1) "Installation"; (2) "Starting the…

  17. A Software Technology Transition Entropy Based Engineering Model

    DTIC Science & Technology

    2002-03-01

    Systems Basics, p273). (Prigogine 1997 p81). It is not the place of this research to provide a mathematical formalism with theorems and lemmas. Rather...science). The ancient philosophers, 27 Pythagoras , Protagoras, Socrates, and Plato start the first discourse (the message) that has continued...unpacking of the technology "message" from Pythagoras . This process is characterized by accumulation learning, modeled by learning curves in

  18. Getting Started with AppleWorks Data Base. First Edition.

    ERIC Educational Resources Information Center

    Schlenker, Richard M.

    This manual is a hands-on teaching tool for beginning users of the AppleWorks database software. It was developed to allow Apple IIGS users who are generally familiar with their machine and its peripherals to build a simple AppleWorks database file using version 2.0 or 2.1 of the program, and to store, print, and manipulate the file. The materials…

  19. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  20. Parallel Fortran-MPI software for numerical inversion of the Laplace transform and its application to oscillatory water levels in groundwater environments

    USGS Publications Warehouse

    Zhan, X.

    2005-01-01

    A parallel Fortran-MPI (Message Passing Interface) software for numerical inversion of the Laplace transform based on a Fourier series method is developed to meet the need of solving intensive computational problems involving oscillatory water level's response to hydraulic tests in a groundwater environment. The software is a parallel version of ACM (The Association for Computing Machinery) Transactions on Mathematical Software (TOMS) Algorithm 796. Running 38 test examples indicated that implementation of MPI techniques with distributed memory architecture speedups the processing and improves the efficiency. Applications to oscillatory water levels in a well during aquifer tests are presented to illustrate how this package can be applied to solve complicated environmental problems involved in differential and integral equations. The package is free and is easy to use for people with little or no previous experience in using MPI but who wish to get off to a quick start in parallel computing. ?? 2004 Elsevier Ltd. All rights reserved.

  1. influx_s: increasing numerical stability and precision for metabolic flux analysis in isotope labelling experiments.

    PubMed

    Sokol, Serguei; Millard, Pierre; Portais, Jean-Charles

    2012-03-01

    The problem of stationary metabolic flux analysis based on isotope labelling experiments first appeared in the early 1950s and was basically solved in early 2000s. Several algorithms and software packages are available for this problem. However, the generic stochastic algorithms (simulated annealing or evolution algorithms) currently used in these software require a lot of time to achieve acceptable precision. For deterministic algorithms, a common drawback is the lack of convergence stability for ill-conditioned systems or when started from a random point. In this article, we present a new deterministic algorithm with significantly increased numerical stability and accuracy of flux estimation compared with commonly used algorithms. It requires relatively short CPU time (from several seconds to several minutes with a standard PC architecture) to estimate fluxes in the central carbon metabolism network of Escherichia coli. The software package influx_s implementing this algorithm is distributed under an OpenSource licence at http://metasys.insa-toulouse.fr/software/influx/. Supplementary data are available at Bioinformatics online.

  2. Control Program for an Optical-Calibration Robot

    NASA Technical Reports Server (NTRS)

    Johnston, Albert

    2005-01-01

    A computer program provides semiautomatic control of a moveable robot used to perform optical calibration of video-camera-based optoelectronic sensor systems that will be used to guide automated rendezvous maneuvers of spacecraft. The function of the robot is to move a target and hold it at specified positions. With the help of limit switches, the software first centers or finds the target. Then the target is moved to a starting position. Thereafter, with the help of an intuitive graphical user interface, an operator types in coordinates of specified positions, and the software responds by commanding the robot to move the target to the positions. The software has capabilities for correcting errors and for recording data from the guidance-sensor system being calibrated. The software can also command that the target be moved in a predetermined sequence of motions between specified positions and can be run in an advanced control mode in which, among other things, the target can be moved beyond the limits set by the limit switches.

  3. Software Development for EECU Platform of Turbofan Engine

    NASA Astrophysics Data System (ADS)

    Kim, Bo Gyoung; Kwak, Dohyup; Kim, Byunghyun; Choi, Hee ju; Kong, Changduk

    2017-04-01

    The turbofan engine operation consists of a number of hardware and software. The engine is controlled by Electronic Engine Control Unit (EECU). In order to control the engine, EECU communicates with an aircraft system, Actuator Drive Unit (ADU), Engine Power Unit (EPU) and sensors on the engine. This paper tried to investigate the process form starting to taking-off and aims to design the EECU software mode and defined communication data format. The software is implemented according to the designed software mode.

  4. Reviews, Software.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)

  5. Announcing a Community Effort to Create an Information Model for Research Software Archives

    NASA Astrophysics Data System (ADS)

    Million, C.; Brazier, A.; King, T.; Hayes, A.

    2018-04-01

    An effort has started to create recommendations and standards for the archiving of planetary science research software. The primary goal is to define an information model that is consistent with OAIS standards.

  6. A software tool for advanced MRgFUS prostate therapy planning and follow up

    NASA Astrophysics Data System (ADS)

    van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.

    2017-03-01

    US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.

  7. NMR-based automated protein structure determination.

    PubMed

    Würz, Julia M; Kazemi, Sina; Schmidt, Elena; Bagaria, Anurag; Güntert, Peter

    2017-08-15

    NMR spectra analysis for protein structure determination can now in many cases be performed by automated computational methods. This overview of the computational methods for NMR protein structure analysis presents recent automated methods for signal identification in multidimensional NMR spectra, sequence-specific resonance assignment, collection of conformational restraints, and structure calculation, as implemented in the CYANA software package. These algorithms are sufficiently reliable and integrated into one software package to enable the fully automated structure determination of proteins starting from NMR spectra without manual interventions or corrections at intermediate steps, with an accuracy of 1-2 Å backbone RMSD in comparison with manually solved reference structures. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Globus Quick Start Guide. Globus Software Version 1.1

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.

  9. Implementing the concurrent operation of sub-arrays in the ALMA correlator

    NASA Astrophysics Data System (ADS)

    Amestica, Rodrigo; Perez, Jesus; Lacasse, Richard; Saez, Alejandro

    2016-07-01

    The ALMA correlator processes the digitized signals from 64 individual antennas to produce a grand total of 2016 correlated base-lines, with runtime selectable lags resolution and integration time. The on-line software system can process a maximum of 125M visibilities per second, producing an archiving data rate close to one sixteenth of the former (7.8M visibilities per second with a network transfer limit of 60 MB/sec). Mechanisms in the correlator hardware design make it possible to split the total number of antennas in the array into smaller subsets, or sub-arrays, such that they can share correlator resources while executing independent observations. The software part of the sub-system is responsible for configuring and scheduling correlator resources in such a way that observations among independent subarrays occur simultaneously while internally sharing correlator resources under a cooperative arrangement. Configuration of correlator modes through its CAN-bus interface and periodic geometric delay updates are the most relevant activities to schedule concurrently while observations happen at the same time among a number of sub-arrays. For that to work correctly, the software interface to sub-arrays schedules shared correlator resources sequentially before observations actually start on each sub-array. Start times for specific observations are optimized and reported back to the higher level observing software. After that initial sequential phase has taken place then simultaneous executions and recording of correlated data across different sub-arrays move forward concurrently, sharing the local network to broadcast results to other software sub-systems. The present paper presents an overview of the different hardware and software actors within the correlator sub-system that implement some degree of concurrency and synchronization needed for seamless and simultaneous operation of multiple sub-arrays, limitations stemming from the resource-sharing nature of the correlator, limitations intrinsic to the digital technology available in the correlator hardware, and milestones so far reached by this new ALMA feature.

  10. Critical Design Decisions of The Planck LFI Level 1 Software

    NASA Astrophysics Data System (ADS)

    Morisset, N.; Rohlfs, R.; Türler, M.; Meharga, M.; Binko, P.; Beck, M.; Frailis, M.; Zacchei, A.

    2010-12-01

    The PLANCK satellite with two on-board instruments, a Low Frequency Instrument (LFI) and a High Frequency Instrument (HFI) has been launched on May 14th with Ariane 5. The ISDC Data Centre for Astrophysics in Versoix, Switzerland has developed and maintains the Planck LFI Level 1 software for the Data Processing Centre (DPC) in Trieste, Italy. The main tasks of the Level 1 processing are to retrieve the daily available scientific and housekeeping (HK) data of the LFI instrument, the Sorption Cooler and the 4k Cooler data from Mission Operation Centre (MOC) in Darmstadt; to sort them by time and by type (detector, observing mode, etc...); to extract the spacecraft attitude information from auxiliary files; to flag the data according to several criteria; and to archive the resulting Time Ordered Information (TOI), which will then be used to produce maps of the sky in different spectral bands. The output of the Level 1 software are the TOI files in FITS format, later ingested into the Data Management Component (DMC) database. This software has been used during different phases of the LFI instrument development. We started to reuse some ISDC components for the LFI Qualification Model (QM) and we completely rework the software for the Flight Model (FM). This was motivated by critical design decisions taken jointly with the DPC. The main questions were: a) the choice of the data format: FITS or DMC? b) the design of the pipelines: use of the Planck Process Coordinator (ProC) or a simple Perl script? c) do we adapt the existing QM software or do we restart from scratch? The timeline and available manpower are also important issues to be taken into account. We present here the orientation of our choices and discuss their pertinence based on the experience of the final pre-launch tests and the start of real Planck LFI operations.

  11. Synesthetic art through 3-D projection: The requirements of a computer-based supermedium

    NASA Technical Reports Server (NTRS)

    Mallary, Robert

    1989-01-01

    A computer-based form of multimedia art is proposed that uses the computer to fuse aspects of painting, sculpture, dance, music, film, and other media into a one-to-one synthesia of image and sound for spatially synchronous 3-D projection. Called synesthetic art, this conversion of many varied media into an aesthetically unitary experience determines the character and requirements of the system and its software. During the start-up phase, computer stereographic systems are unsuitable for software development. Eventually, a new type of illusory-projective supermedium will be required to achieve the needed combination of large-format projection and convincing real life presence, and to handle the vast amount of 3-D visual and acoustic information required. The influence of the concept on the author's research and creative work is illustrated through two examples.

  12. An analysis of the sliding pressure start-up of SCWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, F.; Yang, J.; Li, H.

    In this paper, the preliminary sliding pressure start-up system and scheme of supercritical water-cooled reactor in CGNPC (CGN-SCWR) were proposed. Thermal-hydraulic behavior in start-up procedures was analyzed in detail by employing advanced reactor subchannel analysis software ATHAS. The maximum cladding temperature (MCT for short) and core power of fuel assembly during the whole start-up process were investigated comparatively. The results show that the recommended start-up scheme meets the design requirements from the perspective of thermal-hydraulic. (authors)

  13. PhET: The Best Education Software You Can't Buy

    NASA Astrophysics Data System (ADS)

    Dubson, M.; Duncan, D. K.

    2009-12-01

    Project PhET provides free educational software in the form of stand-alone java and flash simulations and associated classroom materials. Our motto is "It's the best educational software that money can buy, except you can't buy it, because its free." You can start playing with PhET sims right now at http://phet.colorado.edu and add to our 1 million hits per month. PhET originally stood for Physics Education Technology, but we now include other science fields so PhET is now a brand name. Our site has about 80 simulations, mostly in physics and math, but also in chemistry, geology, and biology. Based on careful research and student interviews, our sims have no instructions because no one reads instructions. These simulations can be used in lecture demonstrations, classroom activities, and homework assignments. The PhET site includes a long list of user-tested classroom activities and teacher tips.

  14. Evaluating computer capabilities in a primary care practice-based research network.

    PubMed

    Ariza, Adolfo J; Binns, Helen J; Christoffel, Katherine Kaufer

    2004-01-01

    We wanted to assess computer capabilities in a primary care practice-based research network and to understand how receptive the practices were to new ideas for automation of practice activities and research. This study was conducted among members of the Pediatric Practice Research Group (PPRG). A survey to assess computer capabilities was developed to explore hardware types, software programs, Internet connectivity and data transmission; views on privacy and security; and receptivity to future electronic data collection approaches. Of the 40 PPRG practices participating in the study during the autumn of 2001, all used IBM-compatible systems. Of these, 45% used stand-alone desktops, 40% had networked desktops, and approximately 15% used laptops and minicomputers. A variety of software packages were used, with most practices (82%) having software for some aspect of patient care documentation, patient accounting (90%), business support (60%), and management reports and analysis (97%). The main obstacles to expanding use of computers in patient care were insufficient staff training (63%) and privacy concerns (82%). If provided with training and support, most practices indicated they were willing to consider an array of electronic data collection options for practice-based research activities. There is wide variability in hardware and software use in the pediatric practice setting. Implementing electronic data collection in the PPRG would require a substantial start-up effort and ongoing training and support at the practice site.

  15. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  16. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  17. Methods for cost estimation in software project management

    NASA Astrophysics Data System (ADS)

    Briciu, C. V.; Filip, I.; Indries, I. I.

    2016-02-01

    The speed in which the processes used in software development field have changed makes it very difficult the task of forecasting the overall costs for a software project. By many researchers, this task has been considered unachievable, but there is a group of scientist for which this task can be solved using the already known mathematical methods (e.g. multiple linear regressions) and the new techniques as genetic programming and neural networks. The paper presents a solution for building a model for the cost estimation models in the software project management using genetic algorithms starting from the PROMISE datasets related COCOMO 81 model. In the first part of the paper, a summary of the major achievements in the research area of finding a model for estimating the overall project costs is presented together with the description of the existing software development process models. In the last part, a basic proposal of a mathematical model of a genetic programming is proposed including here the description of the chosen fitness function and chromosome representation. The perspective of model described it linked with the current reality of the software development considering as basis the software product life cycle and the current challenges and innovations in the software development area. Based on the author's experiences and the analysis of the existing models and product lifecycle it was concluded that estimation models should be adapted with the new technologies and emerging systems and they depend largely by the chosen software development method.

  18. Optimal design of the rotor geometry of line-start permanent magnet synchronous motor using the bat algorithm

    NASA Astrophysics Data System (ADS)

    Knypiński, Łukasz

    2017-12-01

    In this paper an algorithm for the optimization of excitation system of line-start permanent magnet synchronous motors will be presented. For the basis of this algorithm, software was developed in the Borland Delphi environment. The software consists of two independent modules: an optimization solver, and a module including the mathematical model of a synchronous motor with a self-start ability. The optimization module contains the bat algorithm procedure. The mathematical model of the motor has been developed in an Ansys Maxwell environment. In order to determine the functional parameters of the motor, additional scripts in Visual Basic language were developed. Selected results of the optimization calculation are presented and compared with results for the particle swarm optimization algorithm.

  19. A microcomputer interface for a digital audio processor-based data recording system.

    PubMed

    Croxton, T L; Stump, S J; Armstrong, W M

    1987-10-01

    An inexpensive interface is described that performs direct transfer of digitized data from the digital audio processor and video cassette recorder based data acquisition system designed by Bezanilla (1985, Biophys. J., 47:437-441) to an IBM PC/XT microcomputer. The FORTRAN callable software that drives this interface is capable of controlling the video cassette recorder and starting data collection immediately after recognition of a segment of previously collected data. This permits piecewise analysis of long intervals of data that would otherwise exceed the memory capability of the microcomputer.

  20. A microcomputer interface for a digital audio processor-based data recording system.

    PubMed Central

    Croxton, T L; Stump, S J; Armstrong, W M

    1987-01-01

    An inexpensive interface is described that performs direct transfer of digitized data from the digital audio processor and video cassette recorder based data acquisition system designed by Bezanilla (1985, Biophys. J., 47:437-441) to an IBM PC/XT microcomputer. The FORTRAN callable software that drives this interface is capable of controlling the video cassette recorder and starting data collection immediately after recognition of a segment of previously collected data. This permits piecewise analysis of long intervals of data that would otherwise exceed the memory capability of the microcomputer. PMID:3676444

  1. Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-09-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  2. Reprint of: Simulation Platform: a cloud-based online simulation environment.

    PubMed

    Yamazaki, Tadashi; Ikeno, Hidetoshi; Okumura, Yoshihiro; Satoh, Shunji; Kamiyama, Yoshimi; Hirata, Yutaka; Inagaki, Keiichiro; Ishihara, Akito; Kannon, Takayuki; Usui, Shiro

    2011-11-01

    For multi-scale and multi-modal neural modeling, it is needed to handle multiple neural models described at different levels seamlessly. Database technology will become more important for these studies, specifically for downloading and handling the neural models seamlessly and effortlessly. To date, conventional neuroinformatics databases have solely been designed to archive model files, but the databases should provide a chance for users to validate the models before downloading them. In this paper, we report our on-going project to develop a cloud-based web service for online simulation called "Simulation Platform". Simulation Platform is a cloud of virtual machines running GNU/Linux. On a virtual machine, various software including developer tools such as compilers and libraries, popular neural simulators such as GENESIS, NEURON and NEST, and scientific software such as Gnuplot, R and Octave, are pre-installed. When a user posts a request, a virtual machine is assigned to the user, and the simulation starts on that machine. The user remotely accesses to the machine through a web browser and carries out the simulation, without the need to install any software but a web browser on the user's own computer. Therefore, Simulation Platform is expected to eliminate impediments to handle multiple neural models that require multiple software. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. Software quality for 1997 - what works and what doesn`t?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.

  4. Integrating MPI and deduplication engines: a software architecture roadmap.

    PubMed

    Baksi, Dibyendu

    2009-03-01

    The objective of this paper is to clarify the major concepts related to architecture and design of patient identity management software systems so that an implementor looking to solve a specific integration problem in the context of a Master Patient Index (MPI) and a deduplication engine can address the relevant issues. The ideas presented are illustrated in the context of a reference use case from Integrating the Health Enterprise Patient Identifier Cross-referencing (IHE PIX) profile. Sound software engineering principles using the latest design paradigm of model driven architecture (MDA) are applied to define different views of the architecture. The main contribution of the paper is a clear software architecture roadmap for implementors of patient identity management systems. Conceptual design in terms of static and dynamic views of the interfaces is provided as an example of platform independent model. This makes the roadmap applicable to any specific solutions of MPI, deduplication library or software platform. Stakeholders in need of integration of MPIs and deduplication engines can evaluate vendor specific solutions and software platform technologies in terms of fundamental concepts and can make informed decisions that preserve investment. This also allows freedom from vendor lock-in and the ability to kick-start integration efforts based on a solid architecture.

  5. Program Setup Time and Learning Curves associated with "ready to fly" Drone Mapping Hardware and Software.

    NASA Astrophysics Data System (ADS)

    Wilcox, T.

    2016-12-01

    How quickly can students (and educators) get started using a "ready to fly" UAS and popular publicly available photogrammetric mapping software for student research at the undergraduate level? This poster presentation focuses on the challenges of starting up your own drone-mapping program for undergraduate research in a compressed timescale of three months. Particular focus will be given to learning the operation of the platforms, hardware and software interface challenges, and using these electronic systems in real-world field settings that pose a range of physical challenges to both operators and equipment. We will be using a combination of the popular DJI Phantom UAS and Pix4D mapping software to investigate mass wasting processes and potential hazards present in public lands popular with recreational users. Projects are aimed at characterizing active geological hazards that operate on short timescales and may include gully headwall erosion in Flaming Geyser State Park and potential landslide instability within Capital State Forest, both in the Puget Sound region of Washington State.

  6. GPU Based Software Correlators - Perspectives for VLBI2010

    NASA Technical Reports Server (NTRS)

    Hobiger, Thomas; Kimura, Moritaka; Takefuji, Kazuhiro; Oyama, Tomoaki; Koyama, Yasuhiro; Kondo, Tetsuro; Gotoh, Tadahiro; Amagai, Jun

    2010-01-01

    Caused by historical separation and driven by the requirements of the PC gaming industry, Graphics Processing Units (GPUs) have evolved to massive parallel processing systems which entered the area of non-graphic related applications. Although a single processing core on the GPU is much slower and provides less functionality than its counterpart on the CPU, the huge number of these small processing entities outperforms the classical processors when the application can be parallelized. Thus, in recent years various radio astronomical projects have started to make use of this technology either to realize the correlator on this platform or to establish the post-processing pipeline with GPUs. Therefore, the feasibility of GPUs as a choice for a VLBI correlator is being investigated, including pros and cons of this technology. Additionally, a GPU based software correlator will be reviewed with respect to energy consumption/GFlop/sec and cost/GFlop/sec.

  7. Software model of a machine vision system based on the common house fly.

    PubMed

    Madsen, Robert; Barrett, Steven; Wilcox, Michael

    2005-01-01

    The vision system of the common house fly has many properties, such as hyperacuity and parallel structure, which would be advantageous in a machine vision system. A software model has been developed which is ultimately intended to be a tool to guide the design of an analog real time vision system. The model starts by laying out cartridges over an image. The cartridges are analogous to the ommatidium of the fly's eye and contain seven photoreceptors each with a Gaussian profile. The spacing between photoreceptors is variable providing for more or less detail as needed. The cartridges provide information on what type of features they see and neighboring cartridges share information to construct a feature map.

  8. From a paper-based to an electronic registry in physiotherapy.

    PubMed

    Buyl, Ronald; Nyssen, Marc

    2008-01-01

    During the past decade the healthcare industry has evolved from paper-based storage of clinical data into the digital era. Electronic healthcare records play a crucial role to meet the growing need for integrated data-storage and data communication. In this context a new law was issued in Belgium on December 7th, 2005, which requires physiotherapists (but also nurses and speech therapists) to keep an electronic version of the registry. This (electronic) registry contains all physiotherapeutic acts, starting from January 1, 2007. Up until that day, a paper version of the registry had to be created every month.This article describes the development of an electronic version of the registry that not only meets all legal constraints, but also enables to verify the traceability and inalterability of the generated documents, by means of SHA-256 codes. One of the major concerns of the process was that the rationale behind the electronic registry would conform well to the common practice of the physiotherapist. Therefore we opted for a periodic recording of a standardized "image" of the controllable data, in the patient database of the software-system, into the XML registry messages. The proposed XSLT schema can also form a basis for the development of tools that can be used by the controlling authorities. Hopefully the electronic registry for physiotherapists will be a first step towards the future development of a fully integrated electronic physiotherapy record.By means of a certification procedure for the software systems, we succeeded in developing a user friendly system that enables end-users that use a quality labeled software package, to automatically produce all the legally necessary documents concerning the registry. Moreover, we hope that this development will be an incentive for non-users to start working in an electronic way.

  9. STOPP/START version 2-development of software applications: easier said than done?

    PubMed

    Anrys, Pauline; Boland, Benoît; Degryse, Jean-Marie; De Lepeleire, Jan; Petrovic, Mirko; Marien, Sophie; Dalleur, Olivia; Strauven, Goedele; Foulon, Veerle; Spinewine, Anne

    2016-09-01

    Explicit criteria, such as the STOPP/START criteria, are increasingly used both in clinical practice and in research to identify potentially inappropriate prescribing in older people. In an article on the STOPP/START criteria version 2, O'Mahony et al have pointed out the advantages of developing computerised criteria. Both clinical decision support systems to support healthcare professionals and software applications to automatically detect inappropriate prescribing in research studies can be developed. In the process of developing such tools, difficulties may occur. In the context of a research study, we have developed an algorithm to automatically apply STOPP/START criteria version 2 to our research database. We comment in this paper on different kinds of difficulties encountered and make suggestions that could be taken into account when developing the next version of the criteria. © The Author 2016. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Teaching with Technology: Literature and Software.

    ERIC Educational Resources Information Center

    Allen, Denise

    1994-01-01

    Reviews five computer programs and compact disc-read only memory (CD-ROM) products designed to improve students' reading and problem-solving skills: (1) "Reading Realities" (Teacher Support Software); (2) "Kid Rhymes" (Creative Pursuits); (3) "First-Start Biographies" (Troll Associates); (4) "My Silly CD of ABCs" (Discis Classroom Editions); and…

  11. Proceedings of the Eighteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The workshop provided a forum for software practitioners from around the world to exchange information on the measurement, use, and evaluation of software methods, models, and tools. This year, approximately 450 people attended the workshop, which consisted of six sessions on the following topics: the Software Engineering Laboratory, measurement, technology assessment, advanced concepts, process, and software engineering issues in NASA. Three presentations were given in each of the topic areas. The content of those presentations and the research papers detailing the work reported are included in these proceedings. The workshop concluded with a tutorial session on how to start an Experience Factory.

  12. Design, Fabrication and Testing of Two Different Laboratory Prototypes of CSI-based Induction Heating Units

    NASA Astrophysics Data System (ADS)

    Roy, M.; Sengupta, M.

    2012-09-01

    Induction heating is a non-contact heating process which became popular due to its energy efficiency. Current source inverter (CSI) based induction heating units are commonly used in the industry. Most of these CSIs are thyristor based, since thyristors of higher ratings are easily available. These being load commutated apparatus a start-up circuit is needed to initiate commutation. In this paper the design and fabrication of two laboratory prototypes have been presented. The first one, a SCR-based CSI fed controlled induction heating unit (IHU), has been tested with two different types of start-up procedures. Thereafter the fabrication and performance of another IGBT-based CSI is compared with the thyristor-based CSI for a 2 kW, 10 kHz application. These two types of CSIs are fully fabricated in laboratory along with the IHU. Performance analysis and simulation of two different CSIs has been done by using SequelGUI2. The triggering pulses for the inverter devices (for both CSI devices as well as auxilliary thyristor of start-up circuit) have been generated and closed-loop control has been done in FPGA platform built around an Altera make cyclone EPIC12Q240C processor which can be programmed using Quartus II software. Close agreement between simulated and experimental results highlight the accuracy of the experimental work.

  13. Evolution of Secondary Software Businesses: Understanding Industry Dynamics

    NASA Astrophysics Data System (ADS)

    Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko

    Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.

  14. Descriptive Morphology Terms For MAMA software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, Christy E.; Porter, Reid B.

    The table on the following pages lists a set of morphology terms for describing materials. We have organized these terms by categories. Software uses are welcome to suggest other terms that are needed to accurately describe materials. This list is intended as a initial starting point to generating a consensus terminology list.

  15. GESTALT: A Framework for Redesign of Educational Software

    ERIC Educational Resources Information Center

    Puustinen, M.; Baker, M.; Lund, K.

    2006-01-01

    Design of educational multimedia rarely starts from scratch, but rather by attempting to reuse existing software. Although redesign has been an issue in research on evaluation and on learning objects, how it should be carried out in a principled way has remained relatively unexplored. Furthermore, understanding how empirical research on…

  16. No-Fail Software Gifts for Kids.

    ERIC Educational Resources Information Center

    Buckleitner, Warren

    1996-01-01

    Reviews children's software packages: (1) "Fun 'N Games"--nonviolent games and activities; (2) "Putt-Putt Saves the Zoo"--matching, logic games, and animal facts; (3) "Big Job"--12 logic games with video from job sites; (4) "JumpStart First Grade"--15 activities introducing typical school lessons; and (5) "Read, Write, & Type!"--progressively…

  17. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    EPA Science Inventory

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...

  18. OpenSatKit Enables Quick Startup for CubeSat Missions

    NASA Technical Reports Server (NTRS)

    McComas, David; Melton, Ryan

    2017-01-01

    The software required to develop, integrate, and operate a spacecraft is substantial regardless of whether its a large or small satellite. Even getting started can be a monumental task. To solve this problem, NASAs Core Flight System (cFS), NASA's 42 spacecraft dynamics simulator, and Ball Aerospaces COSMOS ground system have been integrated together into a kit called OpenSatKit that provides a complete and open source software solution for starting a new satellite mission. Users can have a working system with flight software, dynamics simulation, and a ground command and control system up and running within hours.Every satellite mission requires three primary categories of software to function. The first is Flight Software (FSW) which provides the onboard control of the satellites and its payload(s). NASA's cFS provides a great platform for developing this software. Second, while developing a satellite on earth, it is necessary to simulate the satellites orbit, attitude, and actuators, to ensure that the systems that control these aspects will work correctly in the real environment. NASAs 42 simulator provides these functionalities. Finally, the ground has to be able to communicate with the satellite, monitor its performance and health, and display its data. Additionally, test scripts have to be written to verify the system on the ground. Ball Aerospace's COSMOS command and control system provides this functionality. Once the OpenSatKit is up and running, the next step is to customize the platform and get it running on the end target. Starting from a fully working system makes porting the cFS from Linux to a users platform much easier. An example Raspberry Pi target is included in the kit so users can gain experience working with a low cost hardware target. All users can benefit from OpenSatKit but the greatest impact and benefits will be to SmallSat missions with constrained budgets and small software teams. This paper describes OpenSatKits system design, the steps necessary to run the system to target the Raspberry Pi, and future plans. OpenSatKit is a free fully functional spacecraft software system that we hope will greatly benefit the SmallSat community.

  19. Computational Infrastructure for Geodynamics (CIG)

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.

    2004-12-01

    Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.

  20. Variations in Sleep and Performance by Duty Start Time in Short Haul Operations

    NASA Technical Reports Server (NTRS)

    Flynn-Evans, Erin

    2016-01-01

    Prior studies have confirmed that commercial airline pilots experience circadian phase shifts and short sleep duration following travel with layovers in different time zones. Few studies have examined the impact of early and late starts on the sleep and circadian phase of airline pilots who return to their domicile after each duty period. We recruited 44 pilots (4 female) from a short-haul commercial airline to participate in a study examining sleep and circadian phase over four duty schedules (baseline, early starts, mid-day starts, late starts). Each duty schedule was five days long, separated by three rest days. Participants completed the rosters in the same order. Sleep outcomes were estimated using wrist-borne actigraphy (Actiware Software, Respironics, Bend, OR) and daily sleep diaries. Thirteen participants volunteered to collect urine samples for the assessment of 6-sulfatoxymelatonin (aMT6s). Urine samples were collected in four-hourly bins during the day and eight-hourly bins during sleep episodes, for 24 hours immediately following each experimental duty schedule. The aMT6s results were fit to a cosine in order to obtain the acrophase to estimate circadian phase. Univariate statistics were calculated for acrophase changes, schedule start times and sleep times. All statistical analyses were computed using SAS software (Cary, IN).

  1. WalkThrough Example Procedures for MAMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph

    This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.

  2. Proceedings of the Fourth International Workshop on a Research Agenda for Maintenance and Evolution of Service-Oriented Systems (MESOA 2010)

    DTIC Science & Technology

    2011-09-01

    service -oriented systems • Software -as-a- Service ( SaaS ) • social network infrastructures • Internet marketing • mobile computing • context awareness...Maintenance and Evolution of Service -Oriented Systems (MESOA 2010), organized by members of the Carnegie Mellon Software Engineering Institute’s...CMU/SEI-2011-SR-008 | 1 1 Workshop Introduction The Software Engineering Institute (SEI) started developing a service -oriented architecture

  3. Algorithms and software used in selecting structure of machine-training cluster based on neurocomputers

    NASA Astrophysics Data System (ADS)

    Romanchuk, V. A.; Lukashenko, V. V.

    2018-05-01

    The technique of functioning of a control system by a computing cluster based on neurocomputers is proposed. Particular attention is paid to the method of choosing the structure of the computing cluster due to the fact that the existing methods are not effective because of a specialized hardware base - neurocomputers, which are highly parallel computer devices with an architecture different from the von Neumann architecture. A developed algorithm for choosing the computational structure of a cloud cluster is described, starting from the direction of data transfer in the flow control graph of the program and its adjacency matrix.

  4. Energy reconstruction of hadrons in highly granular combined ECAL and HCAL systems

    NASA Astrophysics Data System (ADS)

    Israeli, Y.

    2018-05-01

    This paper discusses the hadronic energy reconstruction of two combined electromagnetic and hadronic calorimeter systems using physics prototypes of the CALICE collaboration: the silicon-tungsten electromagnetic calorimeter (Si-W ECAL) and the scintillator-SiPM based analog hadron calorimeter (AHCAL); and the scintillator-tungsten electromagnetic calorimeter (ScECAL) and the AHCAL. These systems were operated in hadron beams at CERN and FNAL, permitting the study of the performance in combined ECAL and HCAL systems. Two techniques for the energy reconstruction are used, a standard reconstruction based on calibrated sub-detector energy sums, and one based on a software compensation algorithm making use of the local energy density information provided by the high granularity of the detectors. The software compensation-based algorithm improves the hadronic energy resolution by up to 30% compared to the standard reconstruction. The combined system data show comparable energy resolutions to the one achieved for data with showers starting only in the AHCAL and therefore demonstrate the success of the inter-calibration of the different sub-systems, despite of their different geometries and different readout technologies.

  5. Steamer II: Steamer prototype component inventory and user interface commands. Technical report, 1988-1989

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickieson, J.L.; Thode, W.F.; Newbury, K.

    1988-12-01

    Over the last several years, Navy Personnel Research and Development has produced a prototype simulation of a 1200-psi steam plant. This simulation, called Steamer, is installed on an expensive Symbolics minicomputer at the Surface Warfare Officers School, Pacific Coronado, California. The fundamental research goal of the Steamer prototype system was to evaluate the potential of, what was then, new artificial intelligence (AI) hardware and software technology for supporting the construction of computer-based training systems using graphic representations of complex, dynamic systems. The area of propulsion engineering was chosen for a number of reasons. This document describes the Steamer prototype systemmore » components and user interface commands and establishes a starting point for designing, developing, and implementing Steamer II. Careful examination of the actual program code produced an inventory that describes the hardware, system software, application software, and documentation for the Steamer prototype system. Exercising all menu options systematically produced an inventory of all Steamer prototype user interface commands.« less

  6. Open access for ALICE analysis based on virtualization technology

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  7. Overview of software development at the parabolic dish test site

    NASA Technical Reports Server (NTRS)

    Miyazono, C. K.

    1985-01-01

    The development history of the data acquisition and data analysis software is discussed. The software development occurred between 1978 and 1984 in support of solar energy module testing at the Jet Propulsion Laboratory's Parabolic Dish Test Site, located within Edwards Test Station. The development went through incremental stages, starting with a simple single-user BASIC set of programs, and progressing to the relative complex multi-user FORTRAN system that was used until the termination of the project. Additional software in support of testing is discussed including software in support of a meteorological subsystem and the Test Bed Concentrator Control Console interface. Conclusions and recommendations for further development are discussed.

  8. EUGENE'HOM: A generic similarity-based gene finder using multiple homologous sequences.

    PubMed

    Foissac, Sylvain; Bardou, Philippe; Moisan, Annick; Cros, Marie-Josée; Schiex, Thomas

    2003-07-01

    EUGENE'HOM is a gene prediction software for eukaryotic organisms based on comparative analysis. EUGENE'HOM is able to take into account multiple homologous sequences from more or less closely related organisms. It integrates the results of TBLASTX analysis, splice site and start codon prediction and a robust coding/non-coding probabilistic model which allows EUGENE'HOM to handle sequences from a variety of organisms. The current target of EUGENE'HOM is plant sequences. The EUGENE'HOM web site is available at http://genopole.toulouse.inra.fr/bioinfo/eugene/EuGeneHom/cgi-bin/EuGeneHom.pl.

  9. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by employing soft computing technologies, the quality and reliability of the overall scheme to engine controller development is further improved and vehicle safety is further insured. The final product that this paper proposes is an approach to development of an alternative low cost engine controller that would be capable of performing in unique vision spacecraft vehicles requiring low cost advanced avionics architectures for autonomous operations from engine pre-start to engine shutdown.

  10. Software Carpentry: lessons learned

    PubMed Central

    Wilson, Greg

    2016-01-01

    Since its start in 1998, Software Carpentry has evolved from a week-long training course at the US national laboratories into a worldwide volunteer effort to improve researchers' computing skills. This paper explains what we have learned along the way, the challenges we now face, and our plans for the future. PMID:24715981

  11. The Scientific Uplink and User Support System for SIRTF

    NASA Astrophysics Data System (ADS)

    Heinrichsen, I.; Chavez, J.; Hartley, B.; Mei, Y.; Potts, S.; Roby, T.; Turek, G.; Valjavec, E.; Wu, X.

    The Space Infrared Telescope Facility (SIRTF) is one of NASA's Great Observatory missions, scheduled for launch in 2001. As such its ground segment design is driven by the requirement to provide strong support for the entire astronomical community starting with the call for Legacy Proposals in early 2000. In this contribution, we present the astronomical user interface and the design of the server software that comprises the Scientific Uplink System for SIRTF. The software architecture is split into three major parts: A front-end Java application deployed to the astronomical community providing the capabilities to visualize and edit proposals and the associated lists of observations. This observer toolkit provides templates to define all parameters necessary to carry out the required observations. A specialized version of this software, based on the same overall architecture, is used internal to the SIRTF Science Center to prepare calibration and engineering observations. A Weblogic (TM) based middleware component brokers the transactions with the servers, astronomical image and catalog sources as well as the SIRTF operational databases. Several server systems perform the necessary computations, to obtain resource estimates, target visibilities and to access the instrument models for signal to noise calculations. The same server software is used internally at a later stage to derive the detailed command sequences needed by the SIRTF instruments and spacecraft to execute a given observation.

  12. Control software and electronics architecture design in the framework of the E-ELT instrumentation

    NASA Astrophysics Data System (ADS)

    Di Marcantonio, P.; Coretti, I.; Cirami, R.; Comari, M.; Santin, P.; Pucillo, M.

    2010-07-01

    During the last years the European Southern Observatory (ESO), in collaboration with other European astronomical institutes, has started several feasibility studies for the E-ELT (European-Extremely Large Telescope) instrumentation and post-focal adaptive optics. The goal is to create a flexible suite of instruments to deal with the wide variety of scientific questions astronomers would like to see solved in the coming decades. In this framework INAF-Astronomical Observatory of Trieste (INAF-AOTs) is currently responsible of carrying out the analysis and the preliminary study of the architecture of the electronics and control software of three instruments: CODEX (control software and electronics) and OPTIMOS-EVE/OPTIMOS-DIORAMAS (control software). To cope with the increased complexity and new requirements for stability, precision, real-time latency and communications among sub-systems imposed by these instruments, new solutions have been investigated by our group. In this paper we present the proposed software and electronics architecture based on a distributed common framework centered on the Component/Container model that uses OPC Unified Architecture as a standard layer to communicate with COTS components of three different vendors. We describe three working prototypes that have been set-up in our laboratory and discuss their performances, integration complexity and ease of deployment.

  13. Applying object-oriented software engineering at the BaBar collaboration

    NASA Astrophysics Data System (ADS)

    Jacobsen, Bob; BaBar Collaboration Reconstruction Software Group

    1997-02-01

    The BaBar experiment at SLAC will start taking data in 1999. We are attempting to build its reconstruction software using good software engineering practices, including the use of object-oriented technology. We summarize our experience to date with analysis and design activities, training, CASE and documentation tools, C++ programming practice and similar topics. The emphasis is on the practical issues of simultaneously introducing new techniques to a large collaboration while under a deadline for system delivery.

  14. How Do I Start a Property Records System?

    ERIC Educational Resources Information Center

    Whyman, Wynne

    2003-01-01

    A property records system organizes data to be utilized by a camp's facilities department and integrated into other areas. Start by deciding what records to keep and allotting the time. Then develop consistent procedures, including organizing data, creating a catalog, making back-up copies, and integrating procedures. Use software tools. A good…

  15. Software framework for the upcoming MMT Observatory primary mirror re-aluminization

    NASA Astrophysics Data System (ADS)

    Gibson, J. Duane; Clark, Dusty; Porter, Dallan

    2014-07-01

    Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.

  16. [The use of open source software in graphic anatomic reconstructions and in biomechanic simulations].

    PubMed

    Ciobanu, O

    2009-01-01

    The objective of this study was to obtain three-dimensional (3D) images and to perform biomechanical simulations starting from DICOM images obtained by computed tomography (CT). Open source software were used to prepare digitized 2D images of tissue sections and to create 3D reconstruction from the segmented structures. Finally, 3D images were used in open source software in order to perform biomechanic simulations. This study demonstrates the applicability and feasibility of open source software developed in our days for the 3D reconstruction and biomechanic simulation. The use of open source software may improve the efficiency of investments in imaging technologies and in CAD/CAM technologies for implants and prosthesis fabrication which need expensive specialized software.

  17. Process Management inside ATLAS DAQ

    NASA Astrophysics Data System (ADS)

    Alexandrov, I.; Amorim, A.; Badescu, E.; Burckhart-Chromek, D.; Caprini, M.; Dobson, M.; Duval, P. Y.; Hart, R.; Jones, R.; Kazarov, A.; Kolos, S.; Kotov, V.; Liko, D.; Lucio, L.; Mapelli, L.; Mineev, M.; Moneta, L.; Nassiakou, M.; Pedro, L.; Ribeiro, A.; Roumiantsev, V.; Ryabov, Y.; Schweiger, D.; Soloviev, I.; Wolters, H.

    2002-10-01

    The Process Management component of the online software of the future ATLAS experiment data acquisition system is presented. The purpose of the Process Manager is to perform basic job control of the software components of the data acquisition system. It is capable of starting, stopping and monitoring the status of those components on the data acquisition processors independent of the underlying operating system. Its architecture is designed on the basis of a server client model using CORBA based communication. The server part relies on C++ software agent objects acting as an interface between the local operating system and client applications. Some of the major design challenges of the software agents were to achieve the maximum degree of autonomy possible, to create processes aware of dynamic conditions in their environment and with the ability to determine corresponding actions. Issues such as the performance of the agents in terms of time needed for process creation and destruction, the scalability of the system taking into consideration the final ATLAS configuration and minimizing the use of hardware resources were also of critical importance. Besides the details given on the architecture and the implementation, we also present scalability and performance tests results of the Process Manager system.

  18. Pulsed Neutron Powder Diffraction for Materials Science

    NASA Astrophysics Data System (ADS)

    Kamiyama, T.

    2008-03-01

    The accelerator-based neutron diffraction began in the end of 60's at Tohoku University which was succeeded by the four spallation neutron facilities with proton accelerators at the High Energy Accelerator Research Organization (Japan), Argonne National Laboratory and Los Alamos Laboratory (USA), and Rutherford Appleton Laboratory (UK). Since then, the next generation source has been pursued for 20 years, and 1MW-class spallation neutron sources will be appeared in about three years at the three parts of the world: Japan, UK and USA. The joint proton accelerator project (J-PARC), a collaborative project between KEK and JAEA, is one of them. The aim of the talk is to describe about J-PARC and the neutron diffractometers being installed at the materials and life science facility of J-PARC. The materials and life science facility of J-PARC has 23 neutron beam ports and will start delivering the first neutron beam of 25 Hz from 2008 May. Until now, more than 20 proposals have been reviewed by the review committee, and accepted proposal groups have started to get fund. Those proposals include five polycrystalline diffractometers: a super high resolution powder diffractometer (SHRPD), a 0.2%-resolution powder diffractometer of Ibaraki prefecture (IPD), an engineering diffractometers (Takumi), a high intensity S(Q) diffractometer (VSD), and a high-pressure dedicated diffractometer. SHRPD, Takumi and IPD are being designed and constructed by the joint team of KEK, JAEA and Ibaraki University, whose member are originally from the KEK powder group. These three instruments are expected to start in 2008. VSD is a super high intensity diffractometer with the highest resolution of Δd/d = 0.3%. VSD can measure rapid time-dependent phenomena of crystalline materials as well as glass, liquid and amorphous materials. The pair distribution function will be routinely obtained by the Fourier transiformation of S(Q) data. Q range of VSD will be as wide as 0.01 Å-1

  19. Design and Acquisition of Software for Defense Systems

    DTIC Science & Technology

    2018-02-14

    enterprise business systems and related information technology (IT) services, the role software plays in enabling and enhancing weapons systems often...3 The information in this chart was compiled from Christian Hagen, Jeff Sorenson, Steven Hurt...understanding to make an informed choice of final architecture. The Task Force found commercial practice starts with several competing architectures and

  20. Is Chinese Software Engineering Professionalizing or Not?: Specialization of Knowledge, Subjective Identification and Professionalization

    ERIC Educational Resources Information Center

    Yang, Yan

    2012-01-01

    Purpose: This paper aims to discuss the challenge for the classical idea of professionalism in understanding the Chinese software engineering industry after giving a close insight into the development of this industry as well as individual engineers with a psycho-societal perspective. Design/methodology/approach: The study starts with the general…

  1. Robust Control for the Mercury Laser Altimeter

    NASA Technical Reports Server (NTRS)

    Rosenberg, Jacob S.

    2006-01-01

    Mercury Laser Altimeter Science Algorithms is a software system for controlling the laser altimeter aboard the Messenger spacecraft, which is to enter into orbit about Mercury in 2011. The software will control the altimeter by dynamically modifying hardware inputs for gain, threshold, channel-disable flags, range-window start location, and range-window width, by using ranging information provided by the spacecraft and noise counts from instrument hardware. In addition, because of severe bandwidth restrictions, the software also selects returns for downlink.

  2. Software development for airborne radar

    NASA Astrophysics Data System (ADS)

    Sundstrom, Ingvar G.

    Some aspects for development of software in a modern multimode airborne nose radar are described. First, an overview of where software is used in the radar units is presented. The development phases-system design, functional design, detailed design, function verification, and system verification-are then used as the starting point for the discussion. Methods, tools, and the most important documents are described. The importance of video flight recording in the early stages and use of a digital signal generators for performance verification is emphasized. Some future trends are discussed.

  3. Software Technology for Adaptable, Reliable Systems (STARS). Software Architecture Seminar Report: Central Archive for Reusable Defense Software (CARDS)

    DTIC Science & Technology

    1994-01-29

    other processes, but that he arrived at his results in a different manner. Batory didn’t start with idioms; he performed a domain analysis and...abstracted idioms. Through domain analysis and domain modeling, new idioms can be found and the form of architecture can be the same. It was also questioned...Programming 5. Consensus Definition of Architecture 6. Inductive Analysis of Current Exemplars 7. VHDL (Bailor) 8. Ontological Structuring 3.3.3

  4. Application of Docker Swarm cluster for testing programs, developed for system of devices within paradigm of Internet of things

    NASA Astrophysics Data System (ADS)

    Shichkina, Y. A.; Kupriyanov, M. S.; Moldachev, S. O.

    2018-05-01

    Today, a description of various Internet devices very often appears on the Internet. For the efficient operation of the Industrial Internet of things, it is necessary to provide a modern level of data processing starting from getting them from devices ending with returning them to devices in a processed form. Current solutions of the Internet of Things are mainly focused on the development of centralized decisions, projecting the Internet of Things on the set of cloud-based platforms that are open, but limit the ability of participants of the Internet of Things to adapt these systems to their own problems. Therefore, it is often necessary to create specialized software for specific areas of the Internet of Things. This article describes the solution of the problem of virtualization of the system of devices based on the Docker system. This solution allows developers to test any software on any number of devices forming a mesh.

  5. Through-process modelling of texture and anisotropy in AA5182

    NASA Astrophysics Data System (ADS)

    Crumbach, M.; Neumann, L.; Goerdeler, M.; Aretz, H.; Gottstein, G.; Kopp, R.

    2006-07-01

    A through-process texture and anisotropy prediction for AA5182 sheet production from hot rolling through cold rolling and annealing is reported. Thermo-mechanical process data predicted by the finite element method (FEM) package T-Pack based on the software LARSTRAN were fed into a combination of physics based microstructure models for deformation texture (GIA), work hardening (3IVM), nucleation texture (ReNuc), and recrystallization texture (StaRT). The final simulated sheet texture was fed into a FEM simulation of cup drawing employing a new concept of interactively updated texture based yield locus predictions. The modelling results of texture development and anisotropy were compared to experimental data. The applicability to other alloys and processes is discussed.

  6. The Start of a Tech Revolution

    ERIC Educational Resources Information Center

    Dyrli, Kurt O.

    2009-01-01

    We are at the start of a revolution in the use of computers, one that analysts predict will rival the development of the PC in its significance. Companies such as Google, HP, Amazon, Sun Microsystems, Sony, IBM, and Apple are orienting their entire business models toward this change, and software maker SAS has announced plans for a $70 million…

  7. NASA's Software Safety Standard

    NASA Technical Reports Server (NTRS)

    Ramsay, Christopher M.

    2005-01-01

    NASA (National Aeronautics and Space Administration) relies more and more on software to control, monitor, and verify its safety critical systems, facilities and operations. Since the 1960's there has hardly been a spacecraft (manned or unmanned) launched that did not have a computer on board that provided vital command and control services. Despite this growing dependence on software control and monitoring, there has been no consistent application of software safety practices and methodology to NASA's projects with safety critical software. Led by the NASA Headquarters Office of Safety and Mission Assurance, the NASA Software Safety Standard (STD-18l9.13B) has recently undergone a significant update in an attempt to provide that consistency. This paper will discuss the key features of the new NASA Software Safety Standard. It will start with a brief history of the use and development of software in safety critical applications at NASA. It will then give a brief overview of the NASA Software Working Group and the approach it took to revise the software engineering process across the Agency.

  8. Monte Carlo Simulation Tool Installation and Operation Guide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.

    2013-09-02

    This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection bymore » an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.« less

  9. PC-assisted translation of photogrammetric papers

    NASA Astrophysics Data System (ADS)

    Güthner, Karlheinz; Peipe, Jürgen

    A PC-based system for machine translation of photogrammetric papers from the English into the German language and vice versa is described. The computer-assisted translating process is not intended to create a perfect interpretation of a text but to produce a rough rendering of the content of a paper. Starting with the original text, a continuous data flow is effected into the translated version by means of hardware (scanner, personal computer, printer) and software (OCR, translation, word processing, DTP). An essential component of the system is a photogrammetric microdictionary which is being established at present. It is based on several sources, including e.g. the ISPRS Multilingual Dictionary.

  10. EUGÈNE'HOM: a generic similarity-based gene finder using multiple homologous sequences

    PubMed Central

    Foissac, Sylvain; Bardou, Philippe; Moisan, Annick; Cros, Marie-Josée; Schiex, Thomas

    2003-01-01

    EUGÈNE'HOM is a gene prediction software for eukaryotic organisms based on comparative analysis. EUGÈNE'HOM is able to take into account multiple homologous sequences from more or less closely related organisms. It integrates the results of TBLASTX analysis, splice site and start codon prediction and a robust coding/non-coding probabilistic model which allows EUGÈNE'HOM to handle sequences from a variety of organisms. The current target of EUGÈNE'HOM is plant sequences. The EUGÈNE'HOM web site is available at http://genopole.toulouse.inra.fr/bioinfo/eugene/EuGeneHom/cgi-bin/EuGeneHom.pl. PMID:12824408

  11. Just Another Gibbs Sampler (JAGS): Flexible Software for MCMC Implementation

    ERIC Educational Resources Information Center

    Depaoli, Sarah; Clifton, James P.; Cobb, Patrice R.

    2016-01-01

    A review of the software Just Another Gibbs Sampler (JAGS) is provided. We cover aspects related to history and development and the elements a user needs to know to get started with the program, including (a) definition of the data, (b) definition of the model, (c) compilation of the model, and (d) initialization of the model. An example using a…

  12. DoD Application Store: Enabling C2 Agility?

    DTIC Science & Technology

    2014-06-01

    Framework, will include automated delivery of software patches, web applications, widgets and mobile application packages. The envisioned DoD...Marketplace within the Ozone Widget Framework, will include automated delivery of software patches, web applications, widgets and mobile application...current needs. DoD has started to make inroads within this environment with several Programs of Record (PoR) embracing widgets and other mobile

  13. Control software for two dimensional airfoil tests using a self-streamlining flexible walled transonic test section

    NASA Technical Reports Server (NTRS)

    Wolf, S. W. D.; Goodyer, M. J.

    1982-01-01

    Operation of the Transonic Self-Streamlining Wind Tunnel (TSWT) involved on-line data acquisition with automatic wall adjustment. A tunnel run consisted of streamlining the walls from known starting contours in iterative steps and acquiring model data. Each run performs what is described as a streamlining cycle. The associated software is presented.

  14. Teachers, Computers & Kids: Recipes for Success in Early Childhood Settings. Kids and Computers, Number 1.

    ERIC Educational Resources Information Center

    Crowe, Suzy; Penney, Elaine

    This book is the first volume in the "Kids and Computers" series, a series of books designed to help adults easily use high-quality, developmentally appropriate software with children. After reviewing the basics of selected software packages (how to start the program, stop the program, move around, and use special keys) several ideas and…

  15. Keeping Things Interesting: A Reuse Case Study

    NASA Astrophysics Data System (ADS)

    Troisi, V.; Swick, R.; Seufert, E.

    2006-12-01

    Software reuse has several obvious advantages. By taking advantage of the experience and skill of colleagues one not only saves time, money and resources, but can also jump start a project that might otherwise have floundered from the start, or not even have been possible. One of the least talked about advantages of software reuse is it helps keep the work interesting for the developers. Reuse prevents developers from spending time and energy writing software solutions to problems that have already been solved, and frees them to concentrate on solving new problems, developing new components, and doing things that have never been done before. At the National Snow and Ice Data Center we are fortunate our user community has some unique needs that aren't met by mainstream solutions. Consequently we look for reuse opportunities wherever possible so we can focus on the tasks that add value for our user community. This poster offers a case study of one thread through a decade of reuse at NSIDC that has involved eight different development efforts to date.

  16. Connecting Primary Health Care: A Comprehensive Pilot Study.

    PubMed

    Maghsoudloo, Mehran; Abolhassani, Farid; Lotfibakhshaiesh, Nasrin

    2016-07-01

    The collection of data within the primary health care facilities in Iran is essentially paper-based. It is focused on family's health, monitoring of non-infectious and infectious diseases. Clearly due to the paper-based nature of the tasks, timely decision making at most can be difficult if not impossible. As part of an on-going electronic health record implementation project at Tehran University of Medical Sciences, for the first time in the region, based on a comprehensive pilot project, four urban healthcare facilities are connected to their headquarters and beyond, covering all aspects of primary health care, for the last four years. Without delving into the technical aspects of its software engineering processes, the progress of the implementation is reported, selection of summarized data is presented, and experience gained thus far are discussed. Four years passed and if time is any important reason to go by, then it is safe to accept that the software architecture and electronic health record structural model implemented are robust and yet extensible. Aims and duration of a pilot study should be clearly defined prior to start and managed till its completion. Resistance to change and particularly to information technology, apart from its technical aspects, is also based on human factors.

  17. Analytical approaches to image orientation and stereo digitization applied in the Budnlab software. (Polish Title: Rozwiazania analityczne zwiazane z obsluga procesu orientacji zdjec oraz wykonywaniem opracowan wektorowych w programie Bundlab)

    NASA Astrophysics Data System (ADS)

    Kolecki, J.

    2015-12-01

    The Bundlab software has been developed mainly for academic and research application. This work can be treated as a kind of a report describing the current state of the development of this computer program, focusing especially on the analytical solutions. Firstly, the overall characteristics of the software are provided. Then the description of the image orientation procedure starting from the relative orientation is addressed. The applied solution is based on the coplanarity equation parametrized with the essential matrix. The problem is reformulated in order to solve it using methods of algebraic geometry. The solution is followed by the optimization involving the least square criterion. The formation of the image block from the oriented models as well as the absolute orientation procedure were implemented using the Horn approach as a base algorithm. The second part of the paper is devoted to the tools and methods applied in the stereo digitization module. The solutions that support the user and improve the accuracy are given. Within the paper a few exemplary applications and products are mentioned. The work finishes with the concepts of development and improvements of existing functions.

  18. Managing Scientific Software Complexity with Bocca and CCA

    DOE PAGES

    Allan, Benjamin A.; Norris, Boyana; Elwasif, Wael R.; ...

    2008-01-01

    In high-performance scientific software development, the emphasis is often on short time to first solution. Even when the development of new components mostly reuses existing components or libraries and only small amounts of new code must be created, dealing with the component glue code and software build processes to obtain complete applications is still tedious and error-prone. Component-based software meant to reduce complexity at the application level increases complexity to the extent that the user must learn and remember the interfaces and conventions of the component model itself. To address these needs, we introduce Bocca, the first tool to enablemore » application developers to perform rapid component prototyping while maintaining robust software-engineering practices suitable to HPC environments. Bocca provides project management and a comprehensive build environment for creating and managing applications composed of Common Component Architecture components. Of critical importance for high-performance computing (HPC) applications, Bocca is designed to operate in a language-agnostic way, simultaneously handling components written in any of the languages commonly used in scientific applications: C, C++, Fortran, Python and Java. Bocca automates the tasks related to the component glue code, freeing the user to focus on the scientific aspects of the application. Bocca embraces the philosophy pioneered by Ruby on Rails for web applications: start with something that works, and evolve it to the user's purpose.« less

  19. A toolbox for developing bioinformatics software

    PubMed Central

    Potrzebowski, Wojciech; Puton, Tomasz; Rother, Magdalena; Wywial, Ewa; Bujnicki, Janusz M.

    2012-01-01

    Creating useful software is a major activity of many scientists, including bioinformaticians. Nevertheless, software development in an academic setting is often unsystematic, which can lead to problems associated with maintenance and long-term availibility. Unfortunately, well-documented software development methodology is difficult to adopt, and technical measures that directly improve bioinformatic programming have not been described comprehensively. We have examined 22 software projects and have identified a set of practices for software development in an academic environment. We found them useful to plan a project, support the involvement of experts (e.g. experimentalists), and to promote higher quality and maintainability of the resulting programs. This article describes 12 techniques that facilitate a quick start into software engineering. We describe 3 of the 22 projects in detail and give many examples to illustrate the usage of particular techniques. We expect this toolbox to be useful for many bioinformatics programming projects and to the training of scientific programmers. PMID:21803787

  20. Breaking the hype cycle: using the computer effectively with learners with intellectual disabilities.

    PubMed

    Lloyd, Jan; Moni, Karen B; Jobling, Anne

    2006-06-01

    There has been huge growth in the use of information technology (IT) in classrooms for learners of all ages. It has been suggested that computers in the classroom encourage independent and self-paced learning, provide immediate feedback and improve self-motivation and self-confidence. Concurrently there is increasing interest related to the role of technology in educational programs for individuals with intellectual disabilities. However, although many claims are made about the benefits of computers and software packages there is limited evidence based information to support these claims. Researchers are now starting to look at the specific instructional design features that are hypothesised to facilitate education outcomes rather than the over-emphasis on graphics and sounds. Research undertaken as part of a post-school program (Latch-On: Literacy and Technology - Hands On) at the University of Queensland investigated the use of computers by young adults with intellectual disabilities. The aims of the research reported in this paper were to address the challenges identified in the 'hype' surrounding different pieces of educational software and to develop a means of systematically analysing software for use in teaching programs.

  1. JSBML: a flexible Java library for working with SBML.

    PubMed

    Dräger, Andreas; Rodriguez, Nicolas; Dumousseau, Marine; Dörr, Alexander; Wrzodek, Clemens; Le Novère, Nicolas; Zell, Andreas; Hucka, Michael

    2011-08-01

    The specifications of the Systems Biology Markup Language (SBML) define standards for storing and exchanging computer models of biological processes in text files. In order to perform model simulations, graphical visualizations and other software manipulations, an in-memory representation of SBML is required. We developed JSBML for this purpose. In contrast to prior implementations of SBML APIs, JSBML has been designed from the ground up for the Java programming language, and can therefore be used on all platforms supported by a Java Runtime Environment. This offers important benefits for Java users, including the ability to distribute software as Java Web Start applications. JSBML supports all SBML Levels and Versions through Level 3 Version 1, and we have strived to maintain the highest possible degree of compatibility with the popular library libSBML. JSBML also supports modules that can facilitate the development of plugins for end user applications, as well as ease migration from a libSBML-based backend. Source code, binaries and documentation for JSBML can be freely obtained under the terms of the LGPL 2.1 from the website http://sbml.org/Software/JSBML.

  2. Use of XML and Java for collaborative petroleum reservoir modeling on the Internet

    NASA Astrophysics Data System (ADS)

    Victorine, John; Watney, W. Lynn; Bhattacharya, Saibal

    2005-11-01

    The GEMINI (Geo-Engineering Modeling through INternet Informatics) is a public-domain, web-based freeware that is made up of an integrated suite of 14 Java-based software tools to accomplish on-line, real-time geologic and engineering reservoir modeling. GEMINI facilitates distant collaborations for small company and academic clients, negotiating analyses of both single and multiple wells. The system operates on a single server and an enterprise database. External data sets must be uploaded into this database. Feedback from GEMINI users provided the impetus to develop Stand Alone Web Start Applications of GEMINI modules that reside in and operate from the user's PC. In this version, the GEMINI modules run as applets, which may reside in local user PCs, on the server, or Java Web Start. In this enhanced version, XML-based data handling procedures are used to access data from remote and local databases and save results for later access and analyses. The XML data handling process also integrates different stand-alone GEMINI modules enabling the user(s) to access multiple databases. It provides flexibility to the user to customize analytical approach, database location, and level of collaboration. An example integrated field-study using GEMINI modules and Stand Alone Web Start Applications is provided to demonstrate the versatile applicability of this freeware for cost-effective reservoir modeling.

  3. Use of XML and Java for collaborative petroleum reservoir modeling on the Internet

    USGS Publications Warehouse

    Victorine, J.; Watney, W.L.; Bhattacharya, S.

    2005-01-01

    The GEMINI (Geo-Engineering Modeling through INternet Informatics) is a public-domain, web-based freeware that is made up of an integrated suite of 14 Java-based software tools to accomplish on-line, real-time geologic and engineering reservoir modeling. GEMINI facilitates distant collaborations for small company and academic clients, negotiating analyses of both single and multiple wells. The system operates on a single server and an enterprise database. External data sets must be uploaded into this database. Feedback from GEMINI users provided the impetus to develop Stand Alone Web Start Applications of GEMINI modules that reside in and operate from the user's PC. In this version, the GEMINI modules run as applets, which may reside in local user PCs, on the server, or Java Web Start. In this enhanced version, XML-based data handling procedures are used to access data from remote and local databases and save results for later access and analyses. The XML data handling process also integrates different stand-alone GEMINI modules enabling the user(s) to access multiple databases. It provides flexibility to the user to customize analytical approach, database location, and level of collaboration. An example integrated field-study using GEMINI modules and Stand Alone Web Start Applications is provided to demonstrate the versatile applicability of this freeware for cost-effective reservoir modeling. ?? 2005 Elsevier Ltd. All rights reserved.

  4. Arch-Axis Coefficient Optimization of Long-Span Deck-Type Concrete-Filled Steel Tubular Arch Bridge

    NASA Astrophysics Data System (ADS)

    Liu, Q. J.; Wan, S.; Liu, H. C.

    2017-11-01

    This paper is based on Nanpuxi super major bridge which is under construction and starts from Wencheng Zhejiang province to Taishun highway. A finite element model of the whole bridge is constructed using Midas Civil finite element software. The most adverse load combination in the specification is taken into consideration to determine the method of calculating the arch-axis coefficient of long-span deck-type concrete-filled steel tubular arch bridge. By doing this, this paper aims at providing references for similar engineering projects.

  5. Using PATIMDB to Create Bacterial Transposon Insertion Mutant Libraries

    PubMed Central

    Urbach, Jonathan M.; Wei, Tao; Liberati, Nicole; Grenfell-Lee, Daniel; Villanueva, Jacinto; Wu, Gang; Ausubel, Frederick M.

    2015-01-01

    PATIMDB is a software package for facilitating the generation of transposon mutant insertion libraries. The software has two main functions: process tracking and automated sequence analysis. The process tracking function specifically includes recording the status and fates of multiwell plates and samples in various stages of library construction. Automated sequence analysis refers specifically to the pipeline of sequence analysis starting with ABI files from a sequencing facility and ending with insertion location identifications. The protocols in this unit describe installation and use of PATIMDB software. PMID:19343706

  6. Advanced Computing Technologies for Rocket Engine Propulsion Systems: Object-Oriented Design with C++

    NASA Technical Reports Server (NTRS)

    Bekele, Gete

    2002-01-01

    This document explores the use of advanced computer technologies with an emphasis on object-oriented design to be applied in the development of software for a rocket engine to improve vehicle safety and reliability. The primary focus is on phase one of this project, the smart start sequence module. The objectives are: 1) To use current sound software engineering practices, object-orientation; 2) To improve on software development time, maintenance, execution and management; 3) To provide an alternate design choice for control, implementation, and performance.

  7. GSOSTATS Database: USAF Synchronous Satellite Catalog Data Conversion Software. User's Guide and Software Maintenance Manual, Version 2.1

    NASA Technical Reports Server (NTRS)

    Mallasch, Paul G.; Babic, Slavoljub

    1994-01-01

    The United States Air Force (USAF) provides NASA Lewis Research Center with monthly reports containing the Synchronous Satellite Catalog and the associated Two Line Mean Element Sets. The USAF Synchronous Satellite Catalog supplies satellite orbital parameters collected by an automated monitoring system and provided to Lewis Research Center as text files on magnetic tape. Software was developed to facilitate automated formatting, data normalization, cross-referencing, and error correction of Synchronous Satellite Catalog files before loading into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). This document contains the User's Guide and Software Maintenance Manual with information necessary for installation, initialization, start-up, operation, error recovery, and termination of the software application. It also contains implementation details, modification aids, and software source code adaptations for use in future revisions.

  8. Impact of Growing Business on Software Processes

    NASA Astrophysics Data System (ADS)

    Nikitina, Natalja; Kajko-Mattsson, Mira

    When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.

  9. Bladder cancer mapping in Libya based on standardized morbidity ratio and log-normal model

    NASA Astrophysics Data System (ADS)

    Alhdiri, Maryam Ahmed; Samat, Nor Azah; Mohamed, Zulkifley

    2017-05-01

    Disease mapping contains a set of statistical techniques that detail maps of rates based on estimated mortality, morbidity, and prevalence. A traditional approach to measure the relative risk of the disease is called Standardized Morbidity Ratio (SMR). It is the ratio of an observed and expected number of accounts in an area, which has the greatest uncertainty if the disease is rare or if geographical area is small. Therefore, Bayesian models or statistical smoothing based on Log-normal model are introduced which might solve SMR problem. This study estimates the relative risk for bladder cancer incidence in Libya from 2006 to 2007 based on the SMR and log-normal model, which were fitted to data using WinBUGS software. This study starts with a brief review of these models, starting with the SMR method and followed by the log-normal model, which is then applied to bladder cancer incidence in Libya. All results are compared using maps and tables. The study concludes that the log-normal model gives better relative risk estimates compared to the classical method. The log-normal model has can overcome the SMR problem when there is no observed bladder cancer in an area.

  10. ICEG2D (v2.0) - An Integrated Software Package for Automated Prediction of Flow Fields for Single-Element Airfoils With Ice Accretion

    NASA Technical Reports Server (NTRS)

    Thompson David S.; Soni, Bharat K.

    2001-01-01

    An integrated geometry/grid/simulation software package, ICEG2D, is being developed to automate computational fluid dynamics (CFD) simulations for single- and multi-element airfoils with ice accretions. The current version, ICEG213 (v2.0), was designed to automatically perform four primary functions: (1) generate a grid-ready surface definition based on the geometrical characteristics of the iced airfoil surface, (2) generate high-quality structured and generalized grids starting from a defined surface definition, (3) generate the input and restart files needed to run the structured grid CFD solver NPARC or the generalized grid CFD solver HYBFL2D, and (4) using the flow solutions, generate solution-adaptive grids. ICEG2D (v2.0) can be operated in either a batch mode using a script file or in an interactive mode by entering directives from a command line within a Unix shell. This report summarizes activities completed in the first two years of a three-year research and development program to address automation issues related to CFD simulations for airfoils with ice accretions. As well as describing the technology employed in the software, this document serves as a users manual providing installation and operating instructions. An evaluation of the software is also presented.

  11. Comparison of 2 resident learning tools-interactive screen-based simulated case scenarios versus problem-based learning discussions: a prospective quasi-crossover cohort study.

    PubMed

    Rajan, Shobana; Khanna, Ashish; Argalious, Maged; Kimatian, Stephen J; Mascha, Edward J; Makarova, Natalya; Nada, Eman M; Elsharkawy, Hesham; Firoozbakhsh, Farhad; Avitsian, Rafi

    2016-02-01

    Simulation-based learning is emerging as an alternative educational tool in this era of a relative shortfall of teaching anesthesiologists. The objective of the study is to assess whether screen-based (interactive computer simulated) case scenarios are more effective than problem-based learning discussions (PBLDs) in improving test scores 4 and 8 weeks after these interventions in anesthesia residents during their first neuroanesthesia rotation. Prospective, nonblinded quasi-crossover study. Cleveland Clinic. Anesthesiology residents. Two case scenarios were delivered from the Anesoft software as screen-based sessions, and parallel scripts were developed for 2 PBLDs. Each resident underwent both types of training sessions, starting with the PBLD session, and the 2 cases were alternated each month (ie, in 1 month, the screen-based intervention used case 1 and the PBLD used case 2, and vice versa for the next month). Test scores before the rotation (baseline), immediately after the rotation (4 weeks after the start of the rotation), and 8 weeks after the start of rotation were collected on each topic from each resident. The effect of training method on improvement in test scores was assessed using a linear mixed-effects model. Compared to the departmental standard of PBLD, the simulation method did not improve either the 4- or 8-week mean test scores (P = .41 and P = .40 for training method effect on 4- and 8-week scores, respectively). Resident satisfaction with the simulation module on a 5-point Likert scale showed subjective evidence of a positive impact on resident education. Screen-based simulators were not more effective than PBLD for education during the neuroanesthesia rotation in anesthesia residency. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. GST-PRIME: an algorithm for genome-wide primer design.

    PubMed

    Leister, Dario; Varotto, Claudio

    2007-01-01

    The profiling of mRNA expression based on DNA arrays has become a powerful tool to study genome-wide transcription of genes in a number of organisms. GST-PRIME is a software package created to facilitate large-scale primer design for the amplification of probes to be immobilized on arrays for transcriptome analyses, even though it can be also applied in low-throughput approaches. GST-PRIME allows highly efficient, direct amplification of gene-sequence tags (GSTs) from genomic DNA (gDNA), starting from annotated genome or transcript sequences. GST-PRIME provides a customer-friendly platform for automatic primer design, and despite the relative simplicity of the algorithm, experimental tests in the model plant species Arabidopsis thaliana confirmed the reliability of the software. This chapter describes the algorithm used for primer design, its input and output files, and the installation of the standalone package and its use.

  13. Application of program generation technology in solving heat and flow problems

    NASA Astrophysics Data System (ADS)

    Wan, Shui; Wu, Bangxian; Chen, Ningning

    2007-05-01

    Based on a new DIY concept for software development, an automatic program-generating technology attached on a software system called as Finite Element Program Generator (FEPG) provides a platform of developing programs, through which a scientific researcher can submit his special physico-mathematical problem to the system in a more direct and convenient way for solution. For solving flow and heat problems by using finite element method, the stabilization technologies and fraction-step methods are adopted to overcome the numerical difficulties caused mainly due to the dominated convection. A couple of benchmark problems are given in this paper as examples to illustrate the usage and the superiority of the automatic program generation technique, including the flow in a lid-driven cavity, the starting flow in a circular pipe, the natural convection in a square cavity, and the flow past a circular cylinder, etc. They are also shown as the verification of the algorithms.

  14. Long distance education for croatian nurses with open source software.

    PubMed

    Radenovic, Aleksandar; Kalauz, Sonja

    2006-01-01

    Croatian Nursing Informatics Association (CNIA) has been established as result of continuing work on promoting nursing informatics in Croatia. Main goals of CNIA are promoting nursing informatics and education of nurses about nursing informatics and using information technology in nursing process. CNIA in start of work is developed three courses from nursing informatics all designed with support of long distance education with open source software. Courses are: A - 'From Data to Wisdom', B - 'Introduction to Nursing Informatics' and C - 'Nursing Informatics I'. Courses A and B are obligatory for C course. Technology used to implement these online courses is based on the open source Learning Management System (LMS), Claroline, free online collaborative learning platform. Courses are divided in two modules/days. First module/day participants have classical approach to education and second day with E-learning from home. These courses represent first courses from nursing informatics' and first long distance education for nurses also.

  15. Tatool: a Java-based open-source programming framework for psychological studies.

    PubMed

    von Bastian, Claudia C; Locher, André; Ruflin, Michael

    2013-03-01

    Tatool (Training and Testing Tool) was developed to assist researchers with programming training software, experiments, and questionnaires. Tatool is Java-based, and thus is a platform-independent and object-oriented framework. The architecture was designed to meet the requirements of experimental designs and provides a large number of predefined functions that are useful in psychological studies. Tatool comprises features crucial for training studies (e.g., configurable training schedules, adaptive training algorithms, and individual training statistics) and allows for running studies online via Java Web Start. The accompanying "Tatool Online" platform provides the possibility to manage studies and participants' data easily with a Web-based interface. Tatool is published open source under the GNU Lesser General Public License, and is available at www.tatool.ch.

  16. Intelligent Agents for Design and Synthesis Environments: My Summary

    NASA Technical Reports Server (NTRS)

    Norvig, Peter

    1999-01-01

    This presentation gives a summary of intelligent agents for design synthesis environments. We'll start with the conclusions, and work backwards to justify them. First, an important assumption is that agents (whatever they are) are good for software engineering. This is especially true for software that operates in an uncertain, changing environment. The "real world" of physical artifacts is like that: uncertain in what we can measure, changing in that things are always breaking down, and we must interact with non-software entities. The second point is that software engineering techniques can contribute to good design. There may have been a time when we wanted to build simple artifacts containing little or no software. But modern aircraft and spacecraft are complex, and rely on a great deal of software. So better software engineering leads to better designed artifacts, especially when we are designing a series of related artifacts and can amortize the costs of software development. The third point is that agents are especially useful for design tasks, above and beyond their general usefulness for software engineering, and the usefulness of software engineering to design.

  17. Software for Improved Extraction of Data From Tape Storage

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2003-01-01

    A computer program has been written to replace the original software of Racal Storeplex Delta tape recorders, which are used at Stennis Space Center. The original software could be activated by a command- line interface only; the present software offers the option of a command-line or graphical user interface. The present software also offers the option of batch-file operation (activation by a file that contains command lines for operations performed consecutively). The present software is also more reliable than was the original software: The original software was plagued by several deficiencies that made it difficult to execute, modify, and test. In addition, when using the original software to extract data that had been recorded within specified intervals of time, the resolution with which one could control starting and stopping times was no finer than about a second (or, in some cases, several seconds). In contrast, the present software is capable of controlling playback times to within 1/100 second of times specified by the user, assuming that the tape-recorder clock is accurate to within 1/100 second.

  18. Software for Improved Extraction of Data From Tape Storage

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2002-01-01

    A computer program has been written to replace the original software of Racal Storeplex Delta tape recorders, which are still used at Stennis Space Center but have been discontinued by the manufacturer. Whereas the original software could be activated by a command-line interface only, the present software offers the option of a command-line or graphical user interface. The present software also offers the option of batch-file operation (activation by a file that contains command lines for operations performed consecutively). The present software is also more reliable than was the original software: The original software was plagued by several deficiencies that made it difficult to execute, modify, and test. In addition, when using the original software to extract data that had been recorded within specified intervals of time, the resolution with which one could control starting and stopping times was no finer than about a second (or, in some cases, several seconds). In contrast, the present software is capable of controlling playback times to within 1/100 second of times specified by the user, assuming that the tape-recorder clock is accurate to within 1/100 second.

  19. Peak Doctor v 1.0.0 Labview Version

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garner, Scott

    2014-05-29

    PeakDoctor software works interactively with its user to analyze raw gamma-ray spectroscopic data. The goal of the software is to produce a list of energies and areas of all of the peaks in the spectrum, as accurately as possible. It starts by performing an energy calibration, creating a function that describes how energy can be related to channel number. Next, the software determines which channels in the raw histogram are in the Compton continuum and which channels are parts of a peak. Then the software fits the Compton continuum with cubic polynomials. The last step is to fit all ofmore » the peaks with Gaussian functions, thus producing the list.« less

  20. FreeTure: A Free software to capTure meteors for FRIPON

    NASA Astrophysics Data System (ADS)

    Audureau, Yoan; Marmo, Chiara; Bouley, Sylvain; Kwon, Min-Kyung; Colas, François; Vaubaillon, Jérémie; Birlan, Mirel; Zanda, Brigitte; Vernazza, Pierre; Caminade, Stephane; Gattecceca, Jérôme

    2014-02-01

    The Fireball Recovery and Interplanetary Observation Network (FRIPON) is a French project started in 2014 which will monitor the sky, using 100 all-sky cameras to detect meteors and to retrieve related meteorites on the ground. There are several detection software all around. Some of them are proprietary. Also, some of them are hardware dependent. We present here the open source software for meteor detection to be installed on the FRIPON network's stations. The software will run on Linux with gigabit Ethernet cameras and we plan to make it cross platform. This paper is focused on the meteor detection method used for the pipeline development and the present capabilities.

  1. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo

    2016-04-01

    RISS S.r.l. is a Spin-off company recently born from the initiative of the research group constituting the Seismology Laboratory of the Department of Physics of the University of Naples Federico II. RISS is an innovative start-up, based on the decade-long experience in earthquake monitoring systems and seismic data analysis of its members and has the major goal to transform the most recent innovations of the scientific research into technological products and prototypes. With this aim, RISS has recently started the development of a new software, which is an elegant solution to manage and analyse seismic data and to create automatic earthquake bulletins. The software has been initially developed to manage data recorded at the ISNet network (Irpinia Seismic Network), which is a network of seismic stations deployed in Southern Apennines along the active fault system responsible for the 1980, November 23, MS 6.9 Irpinia earthquake. The software, however, is fully exportable and can be used to manage data from different networks, with any kind of station geometry or network configuration and is able to provide reliable estimates of earthquake source parameters, whichever is the background seismicity level of the area of interest. Here we present the real-time automated procedures and the analyses performed by the software package, which is essentially a chain of different modules, each of them aimed at the automatic computation of a specific source parameter. The P-wave arrival times are first detected on the real-time streaming of data and then the software performs the phase association and earthquake binding. As soon as an event is automatically detected by the binder, the earthquake location coordinates and the origin time are rapidly estimated, using a probabilistic, non-linear, exploration algorithm. Then, the software is able to automatically provide three different magnitude estimates. First, the local magnitude (Ml) is computed, using the peak-to-peak amplitude of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non-expert external users who are interested in the seismological data. The software is a valid tool for the automatic analysis of the background seismicity at different time scales and can be a relevant tool for the monitoring of both natural and induced seismicity.

  2. Clinical evaluation of atlas and deep learning based automatic contouring for lung cancer.

    PubMed

    Lustberg, Tim; van Soest, Johan; Gooding, Mark; Peressutti, Devis; Aljabar, Paul; van der Stoep, Judith; van Elmpt, Wouter; Dekker, Andre

    2018-02-01

    Contouring of organs at risk (OARs) is an important but time consuming part of radiotherapy treatment planning. The aim of this study was to investigate whether using institutional created software-generated contouring will save time if used as a starting point for manual OAR contouring for lung cancer patients. Twenty CT scans of stage I-III NSCLC patients were used to compare user adjusted contours after an atlas-based and deep learning contour, against manual delineation. The lungs, esophagus, spinal cord, heart and mediastinum were contoured for this study. The time to perform the manual tasks was recorded. With a median time of 20 min for manual contouring, the total median time saved was 7.8 min when using atlas-based contouring and 10 min for deep learning contouring. Both atlas based and deep learning adjustment times were significantly lower than manual contouring time for all OARs except for the left lung and esophagus of the atlas based contouring. User adjustment of software generated contours is a viable strategy to reduce contouring time of OARs for lung radiotherapy while conforming to local clinical standards. In addition, deep learning contouring shows promising results compared to existing solutions. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  3. Coastal zone environment measurements at Sakhalin Island using autonomous mobile robotic system

    NASA Astrophysics Data System (ADS)

    Tyugin, Dmitry; Kurkin, Andrey; Zaytsev, Andrey; Zeziulin, Denis; Makarov, Vladimir

    2017-04-01

    To perform continuous complex measurements of environment characteristics in coastal zones autonomous mobile robotic system was built. The main advantage of such system in comparison to manual measurements is an ability to quickly change location of the equipment and start measurements. AMRS allows to transport a set of sensors and appropriate power source for long distances. The equipment installed on the AMRS includes: a modern high-tech ship's radar «Micran» for sea waves measurements, multiparameter platform WXT 520 for weather monitoring, high precision GPS/GLONASS receiver OS-203 for georeferencing, laser scanner platform based on two Sick LMS-511 scanners which can provide 3D distance measurements in up to 80 meters on the AMRS route and rugged designed quad-core fanless computer Matrix MXE-5400 for data collecting and recording. The equipment is controlled by high performance modular software developed specially for the AMRS. During the summer 2016 the experiment was conducted. Measurements took place at the coastal zone of Sakhalin Island (Russia). The measuring system of AMRS was started in automatic mode controlled by the software. As result a lot of data was collected and processed to database. It consists of continuous measurements of the coastal zone including different weather conditions. The most interesting for investigation is a period of three-point storm detected on June, 2, 2016. Further work will relate to data processing of measured environment characteristics and numerical models verification based on the collected data. The presented results of research obtained by the support of the Russian president's scholarship for young scientists and graduate students №SP-193.2015.5

  4. Parallelization of Rocket Engine System Software (Press)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1996-01-01

    The main goal is to assess parallelization requirements for the Rocket Engine Numeric Simulator (RENS) project which, aside from gathering information on liquid-propelled rocket engines and setting forth requirements, involve a large FORTRAN based package at NASA Lewis Research Center and TDK software developed by SUBR/UWF. The ultimate aim is to develop, test, integrate, and suitably deploy a family of software packages on various aspects and facets of rocket engines using liquid-propellants. At present, all project efforts by the funding agency, NASA Lewis Research Center, and the HBCU participants are disseminated over the internet using world wide web home pages. Considering obviously expensive methods of actual field trails, the benefits of software simulators are potentially enormous. When realized, these benefits will be analogous to those provided by numerous CAD/CAM packages and flight-training simulators. According to the overall task assignments, Hampton University's role is to collect all available software, place them in a common format, assess and evaluate, define interfaces, and provide integration. Most importantly, the HU's mission is to see to it that the real-time performance is assured. This involves source code translations, porting, and distribution. The porting will be done in two phases: First, place all software on Cray XMP platform using FORTRAN. After testing and evaluation on the Cray X-MP, the code will be translated to C + + and ported to the parallel nCUBE platform. At present, we are evaluating another option of distributed processing over local area networks using Sun NFS, Ethernet, TCP/IP. Considering the heterogeneous nature of the present software (e.g., first started as an expert system using LISP machines) which now involve FORTRAN code, the effort is expected to be quite challenging.

  5. From implant planning to surgical execution: an integrated approach for surgery in oral implantology.

    PubMed

    Chiarelli, Tommaso; Franchini, Federico; Lamma, Achille; Lamma, Evelina; Sansoni, Tommaso

    2012-03-01

    Using oral implantology software and transferring the preoperative planning into a stereolithographic model, prosthodontists can produce the related surgical guide. This procedure has some disadvantages: bone-supported stent invasiveness, lack of references due to scattering and non-negligible stereolithography cost. An alternative solution is presented that provides an ideal surgical stent (not invasive, precise, and cheap) as a result. This work focuses on the third phase of a fully 3D approach to oral implant planning, that starts by CT scanning a patient who wears a markers-equipped radiological stent, continues exploiting built-on-purpose preoperative planning software, and finishes producing the ideal surgical template. A 5-axes bur-equipped robot has been designed able to reproduce the milling vectors planned by the software. Software-robot interfacing has been achieved properly matching the stent reference frame and the software and robot coordinate systems. Invasiveness has been avoided achieving the surgical stent from the mucosa-supported radiological mask wax-up. Scattering is ignored because of the surgical stent independency from the bone structure radiography. Production cost has been strongly reduced by avoiding the stereolithographic model. Finally, software-robot interfacing precision has been validated comparing digitally a multi-marker base and its planning transfer. Average position and orientation errors (respectively 0.283 mm ± 0.073 mm and 1.798° ± 0.496°) were significantly better than those achieved using methods based on stereolithography (respectively, 1.45 mm ± 1.42 mm and 7.25° ± 2.67°, with a general best maximum translation discrepancy of about 1.1 mm). This paper describes the last step of a fully 3D approach in which implant planning can be done in a 3D environment, and the correct position, orientation and depth of the planned implants are easily computed and transferred to the surgical phase. Copyright © 2011 John Wiley & Sons, Ltd.

  6. Computer simulations of austenite decomposition of microalloyed 700 MPa steel during cooling

    NASA Astrophysics Data System (ADS)

    Pohjonen, Aarne; Paananen, Joni; Mourujärvi, Juho; Manninen, Timo; Larkiola, Jari; Porter, David

    2018-05-01

    We present computer simulations of austenite decomposition to ferrite and bainite during cooling. The phase transformation model is based on Johnson-Mehl-Avrami-Kolmogorov type equations. The model is parameterized by numerical fitting to continuous cooling data obtained with Gleeble thermo-mechanical simulator and it can be used for calculation of the transformation behavior occurring during cooling along any cooling path. The phase transformation model has been coupled with heat conduction simulations. The model includes separate parameters to account for the incubation stage and for the kinetics after the transformation has started. The incubation time is calculated with inversion of the CCT transformation start time. For heat conduction simulations we employed our own parallelized 2-dimensional finite difference code. In addition, the transformation model was also implemented as a subroutine in commercial finite-element software Abaqus which allows for the use of the model in various engineering applications.

  7. Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project

    NASA Astrophysics Data System (ADS)

    Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo

    2017-04-01

    The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.

  8. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.

  9. Mining Program Source Code for Improving Software Quality

    DTIC Science & Technology

    2013-01-01

    conduct static verification on the software application under analysis to detect defects around APIs. (a) Papers published in peer-reviewed journals...N/A for none) Enter List of papers submitted or published that acknowledge ARO support from the start of the project to the date of this printing...List the papers , including journal references, in the following categories: Received Paper 05/06/2013 21.00 Tao Xie, Suresh Thummalapenta, David Lo

  10. Industry Versus DoD: A Comparative Study of Software Reuse

    DTIC Science & Technology

    1994-09-01

    development costs and production time. By no means have they perfected reuse, but some corporations are starting to reap the benefits of their reuse...and cultural resistance (Garry, 1992). Reusable code is not a cure-all for programmers and does not always provide significant benefits . Quite often...and benefits , quality, achievable reuse goals, domain analysis, staff experience, development, and recognition of the effort involved (IEEE Software

  11. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  12. EDENetworks: a user-friendly software to build and analyse networks in biogeography, ecology and population genetics.

    PubMed

    Kivelä, Mikko; Arnaud-Haond, Sophie; Saramäki, Jari

    2015-01-01

    The recent application of graph-based network theory analysis to biogeography, community ecology and population genetics has created a need for user-friendly software, which would allow a wider accessibility to and adaptation of these methods. EDENetworks aims to fill this void by providing an easy-to-use interface for the whole analysis pipeline of ecological and evolutionary networks starting from matrices of species distributions, genotypes, bacterial OTUs or populations characterized genetically. The user can choose between several different ecological distance metrics, such as Bray-Curtis or Sorensen distance, or population genetic metrics such as FST or Goldstein distances, to turn the raw data into a distance/dissimilarity matrix. This matrix is then transformed into a network by manual or automatic thresholding based on percolation theory or by building the minimum spanning tree. The networks can be visualized along with auxiliary data and analysed with various metrics such as degree, clustering coefficient, assortativity and betweenness centrality. The statistical significance of the results can be estimated either by resampling the original biological data or by null models based on permutations of the data. © 2014 John Wiley & Sons Ltd.

  13. Using C to build a satellite scheduling expert system: Examples from the Explorer platform planning system

    NASA Technical Reports Server (NTRS)

    Mclean, David R.; Tuchman, Alan; Potter, William J.

    1991-01-01

    Recently, many expert systems were developed in a LISP environment and then ported to the real world C environment before the final system is delivered. This situation may require that the entire system be completely rewritten in C and may actually result in a system which is put together as quickly as possible with little regard for maintainability and further evolution. With the introduction of high performance UNIX and X-windows based workstations, a great deal of the advantages of developing a first system in the LISP environment have become questionable. A C-based AI development effort is described which is based on a software tools approach with emphasis on reusability and maintainability of code. The discussion starts with simple examples of how list processing can easily be implemented in C and then proceeds to the implementations of frames and objects which use dynamic memory allocation. The implementation of procedures which use depth first search, constraint propagation, context switching and a blackboard-like simulation environment are described. Techniques for managing the complexity of C-based AI software are noted, especially the object-oriented techniques of data encapsulation and incremental development. Finally, all these concepts are put together by describing the components of planning software called the Planning And Resource Reasoning (PARR) shell. This shell was successfully utilized for scheduling services of the Tracking and Data Relay Satellite System for the Earth Radiation Budget Satellite since May 1987 and will be used for operations scheduling of the Explorer Platform in November 1991.

  14. Towards a Community Environmental Observation Network

    NASA Astrophysics Data System (ADS)

    Mertl, Stefan; Lettenbichler, Anton

    2014-05-01

    The Community Environmental Observation Network (CEON) is dedicated to the development of a free sensor network to collect and distribute environmental data (e.g. ground shaking, climate parameters). The data collection will be done with contributions from citizens, research institutions and public authorities like communities or schools. This will lead to a large freely available data base which can be used for public information, research, the arts,..... To start a free sensor network, the most important step is to provide easy access to free data collection and -distribution tools. The initial aims of the project CEON are dedicated to the development of these tools. A high quality data logger based on open hardware and free software is developed and a software suite of already existing free software for near-real time data communication and data distribution over the Internet will be assembled. Foremost, the development focuses on the collection of data related to the deformation of the earth (such as ground shaking, surface displacement of mass movements and glaciers) and the collection of climate data. The extent to other measurements will be considered in the design. The data logger is built using open hardware prototyping platforms like BeagleBone Black and Arduino. Main features of the data logger are: a 24Bit analog-to-digital converter; a GPS module for time reference and positioning; wireless mesh networking using Optimized Link State Routing; near real-time data transmission and communication; and near real-time differential GNSS positioning using the RTKLIB software. The project CEON is supported by the Internet Foundation Austria (IPA) within the NetIdee 2013 call.

  15. Spindle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2013-04-04

    Spindle is software infrastructure that solves file system scalabiltiy problems associated with starting dynamically linked applications in HPC environments. When an HPC applications starts up thousands of pricesses at once, and those processes simultaneously access a shared file system to look for shared libraries, it can cause significant performance problems for both the application and other users. Spindle scalably coordinates the distribution of shared libraries to an application to avoid hammering the shared file system.

  16. Effects of Home and School Computer Use on School Readiness and Cognitive Development among Head Start Children: A Randomized Controlled Pilot Trial

    ERIC Educational Resources Information Center

    Li, Xiaoming; Atkins, Melissa S.; Stanton, Bonita

    2006-01-01

    Data from 122 Head Start children were analyzed to examine the impact of computer use on school readiness and psychomotor skills. Children in the experimental group were given the opportunity to work on a computer for 15-20 minutes per day with their choice of developmentally appropriate educational software, while the control group received a…

  17. Developing Teaching Material Software Assisted for Numerical Methods

    NASA Astrophysics Data System (ADS)

    Handayani, A. D.; Herman, T.; Fatimah, S.

    2017-09-01

    The NCTM vision shows the importance of two things in school mathematics, which is knowing the mathematics of the 21st century and the need to continue to improve mathematics education to answer the challenges of a changing world. One of the competencies associated with the great challenges of the 21st century is the use of help and tools (including IT), such as: knowing the existence of various tools for mathematical activity. One of the significant challenges in mathematical learning is how to teach students about abstract concepts. In this case, technology in the form of mathematics learning software can be used more widely to embed the abstract concept in mathematics. In mathematics learning, the use of mathematical software can make high level math activity become easier accepted by student. Technology can strengthen student learning by delivering numerical, graphic, and symbolic content without spending the time to calculate complex computing problems manually. The purpose of this research is to design and develop teaching materials software assisted for numerical method. The process of developing the teaching material starts from the defining step, the process of designing the learning material developed based on information obtained from the step of early analysis, learners, materials, tasks that support then done the design step or design, then the last step is the development step. The development of teaching materials software assisted for numerical methods is valid in content. While validator assessment for teaching material in numerical methods is good and can be used with little revision.

  18. How can GPs drive software changes to improve healthcare for Aboriginal and Torres Strait Islanders peoples?

    PubMed

    Kehoe, Helen

    2017-01-01

    Changes to the software used in general practice could improve the collection of the Aboriginal and Torres Strait Islander status of all patients, and boost access to healthcare measures specifically for Aboriginal and Torres Strait Islander peoples provided directly or indirectly by general practitioners (GPs). Despite longstanding calls for improvements to general practice software to better support Aboriginal and Torres Strait Islander health, little change has been made. The aim of this article is to promote software improvements by identifying desirable software attributes and encouraging GPs to promote their adoption. Establishing strong links between collecting Aboriginal and Torres Strait Islander status, clinical decision supports, and uptake of GP-mediated health measures specifically for Aboriginal and Torres Strait Islander peoples - and embedding these links in GP software - is a long overdue reform. In the absence of government initiatives in this area, GPs are best placed to advocate for software changes, using the model described here as a starting point for action.

  19. The control system of a 2kW@20K helium refrigerator

    NASA Astrophysics Data System (ADS)

    Pan, W.; Wu, J. H.; Li, Qing; Liu, L. Q.; Li, Qiang

    2017-12-01

    The automatic control of a helium refrigerator includes three aspects, that is, one-button start and stop control, safety protection control, and cooling capacity control. The 2kW@20K helium refrigerator’s control system uses the SIEMENS PLC S7-300 and its related programming and configuration software Step7 and the industrial monitoring software WinCC, to realize the dynamic control of its process, the real-time monitoring of its data, the safety interlock control, and the optimal control of its cooling capacity. At first, this paper describes the control architecture of the whole system in detail, including communication configuration and equipment introduction; and then introduces the sequence control strategy of the dynamic processes, including the start and stop control mode of the machine and the safety interlock control strategy of the machine; finally tells the precise control strategy of the machine’s cooling capacity. Eventually, the whole system achieves the target of one-button starting and stopping, automatic fault protection and stable running to the target cooling capacity, and help finished the cold helium pressurization test of aerospace products.

  20. An optimization program based on the method of feasible directions: Theory and users guide

    NASA Technical Reports Server (NTRS)

    Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.

    1994-01-01

    The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.

  1. Drug carrier in cancer therapy: A simulation study based on magnetic carrier substances

    NASA Astrophysics Data System (ADS)

    Adam, Tijjani; Dhahi, Th S.; Mohammed, Mohammed; Hashim, U.; Noriman, N. Z.; Dahham, Omar S.

    2017-09-01

    The principle of magnetic carrier is a medium for transferring information by sending the drug to the specific part to kill tumor cells. Generally, there are seven stages of cancer. Most of the patient with cancer can only be detected when reaches stage four. At that stage, the cancer is difficult to destroy or to cure. Comparing to the nearly stage, there are probability to destroy tumor cell completely by sending the drug through magnetic carrier directly to nerve. Another way to destroyed tumor completely is by using Deoxyribonucleic acid (DNA). This project is about the simulation study based on magnetic carrier substances. The COMSOL multiphysic software is used in this project. The simulation model represents a permanent magnet, blood vessel, surrounding tissues and air in 2D. Based on result obtained, the graph shown during sending the magnetic flux is high. However, as its carry information the magnetic flux reducess from the above, the move from 0m until 0.009 m it become the lowers and start increase the flux from this until maximum at 0.018m. This is due the fact that carrier start to increase after because the low information is gradually reduce until 0.018m.

  2. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  3. Rail-CR : railroad cognitive radio.

    DOT National Transportation Integrated Search

    2012-12-01

    Robust, reliable, and interoperable wireless communication devices or technologies are vital to the success of positive train control (PTC) systems. Accordingly, the railway industry has started adopting software-defined radios (SDRs) for packet-data...

  4. SiFAP: a Simple Sub-Millisecond Astronomical Photometer

    NASA Astrophysics Data System (ADS)

    Ambrosino, F.; Meddi, F.; Nesci, R.; Rossi, C.; Sclavi, S.; Bruni, I.

    2013-09-01

    A new fast photometer based on SiPM technology was developed at the University of Rome "La Sapienza" starting from 2009. A first prototype was successfully tested observing the Crab pulsar at the Loiano telescope of the Bologna Observatory. In this paper we illustrate the improvements we applied to our instrument, concerning new cooled commercial sensors, a new version of our custom dedicated electronics and an upgraded control timing software. Finally we report the results obtained with this instrument on December 2012 on the Crab pulsar at the Loiano telescope to show its goodness and capabilities.

  5. Allele-sharing models: LOD scores and accurate linkage tests.

    PubMed

    Kong, A; Cox, N J

    1997-11-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested.

  6. Allele-sharing models: LOD scores and accurate linkage tests.

    PubMed Central

    Kong, A; Cox, N J

    1997-01-01

    Starting with a test statistic for linkage analysis based on allele sharing, we propose an associated one-parameter model. Under general missing-data patterns, this model allows exact calculation of likelihood ratios and LOD scores and has been implemented by a simple modification of existing software. Most important, accurate linkage tests can be performed. Using an example, we show that some previously suggested approaches to handling less than perfectly informative data can be unacceptably conservative. Situations in which this model may not perform well are discussed, and an alternative model that requires additional computations is suggested. PMID:9345087

  7. Executable medical guidelines with Arden Syntax-Applications in dermatology and obstetrics.

    PubMed

    Seitinger, Alexander; Rappelsberger, Andrea; Leitich, Harald; Binder, Michael; Adlassnig, Klaus-Peter

    2016-08-12

    Clinical decision support systems (CDSSs) are being developed to assist physicians in processing extensive data and new knowledge based on recent scientific advances. Structured medical knowledge in the form of clinical alerts or reminder rules, decision trees or tables, clinical protocols or practice guidelines, score algorithms, and others, constitute the core of CDSSs. Several medical knowledge representation and guideline languages have been developed for the formal computerized definition of such knowledge. One of these languages is Arden Syntax for Medical Logic Systems, an International Health Level Seven (HL7) standard whose development started in 1989. Its latest version is 2.10, which was presented in 2014. In the present report we discuss Arden Syntax as a modern medical knowledge representation and processing language, and show that this language is not only well suited to define clinical alerts, reminders, and recommendations, but can also be used to implement and process computerized medical practice guidelines. This section describes how contemporary software such as Java, server software, web-services, XML, is used to implement CDSSs based on Arden Syntax. Special emphasis is given to clinical decision support (CDS) that employs practice guidelines as its clinical knowledge base. Two guideline-based applications using Arden Syntax for medical knowledge representation and processing were developed. The first is a software platform for implementing practice guidelines from dermatology. This application employs fuzzy set theory and logic to represent linguistic and propositional uncertainty in medical data, knowledge, and conclusions. The second application implements a reminder system based on clinically published standard operating procedures in obstetrics to prevent deviations from state-of-the-art care. A to-do list with necessary actions specifically tailored to the gestational week/labor/delivery is generated. Today, with the latest versions of Arden Syntax and the application of contemporary software development methods, Arden Syntax has become a powerful and versatile medical knowledge representation and processing language, well suited to implement a large range of CDSSs, including clinical-practice-guideline-based CDSSs. Moreover, such CDS is provided and can be shared as a service by different medical institutions, redefining the sharing of medical knowledge. Arden Syntax is also highly flexible and provides developers the freedom to use up-to-date software design and programming patterns for external patient data access. Copyright © 2016. Published by Elsevier B.V.

  8. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  9. Segmentation of human brain using structural MRI.

    PubMed

    Helms, Gunther

    2016-04-01

    Segmentation of human brain using structural MRI is a key step of processing in imaging neuroscience. The methods have undergone a rapid development in the past two decades and are now widely available. This non-technical review aims at providing an overview and basic understanding of the most common software. Starting with the basis of structural MRI contrast in brain and imaging protocols, the concepts of voxel-based and surface-based segmentation are discussed. Special emphasis is given to the typical contrast features and morphological constraints of cortical and sub-cortical grey matter. In addition to the use for voxel-based morphometry, basic applications in quantitative MRI, cortical thickness estimations, and atrophy measurements as well as assignment of cortical regions and deep brain nuclei are briefly discussed. Finally, some fields for clinical applications are given.

  10. Battery Data MI Importer Template Quick Start Guide

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.

    2017-01-01

    In order to ensure the persistent availability and reliability of test data generated over the course of the project, the M-SHELLS Project has decided to store acquired test data, as well as associated pedigree information, in the Granta Materials Intelligence (MI) database. To facilitate that effort, an importer template and associated graphical user interface (GUI) software have been developed, with this guide providing the operating instructions for their use. The template and automation software GUI are contained in the BatteryDataImporter.xlsm Excel workbook, and are to be used to import M-SHELLS summary, or pedigree, data and the associated raw test data results into an importer template-based file, formatted in such a way as to be ready for immediate upload to the Test Data: Battery Performance table of the Granta MI database. The provided GUI enables the user to select the appropriate summary data file(s), with each file containing the required information to identify any associated raw test data file(s) to be processed. In addition to describing the setup and operation of the importer template and GUI software, this guide also provides instructions for uploading processed data to the database and for viewing the data following upload.

  11. Migration of the Gaudi and LHCb software repositories from CVS to Subversion

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Degaudenzi, H.; LHCb Collaboration

    2011-12-01

    A common code repository is of primary importance in a distributed development environment such as large HEP experiments. CVS (Concurrent Versions System) has been used in the past years at CERN for the hosting of shared software repositories, among which were the repositories for the Gaudi Framework and the LHCb software projects. Many developers around the world produced alternative systems to share code and revisions among several developers, mainly to overcome the limitations in CVS, and CERN has recently started a new service for code hosting based on the version control system Subversion. The differences between CVS and Subversion and the way the code was organized in Gaudi and LHCb CVS repositories required careful study and planning of the migration. Special care was used to define the organization of the new Subversion repository. To avoid as much as possible disruption in the development cycle, the migration has been gradual with the help of tools developed explicitly to hide the differences between the two systems. The principles guiding the migration steps, the organization of the Subversion repository and the tools developed will be presented, as well as the problems encountered both from the librarian and the user points of view.

  12. Development and Engineering Design in Support of "Rover Ranch": A K-12 Outreach Software Project

    NASA Technical Reports Server (NTRS)

    Pascali, Raresh

    2003-01-01

    A continuation of the initial development started in the summer of 1999, the body of work performed in support of 'ROVer Ranch' Project during the present fellowship dealt with the concrete concept implementation and resolution of the related issues. The original work performed last summer focused on the initial examination and articulation of the concept treatment strategy, audience and market analysis for the learning technologies software. The presented work focused on finalizing the set of parts to be made available for building an AERCam Sprint type robot and on defining, testing and implementing process necessary to convert the design engineering files to VRML files. Through reverse engineering, an initial set of mission critical systems was designed for beta testing in schools. The files were created in ProEngineer, exported to VRML 1.0 and converted to VRML 97 (VRML 2.0) for final integration in the software. Attributes for each part were assigned using an in-house developed JAVA based program. The final set of attributes for each system, their mutual interaction and the identification of the relevant ones to be tracked, still remain to be decided.

  13. Software Design Document MCC CSCI (1). Volume 2, Sections 2.18.1 - 2.22

    DTIC Science & Technology

    1991-06-01

    tparam pointer to long mnt +Standard C type. _____________________ Internal Variables _____________ Variable Type Where Typedef Declared td ...ist points to the last transaction pointer on the TimeList. The function call is AssocAddToStart~ffimieList( td , startTimeList, endTimeList). Table 2.20...42 describes the parameters used by this function. Parameters Parameter Type______ Where Typedef Declared td pointer to /simnetllibsrc/libassoc/assoc

  14. Contract for Manpower and Personnel Research and Studies II (COMPRS-II) Annual Report - Year Four

    DTIC Science & Technology

    2002-10-01

    Wonderlic and the Prueba de Aptitud Academica (PAA) will be evaluated in the pilot "Foreign Language Recruiting Initiative" project. Starting in October...Spanish Wonderlic and the Prueba de Aptitud Academica (PAA) will be evaluated in the pilot "Foreign Language Recruiting Initiative" project. Starting...implemented by linear programming software , which optimizes the Army’s enlisted personnel classification system, while accounting for realistic

  15. Mining collections of compounds with Screening Assistant 2

    PubMed Central

    2012-01-01

    Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/. PMID:23327565

  16. Mining collections of compounds with Screening Assistant 2.

    PubMed

    Guilloux, Vincent Le; Arrault, Alban; Colliandre, Lionel; Bourg, Stéphane; Vayer, Philippe; Morin-Allory, Luc

    2012-08-31

    High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.

  17. ARC-2007-ACD07-0140-002

    NASA Image and Video Library

    2007-07-31

    David L. Iverson of NASA Ames Research Center, Moffett Field, California (in foreground) led development of computer software to monitor the conditions of the gyroscopes that keep the International Space Station (ISS) properly oriented in space as the ISS orbits Earth. Also, Charles Lee is pictured. During its develoment, researchers used the software to analyze archived gyroscope records. In these tests, users noticed problems with the gyroscopes long before the current systems flagged glitches. Testers trained using several months of normal space station gyroscope data collected by the International Space Station Mission Control Center at NASA Johnson Space Center, Houston. Promising tests results convinced officials to start using the software in 2007.

  18. Hubble Systems Optimize Hospital Schedules

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Don Rosenthal, a former Ames Research Center computer scientist who helped design the Hubble Space Telescope's scheduling software, co-founded Allocade Inc. of Menlo Park, California, in 2004. Allocade's OnCue software helps hospitals reclaim unused capacity and optimize constantly changing schedules for imaging procedures. After starting to use the software, one medical center soon reported noticeable improvements in efficiency, including a 12 percent increase in procedure volume, 35 percent reduction in staff overtime, and significant reductions in backlog and technician phone time. Allocade now offers versions for outpatient and inpatient magnetic resonance imaging (MRI), ultrasound, interventional radiology, nuclear medicine, Positron Emission Tomography (PET), radiography, radiography-fluoroscopy, and mammography.

  19. Finding and Evaluating Online Resources on Complementary Health Approaches

    MedlinePlus

    ... it selling something? Finding Health Information on the Internet: How To Start To find accurate health information, ... thousands of mobile apps (a software program you access using your phone or other mobile device) that ...

  20. An Inquiry into the Cost of Post Deployment Software Support (PDSS)

    DTIC Science & Technology

    1989-09-01

    Equations .......... ii vi AFIT/GLM/LSY/835- I0 The increasing cost of software maintenance is taking a larger share of the military bidget each year... increments as needed (3:59). The second page of tne Form 75 starts with a section stating how the hours, and consequently the funds, will be allocated to...length of time required, the timeline can be in hourly, weekly, mnunthly, or quarterly increments . Some milestones included are formal approval, test

  1. Health Monitor for Multitasking, Safety-Critical, Real-Time Software

    NASA Technical Reports Server (NTRS)

    Zoerner, Roger

    2011-01-01

    Health Manager can detect Bad Health prior to a failure occurring by periodically monitoring the application software by looking for code corruption errors, and sanity-checking each critical data value prior to use. A processor s memory can fail and corrupt the software, or the software can accidentally write to the wrong address and overwrite the executing software. This innovation will continuously calculate a checksum of the software load to detect corrupted code. This will allow a system to detect a failure before it happens. This innovation monitors each software task (thread) so that if any task reports "bad health," or does not report to the Health Manager, the system is declared bad. The Health Manager reports overall system health to the outside world by outputting a square wave signal. If the square wave stops, this indicates that system health is bad or hung and cannot report. Either way, "bad health" can be detected, whether caused by an error, corrupted data, or a hung processor. A separate Health Monitor Task is started and run periodically in a loop that starts and stops pending on a semaphore. Each monitored task registers with the Health Manager, which maintains a count for the task. The registering task must indicate if it will run more or less often than the Health Manager. If the task runs more often than the Health Manager, the monitored task calls a health function that increments the count and verifies it did not go over max-count. When the periodic Health Manager runs, it verifies that the count did not go over the max-count and zeroes it. If the task runs less often than the Health Manager, the periodic Health Manager will increment the count. The monitored task zeroes the count, and both the Health Manager and monitored task verify that the count did not go over the max-count.

  2. Modern and prospective technologies for weather modification activities: A look at integrating unmanned aircraft systems

    NASA Astrophysics Data System (ADS)

    Axisa, Duncan; DeFelice, Tom P.

    2016-09-01

    Present-day weather modification technologies are scientifically based and have made controlled technological advances since the late 1990s, early 2000s. The technological advances directly related to weather modification have primarily been in the decision support and evaluation based software and modeling areas. However, there have been some technological advances in other fields that might now be advanced enough to start considering their usefulness for improving weather modification operational efficiency and evaluation accuracy. We consider the programmatic aspects underlying the development of new technologies for use in weather modification activities, identifying their potential benefits and limitations. We provide context and initial guidance for operators that might integrate unmanned aircraft systems technology in future weather modification operations.

  3. Constructing a working taxonomy of functional Ada software components for real-time embedded system applications

    NASA Technical Reports Server (NTRS)

    Wallace, Robert

    1986-01-01

    A major impediment to a systematic attack on Ada software reusability is the lack of an effective taxonomy for software component functions. The scope of all possible applications of Ada software is considered too great to allow the practical development of a working taxonomy. Instead, for the purposes herein, the scope of Ada software application is limited to device and subsystem control in real-time embedded systems. A functional approach is taken in constructing the taxonomy tree for identified Ada domain. The use of modular software functions as a starting point fits well with the object oriented programming philosophy of Ada. Examples of the types of functions represented within the working taxonomy are real time kernels, interrupt service routines, synchronization and message passing, data conversion, digital filtering and signal conditioning, and device control. The constructed taxonomy is proposed as a framework from which a need analysis can be performed to reveal voids in current Ada real-time embedded programming efforts for Space Station.

  4. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  5. Impact of Requirements Quality on Project Success or Failure

    NASA Astrophysics Data System (ADS)

    Tamai, Tetsuo; Kamata, Mayumi Itakura

    We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.

  6. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  7. [Central online quality assurance in radiology: an IT solution exemplified by the German Breast Cancer Screening Program].

    PubMed

    Czwoydzinski, J; Girnus, R; Sommer, A; Heindel, W; Lenzen, H

    2011-09-01

    Physical-technical quality assurance is one of the essential tasks of the National Reference Centers in the German Breast Cancer Screening Program. For this purpose the mammography units are required to transfer the measured values of the constancy tests on a daily basis and all phantom images created for this purpose on a weekly basis to the reference centers. This is a serious logistical challenge. To meet these requirements, we developed an innovative software tool. By the end of 2005, we had already developed web-based software (MammoControl) allowing the transmission of constancy test results via entry forms. For automatic analysis and transmission of the phantom images, we then introduced an extension (MammoControl DIANA). This was based on Java, Java Web Start, the NetBeans Rich Client Platform, the Pixelmed Java DICOM Toolkit and the ImageJ library. MammoControl DIANA was designed to run locally in the mammography units. This allows automated on-site image analysis. Both results and compressed images can then be transmitted to the reference center. We developed analysis modules for the daily and monthly consistency tests and additionally for a homogeneity test. The software we developed facilitates the immediate availability of measurement results, phantom images, and DICOM header data in all reference centers. This allows both targeted guidance and short response time in the case of errors. We achieved a consistent IT-based evaluation with standardized tools for the entire screening program in Germany. © Georg Thieme Verlag KG Stuttgart · New York.

  8. SU-E-T-103: Development and Implementation of Web Based Quality Control Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Studinski, R; Taylor, R; Angers, C

    Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less

  9. Stewplan: software for creating forest stewardship plans (Version 1.3)

    Treesearch

    Peter D. Knopp; Mark J. Twery

    2003-01-01

    Describes the purpose and function of the Stewplan computer program. Provides instructions for loading Stewplan, a tutorial for getting started, and instructions for use. A copy of the program is included. [User's manual; CD-ROM].

  10. Enhancing interdisciplinary mathematics and biology education: a microarray data analysis course bridging these disciplines.

    PubMed

    Tra, Yolande V; Evans, Irene M

    2010-01-01

    BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course.

  11. Enhancing Interdisciplinary Mathematics and Biology Education: A Microarray Data Analysis Course Bridging These Disciplines

    PubMed Central

    Evans, Irene M.

    2010-01-01

    BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course. PMID:20810954

  12. A reconfigurable visual-programming library for real-time closed-loop cellular electrophysiology

    PubMed Central

    Biró, István; Giugliano, Michele

    2015-01-01

    Most of the software platforms for cellular electrophysiology are limited in terms of flexibility, hardware support, ease of use, or re-configuration and adaptation for non-expert users. Moreover, advanced experimental protocols requiring real-time closed-loop operation to investigate excitability, plasticity, dynamics, are largely inaccessible to users without moderate to substantial computer proficiency. Here we present an approach based on MATLAB/Simulink, exploiting the benefits of LEGO-like visual programming and configuration, combined to a small, but easily extendible library of functional software components. We provide and validate several examples, implementing conventional and more sophisticated experimental protocols such as dynamic-clamp or the combined use of intracellular and extracellular methods, involving closed-loop real-time control. The functionality of each of these examples is demonstrated with relevant experiments. These can be used as a starting point to create and support a larger variety of electrophysiological tools and methods, hopefully extending the range of default techniques and protocols currently employed in experimental labs across the world. PMID:26157385

  13. Use of the SPARC software program to calculate hydrolysis rate constants for the polymeric brominated flame retardants BC-58 and FR-1025.

    PubMed

    Rayne, Sierra; Forest, Kaya

    2016-01-01

    The SPARC software program was used to estimate the acid-catalyzed, neutral, and base-catalyzed hydrolysis rate constants for the polymeric brominated flame retardants BC-58 and FR-1025. Relatively rapid hydrolysis of BC-58, producing 2,4,6-tribromophenol-and ultimately tetrabromobisphenol A-as the hydrolytically stable end products from all potential hydrolysis reactions, is expected in both environmental and biological systems with starting material hydrolytic half-lives (t(1/2,hydr)) ranging from less than 1 h in marine systems, several hours in cellular environments, and up to several weeks in slightly acid fresh waters. Hydrolysis of FR-1025 to give 2,3,4,5,6-pentabromobenzyl alcohol is expected to be slower (t(1/2,hydr) less than 0.5 years in marine systems up to several years in fresh waters) than BC-58, but is also expected to occur at rates that will contribute significantly to environmental and in vivo loadings of this compound.

  14. A Massively Parallel Computational Method of Reading Index Files for SOAPsnv.

    PubMed

    Zhu, Xiaoqian; Peng, Shaoliang; Liu, Shaojie; Cui, Yingbo; Gu, Xiang; Gao, Ming; Fang, Lin; Fang, Xiaodong

    2015-12-01

    SOAPsnv is the software used for identifying the single nucleotide variation in cancer genes. However, its performance is yet to match the massive amount of data to be processed. Experiments reveal that the main performance bottleneck of SOAPsnv software is the pileup algorithm. The original pileup algorithm's I/O process is time-consuming and inefficient to read input files. Moreover, the scalability of the pileup algorithm is also poor. Therefore, we designed a new algorithm, named BamPileup, aiming to improve the performance of sequential read, and the new pileup algorithm implemented a parallel read mode based on index. Using this method, each thread can directly read the data start from a specific position. The results of experiments on the Tianhe-2 supercomputer show that, when reading data in a multi-threaded parallel I/O way, the processing time of algorithm is reduced to 3.9 s and the application program can achieve a speedup up to 100×. Moreover, the scalability of the new algorithm is also satisfying.

  15. Indoor Navigation by People with Visual Impairment Using a Digital Sign System

    PubMed Central

    Legge, Gordon E.; Beckmann, Paul J.; Tjan, Bosco S.; Havey, Gary; Kramer, Kevin; Rolkosky, David; Gage, Rachel; Chen, Muzi; Puchakayala, Sravan; Rangarajan, Aravindhan

    2013-01-01

    There is a need for adaptive technology to enhance indoor wayfinding by visually-impaired people. To address this need, we have developed and tested a Digital Sign System. The hardware and software consist of digitally-encoded signs widely distributed throughout a building, a handheld sign-reader based on an infrared camera, image-processing software, and a talking digital map running on a mobile device. Four groups of subjects—blind, low vision, blindfolded sighted, and normally sighted controls—were evaluated on three navigation tasks. The results demonstrate that the technology can be used reliably in retrieving information from the signs during active mobility, in finding nearby points of interest, and following routes in a building from a starting location to a destination. The visually impaired subjects accurately and independently completed the navigation tasks, but took substantially longer than normally sighted controls. This fully functional prototype system demonstrates the feasibility of technology enabling independent indoor navigation by people with visual impairment. PMID:24116156

  16. A Rapid Python-Based Methodology for Target-Focused Combinatorial Library Design.

    PubMed

    Li, Shiliang; Song, Yuwei; Liu, Xiaofeng; Li, Honglin

    2016-01-01

    The chemical space is so vast that only a small portion of it has been examined. As a complementary approach to systematically probe the chemical space, virtual combinatorial library design has extended enormous impacts on generating novel and diverse structures for drug discovery. Despite the favorable contributions, high attrition rates in drug development that mainly resulted from lack of efficacy and side effects make it increasingly challenging to discover good chemical starting points. In most cases, focused libraries, which are restricted to particular regions of the chemical space, are deftly exploited to maximize hit rate and improve efficiency at the beginning of the drug discovery and drug development pipeline. This paper presented a valid methodology for fast target-focused combinatorial library design in both reaction-based and production-based ways with the library creating rates of approximately 70,000 molecules per second. Simple, quick and convenient operating procedures are the specific features of the method. SHAFTS, a hybrid 3D similarity calculation software, was embedded to help refine the size of the libraries and improve hit rates. Two target-focused (p38-focused and COX2-focused) libraries were constructed efficiently in this study. This rapid library enumeration method is portable and applicable to any other targets for good chemical starting points identification collaborated with either structure-based or ligand-based virtual screening.

  17. Spacelab cost reduction alternatives study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Alternative approaches to payload operations planning and control and flight crew training are defined for spacelab payloads with the goal of: lowering FY77 and FY 78 costs for new starts; lowering costs to achieve Spacelab operational capability; and minimizing the cost per Spacelab flight. These alternatives attempt to minimize duplication of hardware, software, and personnel, and the investment in supporting facility and equipment. Of particular importance is the possible reduction of equipment, software, and manpower resources such as comtational systems, trainers, and simulators.

  18. Getting started with Open-Hardware: Development and Control of Microfluidic Devices

    PubMed Central

    da Costa, Eric Tavares; Mora, Maria F.; Willis, Peter A.; do Lago, Claudimir L.; Jiao, Hong; Garcia, Carlos D.

    2014-01-01

    Understanding basic concepts of electronics and computer programming allows researchers to get the most out of the equipment found in their laboratories. Although a number of platforms have been specifically designed for the general public and are supported by a vast array of on-line tutorials, this subject is not normally included in university chemistry curricula. Aiming to provide the basic concepts of hardware and software, this article is focused on the design and use of a simple module to control a series of PDMS-based valves. The module is based on a low-cost microprocessor (Teensy) and open-source software (Arduino). The microvalves were fabricated using thin sheets of PDMS and patterned using CO2 laser engraving, providing a simple and efficient way to fabricate devices without the traditional photolithographic process or facilities. Synchronization of valve control enabled the development of two simple devices to perform injection (1.6 ± 0.4 μL/stroke) and mixing of different solutions. Furthermore, a practical demonstration of the utility of this system for microscale chemical sample handling and analysis was achieved performing an on-chip acid-base titration, followed by conductivity detection with an open-source low-cost detection system. Overall, the system provided a very reproducible (98%) platform to perform fluid delivery at the microfluidic scale. PMID:24823494

  19. PLUME-FEATHER, Referencing and Finding Software for Research and Education

    NASA Astrophysics Data System (ADS)

    Bénassy, O.; Caron, C.; Ferret-Canape, C.; Cheylus, A.; Courcelle, E.; Dantec, C.; Dayre, P.; Dostes, T.; Durand, A.; Facq, A.; Gambini, G.; Geahchan, E.; Helft, C.; Hoffmann, D.; Ingarao, M.; Joly, P.; Kieffer, J.; Larré, J.-M.; Libes, M.; Morris, F.; Parmentier, H.; Pérochon, L.; Porte, O.; Romier, G.; Rousse, D.; Tournoy, R.; Valeins, H.

    2014-06-01

    PLUME-FEATHER is a non-profit project created to Promote economicaL, Useful and Maintained softwarEFor theHigher Education And THE Research communities. The site references software, mainly Free/Libre Open Source Software (FLOSS) from French universities and national research organisations, (CNRS, INRA...), laboratories or departments as well as other FLOSS software used and evaluated by users within these institutions. Each software is represented by a reference card, which describes origin, aim, installation, cost (if applicable) and user experience from the point of view of an academic user for academic users. Presently over 1000 programs are referenced on PLUME by more than 900 contributors. Although the server is maintained by a French institution, it is open to international contributions in the academic domain. All contained and validated contents are visible to anonymous public, whereas (presently more than 2000) registered users can contribute, starting with comments on single software reference cards up to help with the organisation and presentation of the referenced software products. The project has been presented to the HEP community in 2012 for the first time [1]. This is an update of the status and a call for (further) contributions.

  20. A nonparametric significance test for sampled networks.

    PubMed

    Elliott, Andrew; Leicht, Elizabeth; Whitmore, Alan; Reinert, Gesine; Reed-Tsochas, Felix

    2018-01-01

    Our work is motivated by an interest in constructing a protein-protein interaction network that captures key features associated with Parkinson's disease. While there is an abundance of subnetwork construction methods available, it is often far from obvious which subnetwork is the most suitable starting point for further investigation. We provide a method to assess whether a subnetwork constructed from a seed list (a list of nodes known to be important in the area of interest) differs significantly from a randomly generated subnetwork. The proposed method uses a Monte Carlo approach. As different seed lists can give rise to the same subnetwork, we control for redundancy by constructing a minimal seed list as the starting point for the significance test. The null model is based on random seed lists of the same length as a minimum seed list that generates the subnetwork; in this random seed list the nodes have (approximately) the same degree distribution as the nodes in the minimum seed list. We use this null model to select subnetworks which deviate significantly from random on an appropriate set of statistics and might capture useful information for a real world protein-protein interaction network. The software used in this paper are available for download at https://sites.google.com/site/elliottande/. The software is written in Python and uses the NetworkX library. ande.elliott@gmail.com or felix.reed-tsochas@sbs.ox.ac.uk. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press.

  1. A nonparametric significance test for sampled networks

    PubMed Central

    Leicht, Elizabeth; Whitmore, Alan; Reinert, Gesine; Reed-Tsochas, Felix

    2018-01-01

    Abstract Motivation Our work is motivated by an interest in constructing a protein–protein interaction network that captures key features associated with Parkinson’s disease. While there is an abundance of subnetwork construction methods available, it is often far from obvious which subnetwork is the most suitable starting point for further investigation. Results We provide a method to assess whether a subnetwork constructed from a seed list (a list of nodes known to be important in the area of interest) differs significantly from a randomly generated subnetwork. The proposed method uses a Monte Carlo approach. As different seed lists can give rise to the same subnetwork, we control for redundancy by constructing a minimal seed list as the starting point for the significance test. The null model is based on random seed lists of the same length as a minimum seed list that generates the subnetwork; in this random seed list the nodes have (approximately) the same degree distribution as the nodes in the minimum seed list. We use this null model to select subnetworks which deviate significantly from random on an appropriate set of statistics and might capture useful information for a real world protein–protein interaction network. Availability and implementation The software used in this paper are available for download at https://sites.google.com/site/elliottande/. The software is written in Python and uses the NetworkX library. Contact ande.elliott@gmail.com or felix.reed-tsochas@sbs.ox.ac.uk Supplementary information Supplementary data are available at Bioinformatics online. PMID:29036452

  2. Template-based combinatorial enumeration of virtual compound libraries for lipids

    PubMed Central

    2012-01-01

    A variety of software packages are available for the combinatorial enumeration of virtual libraries for small molecules, starting from specifications of core scaffolds with attachments points and lists of R-groups as SMILES or SD files. Although SD files include atomic coordinates for core scaffolds and R-groups, it is not possible to control 2-dimensional (2D) layout of the enumerated structures generated for virtual compound libraries because different packages generate different 2D representations for the same structure. We have developed a software package called LipidMapsTools for the template-based combinatorial enumeration of virtual compound libraries for lipids. Virtual libraries are enumerated for the specified lipid abbreviations using matching lists of pre-defined templates and chain abbreviations, instead of core scaffolds and lists of R-groups provided by the user. 2D structures of the enumerated lipids are drawn in a specific and consistent fashion adhering to the framework for representing lipid structures proposed by the LIPID MAPS consortium. LipidMapsTools is lightweight, relatively fast and contains no external dependencies. It is an open source package and freely available under the terms of the modified BSD license. PMID:23006594

  3. An AFDX Network for Spacecraft Data Handling

    NASA Astrophysics Data System (ADS)

    Deredempt, Marie-Helene; Kollias, Vangelis; Sun, Zhili; Canamares, Ernest; Ricco, Philippe

    2014-08-01

    In aeronautical domain, ARINC-664 Part 7 specification (AFDX) [4] provides the enabling technology for interfacing equipment in Integrated Modular Avionics (IMA) architectures. The complementary part of AFDX for a complete interoperability - Time and Space Partitioning (ARINC 653) concepts [1]- was already studied as part of space domain ESA roadmap (i.e. IMA4Space project)Standardized IMA based architecture is already considered in aeronautical domain as more flexible, reliable and secure. Integration and validation become simple, using a common set of tools and data base and could be done by part on different means with the same definition (hardware and software test benches, flight control or alarm test benches, simulator and flight test installation).In some area, requirements in terms of data processing are quite similar in space domain and the concept could be applicable to take benefit of the technology itself and of the panel of hardware and software solutions and tools available on the market. The Mission project (Methodology and assessment for the applicability of ARINC-664 (AFDX) in Satellite/Spacecraft on-board communicatION networks), as an FP7 initiative for bringing terrestrial SME research into the space domain started to evaluate the applicability of the standard in space domain.

  4. Template-based combinatorial enumeration of virtual compound libraries for lipids.

    PubMed

    Sud, Manish; Fahy, Eoin; Subramaniam, Shankar

    2012-09-25

    A variety of software packages are available for the combinatorial enumeration of virtual libraries for small molecules, starting from specifications of core scaffolds with attachments points and lists of R-groups as SMILES or SD files. Although SD files include atomic coordinates for core scaffolds and R-groups, it is not possible to control 2-dimensional (2D) layout of the enumerated structures generated for virtual compound libraries because different packages generate different 2D representations for the same structure. We have developed a software package called LipidMapsTools for the template-based combinatorial enumeration of virtual compound libraries for lipids. Virtual libraries are enumerated for the specified lipid abbreviations using matching lists of pre-defined templates and chain abbreviations, instead of core scaffolds and lists of R-groups provided by the user. 2D structures of the enumerated lipids are drawn in a specific and consistent fashion adhering to the framework for representing lipid structures proposed by the LIPID MAPS consortium. LipidMapsTools is lightweight, relatively fast and contains no external dependencies. It is an open source package and freely available under the terms of the modified BSD license.

  5. Where's My Data - WMD

    NASA Technical Reports Server (NTRS)

    Quach, William L.; Sesplaukis, Tadas; Owen-Mankovich, Kyran J.; Nakamura, Lori L.

    2012-01-01

    WMD provides a centralized interface to access data stored in the Mission Data Processing and Control System (MPCS) GDS (Ground Data Systems) databases during MSL (Mars Science Laboratory) Testbeds and ATLO (Assembly, Test, and Launch Operations) test sessions. The MSL project organizes its data based on venue (Testbed, ATLO, Ops), with each venue's data stored on a separate database, making it cumbersome for users to access data across the various venues. WMD allows sessions to be retrieved through a Web-based search using several criteria: host name, session start date, or session ID number. Sessions matching the search criteria will be displayed and users can then select a session to obtain and analyze the associated data. The uniqueness of this software comes from its collection of data retrieval and analysis features provided through a single interface. This allows users to obtain their data and perform the necessary analysis without having to worry about where and how to get the data, which may be stored in various locations. Additionally, this software is a Web application that only requires a standard browser without additional plug-ins, providing a cross-platform, lightweight solution for users to retrieve and analyze their data. This software solves the problem of efficiently and easily finding and retrieving data from thousands of MSL Testbed and ATLO sessions. WMD allows the user to retrieve their session in as little as one mouse click, and then to quickly retrieve additional data associated with the session.

  6. Communicating Off the Page.

    ERIC Educational Resources Information Center

    Block, Marylaine

    2001-01-01

    Discusses Web sites and Weblogs (or blogs) created by librarians as informal, interactive zines. Considers Web publishing software which makes it easier to start, motivation for self-publishing, and differences from trade publications; and provides a list of librarian blogs and zines. (LRW)

  7. Railway cognitive radio to enhance safety, security, and performance of positive train control.

    DOT National Transportation Integrated Search

    2013-02-01

    Robust and interoperable wireless communications are vital to Positive Train Control (PTC). The railway industry has started adopting software-defined radios (SDRs) for packet-data transmission. SDR systems realize previously fixed components as reco...

  8. Survey Email Scheduling and Monitoring in eRCTs (SESAMe): A Digital Tool to Improve Data Collection in Randomized Controlled Clinical Trials.

    PubMed

    Skonnord, Trygve; Steen, Finn; Skjeie, Holgeir; Fetveit, Arne; Brekke, Mette; Klovning, Atle

    2016-11-22

    Electronic questionnaires can ease data collection in randomized controlled trials (RCTs) in clinical practice. We found no existing software that could automate the sending of emails to participants enrolled into an RCT at different study participant inclusion time points. Our aim was to develop suitable software to facilitate data collection in an ongoing multicenter RCT of low back pain (the Acuback study). For the Acuback study, we determined that we would need to send a total of 5130 emails to 270 patients recruited at different centers and at 19 different time points. The first version of the software was tested in a pilot study in November 2013 but was unable to deliver multiuser or Web-based access. We resolved these shortcomings in the next version, which we tested on the Web in February 2014. Our new version was able to schedule and send the required emails in the full-scale Acuback trial that started in March 2014. The system architecture evolved through an iterative, inductive process between the project study leader and the software programmer. The program was tested and updated when errors occurred. To evaluate the development of the software, we used a logbook, a research assistant dialogue, and Acuback trial participant queries. We have developed a Web-based app, Survey Email Scheduling and Monitoring in eRCTs (SESAMe), that monitors responses in electronic surveys and sends reminders by emails or text messages (short message service, SMS) to participants. The overall response rate for the 19 surveys in the Acuback study increased from 76.4% (655/857) before we introduced reminders to 93.11% (1149/1234) after the new function (P<.001). Further development will aim at securing encryption and data storage. The SESAMe software facilitates consecutive patient data collection in RCTs and can be used to increase response rates and quality of research, both in general practice and in other clinical trial settings. ©Trygve Skonnord, Finn Steen, Holgeir Skjeie, Arne Fetveit, Mette Brekke, Atle Klovning. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 22.11.2016.

  9. Comparison of effectiveness of class lecture versus workshop-based teaching of basic life support on acquiring practice skills among the health care providers

    PubMed Central

    Karim, Habib Md. Reazaul; Yunus, Md.; Bhattacharyya, Prithwis; Ahmed, Ghazal

    2016-01-01

    Background: Basic life support (BLS) is an integral part of emergency medical care. Studies have shown poor knowledge of it among health care providers who are usually taught BLS by lecture-based teachings in classes. Objectives: This study is designed to assess the effectiveness of class lecture versus workshop-based teaching of BLS on acquiring the practice skills on mannequin. Methods: After ethical approval and informed consent from the participants, the present study was conducted among the health care providers. Participants were grouped in lecture-based class teaching and workshop-based teaching. They were then asked to practice BLS on mannequin (Resusci Anne with QCPR) and evaluated as per performance parameters based on American Heart Association BLS. Statistical analyses are done by Fisher's exact t-test using GraphPad INSTAT software and P < 0.05 is taken as significant. Results: There were 55 participants in lecture-based teaching and 50 in workshop-based teaching group. There is no statistical difference in recognition of arrest, checking pulse, and starting chest compression (P > 0.05). Though more than 83% of lecture-based teaching group has started chest compression as compared 96% of workshop group; only 49% of the participants of lecture-based group performed quality chest compression as compared to 82% of other group (P = 0.0005). The workshop group also performed better bag mask ventilation and defibrillation (P < 0.0001). Conclusion: Workshop-based BLS teaching is more effective and lecture-based class teaching better is replaced in medical education curriculum. PMID:27308252

  10. Comparison of effectiveness of class lecture versus workshop-based teaching of basic life support on acquiring practice skills among the health care providers.

    PubMed

    Karim, Habib Md Reazaul; Yunus, Md; Bhattacharyya, Prithwis; Ahmed, Ghazal

    2016-01-01

    Basic life support (BLS) is an integral part of emergency medical care. Studies have shown poor knowledge of it among health care providers who are usually taught BLS by lecture-based teachings in classes. This study is designed to assess the effectiveness of class lecture versus workshop-based teaching of BLS on acquiring the practice skills on mannequin. After ethical approval and informed consent from the participants, the present study was conducted among the health care providers. Participants were grouped in lecture-based class teaching and workshop-based teaching. They were then asked to practice BLS on mannequin (Resusci Anne with QCPR) and evaluated as per performance parameters based on American Heart Association BLS. Statistical analyses are done by Fisher's exact t-test using GraphPad INSTAT software and P < 0.05 is taken as significant. There were 55 participants in lecture-based teaching and 50 in workshop-based teaching group. There is no statistical difference in recognition of arrest, checking pulse, and starting chest compression (P > 0.05). Though more than 83% of lecture-based teaching group has started chest compression as compared 96% of workshop group; only 49% of the participants of lecture-based group performed quality chest compression as compared to 82% of other group (P = 0.0005). The workshop group also performed better bag mask ventilation and defibrillation (P < 0.0001). Workshop-based BLS teaching is more effective and lecture-based class teaching better is replaced in medical education curriculum.

  11. GPR data processing computer software for the PC

    USGS Publications Warehouse

    Lucius, Jeffrey E.; Powers, Michael H.

    2002-01-01

    The computer software described in this report is designed for processing ground penetrating radar (GPR) data on Intel-compatible personal computers running the MS-DOS operating system or MS Windows 3.x/95/98/ME/2000. The earliest versions of these programs were written starting in 1990. At that time, commercially available GPR software did not meet the processing and display requirements of the USGS. Over the years, the programs were refined and new features and programs were added. The collection of computer programs presented here can perform all basic processing of GPR data, including velocity analysis and generation of CMP stacked sections and data volumes, as well as create publication quality data images.

  12. Implementation of an optimum profile guidance system on STOLAND

    NASA Technical Reports Server (NTRS)

    Flanagan, P. F.

    1978-01-01

    The implementation on the STOLAND airborne digital computer of an optimum profile guidance system for the augmentor wing jet STOL research aircraft is described. Major tasks were to implement the guidance and control logic to airborne computer software and to integrate the module with the existing STOLAND navigation, display, and autopilot routines. The optimum profile guidance system comprises an algorithm for synthesizing mimimum fuel trajectories for a wide range of starting positions in the terminal area and a control law for flying the aircraft automatically along the trajectory. The avionics software developed is described along with a FORTRAN program that was constructed to reflect the modular nature and algorthms implemented in the avionics software.

  13. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting.

    PubMed

    Waithe, Dominic; Rennert, Peter; Brostow, Gabriel; Piper, Matthew D W

    2015-01-01

    We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly). Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined) and opaque (yeast-based) fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for experiments starting with as few as 20 vials. We also describe an optional acrylic box to be used as a digital camera mount and to provide controlled lighting during image acquisition which will guarantee the conditions used in this study.

  14. QuantiFly: Robust Trainable Software for Automated Drosophila Egg Counting

    PubMed Central

    Waithe, Dominic; Rennert, Peter; Brostow, Gabriel; Piper, Matthew D. W.

    2015-01-01

    We report the development and testing of software called QuantiFly: an automated tool to quantify Drosophila egg laying. Many laboratories count Drosophila eggs as a marker of fitness. The existing method requires laboratory researchers to count eggs manually while looking down a microscope. This technique is both time-consuming and tedious, especially when experiments require daily counts of hundreds of vials. The basis of the QuantiFly software is an algorithm which applies and improves upon an existing advanced pattern recognition and machine-learning routine. The accuracy of the baseline algorithm is additionally increased in this study through correction of bias observed in the algorithm output. The QuantiFly software, which includes the refined algorithm, has been designed to be immediately accessible to scientists through an intuitive and responsive user-friendly graphical interface. The software is also open-source, self-contained, has no dependencies and is easily installed (https://github.com/dwaithe/quantifly). Compared to manual egg counts made from digital images, QuantiFly achieved average accuracies of 94% and 85% for eggs laid on transparent (defined) and opaque (yeast-based) fly media. Thus, the software is capable of detecting experimental differences in most experimental situations. Significantly, the advanced feature recognition capabilities of the software proved to be robust to food surface artefacts like bubbles and crevices. The user experience involves image acquisition, algorithm training by labelling a subset of eggs in images of some of the vials, followed by a batch analysis mode in which new images are automatically assessed for egg numbers. Initial training typically requires approximately 10 minutes, while subsequent image evaluation by the software is performed in just a few seconds. Given the average time per vial for manual counting is approximately 40 seconds, our software introduces a timesaving advantage for experiments starting with as few as 20 vials. We also describe an optional acrylic box to be used as a digital camera mount and to provide controlled lighting during image acquisition which will guarantee the conditions used in this study. PMID:25992957

  15. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and solutions to possible detriments coming from the effort required by using, supporting and contributing.

  16. Combining Rosetta with molecular dynamics (MD): A benchmark of the MD-based ensemble protein design.

    PubMed

    Ludwiczak, Jan; Jarmula, Adam; Dunin-Horkawicz, Stanislaw

    2018-07-01

    Computational protein design is a set of procedures for computing amino acid sequences that will fold into a specified structure. Rosetta Design, a commonly used software for protein design, allows for the effective identification of sequences compatible with a given backbone structure, while molecular dynamics (MD) simulations can thoroughly sample near-native conformations. We benchmarked a procedure in which Rosetta design is started on MD-derived structural ensembles and showed that such a combined approach generates 20-30% more diverse sequences than currently available methods with only a slight increase in computation time. Importantly, the increase in diversity is achieved without a loss in the quality of the designed sequences assessed by their resemblance to natural sequences. We demonstrate that the MD-based procedure is also applicable to de novo design tasks started from backbone structures without any sequence information. In addition, we implemented a protocol that can be used to assess the stability of designed models and to select the best candidates for experimental validation. In sum our results demonstrate that the MD ensemble-based flexible backbone design can be a viable method for protein design, especially for tasks that require a large pool of diverse sequences. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. A Study on Software-based Sensing Technology for Multiple Object Control in AR Video

    PubMed Central

    Jung, Sungmo; Song, Jae-gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo

    2010-01-01

    Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker’should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms. PMID:22163444

  18. Using artificial neural networks to model aluminium based sheet forming processes and tools details

    NASA Astrophysics Data System (ADS)

    Mekras, N.

    2017-09-01

    In this paper, a methodology and a software system will be presented concerning the use of Artificial Neural Networks (ANNs) for modeling aluminium based sheet forming processes. ANNs models’ creation is based on the training of the ANNs using experimental, trial and historical data records of processes’ inputs and outputs. ANNs models are useful in cases that processes’ mathematical models are not accurate enough, are not well defined or are missing e.g. in cases of complex product shapes, new material alloys, new process requirements, micro-scale products, etc. Usually, after the design and modeling of the forming tools (die, punch, etc.) and before mass production, a set of trials takes place at the shop floor for finalizing processes and tools details concerning e.g. tools’ minimum radii, die/punch clearance, press speed, process temperature, etc. and in relation with the material type, the sheet thickness and the quality achieved from the trials. Using data from the shop floor trials and forming theory data, ANNs models can be trained and created, and can be used to estimate processes and tools final details, hence supporting efficient set-up of processes and tools before mass production starts. The proposed ANNs methodology and the respective software system are implemented within the EU H2020 project LoCoMaTech for the aluminium-based sheet forming process HFQ (solution Heat treatment, cold die Forming and Quenching).

  19. A study on software-based sensing technology for multiple object control in AR video.

    PubMed

    Jung, Sungmo; Song, Jae-Gu; Hwang, Dae-Joon; Ahn, Jae Young; Kim, Seoksoo

    2010-01-01

    Researches on Augmented Reality (AR) have recently received attention. With these, the Machine-to-Machine (M2M) market has started to be active and there are numerous efforts to apply this to real life in all sectors of society. To date, the M2M market has applied the existing marker-based AR technology in entertainment, business and other industries. With the existing marker-based AR technology, a designated object can only be loaded on the screen from one marker and a marker has to be added to load on the screen the same object again. This situation creates a problem where the relevant marker'should be extracted and printed in screen so that loading of the multiple objects is enabled. However, since the distance between markers will not be measured in the process of detecting and copying markers, the markers can be overlapped and thus the objects would not be augmented. To solve this problem, a circle having the longest radius needs to be created from a focal point of a marker to be copied, so that no object is copied within the confines of the circle. In this paper, software-based sensing technology for multiple object detection and loading using PPHT has been developed and overlapping marker control according to multiple object control has been studied using the Bresenham and Mean Shift algorithms.

  20. Requirement Metrics for Risk Identification

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence

    1996-01-01

    The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.

  1. Proceedings of the Twenty-Fourth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    2000-01-01

    On December 1 and 2, the Software Engineering Laboratory (SEL), a consortium composed of NASA/Goddard, the University of Maryland, and CSC, held the 24th Software Engineering Workshop (SEW), the last of the millennium. Approximately 240 people attended the 2-day workshop. Day 1 was composed of four sessions: International Influence of the Software Engineering Laboratory; Object Oriented Testing and Reading; Software Process Improvement; and Space Software. For the first session, three internationally known software process experts discussed the influence of the SEL with respect to software engineering research. In the Space Software session, prominent representatives from three different NASA sites- GSFC's Marti Szczur, the Jet Propulsion Laboratory's Rick Doyle, and the Ames Research Center IV&V Facility's Lou Blazy- discussed the future of space software in their respective centers. At the end of the first day, the SEW sponsored a reception at the GSFC Visitors' Center. Day 2 also provided four sessions: Using the Experience Factory; A panel discussion entitled "Software Past, Present, and Future: Views from Government, Industry, and Academia"; Inspections; and COTS. The day started with an excellent talk by CSC's Frank McGarry on "Attaining Level 5 in CMM Process Maturity." Session 2, the panel discussion on software, featured NASA Chief Information Officer Lee Holcomb (Government), our own Jerry Page (Industry), and Mike Evangelist of the National Science Foundation (Academia). Each presented his perspective on the most important developments in software in the past 10 years, in the present, and in the future.

  2. Status of the DIRAC Project

    NASA Astrophysics Data System (ADS)

    Casajus, A.; Ciba, K.; Fernandez, V.; Graciani, R.; Hamar, V.; Mendez, V.; Poss, S.; Sapunov, M.; Stagni, F.; Tsaregorodtsev, A.; Ubeda, M.

    2012-12-01

    The DIRAC Project was initiated to provide a data processing system for the LHCb Experiment at CERN. It provides all the necessary functionality and performance to satisfy the current and projected future requirements of the LHCb Computing Model. A considerable restructuring of the DIRAC software was undertaken in order to turn it into a general purpose framework for building distributed computing systems that can be used by various user communities in High Energy Physics and other scientific application domains. The CLIC and ILC-SID detector projects started to use DIRAC for their data production system. The Belle Collaboration at KEK, Japan, has adopted the Computing Model based on the DIRAC system for its second phase starting in 2015. The CTA Collaboration uses DIRAC for the data analysis tasks. A large number of other experiments are starting to use DIRAC or are evaluating this solution for their data processing tasks. DIRAC services are included as part of the production infrastructure of the GISELA Latin America grid. Similar services are provided for the users of the France-Grilles and IBERGrid National Grid Initiatives in France and Spain respectively. The new communities using DIRAC started to provide important contributions to its functionality. Among recent additions can be mentioned the support of the Amazon EC2 computing resources as well as other Cloud management systems; a versatile File Replica Catalog with File Metadata capabilities; support for running MPI jobs in the pilot based Workload Management System. Integration with existing application Web Portals, like WS-PGRADE, is demonstrated. In this paper we will describe the current status of the DIRAC Project, recent developments of its framework and functionality as well as the status of the rapidly evolving community of the DIRAC users.

  3. Computerized Liquid Crystal Phase Identification by Neural Networks Analysis of Polarizing Microscopy Textures

    NASA Astrophysics Data System (ADS)

    Karaszi, Zoltan; Konya, Andrew; Dragan, Feodor; Jakli, Antal; CPIP/LCI; CS Dept. of Kent State University Collaboration

    Polarizing optical microscopy (POM) is traditionally the best-established method of studying liquid crystals, and using POM started already with Otto Lehman in 1890. An expert, who is familiar with the science of optics of anisotropic materials and typical textures of liquid crystals, can identify phases with relatively large confidence. However, for unambiguous identification usually other expensive and time-consuming experiments are needed. Replacement of the subjective and qualitative human eye-based liquid crystal texture analysis with quantitative computerized image analysis technique started only recently and were used to enhance the detection of smooth phase transitions, determine order parameter and birefringence of specific liquid crystal phases. We investigate if the computer can recognize and name the phase where the texture was taken. To judge the potential of reliable image recognition based on this procedure, we used 871 images of liquid crystal textures belonging to five main categories: Nematic, Smectic A, Smectic C, Cholesteric and Crystal, and used a Neural Network Clustering Technique included in the data mining software package in Java ``WEKA''. A neural network trained on a set of 827 LC textures classified the remaining 44 textures with 80% accuracy.

  4. The software analysis project for the Office of Human Resources

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil service employee with several years until retirement. The employee enters current salary and savings information as well as goals concerning salary at retirement, assumptions on inflation, and the return on investments. The program produces a picture of the employee's retirement income from all sources based on the assumptions entered. A session showing features of the program was conducted for key personnel at the Center. After analysis, it was decided to offer the program through the Learning Center starting in August 1994.

  5. Time-marching multi-grid seismic tomography

    NASA Astrophysics Data System (ADS)

    Tong, P.; Yang, D.; Liu, Q.

    2016-12-01

    From the classic ray-based traveltime tomography to the state-of-the-art full waveform inversion, because of the nonlinearity of seismic inverse problems, a good starting model is essential for preventing the convergence of the objective function toward local minima. With a focus on building high-accuracy starting models, we propose the so-called time-marching multi-grid seismic tomography method in this study. The new seismic tomography scheme consists of a temporal time-marching approach and a spatial multi-grid strategy. We first divide the recording period of seismic data into a series of time windows. Sequentially, the subsurface properties in each time window are iteratively updated starting from the final model of the previous time window. There are at least two advantages of the time-marching approach: (1) the information included in the seismic data of previous time windows has been explored to build the starting models of later time windows; (2) seismic data of later time windows could provide extra information to refine the subsurface images. Within each time window, we use a multi-grid method to decompose the scale of the inverse problem. Specifically, the unknowns of the inverse problem are sampled on a coarse mesh to capture the macro-scale structure of the subsurface at the beginning. Because of the low dimensionality, it is much easier to reach the global minimum on a coarse mesh. After that, finer meshes are introduced to recover the micro-scale properties. That is to say, the subsurface model is iteratively updated on multi-grid in every time window. We expect that high-accuracy starting models should be generated for the second and later time windows. We will test this time-marching multi-grid method by using our newly developed eikonal-based traveltime tomography software package tomoQuake. Real application results in the 2016 Kumamoto earthquake (Mw 7.0) region in Japan will be demonstrated.

  6. (abstract) Formal Inspection Technology Transfer Program

    NASA Technical Reports Server (NTRS)

    Welz, Linda A.; Kelly, John C.

    1993-01-01

    A Formal Inspection Technology Transfer Program, based on the inspection process developed by Michael Fagan at IBM, has been developed at JPL. The goal of this program is to support organizations wishing to use Formal Inspections to improve the quality of software and system level engineering products. The Technology Transfer Program provides start-up materials and assistance to help organizations establish their own Formal Inspection program. The course materials and certified instructors associated with the Technology Transfer Program have proven to be effective in classes taught at other NASA centers as well as at JPL. Formal Inspections (NASA tailored Fagan Inspections) are a set of technical reviews whose objective is to increase quality and reduce the cost of software development by detecting and correcting errors early. A primary feature of inspections is the removal of engineering errors before they amplify into larger and more costly problems downstream in the development process. Note that the word 'inspection' is used differently in software than in a manufacturing context. A Formal Inspection is a front-end quality enhancement technique, rather than a task conducted just prior to product shipment for the purpose of sorting defective systems (manufacturing usage). Formal Inspections are supporting and in agreement with the 'total quality' approach being adopted by many NASA centers.

  7. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    PubMed Central

    2011-01-01

    Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM) is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM), which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM). ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface) documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html PMID:21414234

  8. An efficient, modular and simple tape archiving solution for LHC Run-3

    NASA Astrophysics Data System (ADS)

    Murray, S.; Bahyl, V.; Cancio, G.; Cano, E.; Kotlyar, V.; Kruse, D. F.; Leduc, J.

    2017-10-01

    The IT Storage group at CERN develops the software responsible for archiving to tape the custodial copy of the physics data generated by the LHC experiments. Physics run 3 will start in 2021 and will introduce two major challenges for which the tape archive software must be evolved. Firstly the software will need to make more efficient use of tape drives in order to sustain the predicted data rate of 150 petabytes per year as opposed to the current 50 petabytes per year. Secondly the software will need to be seamlessly integrated with EOS, which has become the de facto disk storage system provided by the IT Storage group for physics data. The tape storage software for LHC physics run 3 is code named CTA (the CERN Tape Archive). This paper describes how CTA will introduce a pre-emptive drive scheduler to use tape drives more efficiently, will encapsulate all tape software into a single module that will sit behind one or more EOS systems, and will be simpler by dropping support for obsolete backwards compatibility.

  9. ACTS: from ATLAS software towards a common track reconstruction software

    NASA Astrophysics Data System (ADS)

    Gumpert, C.; Salzburger, A.; Kiehn, M.; Hrdinka, J.; Calace, N.; ATLAS Collaboration

    2017-10-01

    Reconstruction of charged particles’ trajectories is a crucial task for most particle physics experiments. The high instantaneous luminosity achieved at the LHC leads to a high number of proton-proton collisions per bunch crossing, which has put the track reconstruction software of the LHC experiments through a thorough test. Preserving track reconstruction performance under increasingly difficult experimental conditions, while keeping the usage of computational resources at a reasonable level, is an inherent problem for many HEP experiments. Exploiting concurrent algorithms and using multivariate techniques for track identification are the primary strategies to achieve that goal. Starting from current ATLAS software, the ACTS project aims to encapsulate track reconstruction software into a generic, framework- and experiment-independent software package. It provides a set of high-level algorithms and data structures for performing track reconstruction tasks as well as fast track simulation. The software is developed with special emphasis on thread-safety to support parallel execution of the code and data structures are optimised for vectorisation to speed up linear algebra operations. The implementation is agnostic to the details of the detection technologies and magnetic field configuration which makes it applicable to many different experiments.

  10. PharmTeX: a LaTeX-Based Open-Source Platform for Automated Reporting Workflow.

    PubMed

    Rasmussen, Christian Hove; Smith, Mike K; Ito, Kaori; Sundararajan, Vijayakumar; Magnusson, Mats O; Niclas Jonsson, E; Fostvedt, Luke; Burger, Paula; McFadyen, Lynn; Tensfeldt, Thomas G; Nicholas, Timothy

    2018-03-16

    Every year, the pharmaceutical industry generates a large number of scientific reports related to drug research, development, and regulatory submissions. Many of these reports are created using text processing tools such as Microsoft Word. Given the large number of figures, tables, references, and other elements, this is often a tedious task involving hours of copying and pasting and substantial efforts in quality control (QC). In the present article, we present the LaTeX-based open-source reporting platform, PharmTeX, a community-based effort to make reporting simple, reproducible, and user-friendly. The PharmTeX creators put a substantial effort into simplifying the sometimes complex elements of LaTeX into user-friendly functions that rely on advanced LaTeX and Perl code running in the background. Using this setup makes LaTeX much more accessible for users with no prior LaTeX experience. A software collection was compiled for users not wanting to manually install the required software components. The PharmTeX templates allow for inclusion of tables directly from mathematical software output as well and figures from several formats. Code listings can be included directly from source. No previous experience and only a few hours of training are required to start writing reports using PharmTeX. PharmTeX significantly reduces the time required for creating a scientific report fully compliant with regulatory and industry expectations. QC is made much simpler, since there is a direct link between analysis output and report input. PharmTeX makes available to report authors the strengths of LaTeX document processing without the need for extensive training. Graphical Abstract ᅟ.

  11. A CSP-Based Agent Modeling Framework for the Cougaar Agent-Based Architecture

    NASA Technical Reports Server (NTRS)

    Gracanin, Denis; Singh, H. Lally; Eltoweissy, Mohamed; Hinchey, Michael G.; Bohner, Shawn A.

    2005-01-01

    Cognitive Agent Architecture (Cougaar) is a Java-based architecture for large-scale distributed agent-based applications. A Cougaar agent is an autonomous software entity with behaviors that represent a real-world entity (e.g., a business process). A Cougaar-based Model Driven Architecture approach, currently under development, uses a description of system's functionality (requirements) to automatically implement the system in Cougaar. The Communicating Sequential Processes (CSP) formalism is used for the formal validation of the generated system. Two main agent components, a blackboard and a plugin, are modeled as CSP processes. A set of channels represents communications between the blackboard and individual plugins. The blackboard is represented as a CSP process that communicates with every agent in the collection. The developed CSP-based Cougaar modeling framework provides a starting point for a more complete formal verification of the automatically generated Cougaar code. Currently it is used to verify the behavior of an individual agent in terms of CSP properties and to analyze the corresponding Cougaar society.

  12. Analysis of capacity and traffic operations impacts of the World Trade Bridge in Laredo

    DOT National Transportation Integrated Search

    2001-07-01

    Project 0-1800 pioneered the use of modern micro-simulation software to analyze the complex procedures involved in international border crossings. The animated models simulate the entire southbound commercial traffic flow, starting with U.S. Customs ...

  13. Laboratory Computing Resource Center

    Science.gov Websites

    Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low

  14. Integration of M&S (Modeling and Simulation), Software Design and DoDAF (Department of Defense Architecture Framework (RT 24)

    DTIC Science & Technology

    2012-04-09

    between BPMN , SysML, and Arena ........................................... 16 Capabilities, Activities, Resources, Performers...Proof of Concept ................................................................ 22 BPMN 2.0 XML to Arena Converter...21 Figure 5: BPMN 2.0 XML StartEvent (Excerpt

  15. Antenna Controller Replacement Software

    NASA Technical Reports Server (NTRS)

    Chao, Roger Y.; Morgan, Scott C.; Strain, Martha M.; Rockwell, Stephen T.; Shimizu, Kenneth J.; Tehrani, Barzia J.; Kwok, Jaclyn H.; Tuazon-Wong, Michelle; Valtier, Henry; Nalbandi, Reza; hide

    2010-01-01

    The Antenna Controller Replacement (ACR) software accurately points and monitors the Deep Space Network (DSN) 70-m and 34-m high-efficiency (HEF) ground-based antennas that are used to track primarily spacecraft and, periodically, celestial targets. To track a spacecraft, or other targets, the antenna must be accurately pointed at the spacecraft, which can be very far away with very weak signals. ACR s conical scanning capability collects the signal in a circular pattern around the target, calculates the location of the strongest signal, and adjusts the antenna pointing to point directly at the spacecraft. A real-time, closed-loop servo control algorithm performed every 0.02 second allows accurate positioning of the antenna in order to track these distant spacecraft. Additionally, this advanced servo control algorithm provides better antenna pointing performance in windy conditions. The ACR software provides high-level commands that provide a very easy user interface for the DSN operator. The operator only needs to enter two commands to start the antenna and subreflector, and Master Equatorial tracking. The most accurate antenna pointing is accomplished by aligning the antenna to the Master Equatorial, which because of its small size and sheltered location, has the most stable pointing. The antenna has hundreds of digital and analog monitor points. The ACR software provides compact displays to summarize the status of the antenna, subreflector, and the Master Equatorial. The ACR software has two major functions. First, it performs all of the steps required to accurately point the antenna (and subreflector and Master Equatorial) at the spacecraft (or celestial target). This involves controlling the antenna/ subreflector/Master-Equatorial hardware, initiating and monitoring the correct sequence of operations, calculating the position of the spacecraft relative to the antenna, executing the real-time servo control algorithm to maintain the correct position, and monitoring tracking performance.

  16. Analyzing and designing object-oriented missile simulations with concurrency

    NASA Astrophysics Data System (ADS)

    Randorf, Jeffrey Allen

    2000-11-01

    A software object model for the six degree-of-freedom missile modeling domain is presented. As a precursor, a domain analysis of the missile modeling domain was started, based on the Feature-Oriented Domain Analysis (FODA) technique described by the Software Engineering Institute (SEI). It was subsequently determined the FODA methodology is functionally equivalent to the Object Modeling Technique. The analysis used legacy software documentation and code from the ENDOSIM, KDEC, and TFrames 6-DOF modeling tools, including other technical literature. The SEI Object Connection Architecture (OCA) was the template for designing the object model. Three variants of the OCA were considered---a reference structure, a recursive structure, and a reference structure with augmentation for flight vehicle modeling. The reference OCA design option was chosen for maintaining simplicity while not compromising the expressive power of the OMT model. The missile architecture was then analyzed for potential areas of concurrent computing. It was shown how protected objects could be used for data passing between OCA object managers, allowing concurrent access without changing the OCA reference design intent or structure. The implementation language was the 1995 release of Ada. OCA software components were shown how to be expressed as Ada child packages. While acceleration of several low level and other high operations level are possible on proper hardware, there was a 33% degradation of 4th order Runge-Kutta integrator performance of two simultaneous ordinary differential equations using Ada tasking on a single processor machine. The Defense Department's High Level Architecture was introduced and explained in context with the OCA. It was shown the HLA and OCA were not mutually exclusive architectures, but complimentary. HLA was shown as an interoperability solution, with the OCA as an architectural vehicle for software reuse. Further directions for implementing a 6-DOF missile modeling environment are discussed.

  17. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  18. HyperCard and Other Macintosh Applications in Astronomy Education

    NASA Astrophysics Data System (ADS)

    Meisel, D.

    1992-12-01

    For the past six years, Macintosh computers have been used in introductory astronomy classes and laboratories with HyperCard and other commercial Macintosh software. I will review some of the available software that has been found particularly useful in undergraduate situations. The review will start with HyperCard (a programmable "index card" system) since it is a mature multimedia platform for the Macintosh. Experiences with the Voyager, the TS-24, MathCad, NIH Image, and other programs as used by the author and George Mumford (Tufts University) in courses and workshops will be described.

  19. ATLAS@AWS

    NASA Astrophysics Data System (ADS)

    Gehrcke, Jan-Philip; Kluth, Stefan; Stonjek, Stefan

    2010-04-01

    We show how the ATLAS offline software is ported on the Amazon Elastic Compute Cloud (EC2). We prepare an Amazon Machine Image (AMI) on the basis of the standard ATLAS platform Scientific Linux 4 (SL4). Then an instance of the SLC4 AMI is started on EC2 and we install and validate a recent release of the ATLAS offline software distribution kit. The installed software is archived as an image on the Amazon Simple Storage Service (S3) and can be quickly retrieved and connected to new SL4 AMI instances using the Amazon Elastic Block Store (EBS). ATLAS jobs can then configure against the release kit using the ATLAS configuration management tool (cmt) in the standard way. The output of jobs is exported to S3 before the SL4 AMI is terminated. Job status information is transferred to the Amazon SimpleDB service. The whole process of launching instances of our AMI, starting, monitoring and stopping jobs and retrieving job output from S3 is controlled from a client machine using python scripts implementing the Amazon EC2/S3 API via the boto library working together with small scripts embedded in the SL4 AMI. We report our experience with setting up and operating the system using standard ATLAS job transforms.

  20. Experience Paper: Software Engineering and Community Codes Track in ATPESC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, Anshu; Riley, Katherine M.

    Argonne Training Program in Extreme Scale Computing (ATPESC) was started by the Argonne National Laboratory with the objective of expanding the ranks of better prepared users of high performance computing (HPC) machines. One of the unique aspects of the program was inclusion of software engineering and community codes track. The inclusion was motivated by the observation that the projects with a good scientific and software process were better able to meet their scientific goals. In this paper we present our experience of running the software track from the beginning of the program until now. We discuss the motivations, the reception,more » and the evolution of the track over the years. We welcome discussion and input from the community to enhance the track in ATPESC, and also to facilitate inclusion of similar tracks in other HPC oriented training programs.« less

  1. An embedded multi-core parallel model for real-time stereo imaging

    NASA Astrophysics Data System (ADS)

    He, Wenjing; Hu, Jian; Niu, Jingyu; Li, Chuanrong; Liu, Guangyu

    2018-04-01

    The real-time processing based on embedded system will enhance the application capability of stereo imaging for LiDAR and hyperspectral sensor. The task partitioning and scheduling strategies for embedded multiprocessor system starts relatively late, compared with that for PC computer. In this paper, aimed at embedded multi-core processing platform, a parallel model for stereo imaging is studied and verified. After analyzing the computing amount, throughout capacity and buffering requirements, a two-stage pipeline parallel model based on message transmission is established. This model can be applied to fast stereo imaging for airborne sensors with various characteristics. To demonstrate the feasibility and effectiveness of the parallel model, a parallel software was designed using test flight data, based on the 8-core DSP processor TMS320C6678. The results indicate that the design performed well in workload distribution and had a speed-up ratio up to 6.4.

  2. RRAM-based hardware implementations of artificial neural networks: progress update and challenges ahead

    NASA Astrophysics Data System (ADS)

    Prezioso, M.; Merrikh-Bayat, F.; Chakrabarti, B.; Strukov, D.

    2016-02-01

    Artificial neural networks have been receiving increasing attention due to their superior performance in many information processing tasks. Typically, scaling up the size of the network results in better performance and richer functionality. However, large neural networks are challenging to implement in software and customized hardware are generally required for their practical implementations. In this work, we will discuss our group's recent efforts on the development of such custom hardware circuits, based on hybrid CMOS/memristor circuits, in particular of CMOL variety. We will start by reviewing the basics of memristive devices and of CMOL circuits. We will then discuss our recent progress towards demonstration of hybrid circuits, focusing on the experimental and theoretical results for artificial neural networks based on crossbarintegrated metal oxide memristors. We will conclude presentation with the discussion of the remaining challenges and the most pressing research needs.

  3. Lift-Off: Using Reference Imagery and Freehand Sketching to Create 3D Models in VR.

    PubMed

    Jackson, Bret; Keefe, Daniel F

    2016-04-01

    Three-dimensional modeling has long been regarded as an ideal application for virtual reality (VR), but current VR-based 3D modeling tools suffer from two problems that limit creativity and applicability: (1) the lack of control for freehand modeling, and (2) the difficulty of starting from scratch. To address these challenges, we present Lift-Off, an immersive 3D interface for creating complex models with a controlled, handcrafted style. Artists start outside of VR with 2D sketches, which are then imported and positioned in VR. Then, using a VR interface built on top of image processing algorithms, 2D curves within the sketches are selected interactively and "lifted" into space to create a 3D scaffolding for the model. Finally, artists sweep surfaces along these curves to create 3D models. Evaluations are presented for both long-term users and for novices who each created a 3D sailboat model from the same starting sketch. Qualitative results are positive, with the visual style of the resulting models of animals and other organic subjects as well as architectural models matching what is possible with traditional fine art media. In addition, quantitative data from logging features built into the software are used to characterize typical tool use and suggest areas for further refinement of the interface.

  4. Water masers in the Kronian system

    NASA Astrophysics Data System (ADS)

    Pogrebenko, Sergei V.; Gurvits, Leonid I.; Elitzur, Moshe; Cosmovici, Cristiano B.; Avruch, Ian M.; Pluchino, Salvatore; Montebugnoli, Stelio; Salerno, Emma; Maccaferri, Giuseppe; Mujunen, Ari; Ritakari, Jouko; Molera, Guifre; Wagner, Jan; Uunila, Minttu; Cimo, Giuseppe; Schilliro, Francesco; Bartolini, Marco

    The presence of water has been considered for a long time as a key condition for life in planetary environments. The Cassini mission discovered water vapour in the Kronian system by detecting absorption of UV emission from a background star (Hansen et al. 2006). Prompted by this discovery, we started an observational campaign for search of another manifestation of the water vapour in the Kronian system, its maser emission at the frequency of 22 GHz (1.35 cm wavelength). Observations with the 32 m Medicina radio telescope (INAF-IRA, Italy) started in 2006 using Mk5A data recording and the JIVE-Huygens software correlator. Later on, an on-line spectrometer was used at Medicina. The 14 m Metsähovi radio telescope (TKK-MRO, Finland) joined the observational campaign in 2008 using a locally developed data capture unit and software spectrometer. More than 300 hours of observations were collected in 2006-2008 campaign with the two radio telescopes. The data were analysed at JIVE using the Doppler tracking technique to compensate the observed spectra for the radial Doppler shift for various bodies in the Kronian system (Pogrebenko et al. 2009). Here we report the observational results for Hyperion, Titan, Enceladus and Atlas, and their physical interpretation. Encouraged by these results we started a campaign of follow up observations including other radio telescopes.

  5. What software tools can I use to view ERBE HDF data products?

    Atmospheric Science Data Center

    2014-12-08

    Visualize ERBE data with view_hdf: view_hdf a visualization and analysis tool for accessing data stored in Hierarchical Data Format (HDF) and HDF-EOS. ... Start HDFView Select File Select Open Select the file to be viewed ERBE: Data Access ...

  6. Installing and Setting Up Git Software Tool on Windows | High-Performance

    Science.gov Websites

    projects somewhere. In this example, we'll put our work in a "projects" folder inside the " GIT bash options. We'll also assume you'll want to start-off using the GUI. In our example, we've

  7. Home Energy Saver

    Science.gov Websites

    Help | About | Privacy | Media Room | Feedback Start Describe Compare Upgrade Community Press For members of the media, we've gathered some press materials issued by Berkeley Lab, including the press Energy Management Software Home Energy Saver Website Computes Possible Savings for Homeowners Media

  8. Porting and refurbishment of the WSS TNG control software

    NASA Astrophysics Data System (ADS)

    Caproni, Alessandro; Zacchei, Andrea; Vuerli, Claudio; Pucillo, Mauro

    2004-09-01

    The Workstation Software Sytem (WSS) is the high level control software of the Italian Galileo Galilei Telescope settled in La Palma Canary Island developed at the beginning of '90 for HP-UX workstations. WSS may be seen as a middle layer software system that manages the communications between the real time systems (VME), different workstations and high level applications providing a uniform distributed environment. The project to port the control software from the HP workstation to Linux environment started at the end of 2001. It is aimed to refurbish the control software introducing some of the new software technologies and languages, available for free in the Linux operating system. The project was realized by gradually substituting each HP workstation with a Linux PC with the goal to avoid main changes in the original software running under HP-UX. Three main phases characterized the project: creation of a simulated control room with several Linux PCs running WSS (to check all the functionality); insertion in the simulated control room of some HPs (to check the mixed environment); substitution of HP workstation in the real control room. From a software point of view, the project introduces some new technologies, like multi-threading, and the possibility to develop high level WSS applications with almost every programming language that implements the Berkley sockets. A library to develop java applications has also been created and tested.

  9. Firefly Algorithm for Structural Search.

    PubMed

    Avendaño-Franco, Guillermo; Romero, Aldo H

    2016-07-12

    The problem of computational structure prediction of materials is approached using the firefly (FF) algorithm. Starting from the chemical composition and optionally using prior knowledge of similar structures, the FF method is able to predict not only known stable structures but also a variety of novel competitive metastable structures. This article focuses on the strengths and limitations of the algorithm as a multimodal global searcher. The algorithm has been implemented in software package PyChemia ( https://github.com/MaterialsDiscovery/PyChemia ), an open source python library for materials analysis. We present applications of the method to van der Waals clusters and crystal structures. The FF method is shown to be competitive when compared to other population-based global searchers.

  10. Large Scale Bacterial Colony Screening of Diversified FRET Biosensors

    PubMed Central

    Litzlbauer, Julia; Schifferer, Martina; Ng, David; Fabritius, Arne; Thestrup, Thomas; Griesbeck, Oliver

    2015-01-01

    Biosensors based on Förster Resonance Energy Transfer (FRET) between fluorescent protein mutants have started to revolutionize physiology and biochemistry. However, many types of FRET biosensors show relatively small FRET changes, making measurements with these probes challenging when used under sub-optimal experimental conditions. Thus, a major effort in the field currently lies in designing new optimization strategies for these types of sensors. Here we describe procedures for optimizing FRET changes by large scale screening of mutant biosensor libraries in bacterial colonies. We describe optimization of biosensor expression, permeabilization of bacteria, software tools for analysis, and screening conditions. The procedures reported here may help in improving FRET changes in multiple suitable classes of biosensors. PMID:26061878

  11. Operational Plan Ontology Model for Interconnection and Interoperability

    NASA Astrophysics Data System (ADS)

    Long, F.; Sun, Y. K.; Shi, H. Q.

    2017-03-01

    Aiming at the assistant decision-making system’s bottleneck of processing the operational plan data and information, this paper starts from the analysis of the problem of traditional expression and the technical advantage of ontology, and then it defines the elements of the operational plan ontology model and determines the basis of construction. Later, it builds up a semi-knowledge-level operational plan ontology model. Finally, it probes into the operational plan expression based on the operational plan ontology model and the usage of the application software. Thus, this paper has the theoretical significance and application value in the improvement of interconnection and interoperability of the operational plan among assistant decision-making systems.

  12. Production of the next-generation library virtual tour.

    PubMed

    Duncan, J M; Roth, L K

    2001-10-01

    While many libraries offer overviews of their services through their Websites, only a small number of health sciences libraries provide Web-based virtual tours. These tours typically feature photographs of major service areas along with textual descriptions. This article describes the process for planning, producing, and implementing a next-generation virtual tour in which a variety of media elements are integrated: photographic images, 360-degree "virtual reality" views, textual descriptions, and contextual floor plans. Hardware and software tools used in the project are detailed, along with a production timeline and budget, tips for streamlining the process, and techniques for improving production. This paper is intended as a starting guide for other libraries considering an investment in such a project.

  13. Unambiguous UML Composite Structures: The OMEGA2 Experience

    NASA Astrophysics Data System (ADS)

    Ober, Iulian; Dragomir, Iulia

    Starting from version 2.0, UML introduced hierarchical composite structures, which are a very expressive way of defining complex software architectures, but which have a very loosely defined semantics in the standard. In this paper we propose a set of consistency rules that ensure UML composite structures are unambiguous and can be given a precise semantics. Our primary application of the static consistency rules defined in this paper is within the OMEGA UML profile [6], but these rules are general and applicable to other hierarchical component models based on the same concepts, such as MARTE GCM or SysML. The rule set has been formalized in OCL and is currently used in the OMEGA UML compiler.

  14. High-quality macromolecular graphics on mobile devices: a quick starter's guide.

    PubMed

    Yiu, Chin-Pang Benny; Chen, Yu Wai

    2014-01-01

    With the rise of tablets, truly portable molecular graphics are now available for wide use by scientists to share structural information in real time at a reasonable cost. We have surveyed the existing software available on Apple iPads and on Android tablets in order to make a recommendation to potential users, primarily based on the product features. Among 12 apps, iMolview (available on both platforms) stands out to be our choice, with PyMOL app (iOS) a close alternative and RCSB PDB Mobile viewer/NDKmol (both platforms) offering some uniquely useful functions. Finally, we include a tutorial on how to get started using iMolview to do some simple visualization in 10 min.

  15. Development of a landlside EWS based on rainfall thresholds for Tuscany Region, Italy

    NASA Astrophysics Data System (ADS)

    Rosi, Ascanio; Segoni, Samuele; Battistini, Alessandro; Rossi, Guglielmo; Catani, Filippo; Casagli, Nicola

    2017-04-01

    We present the set-up of a landslide EWS based on rainfall thresholds for the Tuscany region (central Italy), that shows a heterogeneous distribution of reliefs and precipitation. The work started with the definition of a single set of thresholds for the whole region, but it resulted unsuitable for EWS purposes, because of the heterogeneity of the Tuscan territory and non-repeatability of the analyses, that were affected by a high degree of subjectivity. To overcome this problem, the work started from the implementation of a software capable of objectively defining the rainfall thresholds, since some of the main issues of these thresholds are the subjectivity of the analysis and therefore their non-repeatability. This software, named MaCumBA, is largely automated and can analyze, in a short time, a high number of rainfall events to define several parameters of the threshold, such as the intensity (I) and the duration (D) of the rainfall event, the no-rain time gap (NRG: how many hours without rain are needed to consider two events as separated) and the equation describing the threshold. The possibility of quickly perform several analyses lead to the decision to divide the territory in 25 homogeneous areas (named alert zones, AZ), so as a single threshold for each AZ could be defined. For the definition of the thresholds two independent datasets (of joint rainfall-landslide occurrences) have been used: a calibration dataset (data from 2000 to 2007) and a validation dataset (2008-2009). Once the thresholds were defined, a WebGIS-based EWS has been implemented. In this system it is possible to focus both on monitoring of real-time data and on forecasting at different lead times up to 48 h; forecasting data are collected from LAMI (Limited Area Model Italy) rainfall forecasts. The EWS works on the basis of the threshold parameters defined by MaCumBA (I, D, NRG). An important feature of the warning system is that the visualization of the thresholds in the WebGIS interface may vary in time depending on when the starting time of the rainfall event is set. Therefore, the starting time of the rainfall event is considered as a variable by the system: whenever new rainfall data are available, a recursive algorithm identifies the starting time for which the rainfall path is closest to or overcomes the threshold. This is considered the most hazardous condition, and it is displayed by the WebGIS interface. One more issue that came to surface, after the EWS implementation, was the time-limited validity of the thresholds. On one hand rainfall thresholds can give good results, on the other hand their validity is limited in time, because of several factors, such as changes of pluviometric regime, land use and urban development. Furthermore, the availability of new landslide data can lead to more robust results. For the aforementioned reasons some of the thresholds defined for Tuscany region were updated, by using new landslide data (from 2010 to march 2013). A comparison between updated and former thresholds clearly shows that the performance of an EWS can be enhanced if the thresholds are constantly updated.

  16. Interlock system for machine protection of the KOMAC 100-MeV proton linac

    NASA Astrophysics Data System (ADS)

    Song, Young-Gi

    2015-02-01

    The 100-MeV proton linear accelerator of the Korea Multi-purpose Accelerator Complex (KOMAC) has been developed. The beam service started this year after performing the beam commissioning. If the very sensitive and essential equipment is to be protected during machine operation, a machine interlock system is required, and the interlock system has been implemented. The purpose of the interlock system is to shut off the beam when the radio-frequency (RF) and ion source are unstable or a beam loss occurs. The interlock signal of the KOMAC linac includes a variety of sources, such as the beam loss, RF and high-voltage converter modulator faults, and fast closing valves of the vacuum window at the beam lines and so on. This system consists of a hardware-based interlock system using analog circuits and a software-based interlock system using an industrial programmable logic controller (PLC). The hardware-based interlock system has been fabricated, and the requirement has been satisfied with the results being within 10 µs. The software logic interlock system using the PLC has been connected to the framework of with the experimental physics and industrial control system (EPICS) to integrate a variety of interlock signals and to control the machine components when an interlock occurs. This paper will describe the design and the construction of the machine interlock system for the KOMAC 100-MeV linac.

  17. Getting started with open-hardware: development and control of microfluidic devices.

    PubMed

    da Costa, Eric Tavares; Mora, Maria F; Willis, Peter A; do Lago, Claudimir L; Jiao, Hong; Garcia, Carlos D

    2014-08-01

    Understanding basic concepts of electronics and computer programming allows researchers to get the most out of the equipment found in their laboratories. Although a number of platforms have been specifically designed for the general public and are supported by a vast array of on-line tutorials, this subject is not normally included in university chemistry curricula. Aiming to provide the basic concepts of hardware and software, this article is focused on the design and use of a simple module to control a series of PDMS-based valves. The module is based on a low-cost microprocessor (Teensy) and open-source software (Arduino). The microvalves were fabricated using thin sheets of PDMS and patterned using CO2 laser engraving, providing a simple and efficient way to fabricate devices without the traditional photolithographic process or facilities. Synchronization of valve control enabled the development of two simple devices to perform injection (1.6 ± 0.4 μL/stroke) and mixing of different solutions. Furthermore, a practical demonstration of the utility of this system for microscale chemical sample handling and analysis was achieved performing an on-chip acid-base titration, followed by conductivity detection with an open-source low-cost detection system. Overall, the system provided a very reproducible (98%) platform to perform fluid delivery at the microfluidic scale. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Development and Operations of the Astrophysics Data System

    NASA Technical Reports Server (NTRS)

    Murray, Stephen S.; Oliversen, Ronald (Technical Monitor)

    2003-01-01

    The ADS was represented at the AAS meeting with 3 poster papers and a demonstration booth. We have setup a mirror site of the Vizier data base system at the CDS. functionality of the ADC at Goddard. This will replace the Preparations for the APS and LPSC meetings in March started. We will have demonstrations at both meetings. Preparations for the APS and LPSC meetings in March continued. We will have demonstrations at both meetings. The ADS was represented with a poster at the joint AGUEGU meeting in Nice, France. Discussions about the on-going collaboration between the ADS and the CDS in Strasbourg, France were held in Strasbourg. The ADS was invited to organize a session about the ADS and its mirror sites at the next United Nations Workshop on Basic Space Sciences in the Developing World. Efforts are under way to enter the tables of contents of all conference proceedings in the SA0 library into the ADS. This requires copying the tables of contents from all volumes in the library and have them typed in. This will greatly enhance the coverage of the literature in the ADS. We started the development of a search system for the full text of all scanned material in the ADS. This will eventually allow our users search capabilities that so far do not exist in any form. I order to enable the full text searching, we have purchased OCR software and are in the process of OCRing the scanned pages in the ADS. Efforts are in progress to handle the inclusion of data set identifiers in article manuscripts. The ADS will be the central system that will allow the journals to verify data set identifiers. The "master verifier" has been implemented in prototype form at the ADS. We started to include more journals in Geosciences/Geophysics in the ADS. The Royal astronomical Society has decided to archive their on-line journals in the ADS three years after publication. We have started to process these older on-line articles in order to archive them in the ADS. Our mirror site in Korea now has a full article mirror. We developed XML output capability in the ADS. This will make it easier to exchange data with other data systems. We started the development of new indexing software that will eventually reduce the indexing time for a database from days to hours or less. The ADS was represented at the IAU General Assembly with a poster. Discussions with the IAU management were held about extending the ADS IAU collaborations.

  19. Future-saving audiovisual content for Data Science: Preservation of geoinformatics video heritage with the TIB|AV-Portal

    NASA Astrophysics Data System (ADS)

    Löwe, Peter; Plank, Margret; Ziedorn, Frauke

    2015-04-01

    In data driven research, the access to citation and preservation of the full triad consisting of journal article, research data and -software has started to become good scientific practice. To foster the adoption of this practice the significance of software tools has to be acknowledged, which enable scientists to harness auxiliary audiovisual content in their research work. The advent of ubiquitous computer-based audiovisual recording and corresponding Web 2.0 hosting platforms like Youtube, Slideshare and GitHub has created new ecosystems for contextual information related to scientific software and data, which continues to grow both in size and variety of content. The current Web 2.0 platforms lack capabilities for long term archiving and scientific citation, such as persistent identifiers allowing to reference specific intervals of the overall content. The audiovisual content currently shared by scientists ranges from commented howto-demonstrations on software handling, installation and data-processing, to aggregated visual analytics of the evolution of software projects over time. Such content are crucial additions to the scientific message, as they ensure that software-based data-processing workflows can be assessed, understood and reused in the future. In the context of data driven research, such content needs to be accessible by effective search capabilities, enabling the content to be retrieved and ensuring that the content producers receive credit for their efforts within the scientific community. Improved multimedia archiving and retrieval services for scientific audiovisual content which meet these requirements are currently implemented by the scientific library community. This paper exemplifies the existing challenges, requirements, benefits and the potential of the preservation, accessibility and citability of such audiovisual content for the Open Source communities based on the new audiovisual web service TIB|AV Portal of the German National Library of Science and Technology. The web-based portal allows for extended search capabilities based on enhanced metadata derived by automated video analysis. By combining state-of-the-art multimedia retrieval techniques such as speech-, text-, and image recognition with semantic analysis, content-based access to videos at the segment level is provided. Further, by using the open standard Media Fragment Identifier (MFID), a citable Digital Object Identifier is displayed for each video segment. In addition to the continuously growing footprint of contemporary content, the importance of vintage audiovisual information needs to be considered: This paper showcases the successful application of the TIB|AV-Portal in the preservation and provision of a newly discovered version of a GRASS GIS promotional video produced by US Army -Corps of Enginers Laboratory (US-CERL) in 1987. The video is provides insight into the constraints of the very early days of the GRASS GIS project, which is the oldest active Free and Open Source Software (FOSS) GIS project which has been active for over thirty years. GRASS itself has turned into a collaborative scientific platform and a repository of scientific peer-reviewed code and algorithm/knowledge hub for future generation of scientists [1]. This is a reference case for future preservation activities regarding semantic-enhanced Web 2.0 content from geospatial software projects within Academia and beyond. References: [1] Chemin, Y., Petras V., Petrasova, A., Landa, M., Gebbert, S., Zambelli, P., Neteler, M., Löwe, P.: GRASS GIS: a peer-reviewed scientific platform and future research Repository, Geophysical Research Abstracts, Vol. 17, EGU2015-8314-1, 2015 (submitted)

  20. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience

    PubMed Central

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac

    2017-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255

  1. ISEES: an institute for sustainable software to accelerate environmental science

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Schildhauer, M.; Fox, P. A.

    2013-12-01

    Software is essential to the full science lifecycle, spanning data acquisition, processing, quality assessment, data integration, analysis, modeling, and visualization. Software runs our meteorological sensor systems, our data loggers, and our ocean gliders. Every aspect of science is impacted by, and improved by, software. Scientific advances ranging from modeling climate change to the sequencing of the human genome have been rendered possible in the last few decades due to the massive improvements in the capabilities of computers to process data through software. This pivotal role of software in science is broadly acknowledged, while simultaneously being systematically undervalued through minimal investments in maintenance and innovation. As a community, we need to embrace the creation, use, and maintenance of software within science, and address problems such as code complexity, openness,reproducibility, and accessibility. We also need to fully develop new skills and practices in software engineering as a core competency in our earth science disciplines, starting with undergraduate and graduate education and extending into university and agency professional positions. The Institute for Sustainable Earth and Environmental Software (ISEES) is being envisioned as a community-driven activity that can facilitate and galvanize activites around scientific software in an analogous way to synthesis centers such as NCEAS and NESCent that have stimulated massive advances in ecology and evolution. We will describe the results of six workshops (Science Drivers, Software Lifecycles, Software Components, Workforce Development and Training, Sustainability and Governance, and Community Engagement) that have been held in 2013 to envision such an institute. We will present community recommendations from these workshops and our strategic vision for how ISEES will address the technical issues in the software lifecycle, sustainability of the whole software ecosystem, and the critical issue of computational training for the scientific community. Process for envisioning ISEES.

  2. Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.

    PubMed

    Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac

    2016-01-01

    In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.

  3. Open Source Cloud-Based Technologies for Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  4. ULFEM time series analysis package

    USGS Publications Warehouse

    Karl, Susan M.; McPhee, Darcy K.; Glen, Jonathan M. G.; Klemperer, Simon L.

    2013-01-01

    This manual describes how to use the Ultra-Low-Frequency ElectroMagnetic (ULFEM) software package. Casual users can read the quick-start guide and will probably not need any more information than this. For users who may wish to modify the code, we provide further description of the routines.

  5. Software and Systems Producibility Collaboration and Experimentation Environment (SPRUCE)

    DTIC Science & Technology

    2014-04-01

    represent course materials and assignments from Vanderbilt University’s Dr . Gokhale’s courses. 3.2.4. Communities of Interest Current list of...blogging platforms of Twitter, Facebook and LinkedIn today, these user interactions represent low-effort means for users to start getting involved. A

  6. Mapping Subsurface Structure at Guar Kepah by using Ground Penetrating Radar

    NASA Astrophysics Data System (ADS)

    Mansor, Hafizuddin; Rosli, Najmiah; Ismail, N. A.; Saidin, M.; Masnan, S. S. K.

    2018-04-01

    A Ground Penetrating Radar (GPR) survey was conducted at Guar Kepah to detect buried object before commencement of archaeological gallery construction. The study area covered around 20 m length and 14 m width. 15 GPR lines were constructed from north to south with 20 m length, 1 m spacing and parallel to each other. The 500 MHz closed antenna had been used in this study. The surface findings were noticed before started GPR survey. The data was analysed and interpreted by using Groundvision software and several filters were applied to radargrams to enhance the data. Based on the result, several anomalies were detected. The surface findings also detected by GPR which cause hyperbolic curve in radargrams. The subsurface layer was detected by GPR survey. The anomalies are assigned to several classes based on the pattern of signals obtained in radargrams.

  7. A PC-based generator of surface ECG potentials for computer electrocardiograph testing.

    PubMed

    Franchi, D; Palagi, G; Bedini, R

    1994-02-01

    The system is composed of an electronic circuit, connected to a PC, whose outputs, starting from ECGs digitally collected by commercial interpretative electrocardiographs, simulate virtual patients' limb and chest electrode potentials. Appropriate software manages the D/A conversion and lines up the original short-term signal in a ring buffer to generate continuous ECG traces. The device also permits the addition of artifacts and/or baseline wanders/shifts on each lead separately. The system has been accurately tested and statistical indexes have been computed to quantify the reproduction accuracy analyzing, in the generated signal, both the errors induced on the fiducial point measurements and the capability to retain the diagnostic significance. The device integrated with an annotated ECG data base constitutes a reliable and powerful system to be used in the quality assurance testing of computer electrocardiographs.

  8. FPGA-accelerated algorithm for the regular expression matching system

    NASA Astrophysics Data System (ADS)

    Russek, P.; Wiatr, K.

    2015-01-01

    This article describes an algorithm to support a regular expressions matching system. The goal was to achieve an attractive performance system with low energy consumption. The basic idea of the algorithm comes from a concept of the Bloom filter. It starts from the extraction of static sub-strings for strings of regular expressions. The algorithm is devised to gain from its decomposition into parts which are intended to be executed by custom hardware and the central processing unit (CPU). The pipelined custom processor architecture is proposed and a software algorithm explained accordingly. The software part of the algorithm was coded in C and runs on a processor from the ARM family. The hardware architecture was described in VHDL and implemented in field programmable gate array (FPGA). The performance results and required resources of the above experiments are given. An example of target application for the presented solution is computer and network security systems. The idea was tested on nearly 100,000 body-based viruses from the ClamAV virus database. The solution is intended for the emerging technology of clusters of low-energy computing nodes.

  9. Transitioning NWChem to the Next Generation of Manycore Machines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bylaska, Eric J.; Apra, E; Kowalski, Karol

    The NorthWest chemistry (NWChem) modeling software is a popular molecular chemistry simulation software that was designed from the start to work on massively parallel processing supercomputers [1-3]. It contains an umbrella of modules that today includes self-consistent eld (SCF), second order Møller-Plesset perturbation theory (MP2), coupled cluster (CC), multiconguration self-consistent eld (MCSCF), selected conguration interaction (CI), tensor contraction engine (TCE) many body methods, density functional theory (DFT), time-dependent density functional theory (TDDFT), real-time time-dependent density functional theory, pseudopotential plane-wave density functional theory (PSPW), band structure (BAND), ab initio molecular dynamics (AIMD), Car-Parrinello molecular dynamics (MD), classical MD, hybrid quantum mechanicsmore » molecular mechanics (QM/MM), hybrid ab initio molecular dynamics molecular mechanics (AIMD/MM), gauge independent atomic orbital nuclear magnetic resonance (GIAO NMR), conductor like screening solvation model (COSMO), conductor-like screening solvation model based on density (COSMO-SMD), and reference interaction site model (RISM) solvation models, free energy simulations, reaction path optimization, parallel in time, among other capabilities [4]. Moreover, new capabilities continue to be added with each new release.« less

  10. PreSSAPro: a software for the prediction of secondary structure by amino acid properties.

    PubMed

    Costantini, Susan; Colonna, Giovanni; Facchiano, Angelo M

    2007-10-01

    PreSSAPro is a software, available to the scientific community as a free web service designed to provide predictions of secondary structures starting from the amino acid sequence of a given protein. Predictions are based on our recently published work on the amino acid propensities for secondary structures in either large but not homogeneous protein data sets, as well as in smaller but homogeneous data sets corresponding to protein structural classes, i.e. all-alpha, all-beta, or alpha-beta proteins. Predictions result improved by the use of propensities evaluated for the right protein class. PreSSAPro predicts the secondary structure according to the right protein class, if known, or gives a multiple prediction with reference to the different structural classes. The comparison of these predictions represents a novel tool to evaluate what sequence regions can assume different secondary structures depending on the structural class assignment, in the perspective of identifying proteins able to fold in different conformations. The service is available at the URL http://bioinformatica.isa.cnr.it/PRESSAPRO/.

  11. CMS Analysis School Model

    NASA Astrophysics Data System (ADS)

    Malik, S.; Shipsey, I.; Cavanaugh, R.; Bloom, K.; Chan, Kai-Feng; D'Hondt, J.; Klima, B.; Narain, M.; Palla, F.; Rolandi, G.; Schörner-Sadenius, T.

    2014-06-01

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.

  12. Experimental and analytical investigation of inertial propulsion mechanisms and motion simulation of rigid multi-body mechanical systems

    NASA Astrophysics Data System (ADS)

    Almesallmy, Mohammed

    Methodologies are developed for dynamic analysis of mechanical systems with emphasis on inertial propulsion systems. This work adopted the Lagrangian methodology. Lagrangian methodology is the most efficient classical computational technique, which we call Equations of Motion Code (EOMC). The EOMC is applied to several simple dynamic mechanical systems for easier understanding of the method and to aid other investigators in developing equations of motion of any dynamic system. In addition, it is applied to a rigid multibody system, such as Thomson IPS [Thomson 1986]. Furthermore, a simple symbolic algorithm is developed using Maple software, which can be used to convert any nonlinear n-order ordinary differential equation (ODE) systems into 1st-order ODE system in ready format to be used in Matlab software. A side issue, but equally important, we have started corresponding with the U.S. Patent office to persuade them that patent applications, claiming gross linear motion based on inertial propulsion systems should be automatically rejected. The precedent is rejection of patent applications involving perpetual motion machines.

  13. The changing nature of spacecraft operations: From the Vikings of the 1970's to the great observatories of the 1990's and beyond

    NASA Technical Reports Server (NTRS)

    Ledbetter, Kenneth W.

    1992-01-01

    Four trends in spacecraft flight operations are discussed which will reduce overall program costs. These trends are the use of high-speed, highly reliable data communications systems for distributing operations functions to more convenient and cost-effective sites; the improved capability for remote operation of sensors; a continued rapid increase in memory and processing speed of flight qualified computer chips; and increasingly capable ground-based hardware and software systems, notably those augmented by artificial intelligence functions. Changes reflected by these trends are reviewed starting from the NASA Viking missions of the early 70s, when mission control was conducted at one location using expensive and cumbersome mainframe computers and communications equipment. In the 1980s, powerful desktop computers and modems enabled the Magellan project team to operate the spacecraft remotely. In the 1990s, the Hubble Space Telescope project uses multiple color screens and automated sequencing software on small computers. Given a projection of current capabilities, future control centers will be even more cost-effective.

  14. Process model economics of xanthan production from confectionery industry wastewaters.

    PubMed

    Bajić, Bojana Ž; Vučurović, Damjan G; Dodić, Siniša N; Grahovac, Jovana A; Dodić, Jelena M

    2017-12-01

    In this research a process and cost model for a xanthan production facility was developed using process simulation software (SuperPro Designer ® ). This work represents a novelty in the field for two reasons. One is that xanthan gum has been produced from several wastes but never from wastewaters from confectionery industries. The other more important is that the aforementioned software, which in intended exclusively for bioprocesses, is used for generating a base case, i.e. starting point for transferring the technology to industrial scales. Previously acquired experimental knowledge about using confectionery wastewaters from five different factories as substitutes for commercially used cultivation medium have been incorporated into the process model in order to obtain an economic viability of implementing such substrates. A lower initial sugar content in the medium based on wastewater (28.41 g/L) compared to the synthetic medium (30.00 g/L) gave a lower xanthan content at the end of cultivation (23.98 and 26.27 g/L, respectively). Although this resulted in somewhat poorer economic parameters, they were still in the range of being an investment of interest. Also the possibility of utilizing a cheap resource (waste) and reducing pollution that would result from its disposal has a positive effect on the environment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabriele, Fatuzzo; Michele, Mangiameli, E-mail: amichele.mangiameli@dica.unict.it; Giuseppe, Mussumeci

    The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is amore » set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.« less

  16. mMass 3: a cross-platform software environment for precise analysis of mass spectrometric data.

    PubMed

    Strohalm, Martin; Kavan, Daniel; Novák, Petr; Volný, Michael; Havlícek, Vladimír

    2010-06-01

    While tools for the automated analysis of MS and LC-MS/MS data are continuously improving, it is still often the case that at the end of an experiment, the mass spectrometrist will spend time carefully examining individual spectra. Current software support is mostly provided only by the instrument vendors, and the available software tools are often instrument-dependent. Here we present a new generation of mMass, a cross-platform environment for the precise analysis of individual mass spectra. The software covers a wide range of processing tasks such as import from various data formats, smoothing, baseline correction, peak picking, deisotoping, charge determination, and recalibration. Functions presented in the earlier versions such as in silico digestion and fragmentation were redesigned and improved. In addition to Mascot, an interface for ProFound has been implemented. A specific tool is available for isotopic pattern modeling to enable precise data validation. The largest available lipid database (from the LIPID MAPS Consortium) has been incorporated and together with the new compound search tool lipids can be rapidly identified. In addition, the user can define custom libraries of compounds and use them analogously. The new version of mMass is based on a stand-alone Python library, which provides the basic functionality for data processing and interpretation. This library can serve as a good starting point for other developers in their projects. Binary distributions of mMass, its source code, a detailed user's guide, and video tutorials are freely available from www.mmass.org .

  17. Real-Time Data Display

    NASA Technical Reports Server (NTRS)

    Pedings, Marc

    2007-01-01

    RT-Display is a MATLAB-based data acquisition environment designed to use a variety of commercial off-the-shelf (COTS) hardware to digitize analog signals to a standard data format usable by other post-acquisition data analysis tools. This software presents the acquired data in real time using a variety of signal-processing algorithms. The acquired data is stored in a standard Operator Interactive Signal Processing Software (OISPS) data-formatted file. RT-Display is primarily configured to use the Agilent VXI (or equivalent) data acquisition boards used in such systems as MIDDAS (Multi-channel Integrated Dynamic Data Acquisition System). The software is generalized and deployable in almost any testing environment, without limitations or proprietary configuration for a specific test program or project. With the Agilent hardware configured and in place, users can start the program and, in one step, immediately begin digitizing multiple channels of data. Once the acquisition is completed, data is converted into a common binary format that also can be translated to specific formats used by external analysis software, such as OISPS and PC-Signal (product of AI Signal Research Inc.). RT-Display at the time of this reporting was certified on Agilent hardware capable of acquisition up to 196,608 samples per second. Data signals are presented to the user on-screen simultaneously for 16 channels. Each channel can be viewed individually, with a maximum capability of 160 signal channels (depending on hardware configuration). Current signal presentations include: time data, fast Fourier transforms (FFT), and power spectral density plots (PSD). Additional processing algorithms can be easily incorporated into this environment.

  18. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Astrophysics Data System (ADS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-11-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  19. SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences

    NASA Technical Reports Server (NTRS)

    Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.

    1994-01-01

    A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.

  20. A Software Designed For STP Data Plot and Analysis Based on Object-oriented Methodology

    NASA Astrophysics Data System (ADS)

    Lina, L.; Murata, K.

    2006-12-01

    In the present study, we design a system that is named "STARS (Solar-Terrestrial data Analysis and Reference System)". The STARS provides a research environment that researchers can refer to and analyse a variety of data with single software. This software design is based on the OMT (Object Modeling Technique). The OMT is one of the object-oriented techniques, which has an advantage in maintenance improvement, reuse and long time development of a system. At the Center for Information Technology, Ehime University, after our designing of the STARS, we have already started implementing the STARS. The latest version of the STARS, the STARS5, was released in 2006. Any user can download the system from our WWW site (http:// www.infonet.cite.ehime-u.ac.jp/STARS). The present paper is mainly devoted to the design of a data analysis software system. Through our designing, we paid attention so that the design is flexible and applicable when other developers design software for the similar purpose. If our model is so particular only for our own purpose, it would be useless for other developers. Through our design of the domain object model, we carefully removed the parts, which depend on the system resources, e.g. hardware and software. We put the dependent parts into the application object model. In the present design, therefore, the domain object model and the utility object model are independent of computer resource. This helps anther developer to construct his/her own system based the present design. They simply modify their own application object models according to their system resource. This division of the design between dependent and independent part into three object models is one of the advantages in the OMT. If the design of software is completely done along with the OMT, implementation is rather simple and automatic: developers simply map their designs on our programs. If one creates "ganother STARS" with other programming language such as Java, the programmer simply follows the present system as long as the language is object-oriented language. Researchers would want to add their data into the STARS. In this case, they simply add their own data class in the domain object model. It is because any satellite data has properties such as time or date, which are inherited from the upper class. In this way, their effort is less than in other old methodologies. In the OMT, description format of the system is rather strictly standardized. When new developers take part in STARS project, they have only to understand each model to obtain the overview of the STARS. Then they follow this designs and documents to implement the system. The OMT makes a new comer easy to join into the project already running.

  1. The boot software of the control unit of the near infrared spectrograph of the Euclid space mission: technical specification

    NASA Astrophysics Data System (ADS)

    Gómez-Sáenz-de-Tejada, Jaime; Toledo-Moreo, Rafael; Colodro-Conde, Carlos; Pérez-Lizán, David; Fernández-Conde, Jesús; Sánchez-Prieto, Sebastián.

    2016-07-01

    The Near Infrared Spectrograph and Photometer (NISP) is one of the instruments on board the ESA EUCLID mission. The Boot Software (BSW) is in charge of initialization and communications after a reset occurs at hard- ware level. The Universidad Politecnica de Cartagena and Instituto de Astrofisica de Canarias are responsible of the Instrument Control Unit of the NISP (NI-ICU) in the Euclid Consortium. The NI-ICU BSW is developed by Universidad de Alcaĺa, and its main functions are: communication with the S/C for memory management, self-tests and start of a patchable Application Software (ASW). This paper presents the NI-ICU BSW status of definition and design at the end of the Technical Specification phase.

  2. Mars Science Laboratory Flight Software Boot Robustness Testing Project Report

    NASA Technical Reports Server (NTRS)

    Roth, Brian

    2011-01-01

    On the surface of Mars, the Mars Science Laboratory will boot up its flight computers every morning, having charged the batteries through the night. This boot process is complicated, critical, and affected by numerous hardware states that can be difficult to test. The hardware test beds do not facilitate testing a long duration of back-to-back unmanned automated tests, and although the software simulation has provided the necessary functionality and fidelity for this boot testing, there has not been support for the full flexibility necessary for this task. Therefore to perform this testing a framework has been build around the software simulation that supports running automated tests loading a variety of starting configurations for software and hardware states. This implementation has been tested against the nominal cases to validate the methodology, and support for configuring off-nominal cases is ongoing. The implication of this testing is that the introduction of input configurations that have yet proved difficult to test may reveal boot scenarios worth higher fidelity investigation, and in other cases increase confidence in the robustness of the flight software boot process.

  3. Empowering citizens in international governance of nanotechnologies.

    PubMed

    Malsch, Ineke; Subramanian, Vrishali; Semenzin, Elena; Hristozov, Danail; Marcomini, Antonio; Mullins, Martin; Hester, Karena; McAlea, Eamonn; Murphy, Finbarr; Tofail, Syed A M

    The international dialogue on responsible governance of nanotechnologies engages a wide range of actors with conflicting as well as common interests. It is also characterised by a lack of evidence-based data on uncertain risks of in particular engineered nanomaterials. The present paper aims at deepening understanding of the collective decision making context at international level using the grounded theory approach as proposed by Glaser and Strauss in "The Discovery of Grounded Theory" (1967). This starts by discussing relevant concepts from different fields including sociological and political studies of international relations as well as political philosophy and ethics. This analysis of current trends in international law making is taken as starting point for exploring the role that a software decision support tool could play in multi-stakeholder global governance of nanotechnologies. These theoretical ideas are then compared with the current design of the SUN Decision Support System (SUNDS) under development in the European project on Sustainable Nanotechnologies (SUN, www.sun-fp7.eu). Through constant comparison, the ideas are also compared with requirements of different stakeholders as expressed during a user workshop. This allows for highlighting discussion points for further consideration.

  4. Empowering citizens in international governance of nanotechnologies

    NASA Astrophysics Data System (ADS)

    Malsch, Ineke; Subramanian, Vrishali; Semenzin, Elena; Hristozov, Danail; Marcomini, Antonio; Mullins, Martin; Hester, Karena; McAlea, Eamonn; Murphy, Finbarr; Tofail, Syed A. M.

    2015-05-01

    The international dialogue on responsible governance of nanotechnologies engages a wide range of actors with conflicting as well as common interests. It is also characterised by a lack of evidence-based data on uncertain risks of in particular engineered nanomaterials. The present paper aims at deepening understanding of the collective decision making context at international level using the grounded theory approach as proposed by Glaser and Strauss in "The Discovery of Grounded Theory" (1967). This starts by discussing relevant concepts from different fields including sociological and political studies of international relations as well as political philosophy and ethics. This analysis of current trends in international law making is taken as starting point for exploring the role that a software decision support tool could play in multi-stakeholder global governance of nanotechnologies. These theoretical ideas are then compared with the current design of the SUN Decision Support System (SUNDS) under development in the European project on Sustainable Nanotechnologies (SUN, www.sun-fp7.eu). Through constant comparison, the ideas are also compared with requirements of different stakeholders as expressed during a user workshop. This allows for highlighting discussion points for further consideration.

  5. MATISSE a web-based tool to access, visualize and analyze high resolution minor bodies observation

    NASA Astrophysics Data System (ADS)

    Zinzi, Angelo; Capria, Maria Teresa; Palomba, Ernesto; Antonelli, Lucio Angelo; Giommi, Paolo

    2016-07-01

    In the recent years planetary exploration missions acquired data from minor bodies (i.e., dwarf planets, asteroid and comets) at a detail level never reached before. Since these objects often present very irregular shapes (as in the case of the comet 67P Churyumov-Gerasimenko target of the ESA Rosetta mission) "classical" bidimensional projections of observations are difficult to understand. With the aim of providing the scientific community a tool to access, visualize and analyze data in a new way, ASI Science Data Center started to develop MATISSE (Multi-purposed Advanced Tool for the Instruments for the Solar System Exploration - http://tools.asdc.asi.it/matisse.jsp) in late 2012. This tool allows 3D web-based visualization of data acquired by planetary exploration missions: the output could either be the straightforward projection of the selected observation over the shape model of the target body or the visualization of a high-order product (average/mosaic, difference, ratio, RGB) computed directly online with MATISSE. Standard outputs of the tool also comprise downloadable files to be used with GIS software (GeoTIFF and ENVI format) and 3D very high-resolution files to be viewed by means of the free software Paraview. During this period the first and most frequent exploitation of the tool has been related to visualization of data acquired by VIRTIS-M instruments onboard Rosetta observing the comet 67P. The success of this task, well represented by the good number of published works that used images made with MATISSE confirmed the need of a different approach to correctly visualize data coming from irregular shaped bodies. In the next future the datasets available to MATISSE are planned to be extended, starting from the addition of VIR-Dawn observations of both Vesta and Ceres and also using standard protocols to access data stored in external repositories, such as NASA ODE and Planetary VO.

  6. Cassini Attitude Control Flight Software: from Development to In-Flight Operation

    NASA Technical Reports Server (NTRS)

    Brown, Jay

    2008-01-01

    The Cassini Attitude and Articulation Control Subsystem (AACS) Flight Software (FSW) has achieved its intended design goals by successfully guiding and controlling the Cassini-Huygens planetary mission to Saturn and its moons. This paper describes an overview of AACS FSW details from early design, development, implementation, and test to its fruition of operating and maintaining spacecraft control over an eleven year prime mission. Starting from phases of FSW development, topics expand to FSW development methodology, achievements utilizing in-flight autonomy, and summarize lessons learned during flight operations which can be useful to FSW in current and future spacecraft missions.

  7. System monitoring feedback in cinemas and harvesting energy of the air conditioning condenser

    NASA Astrophysics Data System (ADS)

    Pop, P. P.; Pop-Vadean, A.; Barz, C.; Latinovic, T.; Chiver, O.

    2017-05-01

    Our article monitors the degree of emotional involvement of the audience in the action film in theaters by measuring the concentration of CO2. The software performs data processing obtained dispersion sensors and displays data during the film. The software will also trigger the start of the air conditioning condenser where we can get harvesting energy by installing a piezoelectric device. Useful energy can be recovered from various waste produced in cinema. The time lag between actions and changes in environmental systems determines that decisions made now will affect subsequent generations and the future of our environment.

  8. Sensor Suitcase Tablet Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Retrocommissioning Sensor Suitcase is targeted for use in small commercial buildings of less than 50,000 square feet of floor space that regularly receive basic services such as maintenance and repair, but don't have in-house energy management staff or buildings experts. The Suitcase is designed to be easy-to-use by building maintenance staff, or other professionals such as telecom and alarm technicians. The software in the hand-held is designed to guide the staff to input the building and system information, deploy the sensors in proper location, configure the sensor hardware, and start the data collection.

  9. Blending an Android Development Course with Software Engineering Concepts

    ERIC Educational Resources Information Center

    Chatzigeorgiou, Alexander; Theodorou, Tryfon L.; Violettas, George E.; Xinogalos, Stelios

    2016-01-01

    The tremendous popularity of mobile computing and Android in particular has attracted millions of developers who see opportunities for building their own start-ups. As a consequence Computer Science students express an increasing interest into the related technology of Java development for Android applications. Android projects are complex by…

  10. Getting Started with Microcomputers--A Practical Beginner's Guide.

    ERIC Educational Resources Information Center

    Davies, Norman F.

    1985-01-01

    Discusses the results of a questionnaire sent to experts in the field of computer assisted language learning. Covers such topics as: 1) points to consider before buying a microcomputer; 2) recommended brands and peripheral equipment; 3) software; 4) utilizing programming languages; and 5) literature and contact organizations. (SED)

  11. WinHPC System Programming | High-Performance Computing | NREL

    Science.gov Websites

    Programming WinHPC System Programming Learn how to build and run an MPI (message passing interface (mpi.h) and library (msmpi.lib) are. To build from the command line, run... Start > Intel Software Development Tools > Intel C++ Compiler Professional... > C++ Build Environment for applications running

  12. Interactive Videodisc: the "Why" and the "How." CALICO Monograph Volume 2, Spring 1991.

    ERIC Educational Resources Information Center

    Bush, Michael D.; And Others

    This monograph presents articles on interactive videodisc technology in language learning, ranging from the importance of a theoretical framework, the transition from theory to practice, getting started, design considerations, hypermedia, discovery environments, authoring software, workstation environments, and a look at the future of optical disc…

  13. Pages from the Desktop: Desktop Publishing Today.

    ERIC Educational Resources Information Center

    Crawford, Walt

    1994-01-01

    Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…

  14. Getting Started in Multimedia Training: Cutting or Bleeding Edge?

    ERIC Educational Resources Information Center

    Anderson, Vicki; Sleezer, Catherine M.

    1995-01-01

    Defines multimedia, explores uses of multimedia training, and discusses the effects and challenges of adding multimedia such as graphics, photographs, full motion video, sound effects, or CD-ROMs to existing training methods. Offers planning tips, and suggests software and hardware tools to help set up multimedia training programs. (JMV)

  15. The Educators' Handbook to Interactive Videodisc.

    ERIC Educational Resources Information Center

    Schwartz, Ed

    This overview of interactive videodisc technology is designed to assist educators in finding the appropriate equipment and software for any specific application. The handbook may also serve as a starting point for many educators who know nothing of the technology and assist them in deciding whether this technology is worth pursuing as an…

  16. Is There a Microcomputer in Your Future? ComputerTown Thinks The Answer Is "Yes."

    ERIC Educational Resources Information Center

    Harvie, Barbara; Anton, Julie

    1983-01-01

    The services of ComputerTown, a nonprofit computer literacy project of the People's Computer Company in Menlo Park, California with 150 worldwide affiliates, are enumerated including getting started, funding sources, selecting hardware, software selection, support materials, administrative details, special offerings (classes, events), and common…

  17. Validation of the PCN Concept: Mobility, Traffic Flow Confidentiality and Protection Against Directed Attacks

    DTIC Science & Technology

    2010-11-01

    peer, racoon (IKE-daemon) will start authenticating using certificates. After a successful authentication, IPSec security associations will be set up...colour had credentials from one CA. Racoon and ipsec-tools are open-source software, implementing IKE and IPSec. Validation of the PCN Concept; Mobility

  18. The ATLAS Data Acquisition System in LHC Run 2

    NASA Astrophysics Data System (ADS)

    Panduro Vazquez, William; ATLAS Collaboration

    2017-10-01

    The LHC has been providing pp collisions with record luminosity and energy since the start of Run 2 in 2015. The Trigger and Data Acquisition system of the ATLAS experiment has been upgraded to deal with the increased performance required by this new operational mode. The dataflow system and associated network infrastructure have been reshaped in order to benefit from technological progress and to maximize the flexibility and efficiency of the data selection process. The new design is radically different from the previous implementation both in terms of architecture and performance, with the previous two-level structure merged into a single processing farm, performing incremental data collection and analysis. In addition, logical farm slicing, with each slice managed by a dedicated supervisor, has been dropped in favour of global management by a single farm master operating at 100 kHz. This farm master has also been integrated with a new software-based Region of Interest builder, replacing the previous VMEbus-based system. Finally, the Readout system has been completely refitted with new higher performance, lower footprint server machines housing a new custom front-end interface card. Here we will cover the overall design of the system, along with performance results from the start-up phase of LHC Run 2.

  19. Usability and utility evaluation of the web-based "Should I Start Insulin?" patient decision aid for patients with type 2 diabetes among older people.

    PubMed

    Lee, Yew Kong; Lee, Ping Yein; Ng, Chirk Jenn; Teo, Chin Hai; Abu Bakar, Ahmad Ihsan; Abdullah, Khatijah Lim; Khoo, Ee Ming; Hanafi, Nik Sherina; Low, Wah Yun; Chiew, Thiam Kian

    2018-01-01

    This study aimed to evaluate the usability (ease of use) and utility (impact on user's decision-making process) of a web-based patient decision aid (PDA) among older-age users. A pragmatic, qualitative research design was used. We recruited patients with type 2 diabetes who were at the point of making a decision about starting insulin from a tertiary teaching hospital in Malaysia in 2014. Computer screen recording software was used to record the website browsing session and in-depth interviews were conducted while playing back the website recording. The interviews were analyzed using the framework approach to identify usability and utility issues. Three cycles of iteration were conducted until no more major issues emerged. Thirteen patients participated: median age 65 years old, 10 men, and nine had secondary education/diploma, four were graduates/had postgraduate degree. Four usability issues were identified (navigation between pages and sections, a layout with open display, simple language, and equipment preferences). For utility, participants commented that the website influenced their decision about insulin in three ways: it had provided information about insulin, it helped them deliberate choices using the option-attribute matrix, and it allowed them to involve others in their decision making by sharing the PDA summary printout.

  20. External Dependencies-Driven Architecture Discovery and Analysis of Implemented Systems

    NASA Technical Reports Server (NTRS)

    Ganesan, Dharmalingam; Lindvall, Mikael; Ron, Monica

    2014-01-01

    A method for architecture discovery and analysis of implemented systems (AIS) is disclosed. The premise of the method is that architecture decisions are inspired and influenced by the external entities that the software system makes use of. Examples of such external entities are COTS components, frameworks, and ultimately even the programming language itself and its libraries. Traces of these architecture decisions can thus be found in the implemented software and is manifested in the way software systems use such external entities. While this fact is often ignored in contemporary reverse engineering methods, the AIS method actively leverages and makes use of the dependencies to external entities as a starting point for the architecture discovery. The AIS method is demonstrated using the NASA's Space Network Access System (SNAS). The results show that, with abundant evidence, the method offers reusable and repeatable guidelines for discovering the architecture and locating potential risks (e.g. low testability, decreased performance) that are hidden deep in the implementation. The analysis is conducted by using external dependencies to identify, classify and review a minimal set of key source code files. Given the benefits of analyzing external dependencies as a way to discover architectures, it is argued that external dependencies deserve to be treated as first-class citizens during reverse engineering. The current structure of a knowledge base of external entities and analysis questions with strategies for getting answers is also discussed.

  1. Assesment of access to bibliographic databases and telemetry databases in Astronomy: A groundswell for development.

    NASA Astrophysics Data System (ADS)

    Diaz-Merced, Wanda Liz; Casado, Johanna; Garcia, Beatriz; Aarnio, Alicia; Knierman, Karen; Monkiewicz, Jacqueline; Alicia Aarnio.

    2018-01-01

    Big Data" is a subject that has taken special relevance today, particularly in Astrophysics, where continuous advances in technology are leading to ever larger data sets. A multimodal approach in perception of astronomical data data (achieved through sonification used for the processing of data) increases the detection of signals in very low signal-to-noise ratio limits and is of special importance to achieve greater inclusion in the field of Astronomy. In the last ten years, different software tools have been developed that perform the sonification of astronomical data from tables or databases, among them the best known and in multiplatform development are Sonification Sandbox, MathTrack, and xSonify.In order to determine the accessibility of software we propose to start carrying out a conformity analysis of ISO (International Standard Organization) 9241-171171: 2008. This standard establishes the general guidelines that must be taken into account for accessibility in software design, and it is applied to software used in work, public places, and at home. To analyze the accessibility of web databases, we take into account the "Web Content Content Accessibility Guidelines (WCAG) 2.0", accepted and published by ISO in the ISO / IEC 40500: 2012 standard.In this poster, we present a User Centered Design (UCD), Human Computer Interaction (HCI), and User Experience (UX) framework to address a non-segregational provision of access to bibliographic databases and telemetry databases in Astronomy. Our framework is based on an ISO evaluation on a selection of data bases such as ADS, Simbad and SDSS. The WCAG 2.0 and ISO 9241-171171: 2008 should not be taken as absolute accessibility standards: these guidelines are very general, are not absolute, and do not address particularities. They are not to be taken as a substitute for UCD, HCI, UX design and evaluation. Based on our results, this research presents the framework for a focus group and qualitative data analysis aimed to lay the foundations for the employment of UCD functionalities on astronomical databases.

  2. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.

  3. Bringing the Unidata IDV to the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, W. I.; Oxelson Ganter, J.

    2015-12-01

    Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  4. Evaluation of the diagnostic accuracy of CareStart G6PD deficiency Rapid Diagnostic Test (RDT) in a malaria endemic area in Ghana, Africa.

    PubMed

    Adu-Gyasi, Dennis; Asante, Kwaku Poku; Newton, Sam; Dosoo, David; Amoako, Sabastina; Adjei, George; Amoako, Nicholas; Ankrah, Love; Tchum, Samuel Kofi; Mahama, Emmanuel; Agyemang, Veronica; Kayan, Kingsley; Owusu-Agyei, Seth

    2015-01-01

    Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most widespread enzyme defect that can result in red cell breakdown under oxidative stress when exposed to certain medicines including antimalarials. We evaluated the diagnostic accuracy of CareStart G6PD deficiency Rapid Diagnostic Test (RDT) as a point-of-care tool for screening G6PD deficiency. A cross-sectional study was conducted among 206 randomly selected and consented participants from a group with known G6PD deficiency status between February 2013 and June 2013. A maximum of 1.6ml of capillary blood samples were used for G6PD deficiency screening using CareStart G6PD RDT and Trinity qualitative with Trinity quantitative methods as the "gold standard". Samples were also screened for the presence of malaria parasites. Data entry and analysis were done using Microsoft Access 2010 and Stata Software version 12. Kintampo Health Research Centre Institutional Ethics Committee granted ethical approval. The sensitivity (SE) and specificity (SP) of CareStart G6PD deficiency RDT was 100% and 72.1% compared to Trinity quantitative method respectively and was 98.9% and 96.2% compared to Trinity qualitative method. Malaria infection status had no significant (P=0.199) change on the performance of the G6PD RDT test kit compared to the "gold standard". The outcome of this study suggests that the diagnostic performance of the CareStart G6PD deficiency RDT kit was high and it is acceptable at determining the G6PD deficiency status in a high malaria endemic area in Ghana. The RDT kit presents as an attractive tool for point-of-care G6PD deficiency for rapid testing in areas with high temperatures and less expertise. The CareStart G6PD deficiency RDT kit could be used to screen malaria patients before administration of the fixed dose primaquine with artemisinin-based combination therapy.

  5. DSN Scheduling Engine

    NASA Technical Reports Server (NTRS)

    Clement, Bradley; Johnston, Mark; Wax, Allan; Chouinard, Caroline

    2008-01-01

    The DSN (Deep Space Network) Scheduling Engine targets all space missions that use DSN services. It allows clients to issue scheduling, conflict identification, conflict resolution, and status requests in XML over a Java Message Service interface. The scheduling requests may include new requirements that represent a set of tracks to be scheduled under some constraints. This program uses a heuristic local search to schedule a variety of schedule requirements, and is being infused into the Service Scheduling Assembly, a mixed-initiative scheduling application. The engine resolves conflicting schedules of resource allocation according to a range of existing and possible requirement specifications, including optional antennas; start of track and track duration ranges; periodic tracks; locks on track start, duration, and allocated antenna; MSPA (multiple spacecraft per aperture); arraying/VLBI (very long baseline interferometry)/delta DOR (differential one-way ranging); continuous tracks; segmented tracks; gap-to-track ratio; and override or block-out of requirements. The scheduling models now include conflict identification for SOA(start of activity), BOT (beginning of track), RFI (radio frequency interference), and equipment constraints. This software will search through all possible allocations while providing a best-effort solution at any time. The engine reschedules to accommodate individual emergency tracks in 0.2 second, and emergency antenna downtime in 0.2 second. The software handles doubling of one mission's track requests over one week (to 42 total) in 2.7 seconds. Further tests will be performed in the context of actual schedules.

  6. A research on the application of software defined networking in satellite network architecture

    NASA Astrophysics Data System (ADS)

    Song, Huan; Chen, Jinqiang; Cao, Suzhi; Cui, Dandan; Li, Tong; Su, Yuxing

    2017-10-01

    Software defined network is a new type of network architecture, which decouples control plane and data plane of traditional network, has the feature of flexible configurations and is a direction of the next generation terrestrial Internet development. Satellite network is an important part of the space-ground integrated information network, while the traditional satellite network has the disadvantages of difficult network topology maintenance and slow configuration. The application of SDN technology in satellite network can solve these problems that traditional satellite network faces. At present, the research on the application of SDN technology in satellite network is still in the stage of preliminary study. In this paper, we start with introducing the SDN technology and satellite network architecture. Then we mainly introduce software defined satellite network architecture, as well as the comparison of different software defined satellite network architecture and satellite network virtualization. Finally, the present research status and development trend of SDN technology in satellite network are analyzed.

  7. CellProfiler and KNIME: open source tools for high content screening.

    PubMed

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  8. Integrating Multibody Simulation and CFD: toward Complex Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Pieri, Stefano; Poloni, Carlo; Mühlmeier, Martin

    This paper describes the use of integrated multidisciplinary analysis and optimization of a race car model on a predefined circuit. The objective is the definition of the most efficient geometric configuration that can guarantee the lowest lap time. In order to carry out this study it has been necessary to interface the design optimization software modeFRONTIER with the following softwares: CATIA v5, a three dimensional CAD software, used for the definition of the parametric geometry; A.D.A.M.S./Motorsport, a multi-body dynamic simulation software; IcemCFD, a mesh generator, for the automatic generation of the CFD grid; CFX, a Navier-Stokes code, for the fluid-dynamic forces prediction. The process integration gives the possibility to compute, for each geometrical configuration, a set of aerodynamic coefficients that are then used in the multiboby simulation for the computation of the lap time. Finally an automatic optimization procedure is started and the lap-time minimized. The whole process is executed on a Linux cluster running CFD simulations in parallel.

  9. 360° Film Brings Bombed Church to Life

    NASA Astrophysics Data System (ADS)

    Kwiatek, K.

    2011-09-01

    This paper explores how a computer-generated reconstruction of a church can be adapted to create a panoramic film that is presented in a panoramic viewer and also on a wrap-around projection system. It focuses on the fundamental principles of creating 360º films, not only in 3D modelling software, but also presents how to record 360º video using panoramic cameras inside the heritage site. These issues are explored in a case study of Charles Church in Plymouth, UK that was bombed in 1941 and has never been rebuilt. The generation of a 3D model of the bombed church started from the creation of five spherical panoramas and through the use of Autodesk ImageModeler software. The processed files were imported and merged together in Autodesk 3ds Max where a visualisation of the ruin was produced. A number of historical images were found and this collection enabled the process of a virtual reconstruction of the site. The aspect of merging two still or two video panoramas (one from 3D modelling software, the other one recorded on the site) from the same locations or with the same trajectories is also discussed. The prototype of 360º non-linear film tells a narrative of a wartime wedding that occurred in this church. The film was presented on two 360º screens where members of the audience could make decisions on whether to continue the ceremony or whether to run away when the bombing of the church starts. 3D modelling software made this possible to render a number of different alternati ves (360º images and 360º video). Immersive environments empower the visitor to imagine the building before it was destroyed.

  10. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  11. Functional analysis of the promoter of the molt-inhibiting hormone (mih) gene in mud crab Scylla paramamosain.

    PubMed

    Zhang, Xin; Huang, Danping; Jia, Xiwei; Zou, Zhihua; Wang, Yilei; Zhang, Ziping

    2018-04-01

    In this study, the 5'-flanking region of molt-inhibiting hormone (MIH) gene was cloned by Tail-PCR. It is 2024 bp starting from the translation initiation site, and 1818 bp starting from the predicted transcription start site. Forecast analysis results by the bioinformatics software showed that the transcription start site is located at 207 bp upstream of the start codon ATG, and TATA box is located at 240 bp upstream of the start codon ATG. Potential transcription factor binding sites include Sp1, NF-1, Oct-1, Sox-2, RAP1, and so on. There are two CpG islands, located at -25- +183 bp and -1451- -1316 bp respectively. The transfection results of luciferase reporter constructs showed that the core promoter region was located in the fragment -308 bp to -26 bp. NF-kappaB and RAP1 were essential for mih basal transcriptional activity. There are three kinds of polymorphism CA in the 5'-flanking sequence, and they can influence mih promoter activity. These findings provide a genetic foundation of the further research of mih transcription regulation. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. The NASA Constellation Program Procedure System

    NASA Technical Reports Server (NTRS)

    Phillips, Robert G.; Wang, Lui

    2010-01-01

    NASA has used procedures to describe activities to be performed onboard vehicles by astronaut crew and on the ground by flight controllers since Apollo. Starting with later Space Shuttle missions and the International Space Station, NASA moved forward to electronic presentation of procedures. For the Constellation Program, another large step forward is being taken - to make procedures more interactive with the vehicle and to assist the crew in controlling the vehicle more efficiently and with less error. The overall name for the project is the Constellation Procedure Applications Software System (CxPASS). This paper describes some of the history behind this effort, the key concepts and operational paradigms that the work is based upon, and the actual products being developed to implement procedures for Constellation

  13. Production of the next-generation library virtual tour

    PubMed Central

    Duncan, James M.; Roth, Linda K.

    2001-01-01

    While many libraries offer overviews of their services through their Websites, only a small number of health sciences libraries provide Web-based virtual tours. These tours typically feature photographs of major service areas along with textual descriptions. This article describes the process for planning, producing, and implementing a next-generation virtual tour in which a variety of media elements are integrated: photographic images, 360-degree “virtual reality” views, textual descriptions, and contextual floor plans. Hardware and software tools used in the project are detailed, along with a production timeline and budget, tips for streamlining the process, and techniques for improving production. This paper is intended as a starting guide for other libraries considering an investment in such a project. PMID:11837254

  14. A Joint Optimization Criterion for Blind DS-CDMA Detection

    NASA Astrophysics Data System (ADS)

    Durán-Díaz, Iván; Cruces-Alvarez, Sergio A.

    2006-12-01

    This paper addresses the problem of the blind detection of a desired user in an asynchronous DS-CDMA communications system with multipath propagation channels. Starting from the inverse filter criterion introduced by Tugnait and Li in 2001, we propose to tackle the problem in the context of the blind signal extraction methods for ICA. In order to improve the performance of the detector, we present a criterion based on the joint optimization of several higher-order statistics of the outputs. An algorithm that optimizes the proposed criterion is described, and its improved performance and robustness with respect to the near-far problem are corroborated through simulations. Additionally, a simulation using measurements on a real software-radio platform at 5 GHz has also been performed.

  15. Multifit / Polydefix : a framework for the analysis of polycrystal deformation using X-rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merkel, Sébastien; Hilairet, Nadège

    2015-06-27

    Multifit/Polydefixis an open source IDL software package for the efficient processing of diffraction data obtained in deformation apparatuses at synchrotron beamlines.Multifitallows users to decompose two-dimensional diffraction images into azimuthal slices, fit peak positions, shapes and intensities, and propagate the results to other azimuths and images.Polydefixis for analysis of deformation experiments. Starting from output files created inMultifitor other packages, it will extract elastic lattice strains, evaluate sample pressure and differential stress, and prepare input files for further texture analysis. TheMultifit/Polydefixpackage is designed to make the tedious data analysis of synchrotron-based plasticity, rheology or other time-dependent experiments very straightforward and accessible tomore » a wider community.« less

  16. Improving performance with knowledge management

    NASA Astrophysics Data System (ADS)

    Kim, Sangchul

    2018-06-01

    People and organization are unable to easily locate their experience and knowledge, so meaningful data is usually fragmented, unstructured, not up-to-date and largely incomplete. Poor knowledge management (KM) leaves a company weak to their knowledge-base - or intellectual capital - walking out of the door each year, that is minimum estimated at 10%. Knowledge management (KM) can be defined as an emerging set of organizational design and operational principles, processes, organizational structures, applications and technologies that helps knowledge workers dramatically leverage their creativity and ability to deliver business value and to reap finally a competitive advantage. Then, this paper proposed various method and software starting with an understanding of the enterprise aspect, and gave inspiration to those who wanted to use KM.

  17. Software Requirements Analysis as Fault Predictor

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Waiting until the integration and system test phase to discover errors leads to more costly rework than resolving those same errors earlier in the lifecycle. Costs increase even more significantly once a software system has become operational. WE can assess the quality of system requirements, but do little to correlate this information either to system assurance activities or long-term reliability projections - both of which remain unclear and anecdotal. Extending earlier work on requirements accomplished by the ARM tool, measuring requirements quality information against code complexity and test data for the same system may be used to predict specific software modules containing high impact or deeply embedded faults now escaping in operational systems. Such knowledge would lead to more effective and efficient test programs. It may enable insight into whether a program should be maintained or started over.

  18. Hybrid Optimization Parallel Search PACKage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2009-11-10

    HOPSPACK is open source software for solving optimization problems without derivatives. Application problems may have a fully nonlinear objective function, bound constraints, and linear and nonlinear constraints. Problem variables may be continuous, integer-valued, or a mixture of both. The software provides a framework that supports any derivative-free type of solver algorithm. Through the framework, solvers request parallel function evaluation, which may use MPI (multiple machines) or multithreading (multiple processors/cores on one machine). The framework provides a Cache and Pending Cache of saved evaluations that reduces execution time and facilitates restarts. Solvers can dynamically create other algorithms to solve subproblems, amore » useful technique for handling multiple start points and integer-valued variables. HOPSPACK ships with the Generating Set Search (GSS) algorithm, developed at Sandia as part of the APPSPACK open source software project.« less

  19. Sequence analysis of the canine mitochondrial DNA control region from shed hair samples in criminal investigations.

    PubMed

    Berger, C; Berger, B; Parson, W

    2012-01-01

    In recent years, evidence from domestic dogs has increasingly been analyzed by forensic DNA testing. Especially, canine hairs have proved most suitable and practical due to the high rate of hair transfer occurring between dogs and humans. Starting with the description of a contamination-free sample handling procedure, we give a detailed workflow for sequencing hypervariable segments (HVS) of the mtDNA control region from canine evidence. After the hair material is lysed and the DNA extracted by Phenol/Chloroform, the amplification and sequencing strategy comprises the HVS I and II of the canine control region and is optimized for DNA of medium-to-low quality and quantity. The sequencing procedure is based on the Sanger Big-dye deoxy-terminator method and the separation of the sequencing reaction products is performed on a conventional multicolor fluorescence detection capillary electrophoresis platform. Finally, software-aided base calling and sequence interpretation are addressed exemplarily.

  20. On Two-Scale Modelling of Heat and Mass Transfer

    NASA Astrophysics Data System (ADS)

    Vala, J.; Št'astník, S.

    2008-09-01

    Modelling of macroscopic behaviour of materials, consisting of several layers or components, whose microscopic (at least stochastic) analysis is available, as well as (more general) simulation of non-local phenomena, complicated coupled processes, etc., requires both deeper understanding of physical principles and development of mathematical theories and software algorithms. Starting from the (relatively simple) example of phase transformation in substitutional alloys, this paper sketches the general formulation of a nonlinear system of partial differential equations of evolution for the heat and mass transfer (useful in mechanical and civil engineering, etc.), corresponding to conservation principles of thermodynamics, both at the micro- and at the macroscopic level, and suggests an algorithm for scale-bridging, based on the robust finite element techniques. Some existence and convergence questions, namely those based on the construction of sequences of Rothe and on the mathematical theory of two-scale convergence, are discussed together with references to useful generalizations, required by new technologies.

  1. Do the Critical Success Factors from Learning Analytics Predict Student Outcomes?

    ERIC Educational Resources Information Center

    Strang, Kenneth David

    2016-01-01

    This article starts with a detailed literature review of recent studies that focused on using learning analytics software or learning management system data to determine the nature of any relationships between online student activity and their academic outcomes within university-level business courses. The article then describes how data was…

  2. Computer Architecture's Changing Role in Rebooting Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeBenedictis, Erik P.

    In this paper, Windows 95 started the Wintel era, in which Microsoft Windows running on Intel x86 microprocessors dominated the computer industry and changed the world. Retaining the x86 instruction set across many generations let users buy new and more capable microprocessors without having to buy software to work with new architectures.

  3. Using Empirical Evidence in the Process of Proving: The Case of Dynamic Geometry

    ERIC Educational Resources Information Center

    Guven, Bulent; Cekmez, Erdem; Karatas, Ilhan

    2010-01-01

    With the emergence of Dynamic Geometry Software (DGS), a theoretical gap between the acquisition (inductive) and the justification (deductive) of a mathematical statement has started a debate. Some educators believe that deductive proof in geometry should be abandoned in favour of an experimental approach to mathematical justification. This…

  4. Effectiveness of Strategy Instruction Using Podcasts in Second Language Listening and Speaking

    ERIC Educational Resources Information Center

    Kang, Tingting

    2016-01-01

    Mobile devices have become a significant part of students' lives. The average number of hours that college students reported using their smartphones each day was nine hours--more than half of their daily waking hours (Roberts, Yaya, & Manolis, 2014). Although software developers and teachers have started to develop and incorporate various…

  5. NREL Announces Third Round of Start-Ups to Participate in the Wells Fargo

    Science.gov Websites

    innovative commercial building technologies Photo of NREL researchers talking. George Lee and Steven Low that provide scalable solutions to reduce the energy impact of commercial buildings. Including Round 3 kit for commercial buildings. Referred to apply to program by University of Colorado Boulder Software

  6. Network Simulation Training Instructor's Guide and Student Handouts. Series #B01038.

    ERIC Educational Resources Information Center

    Oklahoma State Dept. of Vocational and Technical Education, Stillwater. Curriculum and Instructional Materials Center.

    This training material provides the reader with information on the installation of and instruction in the use of the on-line Novell Netware V2.15 training software, and consists of an Instructor's Guide and Student Handouts. The instructor's guide includes the following sections: system requirements; installation; starting the tutorial; completing…

  7. Computer Architecture's Changing Role in Rebooting Computing

    DOE PAGES

    DeBenedictis, Erik P.

    2017-04-26

    In this paper, Windows 95 started the Wintel era, in which Microsoft Windows running on Intel x86 microprocessors dominated the computer industry and changed the world. Retaining the x86 instruction set across many generations let users buy new and more capable microprocessors without having to buy software to work with new architectures.

  8. A Multimedia Telematics Network for On-the-Job Training, Tutoring and Assessment.

    ERIC Educational Resources Information Center

    Ferreira, J. M. Martins; MacKinnon, Lachlan; Desmulliez, Marc; Foulk, Patrick

    This paper describes an educational multimedia network developed in Advanced Software for Training and Evaluation of Processes (ASTEP). ASTEP started in February 1998 and was set up by a mixed industry-academia consortium with the objective of meeting the educational/training demands of the highly competitive microelectronics/semiconductor…

  9. Observing Student Working Styles when Using Graphic Calculators to Solve Mathematics Problems

    ERIC Educational Resources Information Center

    Berry, J.; Graham, E.; Smith, A.

    2006-01-01

    Some research studies, many of which used quantitative methods, have suggested that graphics calculators can be used to effectively enhance the learning of mathematics. More recently research studies have started to explore students' styles of working as they solve problems with technology. This paper describes the use of a software application…

  10. Accurate computation and continuation of homoclinic and heteroclinic orbits for singular perturbation problems

    NASA Technical Reports Server (NTRS)

    Vaughan, William W.; Friedman, Mark J.; Monteiro, Anand C.

    1993-01-01

    In earlier papers, Doedel and the authors have developed a numerical method and derived error estimates for the computation of branches of heteroclinic orbits for a system of autonomous ordinary differential equations in R(exp n). The idea of the method is to reduce a boundary value problem on the real line to a boundary value problem on a finite interval by using a local (linear or higher order) approximation of the stable and unstable manifolds. A practical limitation for the computation of homoclinic and heteroclinic orbits has been the difficulty in obtaining starting orbits. Typically these were obtained from a closed form solution or via a homotopy from a known solution. Here we consider extensions of our algorithm which allow us to obtain starting orbits on the continuation branch in a more systematic way as well as make the continuation algorithm more flexible. In applications, we use the continuation software package AUTO in combination with some initial value software. The examples considered include computation of homoclinic orbits in a singular perturbation problem and in a turbulent fluid boundary layer in the wall region problem.

  11. Real-Time 3D Visualization

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Butler Hine, former director of the Intelligent Mechanism Group (IMG) at Ames Research Center, and five others partnered to start Fourth Planet, Inc., a visualization company that specializes in the intuitive visual representation of dynamic, real-time data over the Internet and Intranet. Over a five-year period, the then NASA researchers performed ten robotic field missions in harsh climes to mimic the end- to-end operations of automated vehicles trekking across another world under control from Earth. The core software technology for these missions was the Virtual Environment Vehicle Interface (VEVI). Fourth Planet has released VEVI4, the fourth generation of the VEVI software, and NetVision. VEVI4 is a cutting-edge computer graphics simulation and remote control applications tool. The NetVision package allows large companies to view and analyze in virtual 3D space such things as the health or performance of their computer network or locate a trouble spot on an electric power grid. Other products are forthcoming. Fourth Planet is currently part of the NASA/Ames Technology Commercialization Center, a business incubator for start-up companies.

  12. Evaluation of the free, open source software WordPress as electronic portfolio system in undergraduate medical education.

    PubMed

    Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm

    2016-06-03

    Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile capabilities with the free and open source software WordPress. This allows institutions without proprietary software to build a sophisticated ePortfolio system adapted to their needs with relatively few resources. The implementation of WordPress should be accompanied by introductory courses in the use of the software and its apps in order to facilitate its usability.

  13. Médicarte software developed for the Quebec microprocessor health card project.

    PubMed

    Lavoie, G; Tremblay, L; Durant, P; Papillon, M J; Bérubé, J; Fortin, J P

    1995-01-01

    The Quebec Patient Smart Card Project is a Provincial Government initiative under the responsibility of the Rgie de l'assurance-maladie du Québec (Quebec Health Insurance Board). Development, implementation, and assessment duties were assigned to a team from Université Laval, which in turn joined a group from the Direction de la santé publique du Bas-St-Laurent in Rimouski, where the experiment is taking place. The pilot project seeks to evaluate the use and acceptance of a microprocessor card as a way to improve the exchange of clinical information between card users and various health professionals. The card can be best described as a résumé containing information pertinent to an individual's health history. It is not a complete medical file; rather, it is a summary to be used as a starting point for a discussion between health professionals and patients. The target population is composed of persons 60 years and over, pregnant women, infants under 18 months, and the residents of a small town located in the target area, St-Fabien, regardless of age. The health professionals involved are general practitioners, specialists, pharmacists, nurses, and ambulance personnel. Participation in the project is on a voluntary basis. Each health care provider participating in the project has a personal identification number (PIN) and must use both an access card and a user card to access information. This prevents unauthorized access to a patient's card and allows the staff to sign and date information entered onto the patient card. To test the microprocessor card, we developed software based on a problem-oriented approach integrating diagnosis, investigations, treatments, and referrals. This software is not an expert system that constrains the clinician to a particular decisional algorithm. Instead, the software supports the physician in decision making. The software was developed with a graphical interface (Windows 3.1) to maximize its user friendliness. A version of the software was developed for each of the four groups of health care providers involved. In addition we designed an application to interface with existing pharmaceutical software. For practical reasons and to make it possible to differentiate between the different access profiles, the information stored on the card is divided in several blocks: Identification, Emergency, History (personal and family), Screening Tests, Vaccinations, Drug Profile, General follow-up, and some Specific follow-ups (Pregnancy, Ophthalmology, Kidney failure, Cardiology, Pediatrics, Diabetes, Pneumology, Specific parameters). Over 14,000 diagnoses and symptoms are classified with four levels of precision, the codification being based on the ICPC (International Classification for Primary Care). The software contains different applications to assist the clinician in decision making. A "Drug Advisor" helps the prescriber by detecting possible interactions between drugs, giving indications (doses) and contraindications, cautions, potential side-effects and therapeutic alternatives. There is also a prevention module providing recommendations for vaccination and periodic examinations based on the patient's age and sex. The pharmaceutical, vaccination, and screening tests data banks are updated every six months. These sections of the software are accessible to access card holders at any times, even without a patient card, and constitute in themselves an interesting clinical tool. We developed a software server (SCAM) allowing the different applications to access the data in a memory card regardless of the type of memory card used. Using a single high level command language, this server provides a standardized utilization of memory cards from various manufacturers. It ensures the compatibility of the applications using the card as a storage medium. (abstract truncated)

  14. Integration of a CAS/DGS as a CAD system in the mathematics curriculum for architecture students

    NASA Astrophysics Data System (ADS)

    Falcón, R. M.

    2011-09-01

    Students of Architecture and Building Engineering Degrees work with Computer Aided Design systems daily in order to design and model architectonic constructions. Since this kind of software is based on the creation and transformation of geometrical objects, it seems to be a useful tool in Maths classes in order to capture the attention of the students. However, users of these systems cannot display the set of formulas and equations which constitute the basis of their studio. Moreover, if they want to represent curves or surfaces starting from its corresponding equations, they have to define specific macros which require the knowledge of some computer language or they have to create a table of points in order to convert a set of nodes into polylines, polysolids or splines. More specific concepts, like, for instance, those related to differential geometry, are not implemented in this kind of software, although they are taught in our Maths classes. In a very similar virtual environment, Computer Algebra and Dynamic Geometry Systems offer the possibility of implementing several concepts which can be found in the usual mathematics curriculum for Building Engineering: curves, surfaces and calculus. Specifically, the use of sliders related to the Euler's angles and the generation of tools which project 3D into 2D, facilitate the design and model of curves and rigid objects in space, by starting from their parametric equations. In this article, we show the experience carried out in an experimental and control group in the context of the Maths classes of the Building Engineering Degree of the University of Seville, where students have created their own building models by understanding and testing the usefulness of the mathematical concepts.

  15. Study of fault tolerant software technology for dynamic systems

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Zacharias, G. L.

    1985-01-01

    The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.

  16. a N-D Virtual Notebook about the Basilica of S. Ambrogio in Milan: Information Modeling for the Communication of Historical Phases Subtraction Process

    NASA Astrophysics Data System (ADS)

    Stanga, C.; Spinelli, C.; Brumana, R.; Oreni, D.; Valente, R.; Banfi, F.

    2017-08-01

    This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.

  17. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  18. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  19. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  20. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  1. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  2. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  3. Sourcing Lifecycle for Software as a Service (SAAS)

    NASA Astrophysics Data System (ADS)

    Santy; Sikkel, K.

    2014-03-01

    In recent years, Software as a Service (SaaS) has changed from curiosity caused concept to an accepted well known concept. A key advantage of this model is that, by cautious engineering, it is possible to influence economy of scale to decrease total cost of ownership compared to on-premises solutions. By using the guideline elaborated in this paper, companies which has interest in implementing SaaS will be led throughout the entire implementation cycle starting before the company decides to implement SaaS until the stage when the company decides to shift from SaaS model to another model or when they shift to another SaaS company

  4. Upgrade of DRAMA-ESA's Space Debris Mitigation Analysis Tool Suite

    NASA Astrophysics Data System (ADS)

    Gelhaus, Johannes; Sanchez-Ortiz, Noelia; Braun, Vitali; Kebschull, Christopher; de Oliveira, Joaquim Correia; Dominguez-Gonzalez, Raul; Wiedemann, Carsten; Krag, Holger; Vorsmann, Peter

    2013-08-01

    One decade ago ESA started the dev elopment of the first version of the software tool called DRAMA (Debris Risk Assessment and Mitigation Analysis) to enable ESA space programs to assess their compliance with the recommendations in the European Code of Conduct for Space Debris Mitigation. This tool was maintained, upgraded and extended during the last year and is now a combination of five individual tools, each addressing a different aspect of debris mitigation. This paper gives an overview of the new DRAMA software in general. Both, the main tools ARES, OSCAR, MIDAS, CROC and SARA will be discussed and the environment used by DRAMA will be explained shortly.

  5. Java Web Start based software for automated quantitative nuclear analysis of prostate cancer and benign prostate hyperplasia.

    PubMed

    Singh, Swaroop S; Kim, Desok; Mohler, James L

    2005-05-11

    Androgen acts via androgen receptor (AR) and accurate measurement of the levels of AR protein expression is critical for prostate research. The expression of AR in paired specimens of benign prostate and prostate cancer from 20 African and 20 Caucasian Americans was compared to demonstrate an application of this system. A set of 200 immunopositive and 200 immunonegative nuclei were collected from the images using a macro developed in Image Pro Plus. Linear Discriminant and Logistic Regression analyses were performed on the data to generate classification coefficients. Classification coefficients render the automated image analysis software independent of the type of immunostaining or image acquisition system used. The image analysis software performs local segmentation and uses nuclear shape and size to detect prostatic epithelial nuclei. AR expression is described by (a) percentage of immunopositive nuclei; (b) percentage of immunopositive nuclear area; and (c) intensity of AR expression among immunopositive nuclei or areas. The percent positive nuclei and percent nuclear area were similar by race in both benign prostate hyperplasia and prostate cancer. In prostate cancer epithelial nuclei, African Americans exhibited 38% higher levels of AR immunostaining than Caucasian Americans (two sided Student's t-tests; P < 0.05). Intensity of AR immunostaining was similar between races in benign prostate. The differences measured in the intensity of AR expression in prostate cancer were consistent with previous studies. Classification coefficients are required due to non-standardized immunostaining and image collection methods across medical institutions and research laboratories and helps customize the software for the specimen under study. The availability of a free, automated system creates new opportunities for testing, evaluation and use of this image analysis system by many research groups who study nuclear protein expression.

  6. Prediction of a Therapeutic Dose for Buagafuran, a Potent Anxiolytic Agent by Physiologically Based Pharmacokinetic/Pharmacodynamic Modeling Starting from Pharmacokinetics in Rats and Human.

    PubMed

    Yang, Fen; Wang, Baolian; Liu, Zhihao; Xia, Xuejun; Wang, Weijun; Yin, Dali; Sheng, Li; Li, Yan

    2017-01-01

    Physiologically based pharmacokinetic (PBPK)/pharmacodynamic (PD) models can contribute to animal-to-human extrapolation and therapeutic dose predictions. Buagafuran is a novel anxiolytic agent and phase I clinical trials of buagafuran have been completed. In this paper, a potentially effective dose for buagafuran of 30 mg t.i.d. in human was estimated based on the human brain concentration predicted by a PBPK/PD modeling. The software GastroPlus TM was used to build the PBPK/PD model for buagafuran in rat which related the brain tissue concentrations of buagafuran and the times of animals entering the open arms in the pharmacological model of elevated plus-maze. Buagafuran concentrations in human plasma were fitted and brain tissue concentrations were predicted by using a human PBPK model in which the predicted plasma profiles were in good agreement with observations. The results provided supportive data for the rational use of buagafuran in clinic.

  7. An Open Source Low-Cost Automatic System for Image-Based 3d Digitization

    NASA Astrophysics Data System (ADS)

    Menna, F.; Nocerino, E.; Morabito, D.; Farella, E. M.; Perini, M.; Remondino, F.

    2017-11-01

    3D digitization of heritage artefacts, reverse engineering of industrial components or rapid prototyping-driven design are key topics today. Indeed, millions of archaeological finds all over the world need to be surveyed in 3D either to allow convenient investigations by researchers or because they are inaccessible to visitors and scientists or, unfortunately, because they are seriously endangered by wars and terrorist attacks. On the other hand, in case of industrial and design components there is often the need of deformation analyses or physical replicas starting from reality-based 3D digitisations. The paper is aligned with these needs and presents the realization of the ORION (arduinO Raspberry pI rOtating table for image based 3D recostructioN) prototype system, with its hardware and software components, providing critical insights about its modular design. ORION is an image-based 3D reconstruction system based on automated photogrammetric acquisitions and processing. The system is being developed under a collaborative educational project between FBK Trento, the University of Trento and internship programs with high school in the Trentino province (Italy).

  8. Busting out of crystallography's Sisyphean prison: from pencil and paper to structure solving at the press of a button: past, present and future of crystallographic software development, maintenance and distribution.

    PubMed

    Cranswick, Lachlan Michael David

    2008-01-01

    The history of crystallographic computing and use of crystallographic software is one which traces the escape from the drudgery of manual human calculations to a world where the user delegates most of the travail to electronic computers. In practice, this involves practising crystallographers communicating their thoughts to the crystallographic program authors, in the hope that new procedures will be implemented within their software. Against this background, the development of small-molecule single-crystal and powder diffraction software is traced. Starting with the analogue machines and the use of Hollerith tabulators of the late 1930's, it is shown that computing developments have been science led, with new technologies being harnessed to solve pressing crystallographic problems. The development of software is also traced, with a final caution that few of the computations now performed daily are really understood by the program users. Unless a sufficient body of people continues to dismantle and re-build programs, the knowledge encoded in the old programs will become as inaccessible as the knowledge of how to build the Great Pyramid at Giza.

  9. Net-VISA used as a complement to standard software at the CTBTO: initial operational experience with next-generation software.

    NASA Astrophysics Data System (ADS)

    Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.

    2017-12-01

    The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.

  10. Combining Topological Hardware and Topological Software: Color-Code Quantum Computing with Topological Superconductor Networks

    NASA Astrophysics Data System (ADS)

    Litinski, Daniel; Kesselring, Markus S.; Eisert, Jens; von Oppen, Felix

    2017-07-01

    We present a scalable architecture for fault-tolerant topological quantum computation using networks of voltage-controlled Majorana Cooper pair boxes and topological color codes for error correction. Color codes have a set of transversal gates which coincides with the set of topologically protected gates in Majorana-based systems, namely, the Clifford gates. In this way, we establish color codes as providing a natural setting in which advantages offered by topological hardware can be combined with those arising from topological error-correcting software for full-fledged fault-tolerant quantum computing. We provide a complete description of our architecture, including the underlying physical ingredients. We start by showing that in topological superconductor networks, hexagonal cells can be employed to serve as physical qubits for universal quantum computation, and we present protocols for realizing topologically protected Clifford gates. These hexagonal-cell qubits allow for a direct implementation of open-boundary color codes with ancilla-free syndrome read-out and logical T gates via magic-state distillation. For concreteness, we describe how the necessary operations can be implemented using networks of Majorana Cooper pair boxes, and we give a feasibility estimate for error correction in this architecture. Our approach is motivated by nanowire-based networks of topological superconductors, but it could also be realized in alternative settings such as quantum-Hall-superconductor hybrids.

  11. Design and validation of a portable, inexpensive and multi-beam timing light system using the Nintendo Wii hand controllers.

    PubMed

    Clark, Ross A; Paterson, Kade; Ritchie, Callan; Blundell, Simon; Bryant, Adam L

    2011-03-01

    Commercial timing light systems (CTLS) provide precise measurement of athletes running velocity, however they are often expensive and difficult to transport. In this study an inexpensive, wireless and portable timing light system was created using the infrared camera in Nintendo Wii hand controllers (NWHC). System creation with gold-standard validation. A Windows-based software program using NWHC to replicate a dual-beam timing gate was created. Firstly, data collected during 2m walking and running trials were validated against a 3D kinematic system. Secondly, data recorded during 5m running trials at various intensities from standing or flying starts were compared to a single beam CTLS and the independent and average scores of three handheld stopwatch (HS) operators. Intraclass correlation coefficient and Bland-Altman plots were used to assess validity. Absolute error quartiles and percentage of trials in absolute error threshold ranges were used to determine accuracy. The NWHC system was valid when compared against the 3D kinematic system (ICC=0.99, median absolute error (MAR)=2.95%). For the flying 5m trials the NWHC system possessed excellent validity and precision (ICC=0.97, MAR<3%) when compared with the CTLS. In contrast, the NWHC system and the HS values during standing start trials possessed only modest validity (ICC<0.75) and accuracy (MAR>8%). A NWHC timing light system is inexpensive, portable and valid for assessing running velocity. Errors in the 5m standing start trials may have been due to erroneous event detection by either the commercial or NWHC-based timing light systems. Copyright © 2010 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.

  12. Poster — Thur Eve — 03: Application of the non-negative matrix factorization technique to [{sup 11}C]-DTBZ dynamic PET data for the early detection of Parkinson's disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Dong-Chang; Jans, Hans; McEwan, Sandy

    2014-08-15

    In this work, a class of non-negative matrix factorization (NMF) technique known as alternating non-negative least squares, combined with the projected gradient method, is used to analyze twenty-five [{sup 11}C]-DTBZ dynamic PET/CT brain data. For each subject, a two-factor model is assumed and two factors representing the striatum (factor 1) and the non-striatum (factor 2) tissues are extracted using the proposed NMF technique and commercially available factor analysis software “Pixies”. The extracted factor 1 and 2 curves represent the binding site of the radiotracer and describe the uptake and clearance of the radiotracer by soft tissues in the brain, respectively.more » The proposed NMF technique uses prior information about the dynamic data to obtain sample time-activity curves representing the striatum and the non-striatum tissues. These curves are then used for “warm” starting the optimization. Factor solutions from the two methods are compared graphically and quantitatively. In healthy subjects, radiotracer uptake by factors 1 and 2 are approximately 35–40% and 60–65%, respectively. The solutions are also used to develop a factor-based metric for the detection of early, untreated Parkinson's disease. The metric stratifies healthy subjects from suspected Parkinson's patients (based on the graphical method). The analysis shows that both techniques produce comparable results with similar computational time. The “semi-automatic” approach used by the NMF technique allows clinicians to manually set a starting condition for “warm” starting the optimization in order to facilitate control and efficient interaction with the data.« less

  13. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  14. NHPP-Based Software Reliability Models Using Equilibrium Distribution

    NASA Astrophysics Data System (ADS)

    Xiao, Xiao; Okamura, Hiroyuki; Dohi, Tadashi

    Non-homogeneous Poisson processes (NHPPs) have gained much popularity in actual software testing phases to estimate the software reliability, the number of remaining faults in software and the software release timing. In this paper, we propose a new modeling approach for the NHPP-based software reliability models (SRMs) to describe the stochastic behavior of software fault-detection processes. The fundamental idea is to apply the equilibrium distribution to the fault-detection time distribution in NHPP-based modeling. We also develop efficient parameter estimation procedures for the proposed NHPP-based SRMs. Through numerical experiments, it can be concluded that the proposed NHPP-based SRMs outperform the existing ones in many data sets from the perspective of goodness-of-fit and prediction performance.

  15. Certification of production-quality gLite Job Management components

    NASA Astrophysics Data System (ADS)

    Andreetto, P.; Bertocco, S.; Capannini, F.; Cecchi, M.; Dorigo, A.; Frizziero, E.; Giacomini, F.; Gianelle, A.; Mezzadri, M.; Molinari, E.; Monforte, S.; Prelz, F.; Rebatto, D.; Sgaravatto, M.; Zangrando, L.

    2011-12-01

    With the advent of the recent European Union (EU) funded projects aimed at achieving an open, coordinated and proactive collaboration among the European communities that provide distributed computing services, more strict requirements and quality standards will be asked to middleware providers. Such a highly competitive and dynamic environment, organized to comply a business-oriented model, has already started pursuing quality criteria, thus requiring to formally define rigorous procedures, interfaces and roles for each step of the software life-cycle. This will ensure quality-certified releases and updates of the Grid middleware. In the European Middleware Initiative (EMI), the release management for one or more components will be organized into Product Team (PT) units, fully responsible for delivering production ready, quality-certified software and for coordinating each other to contribute to the EMI release as a whole. This paper presents the certification process, with respect to integration, installation, configuration and testing, adopted at INFN by the Product Team responsible for the gLite Web-Service based Computing Element (CREAM CE) and for the Workload Management System (WMS). The used resources, the testbeds layout, the integration and deployment methods, the certification steps to provide feedback to developers and to grant quality results are described.

  16. GIS application on modern Mexico

    NASA Astrophysics Data System (ADS)

    Prakash, Bharath

    This is a GIS based tool for showcasing the history of modern Mexico starting from the post-colonial era to the elections of 2012. The tool is developed using simple language and is flexible so as to allow for future enhancements. The application consists of numerous images and textual information, and also some links which can be used by primary and high school students to understand the history of modern Mexico, and also by tourists to look for all the international airports and United States of America consulates. This software depicts the aftermaths of the Colonial Era or the Spanish rule of Mexico. It covers various topics like the wars, politics, important personalities, drug cartels and violence. All these events are shown on GIS (Geographic information Science) maps. The software can be customized according to the user requirements and is developed using JAVA and GIS technology. The user interface is created using JAVA and MOJO which contributes to effective learning and understanding of the concepts with ease. Some of the user interface features provided in this tool includes zoom-in, zoom-out, legend editing, location identifier, print command, adding a layer and numerous menu items.

  17. A new paradigm on battery powered embedded system design based on User-Experience-Oriented method

    NASA Astrophysics Data System (ADS)

    Wang, Zhuoran; Wu, Yue

    2014-03-01

    The battery sustainable time has been an active research topic recently for the development of battery powered embedded products such as tablets and smart phones, which are determined by the battery capacity and power consumption. Despite numerous efforts on the improvement of battery capacity in the field of material engineering, the power consumption also plays an important role and easier to ameliorate in delivering a desirable user-experience, especially considering the moderate advancement on batteries for decades. In this study, a new Top-Down modelling method, User-Experience-Oriented Battery Powered Embedded System Design Paradigm, is proposed to estimate the target average power consumption, to guide the hardware and software design, and eventually to approach the theoretical lowest power consumption that the application is still able to provide the full functionality. Starting from the 10-hour sustainable time standard, average working current is defined with battery design capacity and set as a target. Then an implementation is illustrated from both hardware perspective, which is summarized as Auto-Gating power management, and from software perspective, which introduces a new algorithm, SleepVote, to guide the system task design and scheduling.

  18. TOLNet Data Format for Lidar Ozone Profile & Surface Observations

    NASA Astrophysics Data System (ADS)

    Chen, G.; Aknan, A. A.; Newchurch, M.; Leblanc, T.

    2015-12-01

    The Tropospheric Ozone Lidar Network (TOLNet) is an interagency initiative started by NASA, NOAA, and EPA in 2011. TOLNet currently has six Lidars and one ozonesonde station. TOLNet provides high-resolution spatio-temporal measurements of tropospheric (surface to tropopause) ozone and aerosol vertical profiles to address fundamental air-quality science questions. The TOLNet data format was developed by TOLNet members as a community standard for reporting ozone profile observations. The development of this new format was primarily based on the existing NDAAC (Network for the Detection of Atmospheric Composition Change) format and ICARTT (International Consortium for Atmospheric Research on Transport and Transformation) format. The main goal is to present the Lidar observations in self-describing and easy-to-use data files. The TOLNet format is an ASCII format containing a general file header, individual profile headers, and the profile data. The last two components repeat for all profiles recorded in the file. The TOLNet format is both human and machine readable as it adopts standard metadata entries and fixed variable names. In addition, software has been developed to check for format compliance. To be presented is a detailed description of the TOLNet format protocol and scanning software.

  19. ReadXplorer—visualization and analysis of mapped sequences

    PubMed Central

    Hilker, Rolf; Stadermann, Kai Bernd; Doppmeier, Daniel; Kalinowski, Jörn; Stoye, Jens; Straube, Jasmin; Winnebald, Jörn; Goesmann, Alexander

    2014-01-01

    Motivation: Fast algorithms and well-arranged visualizations are required for the comprehensive analysis of the ever-growing size of genomic and transcriptomic next-generation sequencing data. Results: ReadXplorer is a software offering straightforward visualization and extensive analysis functions for genomic and transcriptomic DNA sequences mapped on a reference. A unique specialty of ReadXplorer is the quality classification of the read mappings. It is incorporated in all analysis functions and displayed in ReadXplorer's various synchronized data viewers for (i) the reference sequence, its base coverage as (ii) normalizable plot and (iii) histogram, (iv) read alignments and (v) read pairs. ReadXplorer's analysis capability covers RNA secondary structure prediction, single nucleotide polymorphism and deletion–insertion polymorphism detection, genomic feature and general coverage analysis. Especially for RNA-Seq data, it offers differential gene expression analysis, transcription start site and operon detection as well as RPKM value and read count calculations. Furthermore, ReadXplorer can combine or superimpose coverage of different datasets. Availability and implementation: ReadXplorer is available as open-source software at http://www.readxplorer.org along with a detailed manual. Contact: rhilker@mikrobio.med.uni-giessen.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24790157

  20. CMS Analysis School Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malik, S.; Shipsey, I.; Cavanaugh, R.

    To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals ofmore » CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.« less

  1. Hazard Detection Software for Lunar Landing

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.

    2011-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of viewing on hazard detection performance. The software has also been deployed to Johnson Space Center and integrated into the ALHAT real-time Hardware-in-the-Loop testbed.

  2. When You Can’t Beat ’em, Join ’em: Leveraging ComplexityScience for Innovative Solutions

    DTIC Science & Technology

    2017-08-21

    chemical reactions : • Belousov-Zhabotinskii reaction ... Engineering (ARE) Technical Interchange Meeting by: Dr. Josef Schaff, NAVAIR 4.5 DISTRIBUTION STATEMENT A • Commander’s intent: Networked Navy & the intent...Physics undergrad, software engineering jobs in comms, video games, robotics • Started NAWCAD (NADC) as a computer scientist / engineer

  3. Conversion of DST Group Shape Optimisation Software for Increased Portability across Computing Platforms

    DTIC Science & Technology

    2016-05-01

    reduction achieved is small due to the starting shape being near optimal. The general arrangement and x-y coordinate system are shown in Figure 23...Optimization, Vol. 28, pp. 55–68, 2004. [3] M Heller, J Calero, S Barter , RJ Wescott, J Choi. Fatigue life extension program for LAU-7 missile launcher

  4. Using COMSOL Software on the Peregrine System | High-Performance Computing

    Science.gov Websites

    the following command: lmstat.comsol COMSOL can be used by starting the COMSOL GUI that allows one to compute node, the following will bring up the COMSOL interface. module purge module load comsol/5.3 comsol following command comsol batch -inputfile myinputfile.mph -outputfile out.mph Running a Parallel COMSOL Job

  5. Teaching CAD at the University: Specifically Written or Commercial Software?

    ERIC Educational Resources Information Center

    Garcia, Ramon Rubio; Quiros, Javier Suarez; Santos, Ramon Gallego; Penin, Pedro I. Alvarez

    2007-01-01

    At most universities throughout the world Computer Aided Design is taught using commercial programs more suitable for business and industry than for teaching. This led us to write our own design program (GIcad) starting from the best-known standards on the market, but always avoiding unnecessary commands in the first steps of the learning process.…

  6. Using the Euclid RTP11.13 Repository in the SEC Environment

    DTIC Science & Technology

    2006-03-01

    of wrong user, passwd combination. We found out that the user and password are hard coded in the FCT software. It uses defaultEditor@ rtp I I 13.INETI...The FCT will start, but when connecting to the Repository it fails because of wrong user, passwd combination: It uses defaultEditor@rtpl I 13.INETI

  7. Statistical techniques for sampling and monitoring natural resources

    Treesearch

    Hans T. Schreuder; Richard Ernst; Hugo Ramirez-Maldonado

    2004-01-01

    We present the statistical theory of inventory and monitoring from a probabilistic point of view. We start with the basics and show the interrelationships between designs and estimators illustrating the methods with a small artificial population as well as with a mapped realistic population. For such applications, useful open source software is given in Appendix 4....

  8. The DEEP-South: Scheduling and Data Reduction Software System

    NASA Astrophysics Data System (ADS)

    Yim, Hong-Suh; Kim, Myung-Jin; Bae, Youngho; Moon, Hong-Kyu; Choi, Young-Jun; Roh, Dong-Goo; the DEEP-South Team

    2015-08-01

    The DEep Ecliptic Patrol of the Southern sky (DEEP-South), started in October 2012, is currently in test runs with the first Korea Microlensing Telescope Network (KMTNet) 1.6 m wide-field telescope located at CTIO in Chile. While the primary objective for the DEEP-South is physical characterization of small bodies in the Solar System, it is expected to discover a large number of such bodies, many of them previously unknown.An automatic observation planning and data reduction software subsystem called "The DEEP-South Scheduling and Data reduction System" (the DEEP-South SDS) is currently being designed and implemented for observation planning, data reduction and analysis of huge amount of data with minimum human interaction. The DEEP-South SDS consists of three software subsystems: the DEEP-South Scheduling System (DSS), the Local Data Reduction System (LDR), and the Main Data Reduction System (MDR). The DSS manages observation targets, makes decision on target priority and observation methods, schedules nightly observations, and archive data using the Database Management System (DBMS). The LDR is designed to detect moving objects from CCD images, while the MDR conducts photometry and reconstructs lightcurves. Based on analysis made at the LDR and the MDR, the DSS schedules follow-up observation to be conducted at other KMTNet stations. In the end of 2015, we expect the DEEP-South SDS to achieve a stable operation. We also have a plan to improve the SDS to accomplish finely tuned observation strategy and more efficient data reduction in 2016.

  9. Development and Testing of Prototype Commercial Gasifier Sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zelepouga, Serguei; Moery, Nathan; Wu, Mengbai

    This report presents the results of the sensor development and testing at the Wabash River gasifier. The project work was initiated with modification of the sensor software (Task 2) to enable real time temperature data acquisition, and to process and provide the obtained gasifier temperature information to the gasifier operators. The software modifications were conducted by the North Carolina State University (NCSU) researchers. The modified software was tested at the Gas Technology Institute (GTI) combustion laboratory to assess the temperature recognition algorithm accuracy and repeatability. Task 3 was focused on the sensor hardware modifications needed to improve reliability of themore » sensor system. NCSU conducted numerical modeling of the sensor probe’s purging flow. Based on the modeling results the probe purging system was redesigned to prevent carbon particulates deposition on the probe’s sapphire window. The modified design was evaluated and approved by the Wabash representative. The modified gasifier sensor was built and installed at the Wabash River gasifier on May 1 2014. (Task 4) The sensor was tested from the startup of the gasifier on May 5, 2015 until the planned autumn gasifier outage starting in the beginning of October, 2015. (Task 5) The project team successfully demonstrated the Gasifier Sensor system’s ability to monitor gasifier temperature while maintaining unobstructed optical access for six months without any maintenance. The sensor examination upon completion of the trial revealed that the system did not sustain any damage.« less

  10. BossPro: a biometrics-based obfuscation scheme for software protection

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham

    2013-05-01

    This paper proposes to integrate biometric-based key generation into an obfuscated interpretation algorithm to protect authentication application software from illegitimate use or reverse-engineering. This is especially necessary for mCommerce because application programmes on mobile devices, such as Smartphones and Tablet-PCs are typically open for misuse by hackers. Therefore, the scheme proposed in this paper ensures that a correct interpretation / execution of the obfuscated program code of the authentication application requires a valid biometric generated key of the actual person to be authenticated, in real-time. Without this key, the real semantics of the program cannot be understood by an attacker even if he/she gains access to this application code. Furthermore, the security provided by this scheme can be a vital aspect in protecting any application running on mobile devices that are increasingly used to perform business/financial or other security related applications, but are easily lost or stolen. The scheme starts by creating a personalised copy of any application based on the biometric key generated during an enrolment process with the authenticator as well as a nuance created at the time of communication between the client and the authenticator. The obfuscated code is then shipped to the client's mobile devise and integrated with real-time biometric extracted data of the client to form the unlocking key during execution. The novelty of this scheme is achieved by the close binding of this application program to the biometric key of the client, thus making this application unusable for others. Trials and experimental results on biometric key generation, based on client's faces, and an implemented scheme prototype, based on the Android emulator, prove the concept and novelty of this proposed scheme.

  11. Hybrid region merging method for segmentation of high-resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhang, Xueliang; Xiao, Pengfeng; Feng, Xuezhi; Wang, Jiangeng; Wang, Zuo

    2014-12-01

    Image segmentation remains a challenging problem for object-based image analysis. In this paper, a hybrid region merging (HRM) method is proposed to segment high-resolution remote sensing images. HRM integrates the advantages of global-oriented and local-oriented region merging strategies into a unified framework. The globally most-similar pair of regions is used to determine the starting point of a growing region, which provides an elegant way to avoid the problem of starting point assignment and to enhance the optimization ability for local-oriented region merging. During the region growing procedure, the merging iterations are constrained within the local vicinity, so that the segmentation is accelerated and can reflect the local context, as compared with the global-oriented method. A set of high-resolution remote sensing images is used to test the effectiveness of the HRM method, and three region-based remote sensing image segmentation methods are adopted for comparison, including the hierarchical stepwise optimization (HSWO) method, the local-mutual best region merging (LMM) method, and the multiresolution segmentation (MRS) method embedded in eCognition Developer software. Both the supervised evaluation and visual assessment show that HRM performs better than HSWO and LMM by combining both their advantages. The segmentation results of HRM and MRS are visually comparable, but HRM can describe objects as single regions better than MRS, and the supervised and unsupervised evaluation results further prove the superiority of HRM.

  12. Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web-Based and Mobile Devices

    DTIC Science & Technology

    2015-05-01

    Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web-Based and Mobile Devices Walt Scacchi and Thomas...2015 to 00-00-2015 4. TITLE AND SUBTITLE Achieving Better Buying Power through Acquisition of Open Architecture Software Systems for Web-Based and...architecture (OA) software systems  Emerging challenges in achieving Better Buying Power (BBP) via OA software systems for Web- based and Mobile devices

  13. Incorporating Code-Based Software in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Doehler, Kirsten; Taylor, Laura

    2015-01-01

    This article is based on the experiences of two statistics professors who have taught students to write and effectively utilize code-based software in a college-level introductory statistics course. Advantages of using software and code-based software in this context are discussed. Suggestions are made on how to ease students into using code with…

  14. Toward full life cycle control: Adding maintenance measurement to the SEL

    NASA Technical Reports Server (NTRS)

    Rombach, H. Dieter; Ulery, Bradford T.; Valett, Jon D.

    1992-01-01

    Organization-wide measurement of software products and processes is needed to establish full life cycle control over software products. The Software Engineering Laboratory (SEL)--a joint venture between NASA GSFC, the University of Maryland, and Computer Sciences Corporation--started measurement of software development more than 15 years ago. Recently, the measurement of maintenance was added to the scope of the SEL. In this article, the maintenance measurement program is presented as an addition to the already existing and well-established SEL development measurement program and evaluated in terms of its immediate benefits and long-term improvement potential. Immediate benefits of this program for the SEL include an increased understanding of the maintenance domain, the differences and commonalities between development and maintenance, and the cause-effect relationships between development and maintenance. Initial results from a sample maintenance study are presented to substantiate these benefits. The long-term potential of this program includes the use of maintenance baselines to better plan and manage future projects and to improve development and maintenance practices for future projects wherever warranted.

  15. Speech recognition technology: an outlook for human-to-machine interaction.

    PubMed

    Erdel, T; Crooks, S

    2000-01-01

    Speech recognition, as an enabling technology in healthcare-systems computing, is a topic that has been discussed for quite some time, but is just now coming to fruition. Traditionally, speech-recognition software has been constrained by hardware, but improved processors and increased memory capacities are starting to remove some of these limitations. With these barriers removed, companies that create software for the healthcare setting have the opportunity to write more successful applications. Among the criticisms of speech-recognition applications are the high rates of error and steep training curves. However, even in the face of such negative perceptions, there remains significant opportunities for speech recognition to allow healthcare providers and, more specifically, physicians, to work more efficiently and ultimately spend more time with their patients and less time completing necessary documentation. This article will identify opportunities for inclusion of speech-recognition technology in the healthcare setting and examine major categories of speech-recognition software--continuous speech recognition, command and control, and text-to-speech. We will discuss the advantages and disadvantages of each area, the limitations of the software today, and how future trends might affect them.

  16. Parallel approach for bioinspired algorithms

    NASA Astrophysics Data System (ADS)

    Zaporozhets, Dmitry; Zaruba, Daria; Kulieva, Nina

    2018-05-01

    In the paper, a probabilistic parallel approach based on the population heuristic, such as a genetic algorithm, is suggested. The authors proposed using a multithreading approach at the micro level at which new alternative solutions are generated. On each iteration, several threads that independently used the same population to generate new solutions can be started. After the work of all threads, a selection operator combines obtained results in the new population. To confirm the effectiveness of the suggested approach, the authors have developed software on the basis of which experimental computations can be carried out. The authors have considered a classic optimization problem – finding a Hamiltonian cycle in a graph. Experiments show that due to the parallel approach at the micro level, increment of running speed can be obtained on graphs with 250 and more vertices.

  17. Business diversification - In the businesses of desk calculator, semiconductor and liquid crystal

    NASA Astrophysics Data System (ADS)

    Asada, Atsushi

    This is a record of the lecture at the 27th Annual Meeting on Information Science and Technology. Lecturer, a staff member of Sharp, Corp., explained its business diversification. The Company started with electric appliances. After coping with the application of computer technology, it made a success in the business of desk calculator. Aiming at making calculator for personal use, it coped with the business in semiconductor, and developed its business in liquid crystal for making calculator thinner. Based on these businesses, it expanded its business in OA appliances, and developed the business in combining electric appliances and information including distribution and marketing. The businesses in the age of 1990s will be requested to provide services by customizing hardware, software and system with efforts to enhance valued-added to them.

  18. The MIGenAS integrated bioinformatics toolkit for web-based sequence analysis

    PubMed Central

    Rampp, Markus; Soddemann, Thomas; Lederer, Hermann

    2006-01-01

    We describe a versatile and extensible integrated bioinformatics toolkit for the analysis of biological sequences over the Internet. The web portal offers convenient interactive access to a growing pool of chainable bioinformatics software tools and databases that are centrally installed and maintained by the RZG. Currently, supported tasks comprise sequence similarity searches in public or user-supplied databases, computation and validation of multiple sequence alignments, phylogenetic analysis and protein–structure prediction. Individual tools can be seamlessly chained into pipelines allowing the user to conveniently process complex workflows without the necessity to take care of any format conversions or tedious parsing of intermediate results. The toolkit is part of the Max-Planck Integrated Gene Analysis System (MIGenAS) of the Max Planck Society available at (click ‘Start Toolkit’). PMID:16844980

  19. Adjoint Airfoil Optimization of Darrieus-Type Vertical Axis Wind Turbine

    NASA Astrophysics Data System (ADS)

    Fuchs, Roman; Nordborg, Henrik

    2012-11-01

    We present the feasibility of using an adjoint solver to optimize the torque of a Darrieus-type vertical axis wind turbine (VAWT). We start with a 2D cross section of a symmetrical airfoil and restrict us to low solidity ratios to minimize blade vortex interactions. The adjoint solver of the ANSYS FLUENT software package computes the sensitivities of airfoil surface forces based on a steady flow field. Hence, we find the torque of a full revolution using a weighted average of the sensitivities at different wind speeds and angles of attack. The weights are computed analytically, and the range of angles of attack is given by the tip speed ratio. Then the airfoil geometry is evolved, and the proposed methodology is evaluated by transient simulations.

  20. Characterization of natural puya sand extract of Central Kalimantan by using X-Ray Diffraction

    NASA Astrophysics Data System (ADS)

    Suastika, K. G.; Karelius, K.; Sudyana, I. N.

    2018-03-01

    Start Zircon sand extraction in this study use natural sand material from Kereng Pangi village of Central Kalimantan, also known as Puya sand. There are only three ways to extract the Puya sand. The first is magnetic separation, the second is immersion in HCl, and the third is reaction with NaOH. In addition, sample of each extraction step is analyzed with X-Ray Diffraction (XRD). Then based on the quantitative analysis using X'Pert Highscore Plus software, the samples are identified mostly as zircon (ZrSiO4) and silica (SiO2). Moreover, after the immersion process with HCl, the silica compound goes down and the zircon compound climbs to 74%. In the reaction process with NaOH zircon compound content further to increase to 88%.

  1. Repercussions of imprisonment for conjugal violence: discourses of men 1

    PubMed Central

    de Sousa, Anderson Reis; Pereira, Álvaro; Paixão, Gilvânia Patrícia do Nascimento; Pereira, Nadirlene Gomes; Campos, Luana Moura; Couto, Telmara Menezes

    2016-01-01

    ABSTRACT Objective: to know the consequences that men experience related to incarceration by conjugal violence. Methods: qualitative study on 20 men in jail and indicted in criminal processes related to conjugal violence in a Court specialized in Family and Domestic Violence against women. The interviews were classified based on Collective Subject Discourse method, using NVIVO(r) software. Results: the collective discourse shows that the experience of preventive imprisonment starts a process of family dismantling, social stigma, financial hardship and psycho-emotional symptoms such as phobia, depression, hypertension, and headaches. Conclusion: due to the physical, mental and social consequences of the conjugal violence-related imprisonment experience, it is urgent to look carefully into the somatization process as well as to the prevention strategies regarding this process. PMID:27982312

  2. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    PubMed

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard error estimated by using SAAM numerical and NUKFIT showed differences of <1%. The differences for the time-integrated activity coefficients were also <1% (standard error between 0.4% and 3%). In general, the application of the software is user-friendly and the results are mathematically correct and reproducible. An application of NUKFIT is presented for three different clinical examples. The software tool with its underlying methodology can be employed to objectively and reproducibly estimate the time integrated activity coefficient and its standard error for most time activity data in molecular radiotherapy.

  3. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  4. Taking advantage of ground data systems attributes to achieve quality results in testing software

    NASA Technical Reports Server (NTRS)

    Sigman, Clayton B.; Koslosky, John T.; Hageman, Barbara H.

    1994-01-01

    During the software development life cycle process, basic testing starts with the development team. At the end of the development process, an acceptance test is performed for the user to ensure that the deliverable is acceptable. Ideally, the delivery is an operational product with zero defects. However, the goal of zero defects is normally not achieved but is successful to various degrees. With the emphasis on building low cost ground support systems while maintaining a quality product, a key element in the test process is simulator capability. This paper reviews the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) test tool that is used in the acceptance test process for unmanned satellite operations control centers. The TASS is designed to support the development, test and operational environments of the Goddard Space Flight Center (GSFC) operations control centers. The TASS uses the same basic architecture as the operations control center. This architecture is characterized by its use of distributed processing, industry standards, commercial off-the-shelf (COTS) hardware and software components, and reusable software. The TASS uses much of the same TPOCC architecture and reusable software that the operations control center developer uses. The TASS also makes use of reusable simulator software in the mission specific versions of the TASS. Very little new software needs to be developed, mainly mission specific telemetry communication and command processing software. By taking advantage of the ground data system attributes, successful software reuse for operational systems provides the opportunity to extend the reuse concept into the test area. Consistency in test approach is a major step in achieving quality results.

  5. Finite-Fault and Other New Capabilities of CISN ShakeAlert

    NASA Astrophysics Data System (ADS)

    Boese, M.; Felizardo, C.; Heaton, T. H.; Hudnut, K. W.; Hauksson, E.

    2013-12-01

    Over the past 6 years, scientists at Caltech, UC Berkeley, the Univ. of Southern California, the Univ. of Washington, the US Geological Survey, and ETH Zurich (Switzerland) have developed the 'ShakeAlert' earthquake early warning demonstration system for California and the Pacific Northwest. We have now started to transform this system into a stable end-to-end production system that will be integrated into the daily routine operations of the CISN and PNSN networks. To quickly determine the earthquake magnitude and location, ShakeAlert currently processes and interprets real-time data-streams from several hundred seismic stations within the California Integrated Seismic Network (CISN) and the Pacific Northwest Seismic Network (PNSN). Based on these parameters, the 'UserDisplay' software predicts and displays the arrival and intensity of shaking at a given user site. Real-time ShakeAlert feeds are currently being shared with around 160 individuals, companies, and emergency response organizations to gather feedback about the system performance, to educate potential users about EEW, and to identify needs and applications of EEW in a future operational warning system. To improve the performance during large earthquakes (M>6.5), we have started to develop, implement, and test a number of new algorithms for the ShakeAlert system: the 'FinDer' (Finite Fault Rupture Detector) algorithm provides real-time estimates of locations and extents of finite-fault ruptures from high-frequency seismic data. The 'GPSlip' algorithm estimates the fault slip along these ruptures using high-rate real-time GPS data. And, third, a new type of ground-motion prediction models derived from over 415,000 rupture simulations along active faults in southern California improves MMI intensity predictions for large earthquakes with consideration of finite-fault, rupture directivity, and basin response effects. FinDer and GPSlip are currently being real-time and offline tested in a separate internal ShakeAlert installation at Caltech. Real-time position and displacement time series from around 100 GPS sensors are obtained in JSON format from RTK/PPP(AR) solutions using the RTNet software at USGS Pasadena. However, we have also started to investigate the usage of onsite (in-receiver) processing using NetR9 with RTX and tracebuf2 output format. A number of changes to the ShakeAlert processing, xml message format, and the usage of this information in the UserDisplay software were necessary to handle the new finite-fault and slip information from the FinDer and GPSlip algorithms. In addition, we have developed a framework for end-to-end off-line testing with archived and simulated waveform data using the Earthworm tankplayer. Detailed background information about the algorithms, processing, and results from these test runs will be presented.

  6. Managing Complexity - Developing the Node Control Software For The International Space Station

    NASA Technical Reports Server (NTRS)

    Wood, Donald B.

    2000-01-01

    On December 4th, 1998 at 3:36 AM STS-88 (the space shuttle Endeavor) was launched with the "Node 1 Unity Module" in its payload bay. After working on the Space Station program for a very long time, that launch was one of the most beautiful sights I had ever seen! As the Shuttle proceeded to rendezvous with the Russian American module know as Zarya, I returned to Houston quickly to start monitoring the activation of the software I had spent the last 3 years working on. The FGB module (also known as "Zarya"), was grappled by the shuttle robotic arm, and connected to the Unity module. Crewmembers then hooked up the power and data connections between Zarya and Unity. On December 7th, 1998 at 9:49 PM CST the Node Control Software was activated. On December 15th, 1998, the Node-l/Zarya "cornerstone" of the International Space Station was left on-orbit. The Node Control Software (NCS) is the first software flown by NASA for the International Space Station (ISS). The ISS Program is considered the most complex international engineering effort ever undertaken. At last count some 18 countries are active partners in this global venture. NCS has performed all of its intended functions on orbit, over 200 miles above us. I'll be describing how we built the NCS software.

  7. CALCOM: a software for calculating the center of mass of proteins.

    PubMed

    Costantini, Susan; Paladino, Antonella; Facchiano, Angelo M

    2008-02-09

    The center of mass of a protein is an artificial point useful for detecting important and simple features of proteins structure, shape and association.CALCOM is a software which calculates the center of mass of a protein, starting from PDB protein structure files. In the case of protein complexes and of protein-small ligand complexes, the position of protein residues or of ligand atoms respect to each protein subunit can be evaluated, as well as the distance among the center of mass of the protein subunits, in order to compare different conformations and evaluate the relative motion of subunits. THE SERVICE IS AVAILABLE AT THE URL: http://bioinformatica.isa.cnr.it/CALCOM/.

  8. A New GRB follow-up Software at TUG

    NASA Astrophysics Data System (ADS)

    Dindar, M.; Parmaksizoglu, M.; Helhel, S.; Esenoglu, H.; Kirbiyik, H.

    2016-12-01

    A gamma-ray burst (GRB) optical photometric follow-up system at TUBITAK (Scientic and Technological Research Council of Turkey) National Observatory (TUG) has been planned. It uses the 0.6 m Telescope (T60) and can automatically respond to GRB Coordinates Network (GCN) alerts. The telescopes slew relatively fast, being able to point to a new target field within 30 s upon a request. Whenever available, the 1 m T100 and 2.5 m RTT150 telescopes will be used in the future. As an example in 2015, the GRB software system (will be server side) at T60-telescope responded to GRB alert and started the observation as early as 129 s after the GRB trigger autonomously.

  9. Enhancing requirements engineering for patient registry software systems with evidence-based components.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2017-07-01

    Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 1: Introduction and user's guide

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.

  11. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  12. Implementing Software Safety in the NASA Environment

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha S.; Radley, Charles F.

    1994-01-01

    Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.

  13. The Very Large Array Data Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an international consortium of scientists and software developers based at the National Radio Astronomical Observatory (NRAO), the European Southern Observatory (ESO), and the National Astronomical Observatory of Japan (NAOJ).

  14. Software Patching Lessons Learned - Video Text Version | Energy Systems

    Science.gov Websites

    eyes of an OEM, several different OEMs for many years. Then also in 2013, we were awarded a cooperative validating for OEM solutions and those products is very different from validating with end users, when we and started reaching out to vendors, I was a bit taken aback at the discrepancy and how different

  15. Unlocking Student Data Could Lead to "App Economy" for Colleges: Colleges Are Pressured to Open Up Student Data

    ERIC Educational Resources Information Center

    DeSantis, Nick

    2012-01-01

    College campuses are hothouses of data, including course schedules, degree requirements, and grades. But much of the information remains spread out across software systems or locked on university servers. Start-up companies want to pry the information loose from campus servers in order to offer personalized services that could transform the…

  16. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    ERIC Educational Resources Information Center

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  17. Extension of TVCAI Project to Include Demonstration of Intelligent Videodisc System. Hardware, Software, and Courseware Implementation Component. Final Report.

    ERIC Educational Resources Information Center

    Brandt, Richard C.; Knapp, Barbara H.

    This project, stemming from work started under the National Science Foundation grant "Development of a Television Computer Assisted Instruction (TVCAI) System" SER-7806412, called for the transfer to videodisc of some of the videotape materials developed under the grant. Three efforts were included in the proposal: design and development…

  18. At Home with Stickybear: School Version with Lesson Plans (Ages 1-5). [CD-ROM].

    ERIC Educational Resources Information Center

    Highsmith, Joni Bitman

    Aimed at children ages 1 to 5, this software product is designed to give preschoolers an essential skills head start with sight and sounds that will capture their interest. Different navigation methods in the program are designed for ease of use and also to meet the motor skills of very young children. The accompanying printed teaching guide…

  19. Landscape Builder: software for the creation of initial landscapes for LANDIS from FIA data

    Treesearch

    William Dijak

    2013-01-01

    I developed Landscape Builder to create spatially explicit landscapes as starting conditions for LANDIS Pro 7.0 and LANDIS II landscape forest simulation models from classified satellite imagery and Forest Inventory and Analysis (FIA) data collected over multiple years. LANDIS Pro and LANDIS II models project future landscapes by simulating tree growth, tree species...

  20. Expedited Systems Engineering for Rapid Capability and Urgent Needs

    DTIC Science & Technology

    2012-12-31

    rapid organizations start to differ from traditional ones, and there is a shift in energy , commitment, and knowledge. These findings are motivated by...123 C.7.1 Description: Integration of Modeling and Simulation , Software Design, and...differ from traditional ones, and there is a shift in energy , commitment, and knowledge. These findings are motivated by an analysis of effective

  1. MSG Instant Messenger: Social Presence and Location for the "'Ad Hoc' Learning Experience"

    ERIC Educational Resources Information Center

    Little, Alex; Denham, Chris; Eisenstadt, Marc

    2008-01-01

    "Elearning2.0" promises to harness the power of three of today's most disruptive technologies: social software, elearning, and Web2.0. Our own work in this disruptive space takes as a starting premise that social networking is critical for learning: finding the right person can be more important than "scouring the web for an answer" particularly…

  2. GNAS Maintenance Control Center (GMCC) Design Qualification Test and Evaluation (DQT&E) Test Procedures

    DTIC Science & Technology

    1992-06-01

    connected to one MPS via BROUTERS and modems ). b. MPS - RMMS interface to the GMCC. The IMCS software running on the MPS will.be used to monitor the...Start Date: TODAYS DATE REGION: SO Sector: 56K 69. GPWS: Press FI0 (PRODUCE REPORT) to generate the report. 70. GPWS: Press F12 (REPORT STATUS) to

  3. Stickybear's Early Learning Activities: School Version with Lesson Plans (Ages 2-6). [CD-ROM].

    ERIC Educational Resources Information Center

    Highsmith, Joni Bitman

    Aimed at children ages 2 to 6, this software product is designed to give preschoolers an essential skills head start with sight and sounds that will capture their interest. The courseware provides two modes of play, allowing children to learn through prompted direction or through discovery. The program is bilingual, offering skill development in…

  4. The Nearly Forgotten Malay Folklore: Shall We Start with the Software?

    ERIC Educational Resources Information Center

    Abd Rahim, Normaliza

    2014-01-01

    The study focuses on the nearly forgotten Malay folklore in Malaysia. The objectives of the study were to identify and discuss the types of Malay folklore among primary school learners. The samples of the study were 100 male and female students at schools in Selangor. The samples were picked at random from several schools and they were given…

  5. Elements of strategic capability for software outsourcing enterprises based on the resource

    NASA Astrophysics Data System (ADS)

    Shi, Wengeng

    2011-10-01

    Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.

  6. 31 CFR 538.533 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... software incident to Internet-based communications. 538.533 Section 538.533 Money and Finance: Treasury....533 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Sudan of software necessary to enable the services...

  7. 31 CFR 560.540 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... software incident to Internet-based communications. 560.540 Section 560.540 Money and Finance: Treasury... Licensing Policy § 560.540 Exportation of certain services and software incident to Internet-based... United States or by U.S. persons, wherever located, to persons in Iran of software necessary to enable...

  8. 31 CFR 538.533 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... software incident to Internet-based communications. 538.533 Section 538.533 Money and Finance: Treasury....533 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Sudan of software necessary to enable the services...

  9. 31 CFR 560.540 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... software incident to Internet-based communications. 560.540 Section 560.540 Money and Finance: Treasury....540 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Iran of software necessary to enable the services...

  10. 31 CFR 538.533 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... software incident to Internet-based communications. 538.533 Section 538.533 Money and Finance: Treasury....533 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Sudan of software necessary to enable the services...

  11. 31 CFR 560.540 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... software incident to Internet-based communications. 560.540 Section 560.540 Money and Finance: Treasury... Licensing Policy § 560.540 Exportation of certain services and software incident to Internet-based... United States or by U.S. persons, wherever located, to persons in Iran of software necessary to enable...

  12. 31 CFR 560.540 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... software incident to Internet-based communications. 560.540 Section 560.540 Money and Finance: Treasury....540 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Iran of software necessary to enable the services...

  13. 31 CFR 538.533 - Exportation of certain services and software incident to Internet-based communications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... software incident to Internet-based communications. 538.533 Section 538.533 Money and Finance: Treasury....533 Exportation of certain services and software incident to Internet-based communications. (a) To the....S. persons, wherever located, to persons in Sudan of software necessary to enable the services...

  14. Evaluation of the Diagnostic Accuracy of CareStart G6PD Deficiency Rapid Diagnostic Test (RDT) in a Malaria Endemic Area in Ghana, Africa

    PubMed Central

    Adu-Gyasi, Dennis; Asante, Kwaku Poku; Newton, Sam; Dosoo, David; Amoako, Sabastina; Adjei, George; Amoako, Nicholas; Ankrah, Love; Tchum, Samuel Kofi; Mahama, Emmanuel; Agyemang, Veronica; Kayan, Kingsley; Owusu-Agyei, Seth

    2015-01-01

    Background Glucose-6-phosphate dehydrogenase (G6PD) deficiency is the most widespread enzyme defect that can result in red cell breakdown under oxidative stress when exposed to certain medicines including antimalarials. We evaluated the diagnostic accuracy of CareStart G6PD deficiency Rapid Diagnostic Test (RDT) as a point-of-care tool for screening G6PD deficiency. Methods A cross-sectional study was conducted among 206 randomly selected and consented participants from a group with known G6PD deficiency status between February 2013 and June 2013. A maximum of 1.6ml of capillary blood samples were used for G6PD deficiency screening using CareStart G6PD RDT and Trinity qualitative with Trinity quantitative methods as the “gold standard”. Samples were also screened for the presence of malaria parasites. Data entry and analysis were done using Microsoft Access 2010 and Stata Software version 12. Kintampo Health Research Centre Institutional Ethics Committee granted ethical approval. Results The sensitivity (SE) and specificity (SP) of CareStart G6PD deficiency RDT was 100% and 72.1% compared to Trinity quantitative method respectively and was 98.9% and 96.2% compared to Trinity qualitative method. Malaria infection status had no significant (P=0.199) change on the performance of the G6PD RDT test kit compared to the “gold standard”. Conclusions The outcome of this study suggests that the diagnostic performance of the CareStart G6PD deficiency RDT kit was high and it is acceptable at determining the G6PD deficiency status in a high malaria endemic area in Ghana. The RDT kit presents as an attractive tool for point-of-care G6PD deficiency for rapid testing in areas with high temperatures and less expertise. The CareStart G6PD deficiency RDT kit could be used to screen malaria patients before administration of the fixed dose primaquine with artemisinin-based combination therapy. PMID:25885097

  15. Neutron Source Facility Training Simulator Based on EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Young Soo; Wei, Thomas Y.; Vilim, Richard B.

    A plant operator training simulator is developed for training the plant operators as well as for design verification of plant control system (PCS) and plant protection system (PPS) for the Kharkov Institute of Physics and Technology Neutron Source Facility. The simulator provides the operator interface for the whole plant including the sub-critical assembly coolant loop, target coolant loop, secondary coolant loop, and other facility systems. The operator interface is implemented based on Experimental Physics and Industrial Control System (EPICS), which is a comprehensive software development platform for distributed control systems. Since its development at Argonne National Laboratory, it has beenmore » widely adopted in the experimental physics community, e.g. for control of accelerator facilities. This work is the first implementation for a nuclear facility. The main parts of the operator interface are the plant control panel and plant protection panel. The development involved implementation of process variable database, sequence logic, and graphical user interface (GUI) for the PCS and PPS utilizing EPICS and related software tools, e.g. sequencer for sequence logic, and control system studio (CSS-BOY) for graphical use interface. For functional verification of the PCS and PPS, a plant model is interfaced, which is a physics-based model of the facility coolant loops implemented as a numerical computer code. The training simulator is tested and demonstrated its effectiveness in various plant operation sequences, e.g. start-up, shut-down, maintenance, and refueling. It was also tested for verification of the plant protection system under various trip conditions.« less

  16. Flexible and Low-Cost Measurements for Space Software Development- The Measurements Exploration Framework

    NASA Astrophysics Data System (ADS)

    Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika

    2011-08-01

    Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.

  17. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  18. Molecular mechanisms of protein aggregation from global fitting of kinetic models.

    PubMed

    Meisl, Georg; Kirkegaard, Julius B; Arosio, Paolo; Michaels, Thomas C T; Vendruscolo, Michele; Dobson, Christopher M; Linse, Sara; Knowles, Tuomas P J

    2016-02-01

    The elucidation of the molecular mechanisms by which soluble proteins convert into their amyloid forms is a fundamental prerequisite for understanding and controlling disorders that are linked to protein aggregation, such as Alzheimer's and Parkinson's diseases. However, because of the complexity associated with aggregation reaction networks, the analysis of kinetic data of protein aggregation to obtain the underlying mechanisms represents a complex task. Here we describe a framework, using quantitative kinetic assays and global fitting, to determine and to verify a molecular mechanism for aggregation reactions that is compatible with experimental kinetic data. We implement this approach in a web-based software, AmyloFit. Our procedure starts from the results of kinetic experiments that measure the concentration of aggregate mass as a function of time. We illustrate the approach with results from the aggregation of the β-amyloid (Aβ) peptides measured using thioflavin T, but the method is suitable for data from any similar kinetic experiment measuring the accumulation of aggregate mass as a function of time; the input data are in the form of a tab-separated text file. We also outline general experimental strategies and practical considerations for obtaining kinetic data of sufficient quality to draw detailed mechanistic conclusions, and the procedure starts with instructions for extensive data quality control. For the core part of the analysis, we provide an online platform (http://www.amylofit.ch.cam.ac.uk) that enables robust global analysis of kinetic data without the need for extensive programming or detailed mathematical knowledge. The software automates repetitive tasks and guides users through the key steps of kinetic analysis: determination of constraints to be placed on the aggregation mechanism based on the concentration dependence of the aggregation reaction, choosing from several fundamental models describing assembly into linear aggregates and fitting the chosen models using an advanced minimization algorithm to yield the reaction orders and rate constants. Finally, we outline how to use this approach to investigate which targets potential inhibitors of amyloid formation bind to and where in the reaction mechanism they act. The protocol, from processing data to determining mechanisms, can be completed in <1 d.

  19. Removing a barrier to computer-based outbreak and disease surveillance--the RODS Open Source Project.

    PubMed

    Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J

    2004-09-24

    Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.

  20. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 2: Program descriptions

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The structure and functions of each reporting software program for the Software Engineering Laboratory data base are described. Baseline diagrams, module descriptions, and listings of program generation files are included.

  1. Knowledge Based Software Assistant Conference Proceedings (4th) Held in Syracuse, New York on 12-14 September 1989

    DTIC Science & Technology

    1990-05-01

    Sanders Associates. Inc. A demonstration of knowledge-based support for the evolut ;cnry development of software system requirements uskig mitV/9 text...Conference Commiffee W Douga W~t Spin-Off Technologies 4 AN OVERVIEW OF RADC’S KNOWLEDGE BASED SOFTWARE ASSISTANT PROGRAM Donald M. Elefante Rome Air...Knowledge-Based Software Assistant is a formally based, computer-mediated paradigm for the specification, development, evolution , and Ir ig term

  2. Research on software behavior trust based on hierarchy evaluation

    NASA Astrophysics Data System (ADS)

    Long, Ke; Xu, Haishui

    2017-08-01

    In view of the correlation software behavior, we evaluate software behavior credibility from two levels of control flow and data flow. In control flow level, method of the software behavior of trace based on support vector machine (SVM) is proposed. In data flow level, behavioral evidence evaluation based on fuzzy decision analysis method is put forward.

  3. Improving Software Sustainability: Lessons Learned from Profiles in Science.

    PubMed

    Gallagher, Marie E

    2013-01-01

    The Profiles in Science® digital library features digitized surrogates of historical items selected from the archival collections of the U.S. National Library of Medicine as well as collaborating institutions. In addition, it contains a database of descriptive, technical and administrative metadata. It also contains various software components that allow creation of the metadata, management of the digital items, and access to the items and metadata through the Profiles in Science Web site [1]. The choices made building the digital library were designed to maximize the sustainability and long-term survival of all of the components of the digital library [2]. For example, selecting standard and open digital file formats rather than proprietary formats increases the sustainability of the digital files [3]. Correspondingly, using non-proprietary software may improve the sustainability of the software--either through in-house expertise or through the open source community. Limiting our digital library software exclusively to open source software or to software developed in-house has not been feasible. For example, we have used proprietary operating systems, scanning software, a search engine, and office productivity software. We did this when either lack of essential capabilities or the cost-benefit trade-off favored using proprietary software. We also did so knowing that in the future we would need to replace or upgrade some of our proprietary software, analogous to migrating from an obsolete digital file format to a new format as the technological landscape changes. Since our digital library's start in 1998, all of its software has been upgraded or replaced, but the digitized items have not yet required migration to other formats. Technological changes that compelled us to replace proprietary software included the cost of product licensing, product support, incompatibility with other software, prohibited use due to evolving security policies, and product abandonment. Sometimes these changes happen on short notice, so we continually monitor our library's software for signs of endangerment. We have attempted to replace proprietary software with suitable in-house or open source software. When the replacement involves a standalone piece of software with a nearly equivalent version, such as replacing a commercial HTTP server with an open source HTTP server, the replacement is straightforward. Recently we replaced software that functioned not only as our search engine but also as the backbone of the architecture of our Web site. In this paper, we describe the lessons learned and the pros and cons of replacing this software with open source software.

  4. TH-CD-202-12: Online Inter-Beam Replanning Based On Real-Time Dose Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamerling, CP; Fast, MF; Ziegenhein, P

    Purpose: This work provides a proof-of-concept study for online replanning during treatment delivery for step-and-shoot prostate SBRT, based on real-time dose reconstruction. Online replanning is expected to improve the trade-off between target coverage and organ-at-risk dose in the presence of intra-fractional motion. Methods: We have implemented an online replanning workflow on top of our previously reported real-time dose reconstruction software which connects to an Elekta research linac. The treatment planning system DynaPlan was extended to (1) re-optimize and sequence treatment plans (in clockwise beam order) before each beam, based on actual delivered dose, in a timeframe limited by the gantrymore » rotation between subsequent beams, and (2) send the respective segments to the delivery control software DynaTrack which starts/continues treatment immediately.To investigate the impact of a reduced safety margin, we have created and delivered (on a linac emulator) a conventional CTV+5/3mm (I) and a reduced CTV+1mm margin (II) treatment plan for a prostate patient. We have assessed CTV coverage with and without inter-beam replanning, all exposed to a gradual target shift of 0–5mm in posterior and inferior direction from start until the end of delivery. Results: For the reconstructed conventional plan (I), D98 for CTV was 100% of D98 of the planned dose. For the reconstructed margin-reduced plan (II), D98 for CTV was 95% of the planned D98 without replanning, but could be recovered to 99% by replanning for each beam. Plan (II) with replanning resulted in a decrease for bladder V90% by 88% and an increase to rectum V90% by 9% compared to the conventional plan (I). Dose calculation/accumulation was performed in <15ms per MLC aperture, replanning in <15s per beam. Conclusion: We have shown that online inter-beam replanning is technically feasible and potentially allows for a margin reduction. Future investigation considering motion-robust replanning optimization parameters is in progress. We acknowledge support of the MLC research from Elekta AB. This work is supported by Cancer Research UK under Programme C33589/A19908. Research at ICR is also supported by Cancer Research UK under Programme C33589/A19727 and NHS funding to the NIHR Biomedical Research Centre at RMH and ICR.« less

  5. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  6. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.

  7. Launching large computing applications on a disk-less cluster

    NASA Astrophysics Data System (ADS)

    Schwemmer, Rainer; Caicedo Carvajal, Juan Manuel; Neufeld, Niko

    2011-12-01

    The LHCb Event Filter Farm system is based on a cluster of the order of 1.500 disk-less Linux nodes. Each node runs one instance of the filtering application per core. The amount of cores in our current production environment is 8 per machine for the old cluster and 12 per machine on extension of the cluster. Each instance has to load about 1.000 shared libraries, weighting 200 MB from several directory locations from a central repository. The repository is currently hosted on a SAN and exported via NFS. The libraries are all available in the local file system cache on every node. Loading a library still causes a huge number of requests to the server though, because the loader will try to probe every available path. Measurements show there are between 100.000-200.000 calls per application instance start up. Multiplied by the numbers of cores in the farm, this translates into a veritable DDoS attack on the servers, which lasts several minutes. Since the application is being restarted frequently, a better solution had to be found.scp Rolling out the software to the nodes is out of the question, because they have no disks and the software in it's entirety is too large to put into a ram disk. To solve this problem we developed a FUSE based file systems which acts as a permanent, controllable cache that keeps the essential files that are necessary in stock.

  8. Evidence for soft bounds in Ubuntu package sizes and mammalian body masses.

    PubMed

    Gherardi, Marco; Mandrà, Salvatore; Bassetti, Bruno; Cosentino Lagomarsino, Marco

    2013-12-24

    The development of a complex system depends on the self-coordinated action of a large number of agents, often determining unexpected global behavior. The case of software evolution has great practical importance: knowledge of what is to be considered atypical can guide developers in recognizing and reacting to abnormal behavior. Although the initial framework of a theory of software exists, the current theoretical achievements do not fully capture existing quantitative data or predict future trends. Here we show that two elementary laws describe the evolution of package sizes in a Linux-based operating system: first, relative changes in size follow a random walk with non-Gaussian jumps; second, each size change is bounded by a limit that is dependent on the starting size, an intriguing behavior that we call "soft bound." Our approach is based on data analysis and on a simple theoretical model, which is able to reproduce empirical details without relying on any adjustable parameter and generates definite predictions. The same analysis allows us to formulate and support the hypothesis that a similar mechanism is shaping the distribution of mammalian body sizes, via size-dependent constraints during cladogenesis. Whereas generally accepted approaches struggle to reproduce the large-mass shoulder displayed by the distribution of extant mammalian species, this is a natural consequence of the softly bounded nature of the process. Additionally, the hypothesis that this model is valid has the relevant implication that, contrary to a common assumption, mammalian masses are still evolving, albeit very slowly.

  9. Developing a treatment planning process and software for improved translation of photodynamic therapy

    NASA Astrophysics Data System (ADS)

    Cassidy, J.; Zheng, Z.; Xu, Y.; Betz, V.; Lilge, L.

    2017-04-01

    Background: The majority of de novo cancers are diagnosed in low and middle-income countries, which often lack the resources to provide adequate therapeutic options. None or minimally invasive therapies such as Photodynamic Therapy (PDT) or photothermal therapies could become part of the overall treatment options in these countries. However, widespread acceptance is hindered by the current empirical training of surgeons in these optical techniques and a lack of easily usable treatment optimizing tools. Methods: Based on image processing programs, ITK-SNAP, and the publicly available FullMonte light propagation software, a work plan is proposed that allows for personalized PDT treatment planning. Starting with, contoured clinical CT or MRI images, the generation of 3D tetrahedral models in silico, execution of the Monte Carlo simulation and presentation of the 3D fluence rate, Φ, [mWcm-2] distribution a treatment plan optimizing photon source placement is developed. Results: Permitting 1-2 days for the installation of the required programs, novices can generate their first fluence, H [Jcm-2] or Φ distribution in a matter of hours. This is reduced to 10th of minutes with some training. Executing the photon simulation calculations is rapid and not the performance limiting process. Largest sources of errors are uncertainties in the contouring and unknown tissue optical properties. Conclusions: The presented FullMonte simulation is the fastest tetrahedral based photon propagation program and provides the basis for PDT treatment planning processes, enabling a faster proliferation of low cost, minimal invasive personalized cancer therapies.

  10. Ub-ISAP: a streamlined UNIX pipeline for mining unique viral vector integration sites from next generation sequencing data.

    PubMed

    Kamboj, Atul; Hallwirth, Claus V; Alexander, Ian E; McCowage, Geoffrey B; Kramer, Belinda

    2017-06-17

    The analysis of viral vector genomic integration sites is an important component in assessing the safety and efficiency of patient treatment using gene therapy. Alongside this clinical application, integration site identification is a key step in the genetic mapping of viral elements in mutagenesis screens that aim to elucidate gene function. We have developed a UNIX-based vector integration site analysis pipeline (Ub-ISAP) that utilises a UNIX-based workflow for automated integration site identification and annotation of both single and paired-end sequencing reads. Reads that contain viral sequences of interest are selected and aligned to the host genome, and unique integration sites are then classified as transcription start site-proximal, intragenic or intergenic. Ub-ISAP provides a reliable and efficient pipeline to generate large datasets for assessing the safety and efficiency of integrating vectors in clinical settings, with broader applications in cancer research. Ub-ISAP is available as an open source software package at https://sourceforge.net/projects/ub-isap/ .

  11. Near field communication (NFC) model for arduino uno based security systems office system

    NASA Astrophysics Data System (ADS)

    Chairunnas, A.; Abdurrasyid, I.

    2018-03-01

    Currently, many offices or companies that start growing rapidly in a company or office should have a very limited room to enter only people entitled to enter the room and use the facilities contained in it, for example, Files in it must have many files and documents very important because to reduce the abuse of files and irresponsible person. Because it will be made room door security system by using Near Field Communication on android smartphone. Software used is Arduino IDE. The tools used in this system are Arduino Uno R3, NFC shield, pear sensor, bell, led, servo, 16 × 2 LCD, and Near Field Communication (NFC) in android smartphone. This system runs based on 2 inputs of a new technology that is Near Field Communication (NFC) in android smartphone. And also use pear sensor to detect unauthorized person entering the room. If the correct password is entered then the door will open and the pear sensor will light off if wrong then the bell will light up.

  12. A new Scheme for ATLAS Trigger Simulation using Legacy Code

    NASA Astrophysics Data System (ADS)

    Galster, Gorm; Stelzer, Joerg; Wiedenmann, Werner

    2014-06-01

    Analyses at the LHC which search for rare physics processes or determine with high precision Standard Model parameters require accurate simulations of the detector response and the event selection processes. The accurate determination of the trigger response is crucial for the determination of overall selection efficiencies and signal sensitivities. For the generation and the reconstruction of simulated event data, the most recent software releases are usually used to ensure the best agreement between simulated data and real data. For the simulation of the trigger selection process, however, ideally the same software release that was deployed when the real data were taken should be used. This potentially requires running software dating many years back. Having a strategy for running old software in a modern environment thus becomes essential when data simulated for past years start to present a sizable fraction of the total. We examined the requirements and possibilities for such a simulation scheme within the ATLAS software framework and successfully implemented a proof-of-concept simulation chain. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data format incompatibilities are also likely to emerge in databases and other external support services. Software availability may become an issue, when e.g. the support for the underlying operating system might stop. In this paper we present the encountered problems and developed solutions, and discuss proposals for future development. Some ideas reach beyond the retrospective trigger simulation scheme in ATLAS as they also touch more generally aspects of data preservation.

  13. Implementation of the AES as a Hash Function for Confirming the Identity of Software on a Computer System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Randy R.; Bass, Robert B.; Kouzes, Richard T.

    2003-01-20

    This paper provides a brief overview of the implementation of the Advanced Encryption Standard (AES) as a hash function for confirming the identity of software resident on a computer system. The PNNL Software Authentication team chose to use a hash function to confirm software identity on a system for situations where: (1) there is limited time to perform the confirmation and (2) access to the system is restricted to keyboard or thumbwheel input and output can only be displayed on a monitor. PNNL reviewed three popular algorithms: the Secure Hash Algorithm - 1 (SHA-1), the Message Digest - 5 (MD-5),more » and the Advanced Encryption Standard (AES) and selected the AES to incorporate in software confirmation tool we developed. This paper gives a brief overview of the SHA-1, MD-5, and the AES and sites references for further detail. It then explains the overall processing steps of the AES to reduce a large amount of generic data-the plain text, such is present in memory and other data storage media in a computer system, to a small amount of data-the hash digest, which is a mathematically unique representation or signature of the former that could be displayed on a computer's monitor. This paper starts with a simple definition and example to illustrate the use of a hash function. It concludes with a description of how the software confirmation tool uses the hash function to confirm the identity of software on a computer system.« less

  14. AnClim and ProClimDB software for data quality control and homogenization of time series

    NASA Astrophysics Data System (ADS)

    Stepanek, Petr

    2015-04-01

    During the last decade, a software package consisting of AnClim, ProClimDB and LoadData for processing (mainly climatological) data has been created. This software offers a complex solution for processing of climatological time series, starting from loading the data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme value evaluations and RCM outputs verification and correction (ProClimDB and AnClim software). The detection of inhomogeneities is carried out on a monthly scale through the application of AnClim, or newly by R functions called from ProClimDB, while quality control, the preparation of reference series and the correction of found breaks is carried out by the ProClimDB software. The software combines many statistical tests, types of reference series and time scales (monthly, seasonal and annual, daily and sub-daily ones). These can be used to create an "ensemble" of solutions, which may be more reliable than any single method. AnClim software is suitable for educational purposes: e.g. for students getting acquainted with methods used in climatology. Built-in graphical tools and comparison of various statistical tests help in better understanding of a given method. ProClimDB is, on the contrary, tool aimed for processing of large climatological datasets. Recently, functions from R may be used within the software making it more efficient in data processing and capable of easy inclusion of new methods (when available under R). An example of usage is easy comparison of methods for correction of inhomogeneities in daily data (HOM of Paul Della-Marta, SPLIDHOM method of Olivier Mestre, DAP - own method, QM of Xiaolan Wang and others). The software is available together with further information on www.climahom.eu . Acknowledgement: this work was partially funded by the project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248.

  15. Towards Archetypes-Based Software Development

    NASA Astrophysics Data System (ADS)

    Piho, Gunnar; Roost, Mart; Perkins, David; Tepandi, Jaak

    We present a framework for the archetypes based engineering of domains, requirements and software (Archetypes-Based Software Development, ABD). An archetype is defined as a primordial object that occurs consistently and universally in business domains and in business software systems. An archetype pattern is a collaboration of archetypes. Archetypes and archetype patterns are used to capture conceptual information into domain specific models that are utilized by ABD. The focus of ABD is on software factories - family-based development artefacts (domain specific languages, patterns, frameworks, tools, micro processes, and others) that can be used to build the family members. We demonstrate the usage of ABD for developing laboratory information management system (LIMS) software for the Clinical and Biomedical Proteomics Group, at the Leeds Institute of Molecular Medicine, University of Leeds.

  16. Web-based interactive drone control using hand gesture

    NASA Astrophysics Data System (ADS)

    Zhao, Zhenfei; Luo, Hao; Song, Guang-Hua; Chen, Zhou; Lu, Zhe-Ming; Wu, Xiaofeng

    2018-01-01

    This paper develops a drone control prototype based on web technology with the aid of hand gesture. The uplink control command and downlink data (e.g., video) are transmitted by WiFi communication, and all the information exchange is realized on web. The control command is translated from various predetermined hand gestures. Specifically, the hardware of this friendly interactive control system is composed by a quadrotor drone, a computer vision-based hand gesture sensor, and a cost-effective computer. The software is simplified as a web-based user interface program. Aided by natural hand gestures, this system significantly reduces the complexity of traditional human-computer interaction, making remote drone operation more intuitive. Meanwhile, a web-based automatic control mode is provided in addition to the hand gesture control mode. For both operation modes, no extra application program is needed to be installed on the computer. Experimental results demonstrate the effectiveness and efficiency of the proposed system, including control accuracy, operation latency, etc. This system can be used in many applications such as controlling a drone in global positioning system denied environment or by handlers without professional drone control knowledge since it is easy to get started.

  17. Web-based interactive drone control using hand gesture.

    PubMed

    Zhao, Zhenfei; Luo, Hao; Song, Guang-Hua; Chen, Zhou; Lu, Zhe-Ming; Wu, Xiaofeng

    2018-01-01

    This paper develops a drone control prototype based on web technology with the aid of hand gesture. The uplink control command and downlink data (e.g., video) are transmitted by WiFi communication, and all the information exchange is realized on web. The control command is translated from various predetermined hand gestures. Specifically, the hardware of this friendly interactive control system is composed by a quadrotor drone, a computer vision-based hand gesture sensor, and a cost-effective computer. The software is simplified as a web-based user interface program. Aided by natural hand gestures, this system significantly reduces the complexity of traditional human-computer interaction, making remote drone operation more intuitive. Meanwhile, a web-based automatic control mode is provided in addition to the hand gesture control mode. For both operation modes, no extra application program is needed to be installed on the computer. Experimental results demonstrate the effectiveness and efficiency of the proposed system, including control accuracy, operation latency, etc. This system can be used in many applications such as controlling a drone in global positioning system denied environment or by handlers without professional drone control knowledge since it is easy to get started.

  18. A simple rapid process for semi-automated brain extraction from magnetic resonance images of the whole mouse head.

    PubMed

    Delora, Adam; Gonzales, Aaron; Medina, Christopher S; Mitchell, Adam; Mohed, Abdul Faheem; Jacobs, Russell E; Bearer, Elaine L

    2016-01-15

    Magnetic resonance imaging (MRI) is a well-developed technique in neuroscience. Limitations in applying MRI to rodent models of neuropsychiatric disorders include the large number of animals required to achieve statistical significance, and the paucity of automation tools for the critical early step in processing, brain extraction, which prepares brain images for alignment and voxel-wise statistics. This novel timesaving automation of template-based brain extraction ("skull-stripping") is capable of quickly and reliably extracting the brain from large numbers of whole head images in a single step. The method is simple to install and requires minimal user interaction. This method is equally applicable to different types of MR images. Results were evaluated with Dice and Jacquard similarity indices and compared in 3D surface projections with other stripping approaches. Statistical comparisons demonstrate that individual variation of brain volumes are preserved. A downloadable software package not otherwise available for extraction of brains from whole head images is included here. This software tool increases speed, can be used with an atlas or a template from within the dataset, and produces masks that need little further refinement. Our new automation can be applied to any MR dataset, since the starting point is a template mask generated specifically for that dataset. The method reliably and rapidly extracts brain images from whole head images, rendering them useable for subsequent analytical processing. This software tool will accelerate the exploitation of mouse models for the investigation of human brain disorders by MRI. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. ActiWiz 3 – an overview of the latest developments and their application

    NASA Astrophysics Data System (ADS)

    Vincke, H.; Theis, C.

    2018-06-01

    In 2011 the ActiWiz code was developed at CERN in order to optimize the choice of materials for accelerator equipment from a radiological point of view. Since then the code has been extended to allow for calculating complete nuclide inventories and provide evaluations with respect to radiotoxicity, inhalation doses, etc. Until now the software included only pre-defined radiation environments for CERN’s high-energy proton accelerators which were based on FLUKA Monte Carlo calculations. Eventually the decision was taken to invest into a major revamping of the code. Starting with version 3 the software is not limited anymore to pre-defined radiation fields but within a few seconds it can also treat arbitrary environments of which fluence spectra are available. This has become possible due to the use of ~100 CPU years’ worth of FLUKA Monte Carlo simulations as well as the JEFF cross-section library for neutrons < 20 MeV. Eventually the latest code version allowed for the efficient inclusion of 42 additional radiation environments of the LHC experiments as well as considerably more flexibility in view of characterizing also waste from CERN’s Large Electron Positron collider (LEP). New fully integrated analysis functionalities like automatic evaluation of difficult-to-measure nuclides, rapid assessment of the temporal evolution of quantities like radiotoxicity or dose-rates, etc. make the software a powerful tool for characterization complementary to general purpose MC codes like FLUKA. In this paper an overview of the capabilities will be given using recent examples from the domain of waste characterization as well as operational radiation protection.

  20. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... display the following: (i) The purchase or wager amount; (ii) Game results; and (iii) Any player credit balance. (2) Between plays of any game and until the start of the next play, or until the player selects a new game option such as purchase or wager amount or card selection, whichever is earlier, if not...

  1. 25 CFR 547.8 - What are the minimum technical software standards applicable to Class II gaming systems?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... display the following: (i) The purchase or wager amount; (ii) Game results; and (iii) Any player credit balance. (2) Between plays of any game and until the start of the next play, or until the player selects a new game option such as purchase or wager amount or card selection, whichever is earlier, if not...

  2. Advanced Control Systems for Aircraft Powerplants

    DTIC Science & Technology

    1980-02-01

    production of high- integrity software. 1.0 INTRODUCTION Work on full-authority digital control for gas turbines was started at Rolls- Royce Limited... INTRODUCTION In order to fully understand the operation of the Secondary Power System Control Unit - abbreviated SPSCU - we must first take a close look at...Only Memory EPROM -- Erasable Read Only Memory PLA -- Power Lever Angle LVDT -- Linear Variable Differential Transformer INTRODUCTION Preliminary design

  3. Chips: A Tool for Developing Software Interfaces Interactively.

    DTIC Science & Technology

    1987-10-01

    of the application through the objects on the screen. Chips makes this easy by supplying simple and direct access to the source code and data ...object-oriented programming, user interface management systems, programming environments. Typographic Conventions Technical terms appearing in the...creating an environment in which we could do our work. This project could not have happened without him. Jeff Bonar started and managed the Chips

  4. Running VisIt Software on the Peregrine System | High-Performance Computing

    Science.gov Websites

    kilobyte range. VisIt features a robust remote visualization capability. VisIt can be started on a local machine and used to visualize data on a remote compute cluster.The remote machine must be able to send VisIt module must be loaded as part of this process. To enable remote visualization the 'module load

  5. The increase in the starting torque of PMSM motor by applying of FOC method

    NASA Astrophysics Data System (ADS)

    Plachta, Kamil

    2017-05-01

    The article presents field oriented control method of synchronous permanent magnet motor equipped in optical sensors. This method allows for a wide range regulation of torque and rotational speed of the electric motor. The paper presents mathematical model of electric motor and vector control method. Optical sensors have shorter time response as compared to the inductive sensors, which allow for faster response of the electronic control system to changes of motor loads. The motor driver is based on the digital signal processor which performs advanced mathematical operations in real time. The appliance of Clark and Park transformation in the software defines the angle of rotor position. The presented solution provides smooth adjustment of the rotational speed in the first operating zone and reduces the dead zone of the torque in the second and third operating zones.

  6. BEAMLINE-CONTROLLED STEERING OF SOURCE-POINT ANGLE AT THE ADVANCED PHOTON SOURCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, L.; Fystro, G.; Shang, H.

    An EPICS-based steering software system has been implemented for beamline personnel to directly steer the angle of the synchrotron radiation sources at the Advanced Photon Source. A script running on a workstation monitors "start steering" beamline EPICS records, and effects a steering given by the value of the "angle request" EPICS record. The new system makes the steering process much faster than before, although the older steering protocols can still be used. The robustness features of the original steering remain. Feedback messages are provided to the beamlines and the accelerator operators. Underpinning this new steering protocol is the recent refinementmore » of the global orbit feedback process whereby feedforward of dipole corrector set points and orbit set points are used to create a local steering bump in a rapid and seamless way.« less

  7. Biochemia Medica has started using the CrossCheck plagiarism detection software powered by iThenticate

    PubMed Central

    Šupak-Smolčić, Vesna; Šimundić, Ana-Maria

    2013-01-01

    In February 2013, Biochemia Medica has joined CrossRef, which enabled us to implement CrossCheck plagiarism detection service. Therefore, all manuscript submitted to Biochemia Medica are now first assigned to Research integrity editor (RIE), before sending the manuscript for peer-review. RIE submits the text to CrossCheck analysis and is responsible for reviewing the results of the text similarity analysis. Based on the CrossCheck analysis results, RIE subsequently provides a recommendation to the Editor-in-chief (EIC) on whether the manuscript should be forwarded to peer-review, corrected for suspected parts prior to peer-review or immediately rejected. Final decision on the manuscript is, however, with the EIC. We hope that our new policy and manuscript processing algorithm will help us to further increase the overall quality of our Journal. PMID:23894858

  8. Intellectual Property, Digital Technology and the Developing World

    NASA Astrophysics Data System (ADS)

    Pupillo, Lorenzo Maria

    This chapter provides an overview of how the converging ICTs are challenging the traditional off-line copyright doctrine and suggests how developing countries should approach issues such as copyright in the digital world, software (Protection, Open Source, Reverse Engineering), and data base protection. The balance of the chapter is organized into three sections. After the introduction, the second section explains how digital technology is dramatically changing the entertainment industry, what are the major challenges to the industry, and what are the approaches that the economic literature suggest to face the structural changes that the digital revolution is bringing forward. Starting from the assumption that IPRs frameworks need to be customized to the countries’ development needs, the third section makes recommendations on how developing countries should use copyright to support access to information and to creative industries.

  9. Taking a Position

    NASA Technical Reports Server (NTRS)

    1999-01-01

    "TerrAvoid" and "Position Integrity" combine Global Positioning Satellite (GPS) data with high-resolution maps of the Earth's topography. Dubbs & Severino, Inc., based in Irvine, California, has developed software that allows the system to be run on a battery-powered laptop in the cockpit. The packages, designed primarily for military sponsors and now positioned to hit the consumer market in coming months, came about as the result of the Jet Propulsion Laboratory's Technology Affiliates Program. Intended to give American industry assistance from NASA experts and to facilitate business use of intellectual property developed for the space program, the Technology Affiliates Program introduced the start-up company of Dubbs & Severino to JPL's Dr. Nevin Bryant four years ago. GeoTIFF is now in the public domain, and its use for commercial product development has evolved into an industry standard over the last year.

  10. Embedded parallel processing based ground control systems for small satellite telemetry

    NASA Technical Reports Server (NTRS)

    Forman, Michael L.; Hazra, Tushar K.; Troendly, Gregory M.; Nickum, William G.

    1994-01-01

    The use of networked terminals which utilize embedded processing techniques results in totally integrated, flexible, high speed, reliable, and scalable systems suitable for telemetry and data processing applications such as mission operations centers (MOC). Synergies of these terminals, coupled with the capability of terminal to receive incoming data, allow the viewing of any defined display by any terminal from the start of data acquisition. There is no single point of failure (other than with network input) such as exists with configurations where all input data goes through a single front end processor and then to a serial string of workstations. Missions dedicated to NASA's ozone measurements program utilize the methodologies which are discussed, and result in a multimission configuration of low cost, scalable hardware and software which can be run by one flight operations team with low risk.

  11. A bootstrap based Neyman-Pearson test for identifying variable importance.

    PubMed

    Ditzler, Gregory; Polikar, Robi; Rosen, Gail

    2015-04-01

    Selection of most informative features that leads to a small loss on future data are arguably one of the most important steps in classification, data analysis and model selection. Several feature selection (FS) algorithms are available; however, due to noise present in any data set, FS algorithms are typically accompanied by an appropriate cross-validation scheme. In this brief, we propose a statistical hypothesis test derived from the Neyman-Pearson lemma for determining if a feature is statistically relevant. The proposed approach can be applied as a wrapper to any FS algorithm, regardless of the FS criteria used by that algorithm, to determine whether a feature belongs in the relevant set. Perhaps more importantly, this procedure efficiently determines the number of relevant features given an initial starting point. We provide freely available software implementations of the proposed methodology.

  12. Binding-Site Compatible Fragment Growing Applied to the Design of β2-Adrenergic Receptor Ligands.

    PubMed

    Chevillard, Florent; Rimmer, Helena; Betti, Cecilia; Pardon, Els; Ballet, Steven; van Hilten, Niek; Steyaert, Jan; Diederich, Wibke E; Kolb, Peter

    2018-02-08

    Fragment-based drug discovery is intimately linked to fragment extension approaches that can be accelerated using software for de novo design. Although computers allow for the facile generation of millions of suggestions, synthetic feasibility is however often neglected. In this study we computationally extended, chemically synthesized, and experimentally assayed new ligands for the β 2 -adrenergic receptor (β 2 AR) by growing fragment-sized ligands. In order to address the synthetic tractability issue, our in silico workflow aims at derivatized products based on robust organic reactions. The study started from the predicted binding modes of five fragments. We suggested a total of eight diverse extensions that were easily synthesized, and further assays showed that four products had an improved affinity (up to 40-fold) compared to their respective initial fragment. The described workflow, which we call "growing via merging" and for which the key tools are available online, can improve early fragment-based drug discovery projects, making it a useful creative tool for medicinal chemists during structure-activity relationship (SAR) studies.

  13. Integrating medical devices in the operating room using service-oriented architectures.

    PubMed

    Ibach, Bastian; Benzko, Julia; Schlichting, Stefan; Zimolong, Andreas; Radermacher, Klaus

    2012-08-01

    Abstract With the increasing documentation requirements and communication capabilities of medical devices in the operating room, the integration and modular networking of these devices have become more and more important. Commercial integrated operating room systems are mainly proprietary developments using usually proprietary communication standards and interfaces, which reduce the possibility of integrating devices from different vendors. To overcome these limitations, there is a need for an open standardized architecture that is based on standard protocols and interfaces enabling the integration of devices from different vendors based on heterogeneous software and hardware components. Starting with an analysis of the requirements for device integration in the operating room and the techniques used for integrating devices in other industrial domains, a new concept for an integration architecture for the operating room based on the paradigm of a service-oriented architecture is developed. Standardized communication protocols and interface descriptions are used. As risk management is an important factor in the field of medical engineering, a risk analysis of the developed concept has been carried out and the first prototypes have been implemented.

  14. An Energy Saving Green Plug Device for Nonlinear Loads

    NASA Astrophysics Data System (ADS)

    Bloul, Albe; Sharaf, Adel; El-Hawary, Mohamed

    2018-03-01

    The paper presents a low cost a FACTS Based flexible fuzzy logic based modulated/switched tuned arm filter and Green Plug compensation (SFC-GP) scheme for single-phase nonlinear loads ensuring both voltage stabilization and efficient energy utilization. The new Green Plug-Switched filter compensator SFC modulated LC-Filter PWM Switched Capacitive Compensation Devices is controlled using a fuzzy logic regulator to enhance power quality, improve power factor at the source and reduce switching transients and inrush current conditions as well harmonic contents in source current. The FACTS based SFC-GP Device is a member of family of Green Plug/Filters/Compensation Schemes used for efficient energy utilization, power quality enhancement and voltage/inrush current/soft starting control using a dynamic error driven fuzzy logic controller (FLC). The device with fuzzy logic controller is validated using the Matlab / Simulink Software Environment for enhanced power quality (PQ), improved power factor and reduced inrush currents. This is achieved using modulated PWM Switching of the Filter-Capacitive compensation scheme to cope with dynamic type nonlinear and inrush cyclical loads..

  15. Effort Drivers Estimation for Brazilian Geographically Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio

    To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.

  16. Advanced Command Destruct System (ACDS) Enhanced Flight Termination System (EFTS)

    NASA Technical Reports Server (NTRS)

    Tow, David

    2009-01-01

    NASA Dryden started working towards a single vehicle enhanced flight termination system (EFTS) in January 2008. NASA and AFFTC combined their efforts to work towards final operating capability for multiple vehicle and multiple missions simultaneously, to be completed by the end of 2011. Initially, the system was developed to support one vehicle and one frequency per mission for unmanned aerial vehicles (UAVs) at NASA Dryden. By May 2008 95% of design and hardware builds were completed, however, NASA Dryden's change of software safety scope and requirements caused delays after May 2008. This presentation reviews the initial and final operating capabilities for the Advanced Command Destruct System (ACDS), including command controller and configuration software development. A requirements summary is also provided.

  17. Automated mixed traffic transit vehicle microprocessor controller

    NASA Technical Reports Server (NTRS)

    Marks, R. A.; Cassell, P.; Johnston, A. R.

    1981-01-01

    An improved Automated Mixed Traffic Vehicle (AMTV) speed control system employing a microprocessor and transistor chopper motor current controller is described and its performance is presented in terms of velocity versus time curves. The on board computer hardware and software systems are described as is the software development system. All of the programming used in this controller was implemented using FORTRAN. This microprocessor controller made possible a number of safety features and improved the comfort associated with starting and shopping. In addition, most of the vehicle's performance characteristics can be altered by simple program parameter changes. A failure analysis of the microprocessor controller was generated and the results are included. Flow diagrams for the speed control algorithms and complete FORTRAN code listings are also included.

  18. Positional Catalogues of Saturn's and Jupiter's Moons

    NASA Astrophysics Data System (ADS)

    Yizhakevych, O.; Andruk, V.; Pakuliak, L.; Lukianchuk, V.; Shatokhina, S.

    In the framework of the UkrVO national project (http://ukr-vo.org/) we have started the processing of photographic observations of Saturn's (S1-S8) and Jupiter's (J6-J8) moons. Observations were conducted during 1961-1993 with three astrographs DLFA, DWA, DAZ and Z600 reflector. Plate images were digitized as tif-files with commercial scanners. Image processing was carried out by specific software package in the LINUX-MIDAS-ROMAFOT environment with Tycho2 as reference. The software was developed at the MAO NASU. Obtained positions of objects were compared with theoretically predicted ones in IMCCE (Paris) (www.imcce.fr/sat) online. Rms error of divergence between observed and calculated positions is of 0.20' - 0.35'.

  19. AstroBus On-Board Software

    NASA Astrophysics Data System (ADS)

    Biscarros, D.; Cantenot, C.; Séronie-Vivien, J.; Schmidt, G.

    AstroBus on-board software is a customisable software for ERC32 based avionics implementing standard ESA Packet Utilization Standard functions. Its architecture based on generic design templates and relying on a library providing standard PUS TC, TM and event services enhances its reusability on various programs. Finally, AstroBus on-board software development and validation environment is based on last generation tools providing an optimised customisation environment.

  20. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design.

    PubMed

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-12-01

    Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking.

  1. Development of a software for the design of custom-made hip prostheses using an open-source rapid application development environment.

    PubMed

    Viceconti, M; Testi, D; Gori, R; Zannoni, C

    2000-01-01

    The present work describes a technology transfer project called HIPCOM devoted to the re-engineering of the process used by a medical devices manufacturer to design custom-made hip prostheses. Although it started with insufficient support from the end-user management, a very tight scheduling and a moderate budget, the project developed into what is considered by all partners a success story. In particular, the development of the design software, called HIPCOM Interactive Design Environment (HIDE) was completed in a time shorter than any optimistic expectation. The software was quite stable since its first beta version, and once introduced at the user site it fully replaced the original procedure in less than two months. One year after the early adoption, more than 80 custom-made prostheses had been designed with HIDE and the user had reported only two bugs, both cosmetics. The scope of the present work was to report the development experience and to investigate the reasons for these positive results, with particular reference to the development procedure and the software architecture. The choice of TCL/TK as development language and the adoption of well-defined software architecture were found to be the success key factors. Other important determinants were found to be the adoption of an incremental software engineering strategy, well suited for small to medium projects and the presence in the development staff of a technology transfer expert.

  2. Agile Methods for Open Source Safety-Critical Software

    PubMed Central

    Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-01-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the right amount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion. PMID:21799545

  3. Agile Methods for Open Source Safety-Critical Software.

    PubMed

    Gary, Kevin; Enquobahrie, Andinet; Ibanez, Luis; Cheng, Patrick; Yaniv, Ziv; Cleary, Kevin; Kokoori, Shylaja; Muffih, Benjamin; Heidenreich, John

    2011-08-01

    The introduction of software technology in a life-dependent environment requires the development team to execute a process that ensures a high level of software reliability and correctness. Despite their popularity, agile methods are generally assumed to be inappropriate as a process family in these environments due to their lack of emphasis on documentation, traceability, and other formal techniques. Agile methods, notably Scrum, favor empirical process control, or small constant adjustments in a tight feedback loop. This paper challenges the assumption that agile methods are inappropriate for safety-critical software development. Agile methods are flexible enough to encourage the rightamount of ceremony; therefore if safety-critical systems require greater emphasis on activities like formal specification and requirements management, then an agile process will include these as necessary activities. Furthermore, agile methods focus more on continuous process management and code-level quality than classic software engineering process models. We present our experiences on the image-guided surgical toolkit (IGSTK) project as a backdrop. IGSTK is an open source software project employing agile practices since 2004. We started with the assumption that a lighter process is better, focused on evolving code, and only adding process elements as the need arose. IGSTK has been adopted by teaching hospitals and research labs, and used for clinical trials. Agile methods have matured since the academic community suggested they are not suitable for safety-critical systems almost a decade ago, we present our experiences as a case study for renewing the discussion.

  4. E-learning for Critical Thinking: Using Nominal Focus Group Method to Inform Software Content and Design

    PubMed Central

    Parker, Steve; Mayner, Lidia; Michael Gillham, David

    2015-01-01

    Background: Undergraduate nursing students are often confused by multiple understandings of critical thinking. In response to this situation, the Critiique for critical thinking (CCT) project was implemented to provide consistent structured guidance about critical thinking. Objectives: This paper introduces Critiique software, describes initial validation of the content of this critical thinking tool and explores wider applications of the Critiique software. Materials and Methods: Critiique is flexible, authorable software that guides students step-by-step through critical appraisal of research papers. The spelling of Critiique was deliberate, so as to acquire a unique web domain name and associated logo. The CCT project involved implementation of a modified nominal focus group process with academic staff working together to establish common understandings of critical thinking. Previous work established a consensus about critical thinking in nursing and provided a starting point for the focus groups. The study was conducted at an Australian university campus with the focus group guided by open ended questions. Results: Focus group data established categories of content that academic staff identified as important for teaching critical thinking. This emerging focus group data was then used to inform modification of Critiique software so that students had access to consistent and structured guidance in relation to critical thinking and critical appraisal. Conclusions: The project succeeded in using focus group data from academics to inform software development while at the same time retaining the benefits of broader philosophical dimensions of critical thinking. PMID:26835469

  5. Fluidica CFD software for fluids instruction

    NASA Astrophysics Data System (ADS)

    Colonius, Tim

    2008-11-01

    Fluidica is an open-source freely available Matlab graphical user interface (GUI) to to an immersed-boundary Navier- Stokes solver. The algorithm is programmed in Fortran and compiled into Matlab as mex-function. The user can create external flows about arbitrarily complex bodies and collections of free vortices. The code runs fast enough for complex 2D flows to be computed and visualized in real-time on the screen. This facilitates its use in homework and in the classroom for demonstrations of various potential-flow and viscous flow phenomena. The GUI has been written with the goal of allowing the student to learn how to use the software as she goes along. The user can select which quantities are viewed on the screen, including contours of various scalars, velocity vectors, streamlines, particle trajectories, streaklines, and finite-time Lyapunov exponents. In this talk, we demonstrate the software in the context of worked classroom examples demonstrating lift and drag, starting vortices, separation, and vortex dynamics.

  6. Process of videotape making: presentation design, software, and hardware

    NASA Astrophysics Data System (ADS)

    Dickinson, Robert R.; Brady, Dan R.; Bennison, Tim; Burns, Thomas; Pines, Sheldon

    1991-06-01

    The use of technical video tape presentations for communicating abstractions of complex data is now becoming commonplace. While the use of video tapes in the day-to-day work of scientists and engineers is still in its infancy, their use as applications oriented conferences is now growing rapidly. Despite these advancements, there is still very little that is written down about the process of making technical videotapes. For printed media, different presentation styles are well known for categories such as results reports, executive summary reports, and technical papers and articles. In this paper, the authors present ideas on the topic of technical videotape presentation design in a format that is worth referring to. They have started to document the ways in which the experience of media specialist, teaching professionals, and character animators can be applied to scientific animation. Software and hardware considerations are also discussed. For this portion, distinctions are drawn between the software and hardware required for computer animation (frame at a time) productions, and live recorded interaction with a computer graphics display.

  7. Application of Gaia Analysis Software AGIS to Nano-JASMINE

    NASA Astrophysics Data System (ADS)

    Yamada, Y.; Lammers, U.; Gouda, N.

    2011-07-01

    The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.

  8. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  9. Simulation of Top Quark Pair Production as a Background for Higgs Events at the Compact Muon Solenoid

    NASA Astrophysics Data System (ADS)

    Justus, Christopher

    2005-04-01

    In this study, we simulated top-antitop (tt-bar) quark events at the Compact Muon Solenoid (CMS), an experiment presently being constructed at the Large Hadron Collider in Geneva, Switzerland. The tt-bar process is an important background for Higgs events. We used a chain of software to simulate and reconstruct processes that will occur inside the detector. CMKIN was used to generate and store Monte Carlo Events. OSCAR, a GEANT4 based CMS detector simulator, was used to simulate the CMS detector and how particles would interact with the detector. Next, we used ORCA to simulate the response of the readout electronics at CMS. Last, we used the Jet/MET Root maker to create root files of jets and missing energy. We are now using this software analysis chain to complete a systematic study of initial state radiation at hadron colliders. This study is essential because tt-bar is the main background for the Higgs boson and these processes are extremely sensitive to initial state radiation. Results of our initial state radiation study will be presented. We started this study at the new LHC Physics Center (LPC) located at Fermi National Accelerator Laboratory, and we are now completing the study at the University of Rochester.

  10. Visualizing NetCDF Files by Using the EverVIEW Data Viewer

    USGS Publications Warehouse

    Conzelmann, Craig; Romañach, Stephanie S.

    2010-01-01

    Over the past few years, modelers in South Florida have started using Network Common Data Form (NetCDF) as the standard data container format for storing hydrologic and ecologic modeling inputs and outputs. With its origins in the meteorological discipline, NetCDF was created by the Unidata Program Center at the University Corporation for Atmospheric Research, in conjunction with the National Aeronautics and Space Administration and other organizations. NetCDF is a portable, scalable, self-describing, binary file format optimized for storing array-based scientific data. Despite attributes which make NetCDF desirable to the modeling community, many natural resource managers have few desktop software packages which can consume NetCDF and unlock the valuable data contained within. The U.S. Geological Survey and the Joint Ecosystem Modeling group, an ecological modeling community of practice, are working to address this need with the EverVIEW Data Viewer. Available for several operating systems, this desktop software currently supports graphical displays of NetCDF data as spatial overlays on a three-dimensional globe and views of grid-cell values in tabular form. An included Open Geospatial Consortium compliant, Web-mapping service client and charting interface allows the user to view Web-available spatial data as additional map overlays and provides simple charting visualizations of NetCDF grid values.

  11. On the modification Highly Connected Subgraphs (HCS) algorithm in graph clustering for weighted graph

    NASA Astrophysics Data System (ADS)

    Albirri, E. R.; Sugeng, K. A.; Aldila, D.

    2018-04-01

    Nowadays, in the modern world, since technology and human civilization start to progress, all city in the world is almost connected. The various places in this world are easier to visit. It is an impact of transportation technology and highway construction. The cities which have been connected can be represented by graph. Graph clustering is one of ways which is used to answer some problems represented by graph. There are some methods in graph clustering to solve the problem spesifically. One of them is Highly Connected Subgraphs (HCS) method. HCS is used to identify cluster based on the graph connectivity k for graph G. The connectivity in graph G is denoted by k(G)> \\frac{n}{2} that n is the total of vertices in G, then it is called as HCS or the cluster. This research used literature review and completed with simulation of program in a software. We modified HCS algorithm by using weighted graph. The modification is located in the Process Phase. Process Phase is used to cut the connected graph G into two subgraphs H and \\bar{H}. We also made a program by using software Octave-401. Then we applied the data of Flight Routes Mapping of One of Airlines in Indonesia to our program.

  12. AstroImageJ: Image Processing and Photometric Extraction for Ultra-precise Astronomical Light Curves

    NASA Astrophysics Data System (ADS)

    Collins, Karen A.; Kielkopf, John F.; Stassun, Keivan G.; Hessman, Frederic V.

    2017-02-01

    ImageJ is a graphical user interface (GUI) driven, public domain, Java-based, software package for general image processing traditionally used mainly in life sciences fields. The image processing capabilities of ImageJ are useful and extendable to other scientific fields. Here we present AstroImageJ (AIJ), which provides an astronomy specific image display environment and tools for astronomy specific image calibration and data reduction. Although AIJ maintains the general purpose image processing capabilities of ImageJ, AIJ is streamlined for time-series differential photometry, light curve detrending and fitting, and light curve plotting, especially for applications requiring ultra-precise light curves (e.g., exoplanet transits). AIJ reads and writes standard Flexible Image Transport System (FITS) files, as well as other common image formats, provides FITS header viewing and editing, and is World Coordinate System aware, including an automated interface to the astrometry.net web portal for plate solving images. AIJ provides research grade image calibration and analysis tools with a GUI driven approach, and easily installed cross-platform compatibility. It enables new users, even at the level of undergraduate student, high school student, or amateur astronomer, to quickly start processing, modeling, and plotting astronomical image data with one tightly integrated software package.

  13. Arraycount, an algorithm for automatic cell counting in microwell arrays.

    PubMed

    Kachouie, Nezamoddin; Kang, Lifeng; Khademhosseini, Ali

    2009-09-01

    Microscale technologies have emerged as a powerful tool for studying and manipulating biological systems and miniaturizing experiments. However, the lack of software complementing these techniques has made it difficult to apply them for many high-throughput experiments. This work establishes Arraycount, an approach to automatically count cells in microwell arrays. The procedure consists of fluorescent microscope imaging of cells that are seeded in microwells of a microarray system and then analyzing images via computer to recognize the array and count cells inside each microwell. To start counting, green and red fluorescent images (representing live and dead cells, respectively) are extracted from the original image and processed separately. A template-matching algorithm is proposed in which pre-defined well and cell templates are matched against the red and green images to locate microwells and cells. Subsequently, local maxima in the correlation maps are determined and local maxima maps are thresholded. At the end, the software records the cell counts for each detected microwell on the original image in high-throughput. The automated counting was shown to be accurate compared with manual counting, with a difference of approximately 1-2 cells per microwell: based on cell concentration, the absolute difference between manual and automatic counting measurements was 2.5-13%.

  14. Single baseline GLONASS observations with VLBI: data processing and first results

    NASA Astrophysics Data System (ADS)

    Tornatore, V.; Haas, R.; Duev, D.; Pogrebenko, S.; Casey, S.; Molera Calvés, G.; Keimpema, A.

    2011-07-01

    Several tests to observe signals transmitted by GLONASS (GLObal NAvigation Satellite System) satellites have been performed using the geodetic VLBI (Very Long Baseline Interferometry) technique. The radio telescopes involved in these experiments were Medicina (Italy) and Onsala (Sweden), both equipped with L-band receivers. Observations at the stations were performed using the standard Mark4 VLBI data acquisition rack and Mark5A disk-based recorders. The goals of the observations were to develop and test the scheduling, signal acquisition and processing routines to verify the full tracking pipeline, foreseeing the cross-correlation of the recorded data on the baseline Onsala-Medicina. The natural radio source 3c286 was used as a calibrator before the starting of the satellite observation sessions. Delay models, including the tropospheric and ionospheric corrections, which are consistent for both far- and near-field sources are under development. Correlation of the calibrator signal has been performed using the DiFX software, while the satellite signals have been processed using the narrow band approach with the Metsaehovi software and analysed with a near-field delay model. Delay models both for the calibrator signals and the satellites signals, using the same geometrical, tropospheric and ionospheric models, are under investigation to make a correlation of the satellite signals possible.

  15. Effect of air gap variation on the performance of single stator single rotor axial flux permanent magnet generator

    NASA Astrophysics Data System (ADS)

    Kasim, Muhammad; Irasari, Pudji; Hikmawan, M. Fathul; Widiyanto, Puji; Wirtayasa, Ketut

    2017-02-01

    The axial flux permanent magnet generator (AFPMG) has been widely used especially for electricity generation. The effect of the air gap variation on the characteristic and performances of single rotor - single stator AFPMG has been described in this paper. Effect of air gap length on the magnetic flux distribution, starting torque and MMF has been investigated. The two dimensional finite element magnetic method has been deployed to model and simulated the characteristics of the machine which is based on the Maxwell equation. The analysis has been done for two different air gap lengths which were 2 mm and 4 mm using 2D FEMM 4.2 software at no load condition. The increasing of air gap length reduces the air-gap flux density. For air gap 2 mm, the maximum value of the flux density was 1.04 T while 0.73 T occured for air gap 4 mm.. Based on the experiment result, the increasing air gap also reduced the starting torque of the machine with 39.2 Nm for air gap 2 mm and this value decreased into 34.2 Nm when the air gap increased to 4 mm. Meanwhile, the MMF that was generated by AFPMG decreased around 22% at 50 Hz due to the reduction of magnetic flux induced on stator windings. Overall, the research result showed that the variation of air gap has significant effect on the machine characteristics.

  16. A Software Engineering Approach based on WebML and BPMN to the Mediation Scenario of the SWS Challenge

    NASA Astrophysics Data System (ADS)

    Brambilla, Marco; Ceri, Stefano; Valle, Emanuele Della; Facca, Federico M.; Tziviskou, Christina

    Although Semantic Web Services are expected to produce a revolution in the development of Web-based systems, very few enterprise-wide design experiences are available; one of the main reasons is the lack of sound Software Engineering methods and tools for the deployment of Semantic Web applications. In this chapter, we present an approach to software development for the Semantic Web based on classical Software Engineering methods (i.e., formal business process development, computer-aided and component-based software design, and automatic code generation) and on semantic methods and tools (i.e., ontology engineering, semantic service annotation and discovery).

  17. A near-infrared fluorescence-based surgical navigation system imaging software for sentinel lymph node detection

    NASA Astrophysics Data System (ADS)

    Ye, Jinzuo; Chi, Chongwei; Zhang, Shuang; Ma, Xibo; Tian, Jie

    2014-02-01

    Sentinel lymph node (SLN) in vivo detection is vital in breast cancer surgery. A new near-infrared fluorescence-based surgical navigation system (SNS) imaging software, which has been developed by our research group, is presented for SLN detection surgery in this paper. The software is based on the fluorescence-based surgical navigation hardware system (SNHS) which has been developed in our lab, and is designed specifically for intraoperative imaging and postoperative data analysis. The surgical navigation imaging software consists of the following software modules, which mainly include the control module, the image grabbing module, the real-time display module, the data saving module and the image processing module. And some algorithms have been designed to achieve the performance of the software, for example, the image registration algorithm based on correlation matching. Some of the key features of the software include: setting the control parameters of the SNS; acquiring, display and storing the intraoperative imaging data in real-time automatically; analysis and processing of the saved image data. The developed software has been used to successfully detect the SLNs in 21 cases of breast cancer patients. In the near future, we plan to improve the software performance and it will be extensively used for clinical purpose.

  18. Teaching Agile Software Engineering Using Problem-Based Learning

    ERIC Educational Resources Information Center

    El-Khalili, Nuha H.

    2013-01-01

    Many studies have reported the utilization of Problem-Based Learning (PBL) in teaching Software Engineering courses. However, these studies have different views of the effectiveness of PBL. This paper presents the design of an Advanced Software Engineering course for undergraduate Software Engineering students that uses PBL to teach them Agile…

  19. RELAP-7 Level 2 Milestone Report: Demonstration of a Steady State Single Phase PWR Simulation with RELAP-7

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David Andrs; Ray Berry; Derek Gaston

    The document contains the simulation results of a steady state model PWR problem with the RELAP-7 code. The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on INL's modern scientific software development framework - MOOSE (Multi-Physics Object-Oriented Simulation Environment). This report summarizes the initial results of simulating a model steady-state single phase PWR problem using the current version of the RELAP-7 code. The major purpose of this demonstration simulation is to show that RELAP-7 code can be rapidly developed to simulate single-phase reactor problems. RELAP-7more » is a new project started on October 1st, 2011. It will become the main reactor systems simulation toolkit for RISMC (Risk Informed Safety Margin Characterization) and the next generation tool in the RELAP reactor safety/systems analysis application series (the replacement for RELAP5). The key to the success of RELAP-7 is the simultaneous advancement of physical models, numerical methods, and software design while maintaining a solid user perspective. Physical models include both PDEs (Partial Differential Equations) and ODEs (Ordinary Differential Equations) and experimental based closure models. RELAP-7 will eventually utilize well posed governing equations for multiphase flow, which can be strictly verified. Closure models used in RELAP5 and newly developed models will be reviewed and selected to reflect the progress made during the past three decades. RELAP-7 uses modern numerical methods, which allow implicit time integration, higher order schemes in both time and space, and strongly coupled multi-physics simulations. RELAP-7 is written with object oriented programming language C++. Its development follows modern software design paradigms. The code is easy to read, develop, maintain, and couple with other codes. Most importantly, the modern software design allows the RELAP-7 code to evolve with time. RELAP-7 is a MOOSE-based application. MOOSE (Multiphysics Object-Oriented Simulation Environment) is a framework for solving computational engineering problems in a well-planned, managed, and coordinated way. By leveraging millions of lines of open source software packages, such as PETSC (a nonlinear solver developed at Argonne National Laboratory) and LibMesh (a Finite Element Analysis package developed at University of Texas), MOOSE significantly reduces the expense and time required to develop new applications. Numerical integration methods and mesh management for parallel computation are provided by MOOSE. Therefore RELAP-7 code developers only need to focus on physics and user experiences. By using the MOOSE development environment, RELAP-7 code is developed by following the same modern software design paradigms used for other MOOSE development efforts. There are currently over 20 different MOOSE based applications ranging from 3-D transient neutron transport, detailed 3-D transient fuel performance analysis, to long-term material aging. Multi-physics and multiple dimensional analyses capabilities can be obtained by coupling RELAP-7 and other MOOSE based applications and by leveraging with capabilities developed by other DOE programs. This allows restricting the focus of RELAP-7 to systems analysis-type simulations and gives priority to retain and significantly extend RELAP5's capabilities.« less

  20. A Bibliography of Externally Published Works by the SEI Engineering Techniques Program

    DTIC Science & Technology

    1992-08-01

    media, and virtual reality * model- based engineering * programming languages * reuse * software architectures * software engineering as a discipline...Knowledge- Based Engineering Environments." IEEE Expert 3, 2 (May 1988): 18-23, 26-32. Audience: Practitioner [Klein89b] Klein, D.V. "Comparison of...Terms with Software Reuse Terminology: A Model- Based Approach." ACM SIGSOFT Software Engineering Notes 16, 2 (April 1991): 45-51. Audience: Practitioner

Top