Sample records for software design properly

  1. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  2. A discussion of higher order software concepts as they apply to functional requirements and specifications. [space shuttles and guidance

    NASA Technical Reports Server (NTRS)

    Hamilton, M.

    1973-01-01

    The entry guidance software functional requirements (requirements design phase), its architectural requirements (specifications design phase), and the entry guidance software verified code are discussed. It was found that the proper integration of designs at both the requirements and specifications levels are of high priority consideration.

  3. User Interface Design for Dynamic Geometry Software

    ERIC Educational Resources Information Center

    Kortenkamp, Ulrich; Dohrmann, Christian

    2010-01-01

    In this article we describe long-standing user interface issues with Dynamic Geometry Software and common approaches to address them. We describe first prototypes of multi-touch-capable DGS. We also give some hints on the educational benefits of proper user interface design.

  4. Exploratory research for the development of a computer aided software design environment with the software technology program

    NASA Technical Reports Server (NTRS)

    Hardwick, Charles

    1991-01-01

    Field studies were conducted by MCC to determine areas of research of mutual interest to MCC and JSC. NASA personnel from the Information Systems Directorate and research faculty from UHCL/RICIS visited MCC in Austin, Texas to examine tools and applications under development in the MCC Software Technology Program. MCC personnel presented workshops in hypermedia, design knowledge capture, and design recovery on site at JSC for ISD personnel. The following programs were installed on workstations in the Software Technology Lab, NASA/JSC: (1) GERM (Graphic Entity Relations Modeler); (2) gIBIS (Graphic Issues Based Information System); and (3) DESIRE (Design Recovery tool). These applications were made available to NASA for inspection and evaluation. Programs developed in the MCC Software Technology Program run on the SUN workstation. The programs do not require special configuration, but they will require larger than usual amounts of disk space and RAM to operate properly.

  5. IGDS/TRAP Interface Program (ITIP). Software User Manual (SUM). [network flow diagrams for coal gasification studies

    NASA Technical Reports Server (NTRS)

    Jefferys, S.; Johnson, W.; Lewis, R.; Rich, R.

    1981-01-01

    This specification establishes the requirements, concepts, and preliminary design for a set of software known as the IGDS/TRAP Interface Program (ITIP). This software provides the capability to develop at an Interactive Graphics Design System (IGDS) design station process flow diagrams for use by the NASA Coal Gasification Task Team. In addition, ITIP will use the Data Management and Retrieval System (DMRS) to maintain a data base from which a properly formatted input file to the Time-Line and Resources Analysis Program (TRAP) can be extracted. This set of software will reside on the PDP-11/70 and will become the primary interface between the Coal Gasification Task Team and IGDS, DMRS, and TRAP. The user manual for the computer program is presented.

  6. Designing Birefringent Filters For Solid-State Lasers

    NASA Technical Reports Server (NTRS)

    Monosmith, Bryan

    1992-01-01

    Mathematical model enables design of filter assembly of birefringent plates as integral part of resonator cavity of tunable solid-state laser. Proper design treats polarization eigenstate of entire resonator as function of wavelength. Program includes software modules for variety of optical elements including Pockels cell, laser rod, quarter- and half-wave plates, Faraday rotator, and polarizers.

  7. Study of application of space telescope science operations software for SIRTF use

    NASA Technical Reports Server (NTRS)

    Dignam, F.; Stetson, E.; Allendoerfer, W.

    1985-01-01

    The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation.

  8. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to ensure continued...

  9. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to assure continued...

  10. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to ensure continued...

  11. 40 CFR 85.1902 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION FROM MOBILE SOURCES Emission Defect Reporting Requirements § 85.1902 Definitions. For the purposes...) which affects any parameter or specification enumerated in appendix VIII of this part; or (2) A defect..., components, systems, software or elements of design which must function properly to ensure continued...

  12. Sensor Suitcase Tablet Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Retrocommissioning Sensor Suitcase is targeted for use in small commercial buildings of less than 50,000 square feet of floor space that regularly receive basic services such as maintenance and repair, but don't have in-house energy management staff or buildings experts. The Suitcase is designed to be easy-to-use by building maintenance staff, or other professionals such as telecom and alarm technicians. The software in the hand-held is designed to guide the staff to input the building and system information, deploy the sensors in proper location, configure the sensor hardware, and start the data collection.

  13. Development of Integrated Modular Avionics Application Based on Simulink and XtratuM

    NASA Astrophysics Data System (ADS)

    Fons-Albert, Borja; Usach-Molina, Hector; Vila-Carbo, Joan; Crespo-Lorente, Alfons

    2013-08-01

    This paper presents an integral approach for designing avionics applications that meets the requirements for software development and execution of this application domain. Software design follows the Model-Based design process and is performed in Simulink. This approach allows easy and quick testbench development and helps satisfying DO-178B requirements through the use of proper tools. The software execution platform is based on XtratuM, a minimal bare-metal hypervisor designed in our research group. XtratuM provides support for IMA-SP (Integrated Modular Avionics for Space) architectures. This approach allows the code generation of a Simulink model to be executed on top of Lithos as XtratuM partition. Lithos is a ARINC-653 compliant RTOS for XtratuM. The paper concentrates in how to smoothly port Simulink designs to XtratuM solving problems like application partitioning, automatic code generation, real-time tasking, interfacing, and others. This process is illustrated with an autopilot design test using a flight simulator.

  14. HiCAT Software Infrastructure: Safe hardware control with object oriented Python

    NASA Astrophysics Data System (ADS)

    Moriarty, Christopher; Brooks, Keira; Soummer, Remi

    2018-01-01

    High contrast imaging for Complex Aperture Telescopes (HiCAT) is a testbed designed to demonstrate coronagraphy and wavefront control for segmented on-axis space telescopes such as envisioned for LUVOIR. To limit the air movements in the testbed room, software interfaces for several different hardware components were developed to completely automate operations. When developing software interfaces for many different pieces of hardware, unhandled errors are commonplace and can prevent the software from properly closing a hardware resource. Some fragile components (e.g. deformable mirrors) can be permanently damaged because of this. We present an object oriented Python-based infrastructure to safely automate hardware control and optical experiments. Specifically, conducting high-contrast imaging experiments while monitoring humidity and power status along with graceful shutdown processes even for unexpected errors. Python contains a construct called a “context manager” that allows you define code to run when a resource is opened or closed. Context managers ensure that a resource is properly closed, even when unhandled errors occur. Harnessing the context manager design, we also use Python’s multiprocessing library to monitor humidity and power status without interrupting the experiment. Upon detecting a safety problem, the master process sends an event to the child process that triggers the context managers to gracefully close any open resources. This infrastructure allows us to queue up several experiments and safely operate the testbed without a human in the loop.

  15. Design and Testing of Space Telemetry SCA Waveform

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Handler, Louis M.; Quinn, Todd M.

    2006-01-01

    A Software Communications Architecture (SCA) Waveform for space telemetry is being developed at the NASA Glenn Research Center (GRC). The space telemetry waveform is implemented in a laboratory testbed consisting of general purpose processors, field programmable gate arrays (FPGAs), analog-to-digital converters (ADCs), and digital-to-analog converters (DACs). The radio hardware is integrated with an SCA Core Framework and other software development tools. The waveform design is described from both the bottom-up signal processing and top-down software component perspectives. Simulations and model-based design techniques used for signal processing subsystems are presented. Testing with legacy hardware-based modems verifies proper design implementation and dynamic waveform operations. The waveform development is part of an effort by NASA to define an open architecture for space based reconfigurable transceivers. Use of the SCA as a reference has increased understanding of software defined radio architectures. However, since space requirements put a premium on size, mass, and power, the SCA may be impractical for today s space ready technology. Specific requirements for an SCA waveform and other lessons learned from this development are discussed.

  16. The Environmental Control and Life Support System (ECLSS) advanced automation project

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, Ray

    1990-01-01

    The objective of the environmental control and life support system (ECLSS) Advanced Automation Project is to influence the design of the initial and evolutionary Space Station Freedom Program (SSFP) ECLSS toward a man-made closed environment in which minimal flight and ground manpower is needed. Another objective includes capturing ECLSS design and development knowledge future missions. Our approach has been to (1) analyze the SSFP ECLSS, (2) envision as our goal a fully automated evolutionary environmental control system - an augmentation of the baseline, and (3) document the advanced software systems, hooks, and scars which will be necessary to achieve this goal. From this analysis, prototype software is being developed, and will be tested using air and water recovery simulations and hardware subsystems. In addition, the advanced software is being designed, developed, and tested using automation software management plan and lifecycle tools. Automated knowledge acquisition, engineering, verification and testing tools are being used to develop the software. In this way, we can capture ECLSS development knowledge for future use develop more robust and complex software, provide feedback to the knowledge based system tool community, and ensure proper visibility of our efforts.

  17. ARC-2007-ACD07-0140-001

    NASA Image and Video Library

    2007-07-31

    David L. Iverson of NASA Ames Research center, Moffett Field, California, led development of computer software to monitor the conditions of the gyroscopes that keep the International Space Station (ISS) properly oriented in space as the ISS orbits Earth. The gyroscopes are flywheels that control the station's attitude without the use of propellant fuel. NASA computer scientists designed the new software, the Inductive Monitoring System, to detect warning signs that precede a gyroscope's failure. According to NASA officials, engineers will add the new software tool to a group of existing tools to identify and track problems related to the gyroscopes. If the software detects warning signs, it will quickly warn the space station's mission control center.

  18. Kinematic analysis of the finger exoskeleton using MATLAB/Simulink.

    PubMed

    Nasiłowski, Krzysztof; Awrejcewicz, Jan; Lewandowski, Donat

    2014-01-01

    A paralyzed and not fully functional part of human body can be supported by the properly designed exoskeleton system with motoric abilities. It can help in rehabilitation, or movement of a disabled/paralyzed limb. Both suitably selected geometry and specialized software are studied applying the MATLAB environment. A finger exoskeleton was the base for MATLAB/Simulink model. Specialized software, such as MATLAB/Simulink give us an opportunity to optimize calculation reaching precise results, which help in next steps of design process. The calculations carried out yield information regarding movement relation between three functionally connected actuators and showed distance and velocity changes during the whole simulation time.

  19. Technological Minimalism: A Cost-Effective Alternative for Course Design and Development.

    ERIC Educational Resources Information Center

    Lorenzo, George

    2001-01-01

    Discusses the use of minimum levels of technology, or technological minimalism, for Web-based multimedia course content. Highlights include cost effectiveness; problems with video streaming, the use of XML for Web pages, and Flash and Java applets; listservs instead of proprietary software; and proper faculty training. (LRW)

  20. Real-time PCR (qPCR) primer design using free online software.

    PubMed

    Thornton, Brenda; Basu, Chhandak

    2011-01-01

    Real-time PCR (quantitative PCR or qPCR) has become the preferred method for validating results obtained from assays which measure gene expression profiles. The process uses reverse transcription polymerase chain reaction (RT-PCR), coupled with fluorescent chemistry, to measure variations in transcriptome levels between samples. The four most commonly used fluorescent chemistries are SYBR® Green dyes and TaqMan®, Molecular Beacon or Scorpion probes. SYBR® Green is very simple to use and cost efficient. As SYBR® Green dye binds to any double-stranded DNA product, its success depends greatly on proper primer design. Many types of online primer design software are available, which can be used free of charge to design desirable SYBR® Green-based qPCR primers. This laboratory exercise is intended for those who have a fundamental background in PCR. It addresses the basic fluorescent chemistries of real-time PCR, the basic rules and pitfalls of primer design, and provides a step-by-step protocol for designing SYBR® Green-based primers with free, online software. Copyright © 2010 Wiley Periodicals, Inc.

  1. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.

    2011-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA's Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This presentation specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as an overview of the content of the final report for that internship.

  2. A new practice-driven approach to develop software in a cyber-physical system environment

    NASA Astrophysics Data System (ADS)

    Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei

    2016-02-01

    Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.

  3. Designing and assessing fixed dental prostheses 2 multimedia-based education in dentistry students.

    PubMed

    Jahandideh, Yousef; Roohi Balasi, Leila; Vadiati Saberi, Bardia; Dadgaran, Ideh

    2016-01-01

    Background: Above all methods effective learning results from decent training, acquired in the proper environment and encouraging creative methods. Computer-assisted training by educational software is considered a fundamental measure to improve medical and dentistry education systems. This study aims to design and assess fixed dental prostheses via 2 multimedia instructional contents at the Guilan dentistry school. Methods: This is a descriptive and cross-sectional study. First off, the instructional content was analyzed. The software used to produce multimedia was the iSpring suite Ver.7.0. After designing the instructional multimedia, this software was loaded by LMS. Sixty-nine dentistry students in the 5th semester at Guilan Dentistry School were selected via convenience sampling. At the end of the course, a structured questionnaire containing 26 items were handed to the students to evaluate the instructional multimedia quality. Results: Mean ±SD age was 24.68±3.24 years, 43 were women (62.4%) and 26 were men (37.6%) -the majority of 76.8% used the internet at home. A portion of 33.3% were inclined to use multimedia and the internet with in-person training. About 60% declared that multimedia quality as being good. Conclusion: the instructional multimedia designs which are compatible with lesson objectives and audiovisual facilities can have a great effect on the student's satisfaction. Preparing instructional multimedia makes the instructional content easily accessible for students to be able to review it several times at the proper opportunity and if presented through LMS they would be able to study the lesson subject wherever and whenever accessing the internet.

  4. Distributed software framework and continuous integration in hydroinformatics systems

    NASA Astrophysics Data System (ADS)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  5. Design, Qualification, and On Orbit Performance of the CALIPSO Aerosol Lidar Transmitter

    NASA Technical Reports Server (NTRS)

    Hovis, Floyd E.; Witt, Greg; Sullivan, Edward T.; Le, Khoa; Weimer, Carl; Applegate, Jeff; Luck, William S., Jr.; Verhapen, Ron; Cisewski, Michael S.

    2007-01-01

    The laser transmitter for the CALIPSO aerosol lidar mission has been operating on orbit as planned since June 2006. This document discusses the optical and laser system design and qualification process that led to this success. Space-qualifiable laser design guidelines included the use of mature laser technologies, the use of alignment sensitive resonator designs, the development and practice of stringent contamination control procedures, the operation of all optical components at appropriately derated levels, and the proper budgeting for the space-qualification of the electronics and software.

  6. The Viability of a Software Tool to Assist Students in the Review of Literature

    ERIC Educational Resources Information Center

    Anderson, Timothy R.

    2013-01-01

    Most doctoral students are novice researchers and may not possess the skills to effectively conduct a comprehensive review of the literature and frame a problem designed to conduct original research. Students need proper training and tools necessary to critically evaluate, synthesize and organize literature. The purpose of this concurrent mixed…

  7. Software System Safety and the NASA Aeronautics Blueprint

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael; Hayhurst, Kelly J.

    2002-01-01

    NASA's Aeronautics Blueprint lays out a research agenda for the Agency s aeronautics program. The word software appears only four times in this Blueprint, but the critical importance of safe and correct software to the fulfillment of the proposed research is evident on almost every page. Most of the technology solutions proposed to address challenges in aviation are software dependent technologies. Of the fifty-two specific technology solutions described in the Blueprint, forty-one depend, at least in part, on software for success. For thirty-five of these forty-one, software is not only critical to success, but also to human safety. That is, implementing the technology solutions will require using software in such a way that it may, if not specified, designed, and implemented properly, lead to fatal accidents. These results have at least two implications for the research based on the Blueprint: (1) knowledge about the current state-of-the-art and state-of-the-practice in software engineering and software system safety is essential, and (2) research into current unsolved problems in these software disciplines is also essential.

  8. Design and Analysis of Tooth Impact Test Rig for Spur Gear

    NASA Astrophysics Data System (ADS)

    Ghazali, Wafiuddin Bin Md; Aziz, Ismail Ali Bin Abdul; Daing Idris, Daing Mohamad Nafiz Bin; Ismail, Nurazima Binti; Sofian, Azizul Helmi Bin

    2016-02-01

    This paper is about the design and analysis of a prototype of tooth impact test rig for spur gear. The test rig was fabricated and analysis was conducted to study its’ limitation and capabilities. The design of the rig is analysed to ensure that there will be no problem occurring during the test and reliable data can be obtained. From the result of the analysis, the maximum amount of load that can be applied, the factor of safety of the machine, the stresses on the test rig parts were determined. This is important in the design consideration of the test rig. The materials used for the fabrication of the test rig were also discussed and analysed. MSC Nastran Patran software was used to analyse the model, which was designed by using SolidWorks 2014 software. Based from the results, there were limitations found from the initial design and the test rig design needs to be improved in order for the test rig to operate properly.

  9. Designing and assessing fixed dental prostheses 2 multimedia-based education in dentistry students

    PubMed Central

    Jahandideh, Yousef; Roohi Balasi, Leila; Vadiati Saberi, Bardia; Dadgaran, Ideh

    2016-01-01

    Background: Above all methods effective learning results from decent training, acquired in the proper environment and encouraging creative methods. Computer-assisted training by educational software is considered a fundamental measure to improve medical and dentistry education systems. This study aims to design and assess fixed dental prostheses via 2 multimedia instructional contents at the Guilan dentistry school. Methods: This is a descriptive and cross-sectional study. First off, the instructional content was analyzed. The software used to produce multimedia was the iSpring suite Ver.7.0. After designing the instructional multimedia, this software was loaded by LMS. Sixty-nine dentistry students in the 5th semester at Guilan Dentistry School were selected via convenience sampling. At the end of the course, a structured questionnaire containing 26 items were handed to the students to evaluate the instructional multimedia quality. Results: Mean ±SD age was 24.68±3.24 years, 43 were women (62.4%) and 26 were men (37.6%) –the majority of 76.8% used the internet at home. A portion of 33.3% were inclined to use multimedia and the internet with in-person training. About 60% declared that multimedia quality as being good. Conclusion: the instructional multimedia designs which are compatible with lesson objectives and audiovisual facilities can have a great effect on the student's satisfaction. Preparing instructional multimedia makes the instructional content easily accessible for students to be able to review it several times at the proper opportunity and if presented through LMS they would be able to study the lesson subject wherever and whenever accessing the internet. PMID:28491830

  10. Evaluation Results of an Ontology-based Design Model of Virtual Environments for Upper Limb Motor Rehabilitation of Stroke Patients.

    PubMed

    Ramírez-Fernández, Cristina; Morán, Alberto L; García-Canseco, Eloísa; Gómez-Montalvo, Jorge R

    2017-03-23

    1) To enhance the content of an ontology for designing virtual environments (VEs) for upper limb motor rehabilitation of stroke patients according to the suggestions and comments of rehabilitation specialists and software developers, 2) to characterize the perceived importance level of the ontology, 3) to determine the perceived usefulness of the ontology, and 4) to identify the safety characteristics of the ontology for VEs design according to the rehabilitation specialists. Using two semi-structured Web questionnaires, we asked six rehabilitation specialists and six software developers to provide us with their perception regarding the level of importance and the usability of the ontology. From their responses we have identified themes related to perceived and required safety characteristics of the ontology. Significant differences in the importance level were obtained for the Stroke Disability, VE Configuration, Outcome Measures, and Safety Calibration classes, which were perceived as highly important by rehabilitation specialists. Regarding usability, the ontology was perceived by both groups with high usefulness, ease of use, learnability and intention of use. Concerning the thematic analysis of recommendations, eight topics for safety characteristics of the ontology were identified: adjustment of therapy strategies; selection and delimitation of movements; selection and proper calibration of the interaction device; proper selection of measuring instruments; gradual modification of the difficulty of the exercise; adaptability and variability of therapy exercises; feedback according to the capabilities of the patient; and real-time support for exercise training. The rehabilitation specialists and software developers confirmed the importance of the information contained in the ontology regarding motor rehabilitation of the upper limb. Their recommendations highlight the safety features and the advantages of the ontology as a guide for the effective design of VEs.

  11. Instructor/Operator Station Design Study.

    DTIC Science & Technology

    1982-04-01

    components interact and are dependent one upon the other. A major issue in any design involving both hardware and software is establishing the proper...always begins at leg 1. The AUTOMATED TRAINING EXER- CISE MAP display calls up a map of the gaming area for the selected exer- cise. The PRINTOUT...select the size of the gaming area in nautical miles. When the Aircraft comes within 100 miles of an in-tune station, the approach display for the in

  12. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  13. CASE tools and UML: state of the ART.

    PubMed

    Agarwal, S

    2001-05-01

    With increasing need for automated tools to assist complex systems development, software design methods are becoming popular. This article analyzes the state of art in computer-aided software engineering (CASE) tools and unified modeling language (UML), focusing on their evolution, merits, and industry usage. It identifies managerial issues for the tools' adoption and recommends an action plan to select and implement them. While CASE and UML offer inherent advantages like cheaper, shorter, and efficient development cycles, they suffer from poor user satisfaction. The critical success factors for their implementation include, among others, management and staff commitment, proper corporate infrastructure, and user training.

  14. Assurance Policy Evaluation - Spacecraft and Strategic Systems

    DTIC Science & Technology

    2014-09-17

    electromechanical (EEE) parts, software, design and workmanship, work instructions, manufacturing and tooling, cleanrooms, electrostatic discharge ...T9001B.  An external group, called the Evaluation and Assessment Team, made up of product assurance subject matter experts from NSWC Corona performs...NSWC, Corona and SSP Technical Branch(es). The FTPE, performed every 3 years, is an objective evaluation of facility performance to assure proper

  15. Optical burst switching based satellite backbone network

    NASA Astrophysics Data System (ADS)

    Li, Tingting; Guo, Hongxiang; Wang, Cen; Wu, Jian

    2018-02-01

    We propose a novel time slot based optical burst switching (OBS) architecture for GEO/LEO based satellite backbone network. This architecture can provide high speed data transmission rate and high switching capacity . Furthermore, we design the control plane of this optical satellite backbone network. The software defined network (SDN) and network slice (NS) technologies are introduced. Under the properly designed control mechanism, this backbone network is flexible to support various services with diverse transmission requirements. Additionally, the LEO access and handoff management in this network is also discussed.

  16. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.

  17. Technical Note: Computer-Manufactured Inserts for Prosthetic Sockets

    PubMed Central

    Sanders, Joan E.; McLean, Jake B.; Cagle, John C.; Gardner, David W.; Allyn, Katheryn J.

    2016-01-01

    The objective of this research was to use computer-aided design software and a tabletop 3-D additive manufacturing system to design and fabricate custom plastic inserts for trans-tibial prosthesis users. Shape quality of inserts was tested right after they were inserted into participant’s test sockets and again after four weeks of wear. Inserts remained properly positioned and intact throughout testing. Right after insertion the inserts caused the socket to be slightly under-sized, by a mean of 0.11 mm, approximately 55% of the thickness of a nylon sheath. After four weeks of wear the under-sizing was less, averaging 0.03 mm, approximately 15% of the thickness of a nylon sheath. Thus the inserts settled into the sockets over time. If existing prosthetic design software packages were enhanced to conduct insert design and to automatically generate fabrication files for manufacturing, then computer manufactured inserts may offer advantages over traditional methods in terms of speed of fabrication, ease of design, modification, and record keeping. PMID:27212209

  18. Design of low noise imaging system

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Chen, Xiaolai

    2017-10-01

    In order to meet the needs of engineering applications for low noise imaging system under the mode of global shutter, a complete imaging system is designed based on the SCMOS (Scientific CMOS) image sensor CIS2521F. The paper introduces hardware circuit and software system design. Based on the analysis of key indexes and technologies about the imaging system, the paper makes chips selection and decides SCMOS + FPGA+ DDRII+ Camera Link as processing architecture. Then it introduces the entire system workflow and power supply and distribution unit design. As for the software system, which consists of the SCMOS control module, image acquisition module, data cache control module and transmission control module, the paper designs in Verilog language and drives it to work properly based on Xilinx FPGA. The imaging experimental results show that the imaging system exhibits a 2560*2160 pixel resolution, has a maximum frame frequency of 50 fps. The imaging quality of the system satisfies the requirement of the index.

  19. Software Dependability and Safety Evaluations ESA's Initiative

    NASA Astrophysics Data System (ADS)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  20. Which Tibial Tray Design Achieves Maximum Coverage and Ideal Rotation: Anatomic, Symmetric, or Asymmetric? An MRI-based study.

    PubMed

    Stulberg, S David; Goyal, Nitin

    2015-10-01

    Two goals of tibial tray placement in TKA are to maximize coverage and establish proper rotation. Our purpose was to utilize MRI information obtained as part of PSI planning to determine the impact of tibial tray design on the relationship between coverage and rotation. MR images for 100 consecutive knees were uploaded into PSI software. Preoperative planning software was used to evaluate 3 different tray designs: anatomic, symmetric, and asymmetric. Approximately equally good coverage was achieved with all three trays. However, the anatomic compared to symmetric/asymmetric trays required less malrotation (0.3° vs 3.0/2.4°; P < 0.001), with a higher proportion of cases within 5° of neutral (97% vs 73/77%; P < 0.001). In this study, the anatomic tibia optimized the relationship between coverage and rotation. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. FAME, a microprocessor based front-end analysis and modeling environment

    NASA Technical Reports Server (NTRS)

    Rosenbaum, J. D.; Kutin, E. B.

    1980-01-01

    Higher order software (HOS) is a methodology for the specification and verification of large scale, complex, real time systems. The HOS methodology was implemented as FAME (front end analysis and modeling environment), a microprocessor based system for interactively developing, analyzing, and displaying system models in a low cost user-friendly environment. The nature of the model is such that when completed it can be the basis for projection to a variety of forms such as structured design diagrams, Petri-nets, data flow diagrams, and PSL/PSA source code. The user's interface with the analyzer is easily recognized by any current user of a structured modeling approach; therefore extensive training is unnecessary. Furthermore, when all the system capabilities are used one can check on proper usage of data types, functions, and control structures thereby adding a new dimension to the design process that will lead to better and more easily verified software designs.

  2. Energy performance evaluation of AAC

    NASA Astrophysics Data System (ADS)

    Aybek, Hulya

    The U.S. building industry constitutes the largest consumer of energy (i.e., electricity, natural gas, petroleum) in the world. The building sector uses almost 41 percent of the primary energy and approximately 72 percent of the available electricity in the United States. As global energy-generating resources are being depleted at exponential rates, the amount of energy consumed and wasted cannot be ignored. Professionals concerned about the environment have placed a high priority on finding solutions that reduce energy consumption while maintaining occupant comfort. Sustainable design and the judicious combination of building materials comprise one solution to this problem. A future including sustainable energy may result from using energy simulation software to accurately estimate energy consumption and from applying building materials that achieve the potential results derived through simulation analysis. Energy-modeling tools assist professionals with making informed decisions about energy performance during the early planning phases of a design project, such as determining the most advantageous combination of building materials, choosing mechanical systems, and determining building orientation on the site. By implementing energy simulation software to estimate the effect of these factors on the energy consumption of a building, designers can make adjustments to their designs during the design phase when the effect on cost is minimal. The primary objective of this research consisted of identifying a method with which to properly select energy-efficient building materials and involved evaluating the potential of these materials to earn LEED credits when properly applied to a structure. In addition, this objective included establishing a framework that provides suggestions for improvements to currently available simulation software that enhance the viability of the estimates concerning energy efficiency and the achievements of LEED credits. The primary objective was accomplished by using conducting several simulation models to determine the relative energy efficiency of wood-framed, metal-framed, and Aerated Autoclaved Concrete (AAC) wall structures for both commercial and residential buildings.

  3. On public space design for Chinese urban residential area based on integrated architectural physics environment evaluation

    NASA Astrophysics Data System (ADS)

    Dong, J. Y.; Cheng, W.; Ma, C. P.; Tan, Y. T.; Xin, L. S.

    2017-04-01

    The residential public space is an important part in designing the ecological residence, and a proper physics environment of public space is of greater significance to urban residence in China. Actually, the measure to apply computer aided design software into residential design can effectively avoid an inconformity of design intent with actual using condition, and a negative impact on users due to bad architectural physics environment of buildings, etc. The paper largely adopts a design method of analyzing architectural physics environment of residential public space. By analyzing and evaluating various physics environments, a suitability assessment is obtained for residential public space, thereby guiding the space design.

  4. Automated Testing Experience of the Linear Aerospike SR-71 Experiment (LASRE) Controller

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.

    1999-01-01

    System controllers must be fail-safe, low cost, flexible to software changes, able to output health and status words, and permit rapid retest qualification. The system controller designed and tested for the aerospike engine program was an attempt to meet these requirements. This paper describes (1) the aerospike controller design, (2) the automated simulation testing techniques, and (3) the real time monitoring data visualization structure. Controller cost was minimized by design of a single-string system that used an off-the-shelf 486 central processing unit (CPU). A linked-list architecture, with states (nodes) defined in a user-friendly state table, accomplished software changes to the controller. Proven to be fail-safe, this system reported the abort cause and automatically reverted to a safe condition for any first failure. A real time simulation and test system automated the software checkout and retest requirements. A program requirement to decode all abort causes in real time during all ground and flight tests assured the safety of flight decisions and the proper execution of mission rules. The design also included health and status words, and provided a real time analysis interpretation for all health and status data.

  5. 49 CFR Appendix A to Part 238 - Schedule of Civil Penalties 1

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... movement from Class I or IA brake test 5,000 7,500 (c) Improper movement of en route defect 2,500 5,000 (2...) Failure to include required design features 5,000 7,500 (e) Failure to comply with hardware and software... properly test previously used equipment 7,500 11,000 (b)(1) Failure to develop plan 7,500 11,000 (b)(2...

  6. Students' different understandings of class diagrams

    NASA Astrophysics Data System (ADS)

    Boustedt, Jonas

    2012-03-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a phenomenographic investigation on how students understand class diagrams, Unified Modeling Language (UML) symbols, and relations to object-oriented (OO) concepts. The informants were 20 Computer Science students from four different universities in Sweden. The results show qualitatively different ways to understand and describe UML class diagrams and the "diamond symbols" representing aggregation and composition. The purpose of class diagrams was understood in a varied way, from describing it as a documentation to a more advanced view related to communication. The descriptions of class diagrams varied from seeing them as a specification of classes to a more advanced view, where they were described to show hierarchic structures of classes and relations. The diamond symbols were seen as "relations" and a more advanced way was seeing the white and the black diamonds as different symbols for aggregation and composition. As a consequence of the results, it is recommended that UML should be adopted in courses. It is briefly indicated how the phenomenographic results in combination with variation theory can be used by teachers to enhance students' possibilities to reach advanced understanding of phenomena related to UML class diagrams. Moreover, it is recommended that teachers should put more effort in assessing skills in proper usage of the basic symbols and models and students should be provided with opportunities to practise collaborative design, e.g. using whiteboards.

  7. Evolution of a modular software network

    PubMed Central

    Fortuna, Miguel A.; Bonachela, Juan A.; Levin, Simon A.

    2011-01-01

    “Evolution behaves like a tinkerer” (François Jacob, Science, 1977). Software systems provide a singular opportunity to understand biological processes using concepts from network theory. The Debian GNU/Linux operating system allows us to explore the evolution of a complex network in a unique way. The modular design detected during its growth is based on the reuse of existing code in order to minimize costs during programming. The increase of modularity experienced by the system over time has not counterbalanced the increase in incompatibilities between software packages within modules. This negative effect is far from being a failure of design. A random process of package installation shows that the higher the modularity, the larger the fraction of packages working properly in a local computer. The decrease in the relative number of conflicts between packages from different modules avoids a failure in the functionality of one package spreading throughout the entire system. Some potential analogies with the evolutionary and ecological processes determining the structure of ecological networks of interacting species are discussed. PMID:22106260

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gharibyan, N.

    In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less

  9. The Development of a Dynamic Geomagnetic Cutoff Rigidity Model for the International Space Station

    NASA Technical Reports Server (NTRS)

    Smart, D. F.; Shea, M. A.

    1999-01-01

    We have developed a computer model of geomagnetic vertical cutoffs applicable to the orbit of the International Space Station. This model accounts for the change in geomagnetic cutoff rigidity as a function of geomagnetic activity level. This model was delivered to NASA Johnson Space Center in July 1999 and tested on the Space Radiation Analysis Group DEC-Alpha computer system to ensure that it will properly interface with other software currently used at NASA JSC. The software was designed for ease of being upgraded as other improved models of geomagnetic cutoff as a function of magnetic activity are developed.

  10. Managing the Software Development Process

    NASA Technical Reports Server (NTRS)

    Lubelczky, Jeffrey T.; Parra, Amy

    1999-01-01

    The goal of any software development project is to produce a product that is delivered on time, within the allocated budget, and with the capabilities expected by the customer and unfortunately, this goal is rarely achieved. However, a properly managed project in a mature software engineering environment can consistently achieve this goal. In this paper we provide an introduction to three project success factors, a properly managed project, a competent project manager, and a mature software engineering environment. We will also present an overview of the benefits of a mature software engineering environment based on 24 years of data from the Software Engineering Lab, and suggest some first steps that an organization can take to begin benefiting from this environment. The depth and breadth of software engineering exceeds this paper, various references are cited with a goal of raising awareness and encouraging further investigation into software engineering and project management practices.

  11. Application of Real Options Theory to DoD Software Acquisitions

    DTIC Science & Technology

    2009-02-20

    Future Combat Systems Program. Washington, DC. U.S. Government Printing Office. Damodaran , A. (2007). Investment Valuation : The Options To Expand... valuation methodology, when enhanced and properly formulated around a proposed or existing software investment employing the spiral development approach...THIS PAGE INTENTIONALLY LEFT BLANK iii ABSTRACT The traditional real options valuation methodology, when enhanced and properly formulated

  12. JETSPIN: A specific-purpose open-source software for simulations of nanofiber electrospinning

    NASA Astrophysics Data System (ADS)

    Lauricella, Marco; Pontrelli, Giuseppe; Coluzza, Ivan; Pisignano, Dario; Succi, Sauro

    2015-12-01

    We present the open-source computer program JETSPIN, specifically designed to simulate the electrospinning process of nanofibers. Its capabilities are shown with proper reference to the underlying model, as well as a description of the relevant input variables and associated test-case simulations. The various interactions included in the electrospinning model implemented in JETSPIN are discussed in detail. The code is designed to exploit different computational architectures, from single to parallel processor workstations. This paper provides an overview of JETSPIN, focusing primarily on its structure, parallel implementations, functionality, performance, and availability.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reisman, D.J.

    A variety of issues must be addressed in development of software for information resources. One is accessibility and use of information. Another is that to properly design, abstract, index, and do quality control on a database requires the effort of well-trained and knowledgeable personnel as well as substantial financial resources. Transferring data to other locations has inherent difficulties, including those related to incompatibility. The main issue in developing health risk assessment databases is the needs of the user.

  14. Conceptual IT model

    NASA Astrophysics Data System (ADS)

    Arnaoudova, Kristina; Stanchev, Peter

    2015-11-01

    The business processes are the key asset for every organization. The design of the business process models is the foremost concern and target among an organization's functions. Business processes and their proper management are intensely dependent on the performance of software applications and technology solutions. The paper is attempt for definition of new Conceptual model of IT service provider, it could be examined as IT focused Enterprise model, part of Enterprise Architecture (EA) school.

  15. Software solutions manage the definition, operation, maintenance and configuration control of the National Ignition Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dobson, D; Churby, A; Krieger, E

    2011-07-25

    The National Ignition Facility (NIF) is the world's largest laser composed of millions of individual parts brought together to form one massive assembly. Maintaining control of the physical definition, status and configuration of this structure is a monumental undertaking yet critical to the validity of the shot experiment data and the safe operation of the facility. The NIF business application suite of software provides the means to effectively manage the definition, build, operation, maintenance and configuration control of all components of the National Ignition Facility. State of the art Computer Aided Design software applications are used to generate a virtualmore » model and assemblies. Engineering bills of material are controlled through the Enterprise Configuration Management System. This data structure is passed to the Enterprise Resource Planning system to create a manufacturing bill of material. Specific parts are serialized then tracked along their entire lifecycle providing visibility to the location and status of optical, target and diagnostic components that are key to assessing pre-shot machine readiness. Nearly forty thousand items requiring preventive, reactive and calibration maintenance are tracked through the System Maintenance & Reliability Tracking application to ensure proper operation. Radiological tracking applications ensure proper stewardship of radiological and hazardous materials and help provide a safe working environment for NIF personnel.« less

  16. Optimizing RF gun cavity geometry within an automated injector design system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alicia Hofler ,Pavel Evtushenko

    2011-03-28

    RF guns play an integral role in the success of several light sources around the world, and properly designed and optimized cw superconducting RF (SRF) guns can provide a path to higher average brightness. As the need for these guns grows, it is important to have automated optimization software tools that vary the geometry of the gun cavity as part of the injector design process. This will allow designers to improve existing designs for present installations, extend the utility of these guns to other applications, and develop new designs. An evolutionary algorithm (EA) based system can provide this capability becausemore » EAs can search in parallel a large parameter space (often non-linear) and in a relatively short time identify promising regions of the space for more careful consideration. The injector designer can then evaluate more cavity design parameters during the injector optimization process against the beam performance requirements of the injector. This paper will describe an extension to the APISA software that allows the cavity geometry to be modified as part of the injector optimization and provide examples of its application to existing RF and SRF gun designs.« less

  17. Realization of 2:1 MUX using Mach Zhender Interferometer structure and its application in selection of output signal of MOEMS pressure and temperature sensor

    NASA Astrophysics Data System (ADS)

    Jindal, Sumit Kumar; Raghuwanshi, Sanjeev Kumar

    2016-03-01

    In this paper we have initially designed a circular diaphragm based MOEMS pressure sensor and a thermistor based temperature sensor. This has been done by the help of externally modulated LiNbO3 Mach Zhender Interferometer (MZI) which senses the input voltage signal and modulates it to give an output in the form of intensity of light. This output is then calibrated to understand the proper relation between the input applied and output measured. The next aspect has been the use of MZI to work as a 2:1 MUX where two input lines are -pressure signal and temperature signal. The arrangement of MZI is then modulated in such a way that based on the requirement it chooses the proper input signal and sends it to the output port for the measurement. The design has been simulated in Opti-BPM software.

  18. Technical note: Computer-manufactured inserts for prosthetic sockets.

    PubMed

    Sanders, Joan E; McLean, Jake B; Cagle, John C; Gardner, David W; Allyn, Katheryn J

    2016-08-01

    The objective of this research was to use computer-aided design software and a tabletop 3-D additive manufacturing system to design and fabricate custom plastic inserts for trans-tibial prosthesis users. Shape quality of inserts was tested right after they were inserted into participant's test sockets and again after four weeks of wear. Inserts remained properly positioned and intact throughout testing. Right after insertion the inserts caused the socket to be slightly under-sized, by a mean of 0.11mm, approximately 55% of the thickness of a nylon sheath. After four weeks of wear the under-sizing was less, averaging 0.03mm, approximately 15% of the thickness of a nylon sheath. Thus the inserts settled into the sockets over time. If existing prosthetic design software packages were enhanced to conduct insert design and to automatically generate fabrication files for manufacturing, then computer manufactured inserts may offer advantages over traditional methods in terms of speed of fabrication, ease of design, modification, and record keeping. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  19. RNAiFold 2.0: a web server and software to design custom and Rfam-based RNA molecules.

    PubMed

    Garcia-Martin, Juan Antonio; Dotu, Ivan; Clote, Peter

    2015-07-01

    Several algorithms for RNA inverse folding have been used to design synthetic riboswitches, ribozymes and thermoswitches, whose activity has been experimentally validated. The RNAiFold software is unique among approaches for inverse folding in that (exhaustive) constraint programming is used instead of heuristic methods. For that reason, RNAiFold can generate all sequences that fold into the target structure or determine that there is no solution. RNAiFold 2.0 is a complete overhaul of RNAiFold 1.0, rewritten from the now defunct COMET language to C++. The new code properly extends the capabilities of its predecessor by providing a user-friendly pipeline to design synthetic constructs having the functionality of given Rfam families. In addition, the new software supports amino acid constraints, even for proteins translated in different reading frames from overlapping coding sequences; moreover, structure compatibility/incompatibility constraints have been expanded. With these features, RNAiFold 2.0 allows the user to design single RNA molecules as well as hybridization complexes of two RNA molecules. the web server, source code and linux binaries are publicly accessible at http://bioinformatics.bc.edu/clotelab/RNAiFold2.0. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Optimisation of Design of Air Inlets in Air Distribution Channels of a Double-Skin Transparent Façade

    NASA Astrophysics Data System (ADS)

    Bielek, Boris; Szabó, Daniel; Palko, Milan; Rychtáriková, Monika

    2017-12-01

    This paper reports on an optimization of design of air inlets in naturally ventilated double-skin transparent facades; the design aims at the proper functioning of these facades from the point of view of their aerodynamic and hydrodynamic behaviour. A comparison was made of five different variants of ventilation louvers used in air openings with different shapes, positions and overall geometry. The aerodynamic response of the louvers was determined by 2D simulations using ANSYS software. The hydrodynamic properties were investigated by conducting driven-rain measurements in a large rain chamber at the Slovak University of Technology in Bratislava.

  1. SMS crew station (C and D panels and forward structures). CEI part 1: Detail specification, type 1 data

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Established are the requirements for performance, design, test and qualification of one type of equipment identified as SMS C&D panels and forward structures. This CEI is used to provide all hardware and wiring necessary for the C&D panels to be properly interfaced with the computer complex/signal conversion equipment (SCE), crew station, and software requirements as defined in other CEI specifications.

  2. Health software: a new CEI Guide for software management in medical environment.

    PubMed

    Giacomozzi, Claudia; Martelli, Francesco

    2016-01-01

    The increasing spread of software components in the healthcare context renders explanatory guides relevant and mandatory to interpret laws and standards, and to support safe management of software products in healthcare. In 2012 a working group has been settled for the above purposes at Italian Electrotechnical Committee (CEI), made of experts from Italian National Institute of Health (ISS), representatives of industry, and representatives of the healthcare organizations. As a first outcome of the group activity, Guide CEI 62-237 was published in February 2015. The Guide incorporates an innovative approach based on the proper contextualization of software products, either medical devices or not, to the specific healthcare scenario, and addresses the risk management of IT systems. The Guide provides operators and manufacturers with an interpretative support with many detailed examples to facilitate the proper contextualization and management of health software, in compliance with related European and international regulations and standards.

  3. IPMP Global Fit - A one-step direct data analysis tool for predictive microbiology.

    PubMed

    Huang, Lihan

    2017-12-04

    The objective of this work is to develop and validate a unified optimization algorithm for performing one-step global regression analysis of isothermal growth and survival curves for determination of kinetic parameters in predictive microbiology. The algorithm is incorporated with user-friendly graphical interfaces (GUIs) to develop a data analysis tool, the USDA IPMP-Global Fit. The GUIs are designed to guide the users to easily navigate through the data analysis process and properly select the initial parameters for different combinations of mathematical models. The software is developed for one-step kinetic analysis to directly construct tertiary models by minimizing the global error between the experimental observations and mathematical models. The current version of the software is specifically designed for constructing tertiary models with time and temperature as the independent model parameters in the package. The software is tested with a total of 9 different combinations of primary and secondary models for growth and survival of various microorganisms. The results of data analysis show that this software provides accurate estimates of kinetic parameters. In addition, it can be used to improve the experimental design and data collection for more accurate estimation of kinetic parameters. IPMP-Global Fit can be used in combination with the regular USDA-IPMP for solving the inverse problems and developing tertiary models in predictive microbiology. Published by Elsevier B.V.

  4. A study of universal modulation techniques applied to satellite data collection

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A universal modulation and frequency control system for use with data collection platform (DCP) transmitters is examined. The final design discussed can, under software/firmwave control, generate all of the specific digital data modulation formats currently used in the NASA satellite data collection service and can simultaneously synthesize the proper RF carrier frequencies employed. A novel technique for DCP time and frequency control is presented. The emissions of NBS radio station WWV/WWVH are received, detected, and finally decoded in microcomputer software to generate a highly accurate time base for the platform; with the assistance of external hardware, the microcomputer also directs the recalibration of all DCP oscillators to achieve very high frequency accuracies and low drift rates versus temperature, supply voltage, and time. The final programmable DCP design also employs direct microcomputer control of data reduction, formatting, transmitter switching, and system power management.

  5. Magnetostriction measurement by four probe method

    NASA Astrophysics Data System (ADS)

    Dange, S. N.; Radha, S.

    2018-04-01

    The present paper describes the design and setting up of an indigenouslydevelopedmagnetostriction(MS) measurement setup using four probe method atroom temperature.A standard strain gauge is pasted with a special glue on the sample and its change in resistance with applied magnetic field is measured using KeithleyNanovoltmeter and Current source. An electromagnet with field upto 1.2 tesla is used to source the magnetic field. The sample is placed between the magnet poles using self designed and developed wooden probe stand, capable of moving in three mutually perpendicular directions. The nanovoltmeter and current source are interfaced with PC using RS232 serial interface. A software has been developed in for logging and processing of data. Proper optimization of measurement has been done through software to reduce the noise due to thermal emf and electromagnetic induction. The data acquired for some standard magnetic samples are presented. The sensitivity of the setup is 1microstrain with an error in measurement upto 5%.

  6. Optimum concentric circular array antenna with high gain and side lobe reduction at 5.8 GHz

    NASA Astrophysics Data System (ADS)

    Zaid, Mohammed; Rafiqul Islam, Md; Habaebi, Mohamed H.; Zahirul Alam, AHM; Abdullah, Khaizuran

    2017-11-01

    The significance of high gain directional antennas stems from the need to cope up with the everyday progressing wireless communication systems. Due to low gain of the widely used microstrip antenna, combining multiple antennas in proper geometry increases the gain with good directive property. Over other array forms, this paper uses concentric circular array configuration for its compact structure and inherent symmetry in azimuth. This proposed array is composed of 9 elements on FR-4 substrate, which is designed for WLAN applications at 5.8GHz. Antenna Magus software is used for synthesis, while CST software is used for optimization. The proposed array is designed with optimum inter-element spacing and number of elements achieving a high directional gain of 15.7 dB compared to 14.2 dB of available literature, with a high reduction in side lobe level of -17.6 dB.

  7. Designing a SCADA system simulator for fast breeder reactor

    NASA Astrophysics Data System (ADS)

    Nugraha, E.; Abdullah, A. G.; Hakim, D. L.

    2016-04-01

    SCADA (Supervisory Control and Data Acquisition) system simulator is a Human Machine Interface-based software that is able to visualize the process of a plant. This study describes the results of the process of designing a SCADA system simulator that aims to facilitate the operator in monitoring, controlling, handling the alarm, accessing historical data and historical trend in Nuclear Power Plant (NPP) type Fast Breeder Reactor (FBR). This research used simulation to simulate NPP type FBR Kalpakkam in India. This simulator was developed using Wonderware Intouch software 10 and is equipped with main menu, plant overview, area graphics, control display, set point display, alarm system, real-time trending, historical trending and security system. This simulator can properly simulate the principle of energy flow and energy conversion process on NPP type FBR. This SCADA system simulator can be used as training media for NPP type FBR prospective operators.

  8. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  9. Analyzing and designing object-oriented missile simulations with concurrency

    NASA Astrophysics Data System (ADS)

    Randorf, Jeffrey Allen

    2000-11-01

    A software object model for the six degree-of-freedom missile modeling domain is presented. As a precursor, a domain analysis of the missile modeling domain was started, based on the Feature-Oriented Domain Analysis (FODA) technique described by the Software Engineering Institute (SEI). It was subsequently determined the FODA methodology is functionally equivalent to the Object Modeling Technique. The analysis used legacy software documentation and code from the ENDOSIM, KDEC, and TFrames 6-DOF modeling tools, including other technical literature. The SEI Object Connection Architecture (OCA) was the template for designing the object model. Three variants of the OCA were considered---a reference structure, a recursive structure, and a reference structure with augmentation for flight vehicle modeling. The reference OCA design option was chosen for maintaining simplicity while not compromising the expressive power of the OMT model. The missile architecture was then analyzed for potential areas of concurrent computing. It was shown how protected objects could be used for data passing between OCA object managers, allowing concurrent access without changing the OCA reference design intent or structure. The implementation language was the 1995 release of Ada. OCA software components were shown how to be expressed as Ada child packages. While acceleration of several low level and other high operations level are possible on proper hardware, there was a 33% degradation of 4th order Runge-Kutta integrator performance of two simultaneous ordinary differential equations using Ada tasking on a single processor machine. The Defense Department's High Level Architecture was introduced and explained in context with the OCA. It was shown the HLA and OCA were not mutually exclusive architectures, but complimentary. HLA was shown as an interoperability solution, with the OCA as an architectural vehicle for software reuse. Further directions for implementing a 6-DOF missile modeling environment are discussed.

  10. Jet Noise Reduction

    NASA Technical Reports Server (NTRS)

    Kenny, Patrick

    2004-01-01

    The Acoustics Branch is responsible for reducing noise levels for jet and fan components on aircraft engines. To do this, data must be measured and calibrated accurately to ensure validity of test results. This noise reduction is accomplished by modifications to hardware such as jet nozzles, and by the use of other experimental hardware such as fluidic chevrons, elliptic cores, and fluidic shields. To insure validity of data calibration, a variety of software is used. This software adjusts the sound amplitude and frequency to be consistent with data taken on another day. Both the software and the hardware help make noise reduction possible. work properly. These software programs were designed to make corrections for atmosphere, shear, attenuation, electronic, and background noise. All data can be converted to a one-foot lossless condition, using the proper software corrections, making a reading independent of weather and distance. Also, data can be transformed from model scale to full scale for noise predictions of a real flight. Other programs included calculations of Over All Sound Pressure Level (OASPL), Effective Perceived Noise Level (EPNL). OASPL is the integration of sound with respect to frequency, and EPNL is weighted for a human s response to different sound frequencies and integrated with respect to time. With the proper software correction, data taken in the NATR are useful in determining ways to reduce noise. display any difference between two or more data files. Using this program and graphs of the data, the actual and predicted data can be compared. This software was tested on data collected at the Aero Acoustic Propulsion Laboratory (AAPL) using a variety of window types and overlaps. Similarly, short scripts were written to test each individual program in the software suite for verification. Each graph displays both the original points and the adjusted points connected with lines. During this summer, data points were taken during a live experiment at the AAPL to measure Nozzle Acoustic Test Rig (NATR) background noise levels. Six condenser microphones were placed in strategic locations around the dome and the inlet tunnel to measure different noise sources. From the control room the jet was monitored with the help of video cameras and other sensors. The data points were recorded, reduced, and plotted, and will be used to plan future modifications to the NATR. The primary goal to create data reduction test programs and provide verification was completed. As a result of the internship, I learned C/C++, UNIX/LINUX, Excel, and acoustic data processing methods. I also recorded data at the AAPL, then processed and plotted it. These data would be useful to compare against existing data. In addition, I adjusted software to work on the Mac OSX platform. And I used the available training resources.

  11. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of... respect to implementation of risk adjustment software or as a result of data validation conducted pursuant... implementation of risk adjustment software or data validation. ...

  12. Parallel Domain Decomposition Formulation and Software for Large-Scale Sparse Symmetrical/Unsymmetrical Aeroacoustic Applications

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Watson, Willie R. (Technical Monitor)

    2005-01-01

    The overall objectives of this research work are to formulate and validate efficient parallel algorithms, and to efficiently design/implement computer software for solving large-scale acoustic problems, arised from the unified frameworks of the finite element procedures. The adopted parallel Finite Element (FE) Domain Decomposition (DD) procedures should fully take advantages of multiple processing capabilities offered by most modern high performance computing platforms for efficient parallel computation. To achieve this objective. the formulation needs to integrate efficient sparse (and dense) assembly techniques, hybrid (or mixed) direct and iterative equation solvers, proper pre-conditioned strategies, unrolling strategies, and effective processors' communicating schemes. Finally, the numerical performance of the developed parallel finite element procedures will be evaluated by solving series of structural, and acoustic (symmetrical and un-symmetrical) problems (in different computing platforms). Comparisons with existing "commercialized" and/or "public domain" software are also included, whenever possible.

  13. The ISO SWS on-line system

    NASA Technical Reports Server (NTRS)

    Roelfsema, P. R.; Kester, D. J. M.; Wesselius, P. R.; Wieprech, E.; Sym, N.

    1992-01-01

    The software which is currently being developed for the Short Wavelength Spectrometer (SWS) of the Infrared Space Observatory (ISO) is described. The spectrometer has a wide range of capabilities in the 2-45 micron infrared band. SWS contains two independent gratings, one for the long and one for the short wavelength section of the band. With the gratings a spectral resolution of approximately 1000 to approximately 2500 can be obtained. The instrument also contains two Fabry-Perault's yielding a resolution between approximately 1000 and approximately 20000. Software is currently being developed for the acquisition, calibration, and analysis of SWS data. The software is firstly required to run in a pipeline mode without human interaction, to process data as they are received from the telescope. However, both for testing and calibration of the instrument as well as for evaluation of the planned operating procedures the software should also be suitable for interactive use. Thirdly the same software will be used for long term characterization of the instrument. The software must work properly within the environment designed by the European Space Agency (ESA) for the spacecraft operations. As a result strict constraints are put on I/O devices, throughput etc.

  14. Software development without languages

    NASA Technical Reports Server (NTRS)

    Osborne, Haywood S.

    1988-01-01

    Automatic programming generally involves the construction of a formal specification; i.e., one which allows unambiguous interpretation by tools for the subsequent production of the corresponding software. Previous practical efforts in this direction have focused on the serious problems of: (1) designing the optimum specification language; and (2) mapping (translating or compiling) from this specification language to the program itself. The approach proposed bypasses the above problems. It postulates that the specification proper should be an intermediate form, with the sole function of containing information sufficient to facilitate construction of programs and also of matching documentation. Thus, the means of forming the intermediary becomes a human factors task rather than a linguistic one; human users will read documents generated from the specification, rather than the specification itself.

  15. Investigation of sludge re-circulating clarifiers design and optimization through numerical simulation.

    PubMed

    Davari, S; Lichayee, M J

    2003-01-01

    In steam thermal power plants (TPP) with open re-circulating wet cooling towers, elimination of water hardness and suspended solids (SS) is performed in clarifiers. Most of these clarifiers are of high efficiency sludge re-circulating type (SRC) with capacity between 500-1,500 m3/hr. Improper design and/or mal-operation of clarifiers in TPPs results in working conditions below design capacity or production of soft water with improper quality (hardness and S.S.). This causes accumulation of deposits in heat exchangers, condenser tubes, cooling and service water pipes and boiler tubes as well as increasing the ionic load of water at the demineralizing system inlet. It also increases the amount of chemical consumptions and produces more liquid and solid waste. In this regard, a software program for optimal design and simulation of SRCs has been developed. Then design parameters of existing SRCs in four TPPs in Iran were used as inputs to developed software program and resulting technical specifications were compared with existing ones. In some cases improper design was the main cause of poor outlet water quality. In order to achieve proper efficiency, further investigations were made to obtain control parameters as well as design parameters for both mal-designed and/or mal-operated SRCs.

  16. Use of rapid prototyping drill template for the expansive open door laminoplasty: A cadaveric study.

    PubMed

    Rong, Xin; Wang, Beiyu; Chen, Hua; Ding, Chen; Deng, Yuxiao; Ma, Lipeng; Ma, Yanzhao; Liu, Hao

    2016-11-01

    Trough preparation is a technically demanding yet critical procedure for successful expansive open door laminoplasty (EOLP), requiring both proper position and appropriate bone removal. We aimed to use the specific rapid prototyping drill template to achieve such requirement. The 3D model of the cadaveric cervical spine was reconstructed using the Mimics 17.0 and Geomagic Studio 12.0 software. The drilling template was designed in the 3-Matic software. The trough position was simulated at the medial margin of the facet joint. Two holders were designed on both sides. On the open side, the holder would just allow the drill penetrate the ventral cortex of the lamina. On the hinge side, the holder was designed to keep the ventral cortex of the lamina intact. One orthopedic resident performed the surgery using the rapid prototyping drill template on four cadavers (template group). A control group of four cadavers were operated upon without the use of the template. The deviation of the final trough position from the simulated trough position was 0.18mm±0.51mm in the template group. All the troughs in the template group and 40% of the troughs in the control group were at the medial side of the facet joint. The complete hinge fracture rate was 5% in the template group, significantly lower than that (55%) in the control group (P=0.01). The rapid prototyping drill template could help the surgeon accomplish proper trough position and appropriate bone removal in EOLP on the cadaveric cervical spine. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. ADA (Trade Name) Software Engineering Education and Training Symposium (2nd) Held in Dallas, Texas on 9-11 June 1987.

    DTIC Science & Technology

    1987-06-11

    illustrative examples throughout. The book seems adecuate for a beginners class if instructor complements book with his or her own material. Ada: An...point, boolean, character, and enumeration) are taught, but proper declaration of types and subtypes are fully covered. Flowcharts are used to design the...placed on accurately following the stated requirements and sample run. Normally, students have one week to complete each project. A flowchart showing the

  18. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    NASA Technical Reports Server (NTRS)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and development through health management and maintenance. TEAMS-Designer is the model-building and testability analysis software in that suite.

  19. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  20. The development of mathematics courseware for learning line and angle

    NASA Astrophysics Data System (ADS)

    Halim, Noor Dayana Abd; Han, Ong Boon; Abdullah, Zaleha; Yusup, Junaidah

    2015-05-01

    Learning software is a teaching aid which is often used in schools to increase students' motivation, attract students' attention and also improve the quality of teaching and learning process. However, the development of learning software should be followed the phases in Instructional Design (ID) Model, therefore the process can be carried out systematic and orderly. Thus, this concept paper describes the application of ADDIE model in the development of mathematics learning courseware for learning Line and Angle named CBL-Math. ADDIE model consists of five consecutive phases which are Analysis, Design, Development, Implementation and Evaluation. Each phase must be properly planned in order to achieve the objectives stated. Other than to describe the processes occurring in each phase, this paper also demonstrating how cognitive theory of multimedia learning principles are integrated in the developed courseware. The principles that applied in the courseware reduce the students' cognitive load while learning the topic of line and angle. With well prepared development process and the integration of appropriate principles, it is expected that the developed software can help students learn effectively and also increase students' achievement in the topic of Line and Angle.

  1. Resource Allocation Planning Helper (RALPH): Lessons learned

    NASA Technical Reports Server (NTRS)

    Durham, Ralph; Reilly, Norman B.; Springer, Joe B.

    1990-01-01

    The current task of Resource Allocation Process includes the planning and apportionment of JPL's Ground Data System composed of the Deep Space Network and Mission Control and Computing Center facilities. The addition of the data driven, rule based planning system, RALPH, has expanded the planning horizon from 8 weeks to 10 years and has resulted in large labor savings. Use of the system has also resulted in important improvements in science return through enhanced resource utilization. In addition, RALPH has been instrumental in supporting rapid turn around for an increased volume of special what if studies. The status of RALPH is briefly reviewed and important lessons learned from the creation of an highly functional design team are focused on through an evolutionary design and implementation period in which an AI shell was selected, prototyped, and ultimately abandoned, and through the fundamental changes to the very process that spawned the tool kit. Principal topics include proper integration of software tools within the planning environment, transition from prototype to delivered to delivered software, changes in the planning methodology as a result of evolving software capabilities and creation of the ability to develop and process generic requirements to allow planning flexibility.

  2. Automation of the Environmental Control and Life Support System

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, J. Ray

    1990-01-01

    The objective of the Environmental Control and Life Support System (ECLSS) Advanced Automation Project is to recommend and develop advanced software for the initial and evolutionary Space Station Freedom (SSF) ECLS system which will minimize the crew and ground manpower needed for operations. Another objective includes capturing ECLSS design and development knowledge for future missions. This report summarizes our results from Phase I, the ECLSS domain analysis phase, which we broke down into three steps: 1) Analyze and document the baselined ECLS system, 2) envision as our goal an evolution to a fully automated regenerative life support system, built upon an augmented baseline, and 3) document the augmentations (hooks and scars) and advanced software systems which we see as necessary in achieving minimal manpower support for ECLSS operations. In addition, Phase I included development of an advanced software life cycle testing tools will be used in the development of the software. In this way, we plan in preparation for phase II and III, the development and integration phases, respectively. Automated knowledge acquisition, engineering, verification, and can capture ECLSS development knowledge for future use, develop more robust and complex software, provide feedback to the KBS tool community, and insure proper visibility of our efforts.

  3. Distributed agile software development for the SKA

    NASA Astrophysics Data System (ADS)

    Wicenec, Andreas; Parsons, Rebecca; Kitaeff, Slava; Vinsen, Kevin; Wu, Chen; Nelson, Paul; Reed, David

    2012-09-01

    The SKA software will most probably be developed by many groups distributed across the globe and coming from dierent backgrounds, like industries and research institutions. The SKA software subsystems will have to cover a very wide range of dierent areas, but still they have to react and work together like a single system to achieve the scientic goals and satisfy the challenging data ow requirements. Designing and developing such a system in a distributed fashion requires proper tools and the setup of an environment to allow for ecient detection and tracking of interface and integration issues in particular in a timely way. Agile development can provide much faster feedback mechanisms and also much tighter collaboration between the customer (scientist) and the developer. Continuous integration and continuous deployment on the other hand can provide much faster feedback of integration issues from the system level to the subsystem developers. This paper describes the results obtained from trialing a potential SKA development environment based on existing science software development processes like ALMA, the expected distribution of the groups potentially involved in the SKA development and experience gained in the development of large scale commercial software projects.

  4. Implementing Kanban for agile process management within the ALMA Software Operations Group

    NASA Astrophysics Data System (ADS)

    Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge

    2014-07-01

    After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.

  5. Model driven development of clinical information sytems using openEHR.

    PubMed

    Atalag, Koray; Yang, Hong Yul; Tempero, Ewan; Warren, Jim

    2011-01-01

    openEHR and the recent international standard (ISO 13606) defined a model driven software development methodology for health information systems. However there is little evidence in the literature describing implementation; especially for desktop clinical applications. This paper presents an implementation pathway using .Net/C# technology for Microsoft Windows desktop platforms. An endoscopy reporting application driven by openEHR Archetypes and Templates has been developed. A set of novel GUI directives has been defined and presented which guides the automatic graphical user interface generator to render widgets properly. We also reveal the development steps and important design decisions; from modelling to the final software product. This might provide guidance for other developers and form evidence required for the adoption of these standards for vendors and national programs alike.

  6. The Need and Keys for a New Generation Network Adjustment Software

    NASA Astrophysics Data System (ADS)

    Colomina, I.; Blázquez, M.; Navarro, J. A.; Sastre, J.

    2012-07-01

    Orientation and calibration of photogrammetric and remote sensing instruments is a fundamental capacity of current mapping systems and a fundamental research topic. Neither digital remote sensing acquisition systems nor direct orientation gear, like INS and GNSS technologies, made block adjustment obsolete. On the contrary, the continuous flow of new primary data acquisition systems has challenged the capacity of the legacy block adjustment systems - in general network adjustment systems - in many aspects: extensibility, genericity, portability, large data sets capacity, metadata support and many others. In this article, we concentrate on the extensibility and genericity challenges that current and future network systems shall face. For this purpose we propose a number of software design strategies with emphasis on rigorous abstract modeling that help in achieving simplicity, genericity and extensibility together with the protection of intellectual proper rights in a flexible manner. We illustrate our suggestions with the general design approach of GENA, the generic extensible network adjustment system of GeoNumerics.

  7. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2012-01-01

    NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  8. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2013-01-01

    NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  9. Presenting an Evaluation Model for the Cancer Registry Software.

    PubMed

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  10. Structural Analysis Using Computer Based Methods

    NASA Technical Reports Server (NTRS)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  11. Water supply pipe dimensioning using hydraulic power dissipation

    NASA Astrophysics Data System (ADS)

    Sreemathy, J. R.; Rashmi, G.; Suribabu, C. R.

    2017-07-01

    Proper sizing of the pipe component of water distribution networks play an important role in the overall design of the any water supply system. Several approaches have been applied for the design of networks from an economical point of view. Traditional optimization techniques and population based stochastic algorithms are widely used to optimize the networks. But the use of these approaches is mostly found to be limited to the research level due to difficulties in understanding by the practicing engineers, design engineers and consulting firms. More over due to non-availability of commercial software related to the optimal design of water distribution system,it forces the practicing engineers to adopt either trial and error or experience-based design. This paper presents a simple approach based on power dissipation in each pipeline as a parameter to design the network economically, but not to the level of global minimum cost.

  12. Software selection based on analysis and forecasting methods, practised in 1C

    NASA Astrophysics Data System (ADS)

    Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.

    2015-09-01

    The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.

  13. Management Aspects of Software Maintenance.

    DTIC Science & Technology

    1984-09-01

    educated in * the complex nature of software maintenance to be able to properly evaluate and manage the software maintenance effort. In this...maintenance and improvement may be called "software evolution". The soft- ware manager must be Educated in the complex nature cf soft- Iware maintenance to be...complaint of error or request for modification is also studied in order to determine what action needs tc be taken. 2. Define Objective and Approach :

  14. Application of Real Options Theory to DoD Software Acquisitions

    DTIC Science & Technology

    2009-08-01

    words.) The traditional real options valuation methodology, when enhanced and properly formulated around a proposed or existing software investment...Std 239-18 - ii - THIS PAGE INTENTIONALLY LEFT BLANK - iii - Abstract The traditional real options valuation ...founder and CEO of Real Options Valuation , Inc., a consulting, training, and software development firm specializing in strategic real options

  15. The new meaning of quality in the information age.

    PubMed

    Prahalad, C K; Krishnan, M S

    1999-01-01

    Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.

  16. Application-Program-Installer Builder

    NASA Technical Reports Server (NTRS)

    Wolgast, Paul; Demore, Martha; Lowik, Paul

    2007-01-01

    A computer program builds application programming interfaces (APIs) and related software components for installing and uninstalling application programs in any of a variety of computers and operating systems that support the Java programming language in its binary form. This program is partly similar in function to commercial (e.g., Install-Shield) software. This program is intended to enable satisfaction of a quasi-industry-standard set of requirements for a set of APIs that would enable such installation and uninstallation and that would avoid the pitfalls that are commonly encountered during installation of software. The requirements include the following: 1) Properly detecting prerequisites to an application program before performing the installation; 2) Properly registering component requirements; 3) Correctly measuring the required hard-disk space, including accounting for prerequisite components that have already been installed; and 4) Correctly uninstalling an application program. Correct uninstallation includes (1) detecting whether any component of the program to be removed is required by another program, (2) not removing that component, and (3) deleting references to requirements of the to-be-removed program for components of other programs so that those components can be properly removed at a later time.

  17. Visualization of unsteady computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Haimes, Robert

    1994-11-01

    A brief summary of the computer environment used for calculating three dimensional unsteady Computational Fluid Dynamic (CFD) results is presented. This environment requires a super computer as well as massively parallel processors (MPP's) and clusters of workstations acting as a single MPP (by concurrently working on the same task) provide the required computational bandwidth for CFD calculations of transient problems. The cluster of reduced instruction set computers (RISC) is a recent advent based on the low cost and high performance that workstation vendors provide. The cluster, with the proper software can act as a multiple instruction/multiple data (MIMD) machine. A new set of software tools is being designed specifically to address visualizing 3D unsteady CFD results in these environments. Three user's manuals for the parallel version of Visual3, pV3, revision 1.00 make up the bulk of this report.

  18. Visualization of unsteady computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1994-01-01

    A brief summary of the computer environment used for calculating three dimensional unsteady Computational Fluid Dynamic (CFD) results is presented. This environment requires a super computer as well as massively parallel processors (MPP's) and clusters of workstations acting as a single MPP (by concurrently working on the same task) provide the required computational bandwidth for CFD calculations of transient problems. The cluster of reduced instruction set computers (RISC) is a recent advent based on the low cost and high performance that workstation vendors provide. The cluster, with the proper software can act as a multiple instruction/multiple data (MIMD) machine. A new set of software tools is being designed specifically to address visualizing 3D unsteady CFD results in these environments. Three user's manuals for the parallel version of Visual3, pV3, revision 1.00 make up the bulk of this report.

  19. Integration of PGD-virtual charts into an engineering design process

    NASA Astrophysics Data System (ADS)

    Courard, Amaury; Néron, David; Ladevèze, Pierre; Ballere, Ludovic

    2016-04-01

    This article deals with the efficient construction of approximations of fields and quantities of interest used in geometric optimisation of complex shapes that can be encountered in engineering structures. The strategy, which is developed herein, is based on the construction of virtual charts that allow, once computed offline, to optimise the structure for a negligible online CPU cost. These virtual charts can be used as a powerful numerical decision support tool during the design of industrial structures. They are built using the proper generalized decomposition (PGD) that offers a very convenient framework to solve parametrised problems. In this paper, particular attention has been paid to the integration of the procedure into a genuine engineering design process. In particular, a dedicated methodology is proposed to interface the PGD approach with commercial software.

  20. Design and analysis of lifting tool assemblies to lift different engine block

    NASA Astrophysics Data System (ADS)

    Sawant, Arpana; Deshmukh, Nilaj N.; Chauhan, Santosh; Dabhadkar, Mandar; Deore, Rupali

    2017-07-01

    Engines block are required to be lifted from one place to another while they are being processed. The human effort required for this purpose is more and also the engine block may get damaged if it is not handled properly. There is a need for designing a proper lifting tool which will be able to conveniently lift the engine block and place it at the desired position without any accident and damage to the engine block. In the present study lifting tool assemblies are designed and analyzed in such way that it may lift different categories of engine blocks. The lifting tool assembly consists of lifting plate, lifting ring, cap screws and washers. A parametric model and assembly of Lifting tool is done in 3D modelling software CREO 2.0 and analysis is carried out in ANSYS Workbench 16.0. A test block of weight equivalent to that of an engine block is considered for the purpose of analysis. In the preliminary study, without washer the stresses obtained on the lifting tool were more than the safety margin. In the present design, washers were used with appropriate dimensions which helps to bring down the stresses on the lifting tool within the safety margin. Analysis is carried out to verify that tool design meets the ASME BTH-1 required safety margin.

  1. Does This Really Work? The Keys to Implementing New Technology while Providing Evidence that Technology Is Successful

    ERIC Educational Resources Information Center

    Sawtelle, Sara

    2008-01-01

    Proving that technology works is not as simple as proving that a new vendor for art supplies is more cost effective. Technology effectiveness requires both the right software and the right implementation. Just having the software is not enough. Proper planning, training, leadership, support, pedagogy, and software use--along with many other…

  2. A Generalized Method for Automatic Downhand and Wirefeed Control of a Welding Robot and Positioner

    NASA Technical Reports Server (NTRS)

    Fernandez, Ken; Cook, George E.

    1988-01-01

    A generalized method for controlling a six degree-of-freedom (DOF) robot and a two DOF positioner used for arc welding operations is described. The welding path is defined in the part reference frame, and robot/positioner joint angles of the equivalent eight DOF serial linkage are determined via an iterative solution. Three algorithms are presented: the first solution controls motion of the eight DOF mechanism such that proper torch motion is achieved while minimizing the sum-of-squares of joint displacements; the second algorithm adds two constraint equations to achieve torch control while maintaining part orientation so that welding occurs in the downhand position; and the third algorithm adds the ability to control the proper orientation of a wire feed mechanism used in gas tungsten arc (GTA) welding operations. A verification of these algorithms is given using ROBOSIM, a NASA developed computer graphic simulation software package design for robot systems development.

  3. SPod Progress Summary Slides | Science Inventory | US EPA

    EPA Pesticide Factsheets

    This presentation describes the draft “open source” design package for the SPod fenceline sensor. The SPod is a low cost, solar-powered system that combines wind field and air pollutant concentration measurements to detect emission plumes and help locate the source of emissions. The current design works only in “near-fenceline” applications where localized source emission plumes may be present. The SPod uses data analysis software (described elsewhere) to separate baseline drift from the plume signal of interest. This software is necessary for proper operation of the SPod. Because the SPod is designed to detect source emissions plumes, it is not useful for ambient applications large distances away from sources. The current SPod detects a subset of air pollutants that can be ionized with a 10.6 eV photoionization detector (PID). In the future, other air pollutant sensors may be used. The purpose of this presentation and related postings is to advance design concepts in the low-cost fenceline sensor area with any interested parties. The SPod is a work in progress with continued advances incorporated on an ongoing basis. This document is posted on an EPA share drive along with other information that describes the use operation and limitations of the SPod. These slides summarize the SPod design, purpose, and progress as of June, 2016. These slides will be posted on the EPA SPod Share Site along with design information and other materials that communicat

  4. Performance Evaluation and Software Design for EVA Robotic Assistant Stereo Vision Heads

    NASA Technical Reports Server (NTRS)

    DiPaolo, Daniel

    2003-01-01

    The purpose of this project was to aid the EVA Robotic Assistant project by evaluating and designing the necessary interfaces for two stereo vision heads - the TracLabs Biclops pan-tilt-verge head, and the Helpmate Zebra pan-tilt-verge head. The first half of the project consisted of designing the necessary software interface so that the other modules of the EVA Robotic Assistant had proper access to all of the functionalities offered by each of the stereovision heads. This half took most of the project time, due to a lack of ready-made CORBA drivers for either of the heads. Once this was overcome, the evaluation stage of the project began. The second half of the project was to take these interfaces and to evaluate each of the stereo vision heads in terms of usefulness to the project. In the key project areas such as stability and reliability, the Zebra pan-tilt-verge head came out on top. However, the Biclops did have many more advantages over the Zebra, such as: lower power consumption, faster communications, and a simpler, cleaner API. Overall, the Biclops pan-tilt-verge head outperformed the Zebra pan-tilt-verge head.

  5. Views of Health Information Management Staff on the Medical Coding Software in Mashhad, Iran.

    PubMed

    Kimiafar, Khalil; Hemmati, Fatemeh; Banaye Yazdipour, Alireza; Sarbaz, Masoumeh

    2018-01-01

    Systematic evaluation of Health Information Technology (HIT) and users' views leads to the modification and development of these technologies in accordance with their needs. The purpose of this study was to investigate the views of Health Information Management (HIM) staff on the quality of medical coding software. A descriptive cross-sectional study was conducted between May to July 2016 in 26 hospitals (academic and non-academic) in Mashhad, north-eastern Iran. The study population consisted of the chairs of HIM departments and medical coders (58 staff). Data were collected through a valid and reliable questionnaire. The data were analyzed using the SPSS version 16.0. From the views of staff, the advantages of coding software such as reducing coding time had the highest average (Mean=3.82) while cost reduction had the lowest average (Mean =3.20), respectively. Meanwhile, concern about losing job opportunities was the least important disadvantage (15.5%) to the use of coding software. In general, the results of this study showed that coding software in some cases have deficiencies. Designers and developers of health information coding software should pay more attention to technical aspects, in-work reminders, help in deciding on proper codes selection by access coding rules, maintenance services, link to other relevant databases and the possibility of providing brief and detailed reports in different formats.

  6. Citation and Recognition of contributions using Semantic Provenance Knowledge Captured in the OPeNDAP Software Framework

    NASA Astrophysics Data System (ADS)

    West, P.; Michaelis, J.; Lebot, T.; McGuinness, D. L.; Fox, P. A.

    2014-12-01

    Providing proper citation and attribution for published data, derived data products, and the software tools used to generate them, has always been an important aspect of scientific research. However, It is often the case that this type of detailed citation and attribution is lacking. This is in part because it often requires manual markup since dynamic generation of this type of provenance information is not typically done by the tools used to access, manipulate, transform, and visualize data. In addition, the tools themselves lack the information needed to be properly cited themselves. The OPeNDAP Hyrax Software Framework is a tool that provides access to and the ability to constrain, manipulate, and transform, different types of data from different data formats, into a common format, the DAP (Data Access Protocol), in order to derive new data products. A user, or another software client, specifies an HTTP URL in order to access a particular piece of data, and appropriately transform it to suit a specific purpose of use. The resulting data products, however, do not contain any information about what data was used to create it, or the software process used to generate it, let alone information that would allow the proper citing and attribution to down stream researchers and tool developers. We will present our approach to provenance capture in Hyrax including a mechanism that can be used to report back to the hosting site any derived products, such as publications and reports, using the W3C PROV recommendation pingback service. We will demonstrate our utilization of Semantic Web and Web standards, the development of an information model that extends the PROV model for provenance capture, and the development of the pingback service. We will present our findings, as well as our practices for providing provenance information, visualization of the provenance information, and the development of pingback services, to better enable scientists and tool developers to be recognized and properly cited for their contributions.

  7. Improving INPE'S balloon ground facilities for operation of the protoMIRAX experiment

    NASA Astrophysics Data System (ADS)

    Mattiello-Francisco, F.; Rinke, E.; Fernandes, J. O.; Cardoso, L.; Cardoso, P.; Braga, J.

    2014-10-01

    The system requirements for reusing the scientific balloon ground facilities available at INPE were a challenge to the ground system engineers involved in the protoMIRAX X-ray astronomy experiment. A significant effort on software updating was required for the balloon ground station. Considering that protoMIRAX is a pathfinder for the MIRAX satellite mission, a ground infrastructure compatible with INPE's satellite operation approach would be useful and highly recommended to control and monitor the experiment during the balloon flights. This approach will make use of the SATellite Control System (SATCS), a software-based architecture developed at INPE for satellite commanding and monitoring. SATCS complies with particular operational requirements of different satellites by using several customized object-oriented software elements and frameworks. We present the ground solution designed for protoMIRAX operation, the Control and Reception System (CRS). A new server computer, properly configured with Ethernet, has extended the existing ground station facilities with switch, converters and new software (OPS/SERVER) in order to support the available uplink and downlink channels being mapped to TCP/IP gateways required by SATCS. Currently, the CRS development is customizing the SATCS for the kernel functions of protoMIRAX command and telemetry processing. Design-patterns, component-based libraries and metadata are widely used in the SATCS in order to extend the frameworks to address the Packet Utilization Standard (PUS) for ground-balloon communication, in compliance with the services provided by the data handling computer onboard the protoMIRAX balloon.

  8. A systematic approach to parameter selection for CAD-virtual reality data translation using response surface methodology and MOGA-II.

    PubMed

    Abidi, Mustufa Haider; Al-Ahmari, Abdulrahman; Ahmad, Ali

    2018-01-01

    Advanced graphics capabilities have enabled the use of virtual reality as an efficient design technique. The integration of virtual reality in the design phase still faces impediment because of issues linked to the integration of CAD and virtual reality software. A set of empirical tests using the selected conversion parameters was found to yield properly represented virtual reality models. The reduced model yields an R-sq (pred) value of 72.71% and an R-sq (adjusted) value of 86.64%, indicating that 86.64% of the response variability can be explained by the model. The R-sq (pred) is 67.45%, which is not very high, indicating that the model should be further reduced by eliminating insignificant terms. The reduced model yields an R-sq (pred) value of 73.32% and an R-sq (adjusted) value of 79.49%, indicating that 79.49% of the response variability can be explained by the model. Using the optimization software MODE Frontier (Optimization, MOGA-II, 2014), four types of response surfaces for the three considered response variables were tested for the data of DOE. The parameter values obtained using the proposed experimental design methodology result in better graphics quality, and other necessary design attributes.

  9. Controls/CFD Interdisciplinary Research Software Generates Low-Order Linear Models for Control Design From Steady-State CFD Results

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    1997-01-01

    The NASA Lewis Research Center is developing analytical methods and software tools to create a bridge between the controls and computational fluid dynamics (CFD) disciplines. Traditionally, control design engineers have used coarse nonlinear simulations to generate information for the design of new propulsion system controls. However, such traditional methods are not adequate for modeling the propulsion systems of complex, high-speed vehicles like the High Speed Civil Transport. To properly model the relevant flow physics of high-speed propulsion systems, one must use simulations based on CFD methods. Such CFD simulations have become useful tools for engineers that are designing propulsion system components. The analysis techniques and software being developed as part of this effort are an attempt to evolve CFD into a useful tool for control design as well. One major aspect of this research is the generation of linear models from steady-state CFD results. CFD simulations, often used during the design of high-speed inlets, yield high resolution operating point data. Under a NASA grant, the University of Akron has developed analytical techniques and software tools that use these data to generate linear models for control design. The resulting linear models have the same number of states as the original CFD simulation, so they are still very large and computationally cumbersome. Model reduction techniques have been successfully applied to reduce these large linear models by several orders of magnitude without significantly changing the dynamic response. The result is an accurate, easy to use, low-order linear model that takes less time to generate than those generated by traditional means. The development of methods for generating low-order linear models from steady-state CFD is most complete at the one-dimensional level, where software is available to generate models with different kinds of input and output variables. One-dimensional methods have been extended somewhat so that linear models can also be generated from two- and three-dimensional steady-state results. Standard techniques are adequate for reducing the order of one-dimensional CFD-based linear models. However, reduction of linear models based on two- and three-dimensional CFD results is complicated by very sparse, ill-conditioned matrices. Some novel approaches are being investigated to solve this problem.

  10. Software Cost Measuring and Reporting. One of the Software Acquisition Engineering Guidebook Series.

    DTIC Science & Technology

    1979-01-02

    through the peripherals. How- and performance criteria), ever, his interaction is usually minimal since, by difinition , the automatic test Since TS...performs its Software estimating is still heavily intended functions properly. dependent on experienced judgement. However, quantitative methods...apply to systems of totally different can be distributed to specialists who content. The Quantitative guideline may are most familiar with the work. One

  11. [Design of medical equipment service management system].

    PubMed

    Jiang, Youhao; PengWen; Jiang, Ningfeng; Ma, Li; Kong, Lingwei; Yin, PeiHao; Sun, Cheng

    2012-09-01

    To develop a maintenance management system for medical equipment based on HIS. The system contains some special functions( including preventive maintenance, automatic job dispatch, performance assessment, etc.) which are very useful for confirming the medical equipment in proper conditions and promoting the working efficiency of the staff. The system provides technical support for the improvement of the maintenance management level. The system, completed the software design using C/S, B/S combination mode. The system realized clients of various sections of zero maintenance, and make the data manipulation, statistical features of equipment management department more convenient. the system connects the subsystems closer and interacts information from time to time, forming a tight network structure. This provides a basis for future hospital-wide information integration.

  12. Integration of instrumentation and processing software of a laser speckle contrast imaging system

    NASA Astrophysics Data System (ADS)

    Carrick, Jacob J.

    Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.

  13. Design of CMOS imaging system based on FPGA

    NASA Astrophysics Data System (ADS)

    Hu, Bo; Chen, Xiaolai

    2017-10-01

    In order to meet the needs of engineering applications for high dynamic range CMOS camera under the rolling shutter mode, a complete imaging system is designed based on the CMOS imaging sensor NSC1105. The paper decides CMOS+ADC+FPGA+Camera Link as processing architecture and introduces the design and implementation of the hardware system. As for camera software system, which consists of CMOS timing drive module, image acquisition module and transmission control module, the paper designs in Verilog language and drives it to work properly based on Xilinx FPGA. The ISE 14.6 emulator ISim is used in the simulation of signals. The imaging experimental results show that the system exhibits a 1280*1024 pixel resolution, has a frame frequency of 25 fps and a dynamic range more than 120dB. The imaging quality of the system satisfies the requirement of the index.

  14. Preventing Run-Time Bugs at Compile-Time Using Advanced C++

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neswold, Richard

    When writing software, we develop algorithms that tell the computer what to do at run-time. Our solutions are easier to understand and debug when they are properly modeled using class hierarchies, enumerations, and a well-factored API. Unfortunately, even with these design tools, we end up having to debug our programs at run-time. Worse still, debugging an embedded system changes its dynamics, making it tough to find and fix concurrency issues. This paper describes techniques using C++ to detect run-time bugs *at compile time*. A concurrency library, developed at Fermilab, is used for examples in illustrating these techniques.

  15. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  16. A precision device needs precise simulation: Software description of the CBM Silicon Tracking System

    NASA Astrophysics Data System (ADS)

    Malygina, Hanna; Friese, Volker; CBM Collaboration

    2017-10-01

    Precise modelling of detectors in simulations is the key to the understanding of their performance, which, in turn, is a prerequisite for the proper design choice and, later, for the achievement of valid physics results. In this report, we describe the implementation of the Silicon Tracking System (STS), the main tracking device of the CBM experiment, in the CBM software environment. The STS makes uses of double-sided silicon micro-strip sensors with double metal layers. We present a description of transport and detector response simulation, including all relevant physical effects like charge creation and drift, charge collection, cross-talk and digitization. Of particular importance and novelty is the description of the time behaviour of the detector, since its readout will not be externally triggered but continuous. We also cover some aspects of local reconstruction, which in the CBM case has to be performed in real-time and thus requires high-speed algorithms.

  17. SPSS and SAS programs for determining the number of components using parallel analysis and velicer's MAP test.

    PubMed

    O'Connor, B P

    2000-08-01

    Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.

  18. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Risk adjustment data validation standards. 153.350... validation standards. (a) General requirement. The State, or HHS on behalf of the State, must ensure proper implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of...

  19. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Risk adjustment data validation standards. 153.350... validation standards. (a) General requirement. The State, or HHS on behalf of the State, must ensure proper implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of...

  20. QuantifyMe: An Open-Source Automated Single-Case Experimental Design Platform.

    PubMed

    Taylor, Sara; Sano, Akane; Ferguson, Craig; Mohan, Akshay; Picard, Rosalind W

    2018-04-05

    Smartphones and wearable sensors have enabled unprecedented data collection, with many products now providing feedback to users about recommended step counts or sleep durations. However, these recommendations do not provide personalized insights that have been shown to be best suited for a specific individual. A scientific way to find individualized recommendations and causal links is to conduct experiments using single-case experimental design; however, properly designed single-case experiments are not easy to conduct on oneself. We designed, developed, and evaluated a novel platform, QuantifyMe, for novice self-experimenters to conduct proper-methodology single-case self-experiments in an automated and scientific manner using their smartphones. We provide software for the platform that we used (available for free on GitHub), which provides the methodological elements to run many kinds of customized studies. In this work, we evaluate its use with four different kinds of personalized investigations, examining how variables such as sleep duration and regularity, activity, and leisure time affect personal happiness, stress, productivity, and sleep efficiency. We conducted a six-week pilot study ( N = 13) to evaluate QuantifyMe. We describe the lessons learned developing the platform and recommendations for its improvement, as well as its potential for enabling personalized insights to be scientifically evaluated in many individuals, reducing the high administrative cost for advancing human health and wellbeing.

  1. QuantifyMe: An Open-Source Automated Single-Case Experimental Design Platform †

    PubMed Central

    Sano, Akane; Ferguson, Craig; Mohan, Akshay; Picard, Rosalind W.

    2018-01-01

    Smartphones and wearable sensors have enabled unprecedented data collection, with many products now providing feedback to users about recommended step counts or sleep durations. However, these recommendations do not provide personalized insights that have been shown to be best suited for a specific individual. A scientific way to find individualized recommendations and causal links is to conduct experiments using single-case experimental design; however, properly designed single-case experiments are not easy to conduct on oneself. We designed, developed, and evaluated a novel platform, QuantifyMe, for novice self-experimenters to conduct proper-methodology single-case self-experiments in an automated and scientific manner using their smartphones. We provide software for the platform that we used (available for free on GitHub), which provides the methodological elements to run many kinds of customized studies. In this work, we evaluate its use with four different kinds of personalized investigations, examining how variables such as sleep duration and regularity, activity, and leisure time affect personal happiness, stress, productivity, and sleep efficiency. We conducted a six-week pilot study (N = 13) to evaluate QuantifyMe. We describe the lessons learned developing the platform and recommendations for its improvement, as well as its potential for enabling personalized insights to be scientifically evaluated in many individuals, reducing the high administrative cost for advancing human health and wellbeing. PMID:29621133

  2. An Adaptable Power System with Software Control Algorithm

    NASA Technical Reports Server (NTRS)

    Castell, Karen; Bay, Mike; Hernandez-Pellerano, Amri; Ha, Kong

    1998-01-01

    A low cost, flexible and modular spacecraft power system design was developed in response to a call for an architecture that could accommodate multiple missions in the small to medium load range. Three upcoming satellites will use this design, with one launch date in 1999 and two in the year 2000. The design consists of modular hardware that can be scaled up or down, without additional cost, to suit missions in the 200 to 600 Watt orbital average load range. The design will be applied to satellite orbits that are circular, polar elliptical and a libration point orbit. Mission unique adaptations are accomplished in software and firmware. In designing this advanced, adaptable power system, the major goals were reduction in weight volume and cost. This power system design represents reductions in weight of 78 percent, volume of 86 percent and cost of 65 percent from previous comparable systems. The efforts to miniaturize the electronics without sacrificing performance has created streamlined power electronics with control functions residing in the system microprocessor. The power system design can handle any battery size up to 50 Amp-hour and any battery technology. The three current implementations will use both nickel cadmium and nickel hydrogen batteries ranging in size from 21 to 50 Amp-hours. Multiple batteries can be used by adding another battery module. Any solar cell technology can be used and various array layouts can be incorporated with no change in Power System Electronics (PSE) hardware. Other features of the design are the standardized interfaces between cards and subsystems and immunity to radiation effects up to 30 krad Total Ionizing Dose (TID) and 35 Mev/cm(exp 2)-kg for Single Event Effects (SEE). The control algorithm for the power system resides in a radiation-hardened microprocessor. A table driven software design allows for flexibility in mission specific requirements. By storing critical power system constants in memory, modifying the system code for other programs is simple. These constants can be altered also by ground command, or in response to an anomolous event. All critical power system functions have backup hardware functions to prevent a software or computer glitch from propagating. A number of battery charge control schemes can be implemented by selecting the proper control terms in the code. The architecture allows the design engineer to tune the system response to various system components and anticipated load profiles without costly alterations. A design trade was made with the size, weight and power dissipation of the electronics versus the performance of the power bus to load variations. Linear, fine control is maintained with a streamlined electronics design. This paper describes the hardware design as well as the software control algorithm. The challenges of closing the system control loop digitally is discussed. Control loop margin and power system performance is presented. Lab measurements are shown and compared to the system response of a hardware model running actual flight software.

  3. Effectiveness of GNSS disposal strategies

    NASA Astrophysics Data System (ADS)

    Alessi, E. M.; Rossi, A.; Valsecchi, G. B.; Anselmo, L.; Pardini, C.; Colombo, C.; Lewis, H. G.; Daquin, J.; Deleflie, F.; Vasile, M.; Zuiani, F.; Merz, K.

    2014-06-01

    The management of the Global Navigation Satellite Systems (GNSS) and of the Medium Earth Orbit (MEO) region as a whole is a subject that cannot be deferred, due to the growing exploitation and launch rate in that orbital regime. The advent of the European Galileo and the Chinese Beidou constellations significantly added complexity to the system and calls for an adequate global view on the four constellations present in operation. The operation procedures, including maintenance and disposal practices, of the constellations currently deployed were analyzed in order to asses a proper reference simulation scenario. The complex dynamics of the MEO region with all the geopotential and lunisolar resonances was studied to better identify the proper end-of-life orbit for every proposed strategy, taking into account and, whenever possible, exploiting the orbital dynamics in this peculiar region of space. The possibility to exploit low thrust propulsion or non gravitational perturbations with passive de-orbiting devices (and a combination of the two) was analyzed, in view of possible applications in the design of the future generations of the constellations satellites. Several upgrades in the long-term evolution software SDM and DAMAGE were undertaken to properly handle the constellation simulations in every aspect from constellation maintenance to orbital dynamics. A thorough approach considering the full time evolving covariance matrix associated with every object was implemented in SDM to compute the collision risk and associated maneuver rate for the constellation satellites. Once the software upgrades will be completed, the effectiveness of the different disposal strategies will be analyzed in terms of residual collision risk and avoidance maneuvers rate. This work was performed under the ESA/GSP Contract no. 4000107201/12/F/MOS.

  4. The Source to S2K Conversion System.

    DTIC Science & Technology

    1978-12-01

    mandgement system Provides. As for all software production, the cost of writing this program is high, particularily considering it may be executed only...research, and 3 findlly, implement the system using disciplined, structured software engineering principles. In order to properly document how these...complete read step is required (as done by the Michigan System and EXPRESS) or software support outside the conversion system (as in CODS) is required

  5. Design optimization of piezoresistive cantilevers for force sensing in air and water

    PubMed Central

    Doll, Joseph C.; Park, Sung-Jin; Pruitt, Beth L.

    2009-01-01

    Piezoresistive cantilevers fabricated from doped silicon or metal films are commonly used for force, topography, and chemical sensing at the micro- and macroscales. Proper design is required to optimize the achievable resolution by maximizing sensitivity while simultaneously minimizing the integrated noise over the bandwidth of interest. Existing analytical design methods are insufficient for modeling complex dopant profiles, design constraints, and nonlinear phenomena such as damping in fluid. Here we present an optimization method based on an analytical piezoresistive cantilever model. We use an existing iterative optimizer to minimimize a performance goal, such as minimum detectable force. The design tool is available as open source software. Optimal cantilever design and performance are found to strongly depend on the measurement bandwidth and the constraints applied. We discuss results for silicon piezoresistors fabricated by epitaxy and diffusion, but the method can be applied to any dopant profile or material which can be modeled in a similar fashion or extended to other microelectromechanical systems. PMID:19865512

  6. Treatment delivery software for a new clinical grade ultrasound system for thermoradiotherapy.

    PubMed

    Novák, Petr; Moros, Eduardo G; Straube, William L; Myerson, Robert J

    2005-11-01

    A detailed description of a clinical grade Scanning Ultrasound Reflector Linear Array System (SURLAS) applicator was given in a previous paper [Med. Phys. 32, 230-240 (2005)]. In this paper we concentrate on the design, development, and testing of the personal computer (PC) based treatment delivery software that runs the therapy system. The SURLAS requires the coordinated interaction between the therapy applicator and several peripheral devices for its proper and safe operation. One of the most important tasks was the coordination of the input power sequences for the elements of two parallel opposed ultrasound arrays (eight 1.5 cm x 2 cm elements/array, array 1 and 2 operate at 1.9 and 4.9 MHz, respectively) in coordination with the position of a dual-face scanning acoustic reflector. To achieve this, the treatment delivery software can divide the applicator's treatment window in up to 64 sectors (minimum size of 2 cm x 2 cm), and control the power to each sector independently by adjusting the power output levels from the channels of a 16-channel radio-frequency generator. The software coordinates the generator outputs with the position of the reflector as it scans back and forth between the arrays. Individual sector control and dual frequency operation allows the SURLAS to adjust power deposition in three dimensions to superficial targets coupled to its treatment window. The treatment delivery software also monitors and logs several parameters such as temperatures acquired using a 16-channel thermocouple thermometry unit. Safety (in particular to patients) was the paramount concern and design criterion. Failure mode and effects analysis (FMEA) was applied to the applicator as well as to the entire therapy system in order to identify safety issues and rank their relative importance. This analysis led to the implementation of several safety mechanisms and a software structure where each device communicates with the controlling PC independently of the others. In case of a malfunction in any part of the system or a violation of a user-defined safety criterion based on temperature readings, the software terminates treatment immediately and the user is notified. The software development process consisting of problem analysis, design, implementation, and testing is presented in this paper. Once the software was finished and integrated with the hardware, the therapy system was extensively tested. Results demonstrated that the software operates the SURLAS as intended with minimum risk to future patients.

  7. Informatics in Radiology (infoRAD): personal computer security: part 2. Software Configuration and file protection.

    PubMed

    Caruso, Ronald D

    2004-01-01

    Proper configuration of software security settings and proper file management are necessary and important elements of safe computer use. Unfortunately, the configuration of software security options is often not user friendly. Safe file management requires the use of several utilities, most of which are already installed on the computer or available as freeware. Among these file operations are setting passwords, defragmentation, deletion, wiping, removal of personal information, and encryption. For example, Digital Imaging and Communications in Medicine medical images need to be anonymized, or "scrubbed," to remove patient identifying information in the header section prior to their use in a public educational or research environment. The choices made with respect to computer security may affect the convenience of the computing process. Ultimately, the degree of inconvenience accepted will depend on the sensitivity of the files and communications to be protected and the tolerance of the user. Copyright RSNA, 2004

  8. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  9. Reducing user error in dipstick urinalysis with a low-cost slipping manifold and mobile phone platform (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Smith, Gennifer T.; Dwork, Nicholas; Khan, Saara A.; Millet, Matthew; Magar, Kiran; Javanmard, Mehdi; Bowden, Audrey K.

    2017-03-01

    Urinalysis dipsticks were designed to revolutionize urine-based medical diagnosis. They are cheap, extremely portable, and have multiple assays patterned on a single platform. They were also meant to be incredibly easy to use. Unfortunately, there are many aspects in both the preparation and the analysis of the dipsticks that are plagued by user error. This high error is one reason that dipsticks have failed to flourish in both the at-home market and in low-resource settings. Sources of error include: inaccurate volume deposition, varying lighting conditions, inconsistent timing measurements, and misinterpreted color comparisons. We introduce a novel manifold and companion software for dipstick urinalysis that eliminates the aforementioned error sources. A micro-volume slipping manifold ensures precise sample delivery, an opaque acrylic box guarantees consistent lighting conditions, a simple sticker-based timing mechanism maintains accurate timing, and custom software that processes video data captured by a mobile phone ensures proper color comparisons. We show that the results obtained with the proposed device are as accurate and consistent as a properly executed dip-and-wipe method, the industry gold-standard, suggesting the potential for this strategy to enable confident urinalysis testing. Furthermore, the proposed all-acrylic slipping manifold is reusable and low in cost, making it a potential solution for at-home users and low-resource settings.

  10. Use of Facial Recognition Software to Identify Disaster Victims With Facial Injuries.

    PubMed

    Broach, John; Yong, Rothsovann; Manuell, Mary-Elise; Nichols, Constance

    2017-10-01

    After large-scale disasters, victim identification frequently presents a challenge and a priority for responders attempting to reunite families and ensure proper identification of deceased persons. The purpose of this investigation was to determine whether currently commercially available facial recognition software can successfully identify disaster victims with facial injuries. Photos of 106 people were taken before and after application of moulage designed to simulate traumatic facial injuries. These photos as well as photos from volunteers' personal photo collections were analyzed by using facial recognition software to determine whether this technology could accurately identify a person with facial injuries. The study results suggest that a responder could expect to get a correct match between submitted photos and photos of injured patients between 39% and 45% of the time and a much higher percentage of correct returns if submitted photos were of optimal quality with percentages correct exceeding 90% in most situations. The present results suggest that the use of this software would provide significant benefit to responders. Although a correct result was returned only 40% of the time, this would still likely represent a benefit for a responder trying to identify hundreds or thousands of victims. (Disaster Med Public Health Preparedness. 2017;11:568-572).

  11. Development of a portable bicycle/pedestrian monitoring system for safety enhancement

    NASA Astrophysics Data System (ADS)

    Usher, Colin; Daley, W. D. R.

    2015-03-01

    Pedestrians involved in roadway accidents account for nearly 12 percent of all traffic fatalities and 59,000 injuries each year. Most injuries occur when pedestrians attempt to cross roads, and there have been noted differences in accident rates midblock vs. at intersections. Collecting data on pedestrian behavior is a time consuming manual process that is prone to error. This leads to a lack of quality information to guide the proper design of lane markings and traffic signals to enhance pedestrian safety. Researchers at the Georgia Tech Research Institute are developing and testing an automated system that can be rapidly deployed for data collection to support the analysis of pedestrian behavior at intersections and midblock crossings with and without traffic signals. This system will analyze the collected video data to automatically identify and characterize the number of pedestrians and their behavior. It consists of a mobile trailer with four high definition pan-tilt cameras for data collection. The software is custom designed and uses state of the art commercial pedestrian detection algorithms. We will be presenting the system hardware and software design, challenges, and results from the preliminary system testing. Preliminary results indicate the ability to provide representative quantitative data on pedestrian motion data more efficiently than current techniques.

  12. Design and implementation of a random neural network routing engine.

    PubMed

    Kocak, T; Seeber, J; Terzioglu, H

    2003-01-01

    Random neural network (RNN) is an analytically tractable spiked neural network model that has been implemented in software for a wide range of applications for over a decade. This paper presents the hardware implementation of the RNN model. Recently, cognitive packet networks (CPN) is proposed as an alternative packet network architecture where there is no routing table, instead the RNN based reinforcement learning is used to route packets. Particularly, we describe implementation details for the RNN based routing engine of a CPN network processor chip: the smart packet processor (SPP). The SPP is a dual port device that stores, modifies, and interprets the defining characteristics of multiple RNN models. In addition to hardware design improvements over the software implementation such as the dual access memory, output calculation step, and reduced output calculation module, this paper introduces a major modification to the reinforcement learning algorithm used in the original CPN specification such that the number of weight terms are reduced from 2n/sup 2/ to 2n. This not only yields significant memory savings, but it also simplifies the calculations for the steady state probabilities (neuron outputs in RNN). Simulations have been conducted to confirm the proper functionality for the isolated SPP design as well as for the multiple SPP's in a networked environment.

  13. Subthreshold SPICE Model Optimization

    NASA Astrophysics Data System (ADS)

    Lum, Gregory; Au, Henry; Neff, Joseph; Bozeman, Eric; Kamin, Nick; Shimabukuro, Randy

    2011-04-01

    The first step in integrated circuit design is the simulation of said design in software to verify proper functionally and design requirements. Properties of the process are provided by fabrication foundries in the form of SPICE models. These SPICE models contain the electrical data and physical properties of the basic circuit elements. A limitation of these models is that the data collected by the foundry only accurately model the saturation region. This is fine for most users, but when operating devices in the subthreshold region they are inadequate for accurate simulation results. This is why optimizing the current SPICE models to characterize the subthreshold region is so important. In order to accurately simulate this region of operation, MOSFETs of varying widths and lengths are fabricated and the electrical test data is collected. From the data collected the parameters of the model files are optimized through parameter extraction rather than curve fitting. With the completed optimized models the circuit designer is able to simulate circuit designs for the sub threshold region accurately.

  14. Optimization of Microelectronic Devices for Sensor Applications

    NASA Technical Reports Server (NTRS)

    Cwik, Tom; Klimeck, Gerhard

    2000-01-01

    The NASA/JPL goal to reduce payload in future space missions while increasing mission capability demands miniaturization of active and passive sensors, analytical instruments and communication systems among others. Currently, typical system requirements include the detection of particular spectral lines, associated data processing, and communication of the acquired data to other systems. Advances in lithography and deposition methods result in more advanced devices for space application, while the sub-micron resolution currently available opens a vast design space. Though an experimental exploration of this widening design space-searching for optimized performance by repeated fabrication efforts-is unfeasible, it does motivate the development of reliable software design tools. These tools necessitate models based on fundamental physics and mathematics of the device to accurately model effects such as diffraction and scattering in opto-electronic devices, or bandstructure and scattering in heterostructure devices. The software tools must have convenient turn-around times and interfaces that allow effective usage. The first issue is addressed by the application of high-performance computers and the second by the development of graphical user interfaces driven by properly developed data structures. These tools can then be integrated into an optimization environment, and with the available memory capacity and computational speed of high performance parallel platforms, simulation of optimized components can proceed. In this paper, specific applications of the electromagnetic modeling of infrared filtering, as well as heterostructure device design will be presented using genetic algorithm global optimization methods.

  15. GPM Timeline Inhibits For IT Processing

    NASA Technical Reports Server (NTRS)

    Dion, Shirley K.

    2014-01-01

    The Safety Inhibit Timeline Tool was created as one approach to capturing and understanding inhibits and controls from IT through launch. Global Precipitation Measurement (GPM) Mission, which launched from Japan in March 2014, was a joint mission under a partnership between the National Aeronautics and Space Administration (NASA) and the Japan Aerospace Exploration Agency (JAXA). GPM was one of the first NASA Goddard in-house programs that extensively used software controls. Using this tool during the GPM buildup allowed a thorough review of inhibit and safety critical software design for hazardous subsystems such as the high gain antenna boom, solar array, and instrument deployments, transmitter turn-on, propulsion system release, and instrument radar turn-on. The GPM safety team developed a methodology to document software safety as part of the standard hazard report. As a result of this process, a new tool safety inhibit timeline was created for management of inhibits and their controls during spacecraft buildup and testing during IT at GSFC and at the launch range in Japan. The Safety Inhibit Timeline Tool was a pathfinder approach for reviewing software that controls the electrical inhibits. The Safety Inhibit Timeline Tool strengthens the Safety Analysts understanding of the removal of inhibits during the IT process with safety critical software. With this tool, the Safety Analyst can confirm proper safe configuration of a spacecraft during each IT test, track inhibit and software configuration changes, and assess software criticality. In addition to understanding inhibits and controls during IT, the tool allows the Safety Analyst to better communicate to engineers and management the changes in inhibit states with each phase of hardware and software testing and the impact of safety risks. Lessons learned from participating in the GPM campaign at NASA and JAXA will be discussed during this session.

  16. A compact semiconductor digital interferometer and its applications

    NASA Astrophysics Data System (ADS)

    Britsky, Oleksander I.; Gorbov, Ivan V.; Petrov, Viacheslav V.; Balagura, Iryna V.

    2015-05-01

    The possibility of using semiconductor laser interferometers to measure displacements at the nanometer scale was demonstrated. The creation principles of miniature digital Michelson interferometers based on semiconductor lasers were proposed. The advanced processing algorithm for the interferometer quadrature signals was designed. It enabled to reduce restrictions on speed of measured movements. A miniature semiconductor digital Michelson interferometer was developed. Designing of the precision temperature stability system for miniature low-cost semiconductor laser with 0.01ºС accuracy enabled to use it for creation of compact interferometer rather than a helium-neon one. Proper firmware and software was designed for the interferometer signals real-time processing and conversion in to respective shifts. In the result the relative displacement between 0-500 mm was measured with a resolution of better than 1 nm. Advantages and disadvantages of practical use of the compact semiconductor digital interferometer in seismometers for the measurement of shifts were shown.

  17. Designing an architectural style for Pervasive Healthcare systems.

    PubMed

    Rafe, Vahid; Hajvali, Masoumeh

    2013-04-01

    Nowadays, the Pervasive Healthcare (PH) systems are considered as an important research area. These systems have a dynamic structure and configuration. Therefore, an appropriate method for designing such systems is necessary. The Publish/Subscribe Architecture (pub/sub) is one of the convenient architectures to support such systems. PH systems are safety critical; hence, errors can bring disastrous results. To prevent such problems, a powerful analytical tool is required. So using a proper formal language like graph transformation systems for developing of these systems seems necessary. But even if software engineers use such high level methodologies, errors may occur in the system under design. Hence, it should be investigated automatically and formally that whether this model of system satisfies all their requirements or not. In this paper, a dynamic architectural style for developing PH systems is presented. Then, the behavior of these systems is modeled and evaluated using GROOVE toolset. The results of the analysis show its high reliability.

  18. Invited article: Dielectric material characterization techniques and designs of high-Q resonators for applications from micro to millimeter-waves frequencies applicable at room and cryogenic temperatures.

    PubMed

    Le Floch, Jean-Michel; Fan, Y; Humbert, Georges; Shan, Qingxiao; Férachou, Denis; Bara-Maillet, Romain; Aubourg, Michel; Hartnett, John G; Madrangeas, Valerie; Cros, Dominique; Blondy, Jean-Marc; Krupka, Jerzy; Tobar, Michael E

    2014-03-01

    Dielectric resonators are key elements in many applications in micro to millimeter wave circuits, including ultra-narrow band filters and frequency-determining components for precision frequency synthesis. Distributed-layered and bulk low-loss crystalline and polycrystalline dielectric structures have become very important for building these devices. Proper design requires careful electromagnetic characterization of low-loss material properties. This includes exact simulation with precision numerical software and precise measurements of resonant modes. For example, we have developed the Whispering Gallery mode technique for microwave applications, which has now become the standard for characterizing low-loss structures. This paper will give some of the most common characterization techniques used in the micro to millimeter wave regime at room and cryogenic temperatures for designing high-Q dielectric loaded cavities.

  19. Ada Structure Design Language (ASDL)

    NASA Technical Reports Server (NTRS)

    Chedrawi, Lutfi

    1986-01-01

    An artist acquires all the necessary tools before painting a scene. In the same analogy, a software engineer needs the necessary tools to provide their design with the proper means for implementation. Ada provide these tools. Yet, as an artist's painting needs a brochure to accompany it for further explanation of the scene, an Ada design also needs a document along with it to show the design in its detailed structure and hierarchical order. Ada could be self-explanatory in small programs not exceeding fifty lines of code in length. But, in a large environment, ranging from thousands of lines and above, Ada programs need to be well documented to be preserved and maintained. The language used to specify an Ada document is called Ada Structure Design Language (ASDL). This language sets some rules to help derive a well formatted Ada detailed design document. The rules are defined to meet the needs of a project manager, a maintenance team, a programmer and a systems designer. The design document templates, the document extractor, and the rules set forth by the ASDL are explained in detail.

  20. Design and Optimization of a Telemetric system for appliance in earthquake prediction

    NASA Astrophysics Data System (ADS)

    Bogdos, G.; Tassoulas, E.; Vereses, A.; Papapanagiotou, A.; Filippi, K.; Koulouras, G.; Nomicos, C.

    2009-04-01

    This project's aim is to design a telemetric system which will be able to collect data from a digitizer, transform it into appropriate form and transfer this data to a central system where an on-line data elaboration will take place. On-line mathematical elaboration (fractal analysis) of pre-seismic electromagnetic signals and instant display may lead to safe earthquake prediction methodologies. Ad-hoc connections and heterogeneous topologies are the core network, while wired and wireless means cooperate for an accurate and on-time transmission. The nature of data is considered very sensitive so the transmission needs to be instant. All stations are situated in rural places in order to prevent electromagnetic interferences; this imposes continuous monitoring and provision of backup data links. The central stations collect the data of every station and allocate them properly in a predefined database. Special software is designed to elaborate mathematically the incoming data and export it graphically. The developing part included digitizer design, workstation software design, transmission protocol study and simulation on OPNET, database programming, mathematical data elaborations and software development for graphical representation. All the package was tested under lab conditions and tested in real conditions. The main aspect that this project serves is the very big interest for the scientific community in case this platform will eventually be implemented and then installed in Greek countryside in large scale. The platform is designed in such a way that techniques of data mining and mathematical elaboration are possible and any extension can be adapted. The main specialization of this project is that these mechanisms and mathematical transformations can be applied on live data. This can help to rapid exploitation of the real meaning of the measured and stored data. The elaboration of this study has as primary intention to help and alleviate the analysis process while triggering the scientific community to pay attention on seismic activities in Greece watching it on-line.

  1. Desktop Parallax and Proper Motion: A Laboratory Exercise on Astrometry of Asteroids from Project CLEA

    NASA Astrophysics Data System (ADS)

    Marschall, L. A.; Snyder, G. A.; Good, R. F.; Hayden, M. B.; Cooper, P. R.

    1998-12-01

    Students in introductory and advanced astronomy classes can now experience the process of discovering asteroids, can measure proper motions, and can actually see the parallax of real astronomical objects on the screen, using a new set of computer-based exercises from Project CLEA. The heart of the exercise is a sophisticated astrometry program "Astrometry of Asteroids", which is a restricted version of CLEA's research software "Tools for Astrometry" described elsewhere at this meeting. The program, as used in the teaching lab, allows students to read and display digital images, co-align pairs of images using designated reference stars, blink and identify moving objects on the pairs, compare images with charts produced from the HST Guide Star Catalog (GSC), and fit equatorial coordinates to the images using designated reference stars from the GSC. Complete technical manuals for the exercise are provided for the use of the instructor, and a set of digital images, in FITS format, is included for the exercise. A student manual is provided for an exercise in which students go through the step-by-step process of determining the tangential velocity of an asteroid. Students first examine a series of images of a near-earth asteroid taken over several hours, blinking pairs to identify the moving object. They next measure the equatorial coordinates on a half-dozen images, and from this calculate an angular velocity of the object. Finally, using a pair of images of the asteroid taken simultaneously at the National Undergraduate Research Observatory (NURO) and at Colgate University, they measure the parallax of the asteroid, and thus its distance, which enables them to convert the angular velocity into a tangential velocity. An optional set of 10 pairs of images is provided, some of which contain asteroids, so that students can try to "find the asteroid" for themselves. The software is extremely flexible, and though materials are provided for a self-contained exercise, teachers can adapt the material to a wide variety of uses. The software and manuals are currently available on the Web. Project CLEA is supported by grants from Gettysburg College and the National Science Foundation.

  2. Blowout Monitor

    NASA Technical Reports Server (NTRS)

    1994-01-01

    C Language Integrated Production System (CLIPS), a NASA-developed software shell for developing expert systems, has been embedded in a PC-based expert system for training oil rig personnel in monitoring oil drilling. Oil drilling rigs if not properly maintained for possible blowouts pose hazards to human life, property and the environment may be destroyed. CLIPS is designed to permit the delivery of artificial intelligence on computer. A collection of rules is set up and, as facts become known, these rules are applied. In the Well Site Advisor, CLIPS provides the capability to accurately process, predict and interpret well data in a real time mode. CLIPS was provided to INTEQ by COSMIC.

  3. Traditional versus rule-based programming techniques: Application to the control of optional flight information

    NASA Technical Reports Server (NTRS)

    Ricks, Wendell R.; Abbott, Kathy H.

    1987-01-01

    To the software design community, the concern over the costs associated with a program's execution time and implementation is great. It is always desirable, and sometimes imperative, that the proper programming technique is chosen which minimizes all costs for a given application or type of application. A study is described that compared cost-related factors associated with traditional programming techniques to rule-based programming techniques for a specific application. The results of this study favored the traditional approach regarding execution efficiency, but favored the rule-based approach regarding programmer productivity (implementation ease). Although this study examined a specific application, the results should be widely applicable.

  4. Design and implementation of projects with Xilinx Zynq FPGA: a practical case

    NASA Astrophysics Data System (ADS)

    Travaglini, R.; D'Antone, I.; Meneghini, S.; Rignanese, L.; Zuffa, M.

    The main advantage when using FPGAs with embedded processors is the availability of additional several high-performance resources in the same physical device. Moreover, the FPGA programmability allows for connect custom peripherals. Xilinx have designed a programmable device named Zynq-7000 (simply called Zynq in the following), which integrates programmable logic (identical to the other Xilinx "serie 7" devices) with a System on Chip (SOC) based on two embedded ARM processors. Since both parts are deeply connected, the designers benefit from performance of hardware SOC and flexibility of programmability as well. In this paper a design developed by the Electronic Design Department at the Bologna Division of INFN will be presented as a practical case of project based on Zynq device. It is developed by using a commercial board called ZedBoard hosting a FMC mezzanine with a 12-bit 500 MS/s ADC. The Zynq FPGA on the ZedBoard receives digital outputs from the ADC and send them to the acquisition PC, after proper formatting, through a Gigabit Ethernet link. The major focus of the paper will be about the methodology to develop a Zynq-based design with the Xilinx Vivado software, enlightening how to configure the SOC and connect it with the programmable logic. Firmware design techniques will be presented: in particular both VHDL and IP core based strategies will be discussed. Further, the procedure to develop software for the embedded processor will be presented. Finally, some debugging tools, like the embedded Logic Analyzer, will be shown. Advantages and disadvantages with respect to adopting FPGA without embedded processors will be discussed.

  5. Final Report: CNC Micromachines LDRD No.10793

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JOKIEL JR., BERNHARD; BENAVIDES, GILBERT L.; BIEG, LOTHAR F.

    2003-04-01

    The three-year LDRD ''CNC Micromachines'' was successfully completed at the end of FY02. The project had four major breakthroughs in spatial motion control in MEMS: (1) A unified method for designing scalable planar and spatial on-chip motion control systems was developed. The method relies on the use of parallel kinematic mechanisms (PKMs) that when properly designed provide different types of motion on-chip without the need for post-fabrication assembly, (2) A new type of actuator was developed--the linear stepping track drive (LSTD) that provides open loop linear position control that is scalable in displacement, output force and step size. Several versionsmore » of this actuator were designed, fabricated and successfully tested. (3) Different versions of XYZ translation only and PTT motion stages were designed, successfully fabricated and successfully tested demonstrating absolutely that on-chip spatial motion control systems are not only possible, but are a reality. (4) Control algorithms, software and infrastructure based on MATLAB were created and successfully implemented to drive the XYZ and PTT motion platforms in a controlled manner. The control software is capable of reading an M/G code machine tool language file, decode the instructions and correctly calculate and apply position and velocity trajectories to the motion devices linear drive inputs to position the device platform along the trajectory as specified by the input file. A full and detailed account of design methodology, theory and experimental results (failures and successes) is provided.« less

  6. Developing a Pedagogical-Technical Framework to Improve Creative Writing

    ERIC Educational Resources Information Center

    Chong, Stefanie Xinyi; Lee, Chien-Sing

    2012-01-01

    There are many evidences of motivational and educational benefits from the use of learning software. However, there is a lack of study with regards to the teaching of creative writing. This paper aims to bridge the following gaps: first, the need for a proper framework for scaffolding creative writing through learning software; second, the lack of…

  7. Concept of a Cloud Service for Data Preparation and Computational Control on Custom HPC Systems in Application to Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Puzyrkov, Dmitry; Polyakov, Sergey; Podryga, Viktoriia; Markizov, Sergey

    2018-02-01

    At the present stage of computer technology development it is possible to study the properties and processes in complex systems at molecular and even atomic levels, for example, by means of molecular dynamics methods. The most interesting are problems related with the study of complex processes under real physical conditions. Solving such problems requires the use of high performance computing systems of various types, for example, GRID systems and HPC clusters. Considering the time consuming computational tasks, the need arises of software for automatic and unified monitoring of such computations. A complex computational task can be performed over different HPC systems. It requires output data synchronization between the storage chosen by a scientist and the HPC system used for computations. The design of the computational domain is also quite a problem. It requires complex software tools and algorithms for proper atomistic data generation on HPC systems. The paper describes the prototype of a cloud service, intended for design of atomistic systems of large volume for further detailed molecular dynamic calculations and computational management for this calculations, and presents the part of its concept aimed at initial data generation on the HPC systems.

  8. Telemedicine for Developing Countries. A Survey and Some Design Issues.

    PubMed

    Combi, Carlo; Pozzani, Gabriele; Pozzi, Giuseppe

    2016-11-02

    Developing countries need telemedicine applications that help in many situations, when physicians are a small number with respect to the population, when specialized physicians are not available, when patients and physicians in rural villages need assistance in the delivery of health care. Moreover, the requirements of telemedicine applications for developing countries are somewhat more demanding than for developed countries. Indeed, further social, organizational, and technical aspects need to be considered for successful telemedicine applications in developing countries. We consider all the major projects in telemedicine, devoted to developing countries, as described by the proper scientific literature. On the basis of such literature, we want to define a specific taxonomy that allows a proper classification and a fast overview of telemedicine projects in developing countries. Moreover, by considering both the literature and some recent direct experiences, we want to complete such overview by discussing some design issues to be taken into consideration when developing telemedicine software systems. We considered and reviewed the major conferences and journals in depth, and looked for reports on the telemedicine projects. We provide the reader with a survey of the main projects and systems, from which we derived a taxonomy of features of telemedicine systems for developing countries. We also propose and discuss some classification criteria for design issues, based on the lessons learned in this research area. We highlight some challenges and recommendations to be considered when designing a telemedicine system for developing countries.

  9. Investigation of structure in the modular light pipe component for LED automotive lamp

    NASA Astrophysics Data System (ADS)

    Chen, Hsi-Chao; Zhou, Yang; Huang, Chien-Sheng; Jhong, Wan-Ling; Cheng, Bo-Wei; Jhang, Jhe-Ming

    2014-09-01

    Light-Emitting Diodes (LEDs) have the advantages of small length, long lifetime, fast response time (μs), low voltage, good mechanical properties and environmental protection. Furthermore, LEDs could replace the halogen lamps to avoid the mercury pollution and economize the use of energy. Therefore, the LEDs could instead of the traditional lamp in the future and became an important light source. The proposal of this study was to investigate the effects of the structure and length of the reflector component for a LED automotive lamp. The novel LED automotive lamp was assembled by several different modularization columnar. The optimized design of the different structure and the length to the reflector was simulated by software TracePro. The design result must met the vehicle regulation of United Nations Economic Commission for Europe (UNECE) such as ECE-R19 etc. The structure of the light pipe could be designed by two steps structure. Then constitute the proper structure and choose different power LED to meet the luminous intensity of the vehicle regulation. The simulation result shows the proper structure and length has the best total luminous flux and a high luminous efficiency for the system. Also, the stray light could meet the vehicle regulation of ECE R19. Finally, the experimental result of the selected structure and length of the light pipe could match the simulation result above 80%.

  10. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  11. A Decision Support System for Planning, Control and Auditing of DoD Software Cost Estimation.

    DTIC Science & Technology

    1986-03-01

    is frequently used in U. S. Air Force software cost estimates. Barry Boehm’s Constructive Cost Estimation Model (COCOMO) was recently selected for use...are considered basic to the proper development of software. Pressman , [Ref. 11], addresses these basic elements in a manner which attempts to integrate...H., Jr., and Carlson, Eric D., Building E fective Decision SUDDOrt Systems, Prentice-Hal, EnglewoodNJ, 1982 11. Pressman , Roger S., o A Practioner’s A

  12. KCNSC Automated RAIL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branson, Donald

    The KCNSC Automated RAIL (Rolling Action Item List) system provides an electronic platform to manage and escalate rolling action items within an business and manufacturing environment at Honeywell. The software enables a tiered approach to issue management where issues are escalated up a management chain based on team input and compared to business metrics. The software manages action items at different levels of the organization and allows all users to discuss action items concurrently. In addition, the software drives accountability through timely emails and proper visibility during team meetings.

  13. Managing quality and compliance.

    PubMed

    McNeil, Alice; Koppel, Carl

    2015-01-01

    Critical care nurses assume vital roles in maintaining patient care quality. There are distinct facets to the process including standard setting, regulatory compliance, and completion of reports associated with these endeavors. Typically, multiple niche software applications are required and user interfaces are varied and complex. Although there are distinct quality indicators that must be tracked as well as a list of serious or sentinel events that must be documented and reported, nurses may not know the precise steps to ensure that information is properly documented and actually reaches the proper authorities for further investigation and follow-up actions. Technology advances have permitted the evolution of a singular software platform, capable of monitoring quality indicators and managing all facets of reporting associated with regulatory compliance.

  14. GeolOkit 1.0: a new Open Source, Cross-Platform software for geological data visualization in Google Earth environment

    NASA Astrophysics Data System (ADS)

    Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud

    2016-04-01

    GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.

  15. Flight control system design factors for applying automated testing techniques

    NASA Technical Reports Server (NTRS)

    Sitz, Joel R.; Vernon, Todd H.

    1990-01-01

    The principal design features and operational experiences of the X-29 forward-swept-wing aircraft and F-18 high alpha research vehicle (HARV) automated test systems are discussed. It is noted that operational experiences in developing and using these automated testing techniques have highlighted the need for incorporating target system features to improve testability. Improved target system testability can be accomplished with the addition of nonreal-time and real-time features. Online access to target system implementation details, unobtrusive real-time access to internal user-selectable variables, and proper software instrumentation are all desirable features of the target system. Also, test system and target system design issues must be addressed during the early stages of the target system development. Processing speeds of up to 20 million instructions/s and the development of high-bandwidth reflective memory systems have improved the ability to integrate the target system and test system for the application of automated testing techniques. It is concluded that new methods of designing testability into the target systems are required.

  16. Automated a complex computer aided design concept generated using macros programming

    NASA Astrophysics Data System (ADS)

    Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md

    2013-12-01

    Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.

  17. Rules of thumb to increase the software quality through testing

    NASA Astrophysics Data System (ADS)

    Buttu, M.; Bartolini, M.; Migoni, C.; Orlati, A.; Poppi, S.; Righini, S.

    2016-07-01

    The software maintenance typically requires 40-80% of the overall project costs, and this considerable variability mostly depends on the software internal quality: the more the software is designed and implemented to constantly welcome new changes, the lower will be the maintenance costs. The internal quality is typically enforced through testing, which in turn also affects the development and maintenance costs. This is the reason why testing methodologies have become a major concern for any company that builds - or is involved in building - software. Although there is no testing approach that suits all contexts, we infer some general guidelines learned during the Development of the Italian Single-dish COntrol System (DISCOS), which is a project aimed at producing the control software for the three INAF radio telescopes (the Medicina and Noto dishes, and the newly-built SRT). These guidelines concern both the development and the maintenance phases, and their ultimate goal is to maximize the DISCOS software quality through a Behavior-Driven Development (BDD) workflow beside a continuous delivery pipeline. We consider different topics and patterns; they involve the proper apportion of the tests (from end-to-end to low-level tests), the choice between hardware simulators and mockers, why and how to apply TDD and the dependency injection to increase the test coverage, the emerging technologies available for test isolation, bug fixing, how to protect the system from the external resources changes (firmware updating, hardware substitution, etc.) and, eventually, how to accomplish BDD starting from functional tests and going through integration and unit tests. We discuss pros and cons of each solution and point out the motivations of our choices either as a general rule or narrowed in the context of the DISCOS project.

  18. Use of Soft Computing Technologies for a Qualitative and Reliable Engine Control System for Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)

    2001-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by employing soft computing technologies, the quality and reliability of the overall scheme to engine controller development is further improved and vehicle safety is further insured. The final product that this paper proposes is an approach to development of an alternative low cost engine controller that would be capable of performing in unique vision spacecraft vehicles requiring low cost advanced avionics architectures for autonomous operations from engine pre-start to engine shutdown.

  19. NASA X-34 Technology in Motion

    NASA Technical Reports Server (NTRS)

    Beech, Geoffrey; Chandler, Kristie

    1997-01-01

    The X-34 technology development program is a joint industry/government project to develop, test, and operate a small, fully-reusable hypersonic flight vehicle. The objective is to demonstrate key technologies and operating concepts applicable to future reusable launch vehicles. Integrated in the vehicle are various systems to assure successful completion of mission objectives, including the Main Propulsion System (MPS). NASA-Marshall Space Flight Center (MSFC) is responsible for developing the X-34's MPS including the design and complete build package for the propulsion system components. The X-34 will be powered by the Fastrac Engine, which is currently in design and development at NASA-MSFC. Fastrac is a single-stage main engine, which burns a mixture of liquid oxygen (LOX) and kerosene(RP-1). The interface between the MPS and Fastrac engine are critical for proper system operation and technologies applicable to future reusable launch vehicles. Deneb's IGRIP software package with the Dynamic analysis option provided a key tool for conducting studies critical to this interface as well as a mechanism to drive the design of the LOX and RP-1 feedlines. Kinematic models were created for the Fastrac Engine and the feedlines for various design concepts. Based on the kinematic simulation within Envision, design and joint limits were verified and system interference controlled. It was also critical to the program to evaluate the effect of dynamic loads visually, providing a verification tool for dynamic analysis and in some cases uncovering areas that had not been considered. Deneb's software put the X-34 technology in motion and has been a key factor in facilitating the strenuous design schedule.

  20. Emissivity of Rocket Plume Particulates

    DTIC Science & Technology

    1992-09-01

    V. EXPERIMENTAL RESULTS ........ ............... 29 VI. CONCLUSIONS AND RECOMMENDATIONS .... ........ 32 APPENDIX A. CATS -E SOFTWARE...interfaced through the CATS E Thermal Analysis software, which is MS-DOS based, and can be run on any 28b or higher CPU. This system allows real-time...body source to establish the parameters required by the CATS program for proper microscope/scanner interface. A complete description of microscope

  1. Project Management Software: Proper Selection for Use Within Air Force Systems Command

    DTIC Science & Technology

    1989-09-01

    Figures ......... ..................... vi List of Tables ........ ..................... vii Abstract .......... ....................... viii I...140 vi List of Tables Table Page 1. Stage 4 Evaluation Criteria and Weights ...... . 53 2. Price Ranges and Associated Grades .. ......... 65 3...Chronos Software, Inc. 11. Who-What-When San Francisco, CA VI .09 $ 190. (800) 777-7907 128 Communication Dynamics, Inc. 12. Timepiece Portland, OR V1.3

  2. Assessing Your Assets: Systems for Tracking and Managing IT Assets Can Save Time and Dollars

    ERIC Educational Resources Information Center

    Holub, Patricia A.

    2007-01-01

    The average school district loses more than $80,000 per year because of lost or damaged IT assets, according to a QED survey cosponsored by Follett Software Company. And many districts--59 percent--still use manual systems to track assets. Enter asset management systems. Software for managing assets, when implemented properly, can save time,…

  3. Implementation of Motion Simulation Software and Visual-Auditory Electronics for Use in a Low Gravity Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Martin, William Campbell

    2011-01-01

    The Jet Propulsion Laboratory (JPL) is developing the All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) to assist in manned space missions. One of the proposed targets for this robotic vehicle is a near-Earth asteroid (NEA), which typically exhibit a surface gravity of only a few micro-g. In order to properly test ATHLETE in such an environment, the development team has constructed an inverted Stewart platform testbed that acts as a robotic motion simulator. This project focused on creating physical simulation software that is able to predict how ATHLETE will function on and around a NEA. The corresponding platform configurations are calculated and then passed to the testbed to control ATHLETE's motion. In addition, imitation attitude, imitation attitude control thrusters were designed and fabricated for use on ATHLETE. These utilize a combination of high power LEDs and audio amplifiers to provide visual and auditory cues that correspond to the physics simulation.

  4. Modeling the Dynamics of Task Allocation and Specialization in Honeybee Societies

    NASA Astrophysics Data System (ADS)

    Hoogendoorn, Mark; Schut, Martijn C.; Treur, Jan

    The concept of organization has been studied in sciences such as social science and economics, but recently also in artificial intelligence [Furtado 2005, Giorgini 2004, and McCallum 2005]. With the desire to analyze and design more complex systems consisting of larger numbers of agents (e.g., in nature, society, or software), the need arises for a concept of higher abstraction than the concept agent. To this end, organizational modeling is becoming a practiced stage in the analysis and design of multi-agent systems, hereby taking into consideration the environment of the organization. An environment can have a high degree of variability which might require organizations to adapt to the environment's dynamics, to ensure a continuous proper functioning of the organization. Hence, such change processes are a crucial function of the organization and should be part of the organizational model.

  5. Ergonomics in the electronic library.

    PubMed Central

    Thibodeau, P L; Melamut, S J

    1995-01-01

    New technologies are changing the face of information services and how those services are delivered. Libraries spend a great deal of time planning the hardware and software implementations of electronic information services, but the human factors are often overlooked. Computers and electronic tools have changed the nature of many librarians' daily work, creating new problems, including stress, fatigue, and cumulative trauma disorders. Ergonomic issues need to be considered when designing or redesigning facilities for electronic resources and services. Libraries can prevent some of the common problems that appear in the digital workplace by paying attention to basic ergonomic issues when designing workstations and work areas. Proper monitor placement, lighting, workstation setup, and seating prevent many of the common occupational problems associated with computers. Staff training will further reduce the likelihood of ergonomic problems in the electronic workplace. PMID:7581189

  6. Integral equation and discontinuous Galerkin methods for the analysis of light-matter interaction

    NASA Astrophysics Data System (ADS)

    Baczewski, Andrew David

    Light-matter interaction is among the most enduring interests of the physical sciences. The understanding and control of this physics is of paramount importance to the design of myriad technologies ranging from stained glass, to molecular sensing and characterization techniques, to quantum computers. The development of complex engineered systems that exploit this physics is predicated at least partially upon in silico design and optimization that properly capture the light-matter coupling. In this thesis, the details of computational frameworks that enable this type of analysis, based upon both Integral Equation and Discontinuous Galerkin formulations will be explored. There will be a primary focus on the development of efficient and accurate software, with results corroborating both. The secondary focus will be on the use of these tools in the analysis of a number of exemplary systems.

  7. Human-computer interaction reflected in the design of user interfaces for general practitioners.

    PubMed

    Stoicu-Tivadar, Lacramioara; Stoicu-Tivadar, Vasile

    2006-01-01

    To address the problem of properly built health information systems in general practice as an important issue for their approval and use in clinical practice. We present how a national general practitioner (GP) network was built, put in practice and several results of its activity seen from the clinician's and the software application team's points of view. We used a multi-level incremental development appropriate for the conditions of the required information system. After the development of the first version of the software components (based on rapid prototyping) of the sentinel network, a questionnaire addressed the needs and improvements required by the health professionals. Based on the answers, the functionality of the system and the interface were improved regarding the real needs expressed by the end-users. The network is functional and the collected data from the network are being processed using statistical methods. The academic software team developed a GP application that is well received by the GPs in the network, as resulted from the survey and discussions during the training period. As an added confirmation, several GPs outside the network enrolled after seeing the software at work. Another confirmation that we did a good job was that after the final presentation of the results of the project a representative from the Romanian Society for Cardiology expressed the wish of this society to access the data yielded by the network.

  8. Universal mechatronics coordinator

    NASA Astrophysics Data System (ADS)

    Muir, Patrick F.

    1999-11-01

    Mechatronic systems incorporate multiple actuators and sensor which must be properly coordinated to achieve the desired system functionality. Many mechatronic systems are designed as one-of-a-kind custom projects without consideration for facilitating future system or alterations and extensions to the current syste. Thus, subsequent changes to the system are slow, different, and costly. It has become apparent that manufacturing processes, and thus the mechatronics which embody them, need to be agile in order to more quickly and easily respond to changing customer demands or market pressures. To achieve agility, both the hardware and software of the system need to be designed such that the creation of new system and the alteration and extension of current system is fast and easy. This paper describes the design of a Universal Mechatronics Coordinator (UMC) which facilitates agile setup and changeover of coordination software for mechatronic systems. The UMC is capable of sequencing continuous and discrete actions that are programmed as stimulus-response pairs, as state machines, or a combination of the two. It facilitates the modular, reusable programing of continuous actions such as servo control algorithms, data collection code, and safety checking routines; and discrete actions such as reporting achieved states, and turning on/off binary devices. The UMC has been applied to the control of a z- theta assembly robot for the Minifactory project and is applicable to a spectrum of widely differing mechatronic systems.

  9. NDAS Hardware Translation Layer Development

    NASA Technical Reports Server (NTRS)

    Nazaretian, Ryan N.; Holladay, Wendy T.

    2011-01-01

    The NASA Data Acquisition System (NDAS) project is aimed to replace all DAS software for NASA s Rocket Testing Facilities. There must be a software-hardware translation layer so the software can properly talk to the hardware. Since the hardware from each test stand varies, drivers for each stand have to be made. These drivers will act more like plugins for the software. If the software is being used in E3, then the software should point to the E3 driver package. If the software is being used at B2, then the software should point to the B2 driver package. The driver packages should also be filled with hardware drivers that are universal to the DAS system. For example, since A1, A2, and B2 all use the Preston 8300AU signal conditioners, then the driver for those three stands should be the same and updated collectively.

  10. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  11. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    NASA Astrophysics Data System (ADS)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  12. Mission planning, mission analysis and software formulation. Level C requirements for the shuttle mission control center orbital guidance software

    NASA Technical Reports Server (NTRS)

    Langston, L. J.

    1976-01-01

    The formulation of Level C requirements for guidance software was reported. Requirements for a PEG supervisor which controls all input/output interfaces with other processors and determines which PEG mode is to be utilized were studied in detail. A description of the two guidance modes for which Level C requirements have been formulated was presented. Functions required for proper execution of the guidance software were defined. The requirements for a navigation function that is used in the prediction logic of PEG mode 4 were discussed. It is concluded that this function is extracted from the current navigation FSSR.

  13. Airborne antenna pattern calculations

    NASA Technical Reports Server (NTRS)

    Knerr, T. J.; Mielke, R. R.

    1981-01-01

    Progress on the development of modeling software, testing software against caclulated data from program VPAP and measured patterns, and calculating roll plane patterns for general aviation aircraft is reported. Major objectives are the continued development of computer software for aircraft modeling and use of this software and program OSUVOL to calculate principal plane and volumetric radiation patterns. The determination of proper placement of antennas on aircraft to meet the requirements of the Microwave Landing System is discussed. An overview of the performed work, and an example of a roll plane model for the Piper PA-31T Cheyenne aircraft and the resulting calculated roll plane radiation pattern are included.

  14. "Proximal Sensing" capabilities for snow cover monitoring

    NASA Astrophysics Data System (ADS)

    Valt, Mauro; Salvatori, Rosamaria; Plini, Paolo; Salzano, Roberto; Giusti, Marco; Montagnoli, Mauro; Sigismondi, Daniele; Cagnati, Anselmo

    2013-04-01

    The seasonal snow cover represents one of the most important land cover class in relation to environmental studies in mountain areas, especially considering its variation during time. Snow cover and its extension play a relevant role for the studies on the atmospheric dynamics and the evolution of climate. It is also important for the analysis and management of water resources and for the management of touristic activities in mountain areas. Recently, webcam images collected at daily or even hourly intervals are being used as tools to observe the snow covered areas; those images, properly processed, can be considered a very important environmental data source. Images captured by digital cameras become a useful tool at local scale providing images even when the cloud coverage makes impossible the observation by satellite sensors. When suitably processed these images can be used for scientific purposes, having a good resolution (at least 800x600x16 million colours) and a very good sampling frequency (hourly images taken through the whole year). Once stored in databases, those images represent therefore an important source of information for the study of recent climatic changes, to evaluate the available water resources and to analyse the daily surface evolution of the snow cover. The Snow-noSnow software has been specifically designed to automatically detect the extension of snow cover collected from webcam images with a very limited human intervention. The software was tested on images collected on Alps (ARPAV webcam network) and on Apennine in a pilot station properly equipped for this project by CNR-IIA. The results obtained through the use of Snow-noSnow are comparable to the one achieved by photo-interpretation and could be considered as better as the ones obtained using the image segmentation routine implemented into image processing commercial softwares. Additionally, Snow-noSnow operates in a semi-automatic way and has a reduced processing time. The analysis of this kind of images could represent an useful element to support the interpretation of remote sensing images, especially those provided by high spatial resolution sensors. Keywords: snow cover monitoring, digital images, software, Alps, Apennines.

  15. LAMPAT and LAMPATNL User’s Manual

    DTIC Science & Technology

    2012-09-01

    nonlinearity. These tools are implemented as subroutines in the finite element software ABAQUS . This user’s manual provides information on the proper...model either through the General tab of the Edit Job dialog box in Abaqus /CAE or the command line with user=( subroutine filename). Table 1...Selection of software product and subroutine . Static Analysis With Abaqus /Standard Dynamic Analysis With Abaqus /Explicit Linear, uncoupled

  16. Cost Estimating Cases: Educational Tools for Cost Analysts

    DTIC Science & Technology

    1993-09-01

    only appropriate documentation should be provided. In other words, students should not submit all of the documentation possible using ACEIT , only that...case was their lack of understanding of the ACEIT software used to conduct the estimate. Specifically, many students misinterpreted the cost...estimating relationships (CERs) embedded in the 49 software. Additionally, few of the students were able to properly organize the ACEIT documentation output

  17. The Concept and Control Capabilities of Universal Electric Vehicle Prototype using LabView Software

    NASA Astrophysics Data System (ADS)

    Skowronek, Hubert; Waszczuk, Kamil; Kowalski, Maciej; Karolczak, Paweł; Baral, Bivek

    2016-10-01

    The concept of drive control prototype electric car designed in assumptions for sales in the markets of developing countries, mainly in South Asia has been presented in the article. The basic requirements for this type of vehicles and the possibility of rapid prototyping onboard equipment for the purpose of preliminary tests have been presented. The control system was composed of a PC and measurement card myRIO and has two operating modes. In the first of them can simulate changes of each components parameters and checking of program proper functioning. In the second mode, instead of the simulation it is possible to control the real object.

  18. Isothermal dendritic growth: A low gravity experiment

    NASA Technical Reports Server (NTRS)

    Glicksman, M. E.; Hahn, R. C.; Lograsso, T. A.; Rubinstein, E. R.; Selleck, M. E.; Winsa, E.

    1988-01-01

    The Isothermal Dendritic Growth Experiment is an active crystal growth experiment designed to test dendritic growth theory at low undercoolings where convection prohibits such studies at 1 g. The experiment will be essentially autonomous, though limited in-flight interaction through a computer interface is planned. One of the key components of the apparatus will be a crystal growth chamber capable of achieving oriented single crystal dendritic growth. Recent work indicates that seeding the chamber with a crystal of the proper orientation will not, in and of itself, be sufficient to meet this requirement. Additional flight hardware and software required for the STS flight experiment are currently being developed at NASA Lewis Research Center and at Rensselaer Polytechnic Institute.

  19. Composite Design and Manufacturing Development for Human Spacecrafts

    NASA Technical Reports Server (NTRS)

    Litteken, Douglas; Lowry, David

    2013-01-01

    The Structural Engineering Division at the NASA Johnson Space Center (JSC) has begun work on lightweight, multi-functional pressurized composite structures. The first candidate vehicle for technology development is the Multi-Mission Space Exploration Vehicle (MMSEV) cabin, known as the Gen 2B cabin, which has been built at JSC by the Robotics Division. Of the habitable MMSEV vehicle prototypes designed to date, this is the first one specifically analyzed and tested to hold internal pressure and the only one made out of composite materials. This design uses a laminate base with zoned reinforcement and external stringers, intended to demonstrate certain capabilities, and to prepare for the next cabin design, which will be a composite sandwich panel construction with multi-functional capabilities. As part of this advanced development process, a number of new technologies were used to assist in the design and manufacturing process. One of the methods, new to JSC, was to build the Gen 2B cabin with Out of Autoclave technology to permit the creation of larger parts with fewer joints. An 8-ply pre-preg layup was constructed to form the cabin body. Prior to lay-up, a design optimization software called FiberSIM was used to create each ply pattern. This software is integrated with Pro/Engineer to allow for customized draping of each fabric ply over the complex tool surface. Slits and darts are made in the software model to create an optimal design that maintains proper fiber placement and orientation. The flat pattern of each ply is then exported and sent to an automated cutting table where the patterns are cut out of graphite material. Additionally, to assist in lay-up, a laser projection system (LPT) is used to project outlines of each ply directly onto the tool face for accurate fiber placement and ply build-up. Finally, as part of the OoA process, a large oven was procured to post-cure each part. After manufacturing complete, the cabin underwent modal and pressure testing (currently in progress at date of writing) and will go on to be outfitted and used for further ops usage.

  20. An Interactive Method of Characteristics Java Applet to Design and Analyze Supersonic Aircraft Nozzles

    NASA Technical Reports Server (NTRS)

    Benson, Thomas J.

    2014-01-01

    The Method of Characteristics (MOC) is a classic technique for designing supersonic nozzles. An interactive computer program using MOC has been developed to allow engineers to design and analyze supersonic nozzle flow fields. The program calculates the internal flow for many classic designs, such as a supersonic wind tunnel nozzle, an ideal 2D or axisymmetric nozzle, or a variety of plug nozzles. The program also calculates the plume flow produced by the nozzle and the external flow leading to the nozzle exit. The program can be used to assess the interactions between the internal, external and plume flows. By proper design and operation of the nozzle, it may be possible to lessen the strength of the sonic boom produced at the rear of supersonic aircraft. The program can also calculate non-ideal nozzles, such as simple cone flows, to determine flow divergence and nonuniformities at the exit, and its effect on the plume shape. The computer program is written in Java and is provided as free-ware from the NASA Glenn central software server.

  1. An automation of design and modelling tasks in NX Siemens environment with original software - cost module

    NASA Astrophysics Data System (ADS)

    Zbiciak, R.; Grabowik, C.; Janik, W.

    2015-11-01

    The design-constructional process is a creation activity which strives to fulfil, as well as it possible at the certain moment of time, all demands and needs formulated by a user taking into account social, technical and technological advances. Engineer knowledge and skills and their inborn abilities have the greatest influence on the final product quality and cost. They have also deciding influence on product technical and economic value. Taking into account above it seems to be advisable to make software tools that support an engineer in the process of manufacturing cost estimation. The Cost module is built with analytical procedures which are used for relative manufacturing cost estimation. As in the case of the Generator module the Cost module was written in object programming language C# in Visual Studio environment. During the research the following eight factors, that have the greatest influence on overall manufacturing cost, were distinguished and defined: (i) a gear wheel teeth type it is straight or helicoidal, (ii) a gear wheel design shape A, B with or without wheel hub, (iii) a gear tooth module, (iv) teeth number, (v) gear rim width, (vi) gear wheel material, (vii) heat treatment or thermochemical treatment, (viii) accuracy class. Knowledge of parameters (i) to (v) is indispensable for proper modelling of 3D gear wheels models in CAD system environment. These parameters are also processed in the Cost module. The last three parameters it is (vi) to (viii) are exclusively used in the Cost module. The estimation of manufacturing relative cost is based on indexes calculated for each particular parameter. Estimated in this way the manufacturing relative cost gives an overview of design parameters influence on the final gear wheel manufacturing cost. This relative manufacturing cost takes values from 0.00 to 1,00 range. The bigger index value the higher relative manufacturing cost is. Verification whether the proposed algorithm of relative manufacturing costs estimation has been designed properly was made by comparison of the achieved from the algorithm results with those obtained from industry. This verification has indicated that in most cases both group of results are similar. Taking into account above it is possible to draw a conclusion that the Cost module might play significant role in design constructional process by adding an engineer at the selection stage of alternative gear wheels design. It should be remembered that real manufacturing cost can differ significantly according to available in a factory manufacturing techniques and stock of machine tools.

  2. Telemedicine for Developing Countries

    PubMed Central

    Combi, Carlo; Pozzani, Gabriele

    2016-01-01

    Summary Background Developing countries need telemedicine applications that help in many situations, when physicians are a small number with respect to the population, when specialized physicians are not available, when patients and physicians in rural villages need assistance in the delivery of health care. Moreover, the requirements of telemedicine applications for developing countries are somewhat more demanding than for developed countries. Indeed, further social, organizational, and technical aspects need to be considered for successful telemedicine applications in developing countries. Objective We consider all the major projects in telemedicine, devoted to developing countries, as described by the proper scientific literature. On the basis of such literature, we want to define a specific taxonomy that allows a proper classification and a fast overview of telemedicine projects in developing countries. Moreover, by considering both the literature and some recent direct experiences, we want to complete such overview by discussing some design issues to be taken into consideration when developing telemedicine software systems. Methods We considered and reviewed the major conferences and journals in depth, and looked for reports on the telemedicine projects. Results We provide the reader with a survey of the main projects and systems, from which we derived a taxonomy of features of telemedicine systems for developing countries. We also propose and discuss some classification criteria for design issues, based on the lessons learned in this research area. Conclusions We highlight some challenges and recommendations to be considered when designing a telemedicine system for developing countries. PMID:27803948

  3. Calibration of 3D ALE finite element model from experiments on friction stir welding of lap joints

    NASA Astrophysics Data System (ADS)

    Fourment, Lionel; Gastebois, Sabrina; Dubourg, Laurent

    2016-10-01

    In order to support the design of such a complex process like Friction Stir Welding (FSW) for the aeronautic industry, numerical simulation software requires (1) developing an efficient and accurate Finite Element (F.E.) formulation that allows predicting welding defects, (2) properly modeling the thermo-mechanical complexity of the FSW process and (3) calibrating the F.E. model from accurate measurements from FSW experiments. This work uses a parallel ALE formulation developed in the Forge® F.E. code to model the different possible defects (flashes and worm holes), while pin and shoulder threads are modeled by a new friction law at the tool / material interface. FSW experiments require using a complex tool with scroll on shoulder, which is instrumented for providing sensitive thermal data close to the joint. Calibration of unknown material thermal coefficients, constitutive equations parameters and friction model from measured forces, torques and temperatures is carried out using two F.E. models, Eulerian and ALE, to reach a satisfactory agreement assessed by the proper sensitivity of the simulation to process parameters.

  4. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    NASA Astrophysics Data System (ADS)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used to probe the design space of several local minima and maxima. After analysis of numerous samples, an optimum configuration of the design that is more stable than that of the initial design is reached. The above process requires several software tools: CATIA as the CAD tool, ANSYS as the FEA tool, VABS for obtaining the cross-sectional structural properties, and DYMORE for the frequency and dynamic analysis of the rotor. MATLAB codes are also employed to generate input files and read output files of DYMORE. All these tools are connected using ModelCenter.

  5. GCS plan for software aspects of certification

    NASA Technical Reports Server (NTRS)

    Shagnea, Anita M.; Lowman, Douglas S.; Withers, B. Edward

    1990-01-01

    As part of the Guidance and Control Software (GCS) research project being sponsored by NASA to evaluate the failure processes of software, standard industry software development procedures are being employed. To ensure that these procedures are authentic, the guidelines outlined in the Radio Technical Commission for Aeronautics (RTCA/DO-178A document entitled, software considerations in airborne systems and equipment certification, were adopted. A major aspect of these guidelines is proper documentation. As such, this report, the plan for software aspects of certification, was produced in accordance with DO-178A. An overview is given of the GCS research project, including the goals of the project, project organization, and project schedules. It also specifies the plans for all aspects of the project which relate to the certification of the GCS implementations developed under a NASA contract. These plans include decisions made regarding the software specification, accuracy requirements, configuration management, implementation development and verification, and the development of the GCS simulator.

  6. A simple and efficient alternative to implementing systematic random sampling in stereological designs without a motorized microscope stage.

    PubMed

    Melvin, Neal R; Poda, Daniel; Sutherland, Robert J

    2007-10-01

    When properly applied, stereology is a very robust and efficient method to quantify a variety of parameters from biological material. A common sampling strategy in stereology is systematic random sampling, which involves choosing a random sampling [corrected] start point outside the structure of interest, and sampling relevant objects at [corrected] sites that are placed at pre-determined, equidistant intervals. This has proven to be a very efficient sampling strategy, and is used widely in stereological designs. At the microscopic level, this is most often achieved through the use of a motorized stage that facilitates the systematic random stepping across the structure of interest. Here, we report a simple, precise and cost-effective software-based alternative to accomplishing systematic random sampling under the microscope. We believe that this approach will facilitate the use of stereological designs that employ systematic random sampling in laboratories that lack the resources to acquire costly, fully automated systems.

  7. FPGA-based voltage and current dual drive system for high frame rate electrical impedance tomography.

    PubMed

    Khan, Shadab; Manwaring, Preston; Borsic, Andrea; Halter, Ryan

    2015-04-01

    Electrical impedance tomography (EIT) is used to image the electrical property distribution of a tissue under test. An EIT system comprises complex hardware and software modules, which are typically designed for a specific application. Upgrading these modules is a time-consuming process, and requires rigorous testing to ensure proper functioning of new modules with the existing ones. To this end, we developed a modular and reconfigurable data acquisition (DAQ) system using National Instruments' (NI) hardware and software modules, which offer inherent compatibility over generations of hardware and software revisions. The system can be configured to use up to 32-channels. This EIT system can be used to interchangeably apply current or voltage signal, and measure the tissue response in a semi-parallel fashion. A novel signal averaging algorithm, and 512-point fast Fourier transform (FFT) computation block was implemented on the FPGA. FFT output bins were classified as signal or noise. Signal bins constitute a tissue's response to a pure or mixed tone signal. Signal bins' data can be used for traditional applications, as well as synchronous frequency-difference imaging. Noise bins were used to compute noise power on the FPGA. Noise power represents a metric of signal quality, and can be used to ensure proper tissue-electrode contact. Allocation of these computationally expensive tasks to the FPGA reduced the required bandwidth between PC, and the FPGA for high frame rate EIT. In 16-channel configuration, with a signal-averaging factor of 8, the DAQ frame rate at 100 kHz exceeded 110 frames s (-1), and signal-to-noise ratio exceeded 90 dB across the spectrum. Reciprocity error was found to be for frequencies up to 1 MHz. Static imaging experiments were performed on a high-conductivity inclusion placed in a saline filled tank; the inclusion was clearly localized in the reconstructions obtained for both absolute current and voltage mode data.

  8. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  9. Advantages and Challenges of 10-Gbps Transmission on High-Density Interconnect Boards

    NASA Astrophysics Data System (ADS)

    Yee, Chang Fei; Jambek, Asral Bahari; Al-Hadi, Azremi Abdullah

    2016-06-01

    This paper provides a brief introduction to high-density interconnect (HDI) technology and its implementation on printed circuit boards (PCBs). The advantages and challenges of implementing 10-Gbps signal transmission on high-density interconnect boards are discussed in detail. The advantages (e.g., smaller via dimension and via stub removal) and challenges (e.g., crosstalk due to smaller interpair separation) of HDI are studied by analyzing the S-parameter, time-domain reflectometry (TDR), and transmission-line eye diagrams obtained by three-dimensional electromagnetic modeling (3DEM) and two-dimensional electromagnetic modeling (2DEM) using Mentor Graphics HyperLynx and Keysight Advanced Design System (ADS) electronic computer-aided design (ECAD) software. HDI outperforms conventional PCB technology in terms of signal integrity, but proper routing topology should be applied to overcome the challenge posed by crosstalk due to the tight spacing between traces.

  10. Manikin families representing obese airline passengers in the US.

    PubMed

    Park, Hanjun; Park, Woojin; Kim, Yongkang

    2014-01-01

    Aircraft passenger spaces designed without proper anthropometric analyses can create serious problems for obese passengers, including: possible denial of boarding, excessive body pressures and contact stresses, postural fixity and related health hazards, and increased risks of emergency evacuation failure. In order to help address the obese passenger's accommodation issues, this study developed male and female manikin families that represent obese US airline passengers. Anthropometric data of obese individuals obtained from the CAESAR anthropometric database were analyzed through PCA-based factor analyses. For each gender, a 99% enclosure cuboid was constructed, and a small set of manikins was defined on the basis of each enclosure cuboid. Digital human models (articulated human figures) representing the manikins were created using a human CAD software program. The manikin families were utilized to develop design recommendations for selected aircraft seat dimensions. The manikin families presented in this study would greatly facilitate anthropometrically accommodating large airline passengers.

  11. Hierarchical MFMO Circuit Modules for an Energy-Efficient SDR DBF

    NASA Astrophysics Data System (ADS)

    Mar, Jeich; Kuo, Chi-Cheng; Wu, Shin-Ru; Lin, You-Rong

    The hierarchical multi-function matrix operation (MFMO) circuit modules are designed using coordinate rotations digital computer (CORDIC) algorithm for realizing the intensive computation of matrix operations. The paper emphasizes that the designed hierarchical MFMO circuit modules can be used to develop a power-efficient software-defined radio (SDR) digital beamformer (DBF). The formulas of the processing time for the scalable MFMO circuit modules implemented in field programmable gate array (FPGA) are derived to allocate the proper logic resources for the hardware reconfiguration. The hierarchical MFMO circuit modules are scalable to the changing number of array branches employed for the SDR DBF to achieve the purpose of power saving. The efficient reuse of the common MFMO circuit modules in the SDR DBF can also lead to energy reduction. Finally, the power dissipation and reconfiguration function in the different modes of the SDR DBF are observed from the experiment results.

  12. Orion Entry Display Feeder and Interactions with the Entry Monitor System

    NASA Technical Reports Server (NTRS)

    Baird, Darren; Bernatovich, Mike; Gillespie, Ellen; Kadwa, Binaifer; Matthews, Dave; Penny, Wes; Zak, Tim; Grant, Mike; Bihari, Brian

    2010-01-01

    The Orion spacecraft is designed to return astronauts to a landing within 10 km of the intended landing target from low Earth orbit, lunar direct-entry, and lunar skip-entry trajectories. Al pile the landing is nominally controlled autonomously, the crew can fly precision entries manually in the event of an anomaly. The onboard entry displays will be used by the crew to monitor and manually fly the entry, descent, and landing, while the Entry Monitor System (EMS) will be used to monitor the health and status of the onboard guidance and the trajectory. The entry displays are driven by the entry display feeder, part of the Entry Monitor System (EMS). The entry re-targeting module, also part of the EMS, provides all the data required to generate the capability footprint of the vehicle at any point in the trajectory, which is shown on the Primary Flight Display (PFD). It also provides caution and warning data and recommends the safest possible re-designated landing site when the nominal landing site is no longer within the capability of the vehicle. The PFD and the EMS allow the crew to manually fly an entry trajectory profile from entry interface until parachute deploy having the flexibility to manually steer the vehicle to a selected landing site that best satisfies the priorities of the crew. The entry display feeder provides data from the ENIS and other components of the GNC flight software to the displays at the proper rate and in the proper units. It also performs calculations that are specific to the entry displays and which are not made in any other component of the flight software. In some instances, it performs calculations identical to those performed by the onboard primary guidance algorithm to protect against a guidance system failure. These functions and the interactions between the entry display feeder and the other components of the EMS are described.

  13. Asteroid proper elements and secular resonances

    NASA Technical Reports Server (NTRS)

    Knezevic, Zoran; Milani, Andrea

    1992-01-01

    In a series of papers (e.g., Knezevic, 1991; Milani and Knezevic, 1990; 1991) we reported on the progress we were making in computing asteroid proper elements, both as regards their accuracy and long-term stability. Additionally, we reported on the efficiency and 'intelligence' of our software. At the same time, we studied the associated problems of resonance effects, and we introduced the new class of 'nonlinear' secular resonances; we determined the locations of these secular resonances in proper-element phase space and analyzed their impact on the asteroid family classification. Here we would like to summarize the current status of our work and possible further developments.

  14. 3D Printed, Customized Cranial Implant for Surgical Planning

    NASA Astrophysics Data System (ADS)

    Bogu, Venkata Phanindra; Ravi Kumar, Yennam; Asit Kumar, Khanra

    2018-06-01

    The main objective of the present work is to model cranial implant and printed in FDM machine (printer model used: mojo). Actually this is peculiar case and the skull has been damaged in frontal, parietal and temporal regions and a small portion of frontal region damaged away from saggital plane, complexity is to fill this frontal region with proper curvature. The Patient CT-data (Number of slices was 381 and thickness of each slice is 0.488 mm) was processed in mimics14.1 software, mimics file was sent to 3-matic software and calculated thickness of skull at different sections where cranial implant is needed then corrected the edges of cranial implant to overcome CSF (cerebrospinal fluid) leakage and proper fitting. Finally the implant average thickness is decided as 2.5 mm and printed in FDM machine with ABS plastic.

  15. 3D Printed, Customized Cranial Implant for Surgical Planning

    NASA Astrophysics Data System (ADS)

    Bogu, Venkata Phanindra; Ravi Kumar, Yennam; Asit Kumar, Khanra

    2016-06-01

    The main objective of the present work is to model cranial implant and printed in FDM machine (printer model used: mojo). Actually this is peculiar case and the skull has been damaged in frontal, parietal and temporal regions and a small portion of frontal region damaged away from saggital plane, complexity is to fill this frontal region with proper curvature. The Patient CT-data (Number of slices was 381 and thickness of each slice is 0.488 mm) was processed in mimics14.1 software, mimics file was sent to 3-matic software and calculated thickness of skull at different sections where cranial implant is needed then corrected the edges of cranial implant to overcome CSF (cerebrospinal fluid) leakage and proper fitting. Finally the implant average thickness is decided as 2.5 mm and printed in FDM machine with ABS plastic.

  16. Open source platform for collaborative construction of wearable sensor datasets for human motion analysis and an application for gait analysis.

    PubMed

    Llamas, César; González, Manuel A; Hernández, Carmen; Vegas, Jesús

    2016-10-01

    Nearly every practical improvement in modeling human motion is well founded in a properly designed collection of data or datasets. These datasets must be made publicly available for the community could validate and accept them. It is reasonable to concede that a collective, guided enterprise could serve to devise solid and substantial datasets, as a result of a collaborative effort, in the same sense as the open software community does. In this way datasets could be complemented, extended and expanded in size with, for example, more individuals, samples and human actions. For this to be possible some commitments must be made by the collaborators, being one of them sharing the same data acquisition platform. In this paper, we offer an affordable open source hardware and software platform based on inertial wearable sensors in a way that several groups could cooperate in the construction of datasets through common software suitable for collaboration. Some experimental results about the throughput of the overall system are reported showing the feasibility of acquiring data from up to 6 sensors with a sampling frequency no less than 118Hz. Also, a proof-of-concept dataset is provided comprising sampled data from 12 subjects suitable for gait analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. The determination of measures of software reliability

    NASA Technical Reports Server (NTRS)

    Maxwell, F. D.; Corn, B. C.

    1978-01-01

    Measurement of software reliability was carried out during the development of data base software for a multi-sensor tracking system. The failure ratio and failure rate were found to be consistent measures. Trend lines could be established from these measurements that provide good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.

  18. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Astrophysics Data System (ADS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA’s Orbital Debris Program Office (ODPO), in honour of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosyncronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA’s Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  19. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Technical Reports Server (NTRS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    2017-01-01

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA's Orbital Debris Program Office (ODPO), in honor of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosynchronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA's Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  20. Disruption of Angiogenesis by Anthocyanin-Rich Extracts of Hibiscus sabdariffa

    PubMed Central

    Joshua, Madu; Okere, Christiana; Sylvester, O’Donnell; Yahaya, Muhammad; Precious, Omale; Dluya, Thagriki; Um, Ji-Yeon; Neksumi, Musa; Boyd, Jessica; Vincent-Tyndall, Jennifer; Choo, Dong-Won; Gutsaeva, Diana R.; Jahng, Wan Jin

    2017-01-01

    Abnormal vessel formations contribute to the progression of specific angiogenic diseases including age-related macular degeneration. Adequate vessel growth and maintenance represent the coordinated process of endothelial cell proliferation, matrix remodeling, and differentiation. However, the molecular mechanism of the proper balance between angiogenic activators and inhibitors remains elusive. In addition, quantitative analysis of vessel formation has been challenging due to complex angiogenic morphology. We hypothesized that conjugated double bond containing-natural products, including anthocyanin extracts from Hibiscus sabdariffa, may control the proper angiogenesis. The current study was designed to determine whether natural molecules from African plant library modulate angiogenesis. Further, we questioned how the proper balance of anti- or pro-angiogenic signaling can be obtained in the vascular microenvironment by treating anthocyanin or fatty acids using chick chorioallantoic membrane angiogenesis model in ovo. The angiogenic morphology was analyzed systematically by measuring twenty one angiogenic indexes using Angiogenic Analyzer software. Chick chorioallantoic model demonstrated that anthocyanin-rich extracts inhibited angiogenesis in time- and concentration-dependent manner. Molecular modeling analysis proposed that hibiscetin as a component in Hibiscus may bind to the active site of vascular endothelial growth factor receptor 2 (VEGFR2) with ΔG= −8.42 kcal/mol of binding energy. Our results provided the evidence that anthocyanin is an angiogenic modulator that can be used to treat uncontrolled neovascular-related diseases, including age-related macular degeneration. PMID:28459020

  1. Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Frederick, J. M.; Hammond, G. E.

    2017-12-01

    Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A

  2. Design of apochromatic lens with large field and high definition for machine vision.

    PubMed

    Yang, Ao; Gao, Xingyu; Li, Mingfeng

    2016-08-01

    Precise machine vision detection for a large object at a finite working distance (WD) requires that the lens has a high resolution for a large field of view (FOV). In this case, the effect of a secondary spectrum on image quality is not negligible. According to the detection requirements, a high resolution apochromatic objective is designed and analyzed. The initial optical structure (IOS) is combined with three segments. Next, the secondary spectrum of the IOS is corrected by replacing glasses using the dispersion vector analysis method based on the Buchdahl dispersion equation. Other aberrations are optimized by the commercial optical design software ZEMAX by properly choosing the optimization function operands. The optimized optical structure (OOS) has an f-number (F/#) of 3.08, a FOV of φ60  mm, a WD of 240 mm, and a modulated transfer function (MTF) of all fields of more than 0.1 at 320  cycles/mm. The design requirements for a nonfluorite material apochromatic objective lens with a large field and high definition for machine vision detection have been achieved.

  3. Reliability of simulated robustness testing in fast liquid chromatography, using state-of-the-art column technology, instrumentation and modelling software.

    PubMed

    Kormány, Róbert; Fekete, Jenő; Guillarme, Davy; Fekete, Szabolcs

    2014-02-01

    The goal of this study was to evaluate the accuracy of simulated robustness testing using commercial modelling software (DryLab) and state-of-the-art stationary phases. For this purpose, a mixture of amlodipine and its seven related impurities was analyzed on short narrow bore columns (50×2.1mm, packed with sub-2μm particles) providing short analysis times. The performance of commercial modelling software for robustness testing was systematically compared to experimental measurements and DoE based predictions. We have demonstrated that the reliability of predictions was good, since the predicted retention times and resolutions were in good agreement with the experimental ones at the edges of the design space. In average, the retention time relative errors were <1.0%, while the predicted critical resolution errors were comprised between 6.9 and 17.2%. Because the simulated robustness testing requires significantly less experimental work than the DoE based predictions, we think that robustness could now be investigated in the early stage of method development. Moreover, the column interchangeability, which is also an important part of robustness testing, was investigated considering five different C8 and C18 columns packed with sub-2μm particles. Again, thanks to modelling software, we proved that the separation was feasible on all columns within the same analysis time (less than 4min), by proper adjustments of variables. Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Design of primers and probes for quantitative real-time PCR methods.

    PubMed

    Rodríguez, Alicia; Rodríguez, Mar; Córdoba, Juan J; Andrade, María J

    2015-01-01

    Design of primers and probes is one of the most crucial factors affecting the success and quality of quantitative real-time PCR (qPCR) analyses, since an accurate and reliable quantification depends on using efficient primers and probes. Design of primers and probes should meet several criteria to find potential primers and probes for specific qPCR assays. The formation of primer-dimers and other non-specific products should be avoided or reduced. This factor is especially important when designing primers for SYBR(®) Green protocols but also in designing probes to ensure specificity of the developed qPCR protocol. To design primers and probes for qPCR, multiple software programs and websites are available being numerous of them free. These tools often consider the default requirements for primers and probes, although new research advances in primer and probe design should be progressively added to different algorithm programs. After a proper design, a precise validation of the primers and probes is necessary. Specific consideration should be taken into account when designing primers and probes for multiplex qPCR and reverse transcription qPCR (RT-qPCR). This chapter provides guidelines for the design of suitable primers and probes and their subsequent validation through the development of singlex qPCR, multiplex qPCR, and RT-qPCR protocols.

  5. Caring for Your Videodiscs, CD-ROM Discs, and Players.

    ERIC Educational Resources Information Center

    Ekhaml, Leticia; Saygan, Bobby

    1993-01-01

    Presents guidelines for the proper care and handling of videodisc and CD-ROM hardware and software. Topics discussed include handling the equipment, moving, cleaning techniques, storage considerations, ventilation requirements, and climate control. (LRW)

  6. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  7. Modeling software systems by domains

    NASA Technical Reports Server (NTRS)

    Dippolito, Richard; Lee, Kenneth

    1992-01-01

    The Software Architectures Engineering (SAE) Project at the Software Engineering Institute (SEI) has developed engineering modeling techniques that both reduce the complexity of software for domain-specific computer systems and result in systems that are easier to build and maintain. These techniques allow maximum freedom for system developers to apply their domain expertise to software. We have applied these techniques to several types of applications, including training simulators operating in real time, engineering simulators operating in non-real time, and real-time embedded computer systems. Our modeling techniques result in software that mirrors both the complexity of the application and the domain knowledge requirements. We submit that the proper measure of software complexity reflects neither the number of software component units nor the code count, but the locus of and amount of domain knowledge. As a result of using these techniques, domain knowledge is isolated by fields of engineering expertise and removed from the concern of the software engineer. In this paper, we will describe kinds of domain expertise, describe engineering by domains, and provide relevant examples of software developed for simulator applications using the techniques.

  8. A Novel Rules Based Approach for Estimating Software Birthmark

    PubMed Central

    Binti Alias, Norma; Anwar, Sajid

    2015-01-01

    Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363

  9. Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.

    1992-01-01

    The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.

  10. Cost as a technology driver. [in aerospace R and D

    NASA Technical Reports Server (NTRS)

    Fitzgerald, P. E., Jr.; Savage, M.

    1976-01-01

    Cost managment as a guiding factor in optimum development of technology, and proper timing of cost-saving programs in the development of a system or technology with payoffs in development and operational advances are discussed and illustrated. Advances enhancing the performance of hardware or software advances raising productivity or reducing cost, are outlined, with examples drawn from: thermochemical thrust maximization, development of cryogenic storage tanks, improvements in fuel cells for Space Shuttle, design of a spacecraft pyrotechnic initiator, cost cutting by reduction in the number of parts to be joined, and cost cutting by dramatic reductions in circuit component number with small-scale double-diffused integrated circuitry. Program-focused supporting research and technology models are devised to aid judicious timing of cost-conscious research programs.

  11. Instrumentation & Data Acquisition System (D AS) Engineer

    NASA Technical Reports Server (NTRS)

    Jackson, Markus Deon

    2015-01-01

    The primary job of an Instrumentation and Data Acquisition System (DAS) Engineer is to properly measure physical phenomenon of hardware using appropriate instrumentation and DAS equipment designed to record data during a specified test of the hardware. A DAS system includes a CPU or processor, a data storage device such as a hard drive, a data communication bus such as Universal Serial Bus, software to control the DAS system processes like calibrations, recording of data and processing of data. It also includes signal conditioning amplifiers, and certain sensors for specified measurements. My internship responsibilities have included testing and adjusting Pacific Instruments Model 9355 signal conditioning amplifiers, writing and performing checkout procedures, writing and performing calibration procedures while learning the basics of instrumentation.

  12. Conception and design of a control and monitoring system for the mirror alignment of the CBM RICH detector

    NASA Astrophysics Data System (ADS)

    Adamczewski-Musch, J.; Akishin, P.; Becker, K.-H.; Belogurov, S.; Bendarouach, J.; Boldyreva, N.; Deveaux, C.; Dobyrn, V.; Dürr, M.; Eschke, J.; Förtsch, J.; Heep, J.; Höhne, C.; Kampert, K.-H.; Kochenda, L.; Kopfer, J.; Kravtsov, P.; Kres, I.; Lebedev, S.; Lebedeva, E.; Leonova, E.; Linev, S.; Mahmoud, T.; Michel, J.; Miftakhov, N.; Niebur, W.; Ovcharenko, E.; Patel, V.; Pauly, C.; Pfeifer, D.; Querchfeld, S.; Rautenberg, J.; Reinecke, S.; Riabov, Y.; Roshchin, E.; Samsonov, V.; Schetinin, V.; Tarasenkova, O.; Traxler, M.; Ugur, C.; Vznuzdaev, E.; Vznuzdaev, M.

    2017-12-01

    The Compressed Baryonic Matter (CBM) experiment at the future Facility for Anti-proton and Ion Research (FAIR) will investigate the phase diagram of strongly interacting matter at high net-baryon density and moderate temperature in A+A collisions. One of the key detectors of CBM to explore this physics program is a Ring Imaging CHerenkov (RICH) detector for electron identification. For a high performance of the RICH detector precise mirror alignment is essential. A three-step correction cycle has been developed, which will be discussed: First a qualitative, fast check of the mirror positions, second a quantitative determination of possible misalignments and third a software correction routine, allowing a proper functioning of the RICH under misalignment conditions.

  13. Designing a fully automated multi-bioreactor plant for fast DoE optimization of pharmaceutical protein production.

    PubMed

    Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner

    2013-06-01

    The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. A study on leakage radiation dose at ELV-4 electron accelerator bunker

    NASA Astrophysics Data System (ADS)

    Chulan, Mohd Rizal Md; Yahaya, Redzuwan; Ghazali, Abu BakarMhd

    2014-09-01

    Shielding is an important aspect in the safety of an accelerator and the most important aspects of a bunker shielding is the door. The bunker's door should be designed properly to minimize the leakage radiation and shall not exceed the permitted limit of 2.5μSv/hr. In determining the leakage radiation dose that passed through the door and gaps between the door and the wall, 2-dimensional manual calculations are often used. This method is hard to perform because visual 2-dimensional is limited and is also very difficult in the real situation. Therefore estimation values are normally performed. In doing so, the construction cost would be higher because of overestimate or underestimate which require costly modification to the bunker. Therefore in this study, two methods are introduced to overcome the problem such as simulation using MCNPX Version 2.6.0 software and manual calculation using 3-dimensional model from Autodesk Inventor 2010 software. The values from the two methods were eventually compared to the real values from direct measurements using Ludlum Model 3 with Model 44-9 probe survey meter.

  15. LLCEDATA and LLCECALC for Windows version 1.0, Volume 1: User`s manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McFadden, J.G.

    LLCEDATA and LLCECALC for Windows are user-friendly computer software programs that work together to determine the proper waste designation, handling, and disposition requirements for Long Length Contaminated Equipment (LLCE). LLCEDATA reads from a variety of data bases to produce an equipment data file (EDF) that represents a snapshot of both the LLCE and the tank it originates from. LLCECALC reads the EDF and a gamma assay (AV2) file that is produced by the Flexible Receiver Gamma Energy Analysis System. LLCECALC performs corrections to the AV2 file as it is being read and characterizes the LLCE. Both programs produce a varietymore » of reports, including a characterization report and a status report. The status report documents each action taken by the user, LLCEDATA, and LLCECALC. Documentation for LLCEDATA and LLCECALC for Windows is available in three volumes. Volume 1 is a user`s manual, which is intended as a quick reference for both LLCEDATA and LLCECALC. Volume 2 is a technical manual, and Volume 3 is a software verification and validation document.« less

  16. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    PubMed

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society for Leukocyte Biology.

  17. The Impact of Contextual Factors on the Security of Code

    DTIC Science & Technology

    2014-12-30

    in which a system is resourced, overseen, managed and assured will have a lot to do with how successfully it performs in actual practice. Software is...ensure proper and adequate system assurance . Because of the high degree of skill and specialization required, details about software and systems are...whole has to be carefully coordinated in order to assure against the types of faults that are the basis for most of the exploits listed in the Common

  18. A New Framework for Software Visualization: A Multi-Layer Approach

    DTIC Science & Technology

    2006-09-01

    primary target is an exploration of the current state of the area so that we can discover the challenges and propose solutions for them. The study ...Small define both areas of study to collectively be a part of Software Visualization. 22 Visual Programming as ’Visual Programming’ (VP) refers to...founded taxonomy, with the proper characteristics, can further investigation in any field of study . A common language or terminology and the existence of

  19. Empirical studies of software design: Implications for SSEs

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.

  20. 41 CFR 102-74.340 - Who is responsible for monitoring and controlling areas designated for smoking by an agency head...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... with proper signage? 102-74.340 Section 102-74.340 Public Contracts and Property Management Federal... designated for smoking by an agency head and for identifying those areas with proper signage? Agency heads... smoking and identifying these areas with proper signage. Suitable, uniform signs reading “Designated...

  1. 41 CFR 102-74.340 - Who is responsible for monitoring and controlling areas designated for smoking by an agency head...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... with proper signage? 102-74.340 Section 102-74.340 Public Contracts and Property Management Federal... designated for smoking by an agency head and for identifying those areas with proper signage? Agency heads... smoking and identifying these areas with proper signage. Suitable, uniform signs reading “Designated...

  2. 41 CFR 102-74.340 - Who is responsible for monitoring and controlling areas designated for smoking by an agency head...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... with proper signage? 102-74.340 Section 102-74.340 Public Contracts and Property Management Federal... designated for smoking by an agency head and for identifying those areas with proper signage? Agency heads... smoking and identifying these areas with proper signage. Suitable, uniform signs reading “Designated...

  3. 41 CFR 102-74.340 - Who is responsible for monitoring and controlling areas designated for smoking by an agency head...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... with proper signage? 102-74.340 Section 102-74.340 Public Contracts and Property Management Federal... designated for smoking by an agency head and for identifying those areas with proper signage? Agency heads... smoking and identifying these areas with proper signage. Suitable, uniform signs reading “Designated...

  4. 41 CFR 102-74.340 - Who is responsible for monitoring and controlling areas designated for smoking by an agency head...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... with proper signage? 102-74.340 Section 102-74.340 Public Contracts and Property Management Federal... designated for smoking by an agency head and for identifying those areas with proper signage? Agency heads... smoking and identifying these areas with proper signage. Suitable, uniform signs reading “Designated...

  5. SEDS1 mission software verification using a signal simulator

    NASA Technical Reports Server (NTRS)

    Pierson, William E.

    1992-01-01

    The first flight of the Small Expendable Deployer System (SEDS1) is schedule to fly as the secondary payload of a Delta 2 in March, 1993. The objective of the SEDS1 mission is to collect data to validate the concept of tethered satellite systems and to verify computer simulations used to predict their behavior. SEDS1 will deploy a 50 lb. instrumented satellite as an end mass using a 20 km tether. Langley Research Center is providing the end mass instrumentation, while the Marshall Space Flight Center is designing and building the deployer. The objective of the experiment is to test the SEDS design concept by demonstrating that the system will satisfactorily deploy the full 20 km tether without stopping prematurely, come to a smooth stop on the application of a brake, and cut the tether at the proper time after it swings to the local vertical. Also, SEDS1 will collect data which will be used to test the accuracy of tether dynamics models used to stimulate this type of deployment. The experiment will last about 1.5 hours and complete approximately 1.5 orbits. Radar tracking of the Delta II and end mass is planned. In addition, the SEDS1 on-board computer will continuously record, store, and transmit mission data over the Delta II S-band telemetry system. The Data System will count tether windings as the tether unwinds, log the times of each turn and other mission events, monitor tether tension, and record the temperature of system components. A summary of the measurements taken during the SEDS1 are shown. The Data System will also control the tether brake and cutter mechanisms. Preliminary versions of two major sections of the flight software, the data telemetry modules and the data collection modules, were developed and tested under the 1990 NASA/ASEE Summer Faculty Fellowship Program. To facilitate the debugging of these software modules, a prototype SEDS Data System was programmed to simulate turn count signals. During the 1991 summer program, the concept of simulating signals produced by the SEDS electronics systems and circuits was expanded and more precisely defined. During the 1992 summer program, the SEDS signal simulator was programmed to test the requirements of the SEDS Mission software, and this simulator will be used in the formal verification of the SEDS Mission Software. The formal test procedures specification was written which incorporates the use of the signal simulator to test the SEDS Mission Software and which incorporates procedures for testing the other major component of the SEDS software, the Monitor Software.

  6. Sampling theory and automated simulations for vertical sections, applied to human brain.

    PubMed

    Cruz-Orive, L M; Gelšvartas, J; Roberts, N

    2014-02-01

    In recent years, there have been substantial developments in both magnetic resonance imaging techniques and automatic image analysis software. The purpose of this paper is to develop stereological image sampling theory (i.e. unbiased sampling rules) that can be used by image analysts for estimating geometric quantities such as surface area and volume, and to illustrate its implementation. The methods will ideally be applied automatically on segmented, properly sampled 2D images - although convenient manual application is always an option - and they are of wide applicability in many disciplines. In particular, the vertical sections design to estimate surface area is described in detail and applied to estimate the area of the pial surface and of the boundary between cortex and underlying white matter (i.e. subcortical surface area). For completeness, cortical volume and mean cortical thickness are also estimated. The aforementioned surfaces were triangulated in 3D with the aid of FreeSurfer software, which provided accurate surface area measures that served as gold standards. Furthermore, a software was developed to produce digitized trace curves of the triangulated target surfaces automatically from virtual sections. From such traces, a new method (called the 'lambda method') is presented to estimate surface area automatically. In addition, with the new software, intersections could be counted automatically between the relevant surface traces and a cycloid test grid for the classical design. This capability, together with the aforementioned gold standard, enabled us to thoroughly check the performance and the variability of the different estimators by Monte Carlo simulations for studying the human brain. In particular, new methods are offered to split the total error variance into the orientations, sectioning and cycloid components. The latter prediction was hitherto unavailable--one is proposed here and checked by way of simulations on a given set of digitized vertical sections with automatically superimposed cycloid grids of three different sizes. Concrete and detailed recommendations are given to implement the methods. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.

  7. Development of a Multi-Channel, High Frequency QRS Electrocardiograph

    NASA Technical Reports Server (NTRS)

    DePalma, Jude L.

    2003-01-01

    With the advent of the ISS era and the potential requirement for increased cardiovascular monitoring of crewmembers during extended EVAs, NASA flight surgeons would stand to benefit from an evolving technology that allows for a more rapid diagnosis of myocardial ischemia compared to standard electrocardiography. Similarly, during the astronaut selection process, NASA flight surgeons and other physicians would also stand to benefit from a completely noninvasive technology that, either at rest or during maximal exercise tests, is more sensitive than standard ECG in identifying the presence of ischemia. Perhaps most importantly, practicing cardiologists and emergency medicine physicians could greatly benefit from such a device as it could augment (or even replace) standard electrocardiography in settings where the rapid diagnosis of myocardial ischemia (or the lack thereof) is required for proper clinical decision-making. A multi-channel, high-frequency QRS electrocardiograph is currently under development in the Life Sciences Research Laboratories at JSC. Specifically the project consisted of writing software code, some of which contained specially-designed digital filters, which will be incorporated into an existing commercial software program that is already designed to collect, plot and analyze conventional 12-lead ECG signals on a desktop, portable or palm PC. The software will derive the high-frequency QRS signals, which will be analyzed (in numerous ways) and plotted alongside of the conventional ECG signals, giving the PC-viewing clinician advanced diagnostic information that has never been available previously in all 12 ECG leads simultaneously. After the hardware and software for the advanced digital ECG monitor have been fully integrated, plans are to use the monitor to begin clinical studies both on healthy subjects and on patients with known coronary artery disease in both the outpatient and hospital settings. The ultimate goal is to get the technology out into the clinical world, where it has the potential to save lives.

  8. Feasibility study, software design, layout and simulation of a two-dimensional fast Fourier transform machine for use in optical array interferometry

    NASA Technical Reports Server (NTRS)

    Boriakoff, Valentin; Chen, Wei

    1990-01-01

    The NASA-Cornell Univ.-Worcester Polytechnic Institute Fast Fourier Transform (FFT) chip based on the architecture of the systolic FFT computation as presented by Boriakoff is implemented into an operating device design. The kernel of the system, a systolic inner product floating point processor, was designed to be assembled into a systolic network that would take incoming data streams in pipeline fashion and provide an FFT output at the same rate, word by word. It was thoroughly simulated for proper operation, and it has passed a comprehensive set of tests showing no operational errors. The black box specifications of the chip, which conform to the initial requirements of the design as specified by NASA, are given. The five subcells are described and their high level function description, logic diagrams, and simulation results are presented. Some modification of the Read Only Memory (ROM) design were made, since some errors were found in it. Because a four stage pipeline structure was used, simulating such a structure is more difficult than an ordinary structure. Simulation methods are discussed. Chip signal protocols and chip pinout are explained.

  9. VARED: Verification and Analysis of Requirements and Early Designs

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Throop, David; Claunch, Charles

    2014-01-01

    Requirements are a part of every project life cycle; everything going forward in a project depends on them. Good requirements are hard to write, there are few useful tools to test, verify, or check them, and it is difficult to properly marry them to the subsequent design, especially if the requirements are written in natural language. In fact, the inconsistencies and errors in the requirements along with the difficulty in finding these errors contribute greatly to the cost of the testing and verification stage of flight software projects [1]. Large projects tend to have several thousand requirements written at various levels by different groups of people. The design process is distributed and a lack of widely accepted standards for requirements often results in a product that varies widely in style and quality. A simple way to improve this would be to standardize the design process using a set of tools and widely accepted requirements design constraints. The difficulty with this approach is finding the appropriate constraints and tools. Common complaints against the tools available include ease of use, functionality, and available features. Also, although preferable, it is rare that these tools are capable of testing the quality of the requirements.

  10. MULTIMEDIA EXPOSURE MODELING

    EPA Science Inventory

    This task addresses a number of issues that arise in multimedia modeling with an emphasis on interactions among the atmosphere and multiple other environmental media. Approaches for working with multiple types of models and the data sets are being developed. Proper software tool...

  11. ARC-2007-ACD07-0140-002

    NASA Image and Video Library

    2007-07-31

    David L. Iverson of NASA Ames Research Center, Moffett Field, California (in foreground) led development of computer software to monitor the conditions of the gyroscopes that keep the International Space Station (ISS) properly oriented in space as the ISS orbits Earth. Also, Charles Lee is pictured. During its develoment, researchers used the software to analyze archived gyroscope records. In these tests, users noticed problems with the gyroscopes long before the current systems flagged glitches. Testers trained using several months of normal space station gyroscope data collected by the International Space Station Mission Control Center at NASA Johnson Space Center, Houston. Promising tests results convinced officials to start using the software in 2007.

  12. SigmaPlot 2000, Version 6.00, SPSS Inc. Computer Software Test Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HURLBUT, S.T.

    2000-10-24

    SigmaPlot is a vendor software product used in conjunction with the supercritical fluid extraction Fourier transform infrared spectrometer (SFE-FTIR) system. This product converts the raw spectral data to useful area numbers. SigmaPlot will be used in conjunction with procedure ZA-565-301, ''Determination of Moisture by Supercritical Fluid Extraction and Infrared Detection.'' This test plan will be performed in conjunction with or prior to HNF-6936, ''HA-53 Supercritical Fluid Extraction System Acceptance Test Plan'', to perform analyses for water. The test will ensure that the software can be installed properly and will manipulate the analytical data correctly.

  13. Students' Different Understandings of Class Diagrams

    ERIC Educational Resources Information Center

    Boustedt, Jonas

    2012-01-01

    The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a…

  14. Diatomite filtration of water for injection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olmsted, B.C. Jr.; Bell, G.R.

    1966-01-01

    A discussion is presented of the capabilities, problems, and answers, in the performance of diatomite filters. The discussion includes a description of diatomite filtration, new developments, design criteria, and some case histories. Diatomite filters, when properly designed and installed, and when properly applied, can provide effective clarification of waters for injection at low capital and operating costs. Design, installation, and proper application, effectiveness, and capital and operating costs can be placed in the proper perspective in the light of general experience and recent pilot plant tests in the southern California area. (30 refs.)

  15. Intelligent systems technology infrastructure for integrated systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry

    1991-01-01

    A system infrastructure must be properly designed and integrated from the conceptual development phase to accommodate evolutionary intelligent technologies. Several technology development activities were identified that may have application to rendezvous and capture systems. Optical correlators in conjunction with fuzzy logic control might be used for the identification, tracking, and capture of either cooperative or non-cooperative targets without the intensive computational requirements associated with vision processing. A hybrid digital/analog system was developed and tested with a robotic arm. An aircraft refueling application demonstration is planned within two years. Initially this demonstration will be ground based with a follow-on air based demonstration. System dependability measurement and modeling techniques are being developed for fault management applications. This involves usage of incremental solution/evaluation techniques and modularized systems to facilitate reuse and to take advantage of natural partitions in system models. Though not yet commercially available and currently subject to accuracy limitations, technology is being developed to perform optical matrix operations to enhance computational speed. Optical terrain recognition using camera image sequencing processed with optical correlators is being developed to determine position and velocity in support of lander guidance. The system is planned for testing in conjunction with Dryden Flight Research Facility. Advanced architecture technology is defining open architecture design constraints, test bed concepts (processors, multiple hardware/software and multi-dimensional user support, knowledge/tool sharing infrastructure), and software engineering interface issues.

  16. NOTE: MMCTP: a radiotherapy research environment for Monte Carlo and patient-specific treatment planning

    NASA Astrophysics Data System (ADS)

    Alexander, A.; DeBlois, F.; Stroian, G.; Al-Yahya, K.; Heath, E.; Seuntjens, J.

    2007-07-01

    Radiotherapy research lacks a flexible computational research environment for Monte Carlo (MC) and patient-specific treatment planning. The purpose of this study was to develop a flexible software package on low-cost hardware with the aim of integrating new patient-specific treatment planning with MC dose calculations suitable for large-scale prospective and retrospective treatment planning studies. We designed the software package 'McGill Monte Carlo treatment planning' (MMCTP) for the research development of MC and patient-specific treatment planning. The MMCTP design consists of a graphical user interface (GUI), which runs on a simple workstation connected through standard secure-shell protocol to a cluster for lengthy MC calculations. Treatment planning information (e.g., images, structures, beam geometry properties and dose distributions) is converted into a convenient MMCTP local file storage format designated, the McGill RT format. MMCTP features include (a) DICOM_RT, RTOG and CADPlan CART format imports; (b) 2D and 3D visualization views for images, structure contours, and dose distributions; (c) contouring tools; (d) DVH analysis, and dose matrix comparison tools; (e) external beam editing; (f) MC transport calculation from beam source to patient geometry for photon and electron beams. The MC input files, which are prepared from the beam geometry properties and patient information (e.g., images and structure contours), are uploaded and run on a cluster using shell commands controlled from the MMCTP GUI. The visualization, dose matrix operation and DVH tools offer extensive options for plan analysis and comparison between MC plans and plans imported from commercial treatment planning systems. The MMCTP GUI provides a flexible research platform for the development of patient-specific MC treatment planning for photon and electron external beam radiation therapy. The impact of this tool lies in the fact that it allows for systematic, platform-independent, large-scale MC treatment planning for different treatment sites. Patient recalculations were performed to validate the software and ensure proper functionality.

  17. Assessing efficiency of software production for NASA-SEL data

    NASA Technical Reports Server (NTRS)

    Vonmayrhauser, Anneliese; Roeseler, Armin

    1993-01-01

    This paper uses production models to identify and quantify efficient allocation of resources and key drivers of software productivity for project data in the NASA-SEL database. While analysis allows identification of efficient projects, many of the metrics that could have provided a more detailed analysis are not at a level of measurement to allow production model analysis. Production models must be used with proper parameterization to be successful. This may mean a new look at which metrics are helpful for efficiency assessment.

  18. Software For Computer-Security Audits

    NASA Technical Reports Server (NTRS)

    Arndt, Kate; Lonsford, Emily

    1994-01-01

    Information relevant to potential breaches of security gathered efficiently. Automated Auditing Tools for VAX/VMS program includes following automated software tools performing noted tasks: Privileged ID Identification, program identifies users and their privileges to circumvent existing computer security measures; Critical File Protection, critical files not properly protected identified; Inactive ID Identification, identifications of users no longer in use found; Password Lifetime Review, maximum lifetimes of passwords of all identifications determined; and Password Length Review, minimum allowed length of passwords of all identifications determined. Written in DEC VAX DCL language.

  19. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  20. Monaural Sound Localization Based on Reflective Structure and Homomorphic Deconvolution

    PubMed Central

    Park, Yeonseok; Choi, Anthony

    2017-01-01

    The asymmetric structure around the receiver provides a particular time delay for the specific incoming propagation. This paper designs a monaural sound localization system based on the reflective structure around the microphone. The reflective plates are placed to present the direction-wise time delay, which is naturally processed by convolutional operation with a sound source. The received signal is separated for estimating the dominant time delay by using homomorphic deconvolution, which utilizes the real cepstrum and inverse cepstrum sequentially to derive the propagation response’s autocorrelation. Once the localization system accurately estimates the information, the time delay model computes the corresponding reflection for localization. Because of the structure limitation, two stages of the localization process perform the estimation procedure as range and angle. The software toolchain from propagation physics and algorithm simulation realizes the optimal 3D-printed structure. The acoustic experiments in the anechoic chamber denote that 79.0% of the study range data from the isotropic signal is properly detected by the response value, and 87.5% of the specific direction data from the study range signal is properly estimated by the response time. The product of both rates shows the overall hit rate to be 69.1%. PMID:28946625

  1. An NAFP Project: Use of Object Oriented Methodologies and Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Baggs, Rhoda

    2007-01-01

    In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.

  2. Microwave Scanning System Correlations

    DTIC Science & Technology

    2010-08-11

    The follow equipment is needed for each of the individual scanning systems: Handheld Scanner Equipment list 1. Dell Netbook (with the...proper software installed by Evisive) 2. Bluetooth USB port transmitter 3. Handheld Probe 4. USB to mini-USB Converter (links camera to netbook

  3. Knowledge.

    ERIC Educational Resources Information Center

    Online-Offline, 1999

    1999-01-01

    This theme issue on knowledge includes annotated listings of Web sites, CD-ROMs and computer software, videos, books, and additional resources that deal with knowledge and differences between how animals and humans learn. Sidebars discuss animal intelligence, learning proper behavior, and getting news from the Internet. (LRW)

  4. Copyright Survey Results.

    ERIC Educational Resources Information Center

    Botterbusch, Hope R.

    1992-01-01

    Reports results of a survey of copyright concerns that was conducted by the Association for Educational Communications and Technology. Areas addressed include video and television; copyright legislation; printed materials; music; audiovisual materials; and computer software. A checklist of proper copyright procedures is included. (six references)…

  5. Application of a personal computer for the uncoupled vibration analysis of wind turbine blade and counterweight assemblies

    NASA Technical Reports Server (NTRS)

    White, P. R.; Little, R. R.

    1985-01-01

    A research effort was undertaken to develop personal computer based software for vibrational analysis. The software was developed to analytically determine the natural frequencies and mode shapes for the uncoupled lateral vibrations of the blade and counterweight assemblies used in a single bladed wind turbine. The uncoupled vibration analysis was performed in both the flapwise and chordwise directions for static rotor conditions. The effects of rotation on the uncoupled flapwise vibration of the blade and counterweight assemblies were evaluated for various rotor speeds up to 90 rpm. The theory, used in the vibration analysis codes, is based on a lumped mass formulation for the blade and counterweight assemblies. The codes are general so that other designs can be readily analyzed. The input for the codes is generally interactive to facilitate usage. The output of the codes is both tabular and graphical. Listings of the codes are provided. Predicted natural frequencies of the first several modes show reasonable agreement with experimental results. The analysis codes were originally developed on a DEC PDP 11/34 minicomputer and then downloaded and modified to run on an ITT XTRA personal computer. Studies conducted to evaluate the efficiency of running the programs on a personal computer as compared with the minicomputer indicated that, with the proper combination of hardware and software options, the efficiency of using a personal computer exceeds that of a minicomputer.

  6. SSAGES: Software Suite for Advanced General Ensemble Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods, and that facilitates implementation of new techniquesmore » as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite.« less

  7. SSAGES: Software Suite for Advanced General Ensemble Simulations.

    PubMed

    Sidky, Hythem; Colón, Yamil J; Helfferich, Julian; Sikora, Benjamin J; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S; Reid, Daniel R; Sevgen, Emre; Thapar, Vikram; Webb, Michael A; Whitmer, Jonathan K; de Pablo, Juan J

    2018-01-28

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques-including adaptive biasing force, string methods, and forward flux sampling-that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  8. A hybrid integrated services digital network-internet protocol solution for resident education.

    PubMed

    Erickson, Delnora; Greer, Lester; Belard, Arnaud; Tinnel, Brent; O'Connell, John

    2010-05-01

    The purpose of this study was to explore the effectiveness of incorporating Web-based application sharing of virtual medical simulation software within a multipoint video teleconference (VTC) as a training tool in graduate medical education. National Capital Consortium Radiation Oncology Residency Program resident and attending physicians participated in dosimetry teaching sessions held via VTC using Acrobat Connect application sharing. Residents at remote locations could take turns designing radiation treatments using standard three-dimensional planning software, whereas instructors gave immediate feedback and demonstrated proper techniques. Immediately after each dosimetry lesson, residents were asked to complete a survey that evaluated the effectiveness of the session. At the end of a 3-month trial of using Adobe Connect, residents completed a final survey that compared this teaching technology to the prior VTC-alone method. The mean difference from equality across all quality measures from the weekly survey was 0.8, where 0 indicated neither enhanced nor detracted from the learning experience and 1 indicated a minor enhancement in the learning experience. The mean difference from equality across all measures from the final survey comparing use of application sharing with VTC to VTC alone was 1.5, where 1 indicated slightly better and 2 indicated a somewhat better experience. The teaching efficacy of multipoint VTC is perceived by medical residents to be more effective when complemented by application-sharing software such as Adobe Acrobat Connect.

  9. Computer simulation of on-orbit manned maneuvering unit operations

    NASA Technical Reports Server (NTRS)

    Stuart, G. M.; Garcia, K. D.

    1986-01-01

    Simulation of spacecraft on-orbit operations is discussed in reference to Martin Marietta's Space Operations Simulation laboratory's use of computer software models to drive a six-degree-of-freedom moving base carriage and two target gimbal systems. In particular, key simulation issues and related computer software models associated with providing real-time, man-in-the-loop simulations of the Manned Maneuvering Unit (MMU) are addressed with special attention given to how effectively these models and motion systems simulate the MMU's actual on-orbit operations. The weightless effects of the space environment require the development of entirely new devices for locomotion. Since the access to space is very limited, it is necessary to design, build, and test these new devices within the physical constraints of earth using simulators. The simulation method that is discussed here is the technique of using computer software models to drive a Moving Base Carriage (MBC) that is capable of providing simultaneous six-degree-of-freedom motions. This method, utilized at Martin Marietta's Space Operations Simulation (SOS) laboratory, provides the ability to simulate the operation of manned spacecraft, provides the pilot with proper three-dimensional visual cues, and allows training of on-orbit operations. The purpose here is to discuss significant MMU simulation issues, the related models that were developed in response to these issues and how effectively these models simulate the MMU's actual on-orbiter operations.

  10. SSAGES: Software Suite for Advanced General Ensemble Simulations

    NASA Astrophysics Data System (ADS)

    Sidky, Hythem; Colón, Yamil J.; Helfferich, Julian; Sikora, Benjamin J.; Bezik, Cody; Chu, Weiwei; Giberti, Federico; Guo, Ashley Z.; Jiang, Xikai; Lequieu, Joshua; Li, Jiyuan; Moller, Joshua; Quevillon, Michael J.; Rahimi, Mohammad; Ramezani-Dakhel, Hadi; Rathee, Vikramjit S.; Reid, Daniel R.; Sevgen, Emre; Thapar, Vikram; Webb, Michael A.; Whitmer, Jonathan K.; de Pablo, Juan J.

    2018-01-01

    Molecular simulation has emerged as an essential tool for modern-day research, but obtaining proper results and making reliable conclusions from simulations requires adequate sampling of the system under consideration. To this end, a variety of methods exist in the literature that can enhance sampling considerably, and increasingly sophisticated, effective algorithms continue to be developed at a rapid pace. Implementation of these techniques, however, can be challenging for experts and non-experts alike. There is a clear need for software that provides rapid, reliable, and easy access to a wide range of advanced sampling methods and that facilitates implementation of new techniques as they emerge. Here we present SSAGES, a publicly available Software Suite for Advanced General Ensemble Simulations designed to interface with multiple widely used molecular dynamics simulations packages. SSAGES allows facile application of a variety of enhanced sampling techniques—including adaptive biasing force, string methods, and forward flux sampling—that extract meaningful free energy and transition path data from all-atom and coarse-grained simulations. A noteworthy feature of SSAGES is a user-friendly framework that facilitates further development and implementation of new methods and collective variables. In this work, the use of SSAGES is illustrated in the context of simple representative applications involving distinct methods and different collective variables that are available in the current release of the suite. The code may be found at: https://github.com/MICCoM/SSAGES-public.

  11. Performance testing of 3D point cloud software

    NASA Astrophysics Data System (ADS)

    Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.

    2013-10-01

    LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.

  12. [Software for illustrating a cost-quality balance carried out by clinical laboratory practice].

    PubMed

    Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi

    2010-09-01

    We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial

  13. Simulation Control Graphical User Interface Logging Report

    NASA Technical Reports Server (NTRS)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  14. The Commander’s Emergency Response Program: A Model for Future Implementation

    DTIC Science & Technology

    2010-04-07

    unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects, increasing...for commanders and their designated practitioners to properly select projects, increasing the effectiveness of CERP funds. 4 TABLE OF...and unintended Effects. The INVEST-E methodology serves as a tool for commanders and their designated practitioners to properly select projects

  15. Offering integrated medical equipment management in an application service provider model.

    PubMed

    Cruz, Antonio Miguel; Barr, Cameron; Denis, Ernesto Rodríguez

    2007-01-01

    With the advancement of medical technology and thus the complexity of the equipment under their care, clinical engineering departments (CEDs) must continue to make use of computerized tools in the management of departmental activities. Authors of this paper designed, installed, and implemented an application service provider (ASP) model at the laboratory level to offer value added management tools in an online format to CEDs. The project, designed to investigate how to help meet demands across multiple healthcare organizations and provide a means of access for organizations that otherwise might not be able to take advantage of the benefits of those tools, has been well received. Ten hospitals have requested the service, and five of those are ready to proceed with the implementation of the ASP. With the proposed centralized system architecture, the model has shown promise in reducing network infrastructure labor and equipment costs, benchmarking of equipment performance indicators, and developing avenues for proper and timely problem reporting. The following is a detailed description of the design process from conception to implementation of the five main software modules and supporting system architecture.

  16. Discussions on attitude determination and control system for micro/nano/pico-satellites considering survivability based on Hodoyoshi-3 and 4 experiences

    NASA Astrophysics Data System (ADS)

    Nakasuka, Shinichi; Miyata, Kikuko; Tsuruda, Yoshihiro; Aoyanagi, Yoshihide; Matsumoto, Takeshi

    2018-04-01

    The recent advancement of micro/nano/pico-satellites technologies encourages many universities to develop three axis stabilized satellites. As three axis stabilization is high level technology requiring the proper functioning of various sensors, actuators and control software, many early satellites failed in their initial operation phase because of shortage of solar power generation or inability to realize the initial step of missions because of unexpected attitude control system performance. These results come from failure to design the satellite attitude determination and control system (ADCS) appropriately and not considering "satellite survivability." ADCS should be designed such that even if some sensors or actuators cannot work as expected, the satellite can survive and carry out some of its missions, even if not full. This paper discusses how to realize ADCS while taking satellite survivability into account, based on our experiences of design and in-orbit operations of Hodoyoshi-3 and 4 satellites launched in 2014, which suffered from various component anomalies but could complete their missions.

  17. Modelling and structural analysis of skull/cranial implant: beyond mid-line deformities.

    PubMed

    Bogu, V Phanindra; Kumar, Y Ravi; Kumar Khanara, Asit

    2017-01-01

    This computational study explores modelling and finite element study of the implant under Intracranial pressure (ICP) conditions with normal ICP range (7 mm Hg to 15 mm Hg) or increased ICP (>I5 mm Hg). The implant fixation points allow implant behaviour with respect to intracranial pressure conditions. However, increased fixation points lead to variation in deformation and equivalent stress. Finite element analysis is providing a valuable insight to know the deformation and equivalent stress. The patient CT data (Computed Tomography) is processed in Mimics software to get the mesh model. The implant is modelled by using modified reverse engineering technique with the help of Rhinoceros software. This modelling method is applicable for all types of defects including those beyond the middle line and multiple ones. It is designed with eight fixation points and ten fixation points to fix an implant. Consequently, the mechanical deformation and equivalent stress (von Mises) are calculated in ANSYS 15 software with distinctive material properties such as Titanium alloy (Ti6Al4V), Polymethyl methacrylate (PMMA) and polyether-ether-ketone (PEEK). The deformation and equivalent stress results are obtained through ANSYS 15 software. It is observed that Ti6Al4V material shows low deformation and PEEK material shows less equivalent stress. Among all materials PEEK shows noticeably good result. Hence, a concept was established and more clinically relevant results can be expected with implementation of realistic 3D printed model in the future. This will allow physicians to gain knowledge and decrease surgery time with proper planning.

  18. Rapid Development of Custom Software Architecture Design Environments

    DTIC Science & Technology

    1999-08-01

    the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture

  19. "Reliability Of Fiber Optic Lans"

    NASA Astrophysics Data System (ADS)

    Code n, Michael; Scholl, Frederick; Hatfield, W. Bryan

    1987-02-01

    Fiber optic Local Area Network Systems are being used to interconnect increasing numbers of nodes. These nodes may include office computer peripherals and terminals, PBX switches, process control equipment and sensors, automated machine tools and robots, and military telemetry and communications equipment. The extensive shared base of capital resources in each system requires that the fiber optic LAN meet stringent reliability and maintainability requirements. These requirements are met by proper system design and by suitable manufacturing and quality procedures at all levels of a vertically integrated manufacturing operation. We will describe the reliability and maintainability of Codenoll's passive star based systems. These include LAN systems compatible with Ethernet (IEEE 802.3) and MAP (IEEE 802.4), and software compatible with IBM Token Ring (IEEE 802.5). No single point of failure exists in this system architecture.

  20. Learning characteristics of a space-time neural network as a tether skiprope observer

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Villarreal, James A.; Jani, Yashvant; Copeland, Charles

    1993-01-01

    The Software Technology Laboratory at the Johnson Space Center is testing a Space Time Neural Network (STNN) for observing tether oscillations present during retrieval of a tethered satellite. Proper identification of tether oscillations, known as 'skiprope' motion, is vital to safe retrieval of the tethered satellite. Our studies indicate that STNN has certain learning characteristics that must be understood properly to utilize this type of neural network for the tethered satellite problem. We present our findings on the learning characteristics including a learning rate versus momentum performance table.

  1. Learning characteristics of a space-time neural network as a tether skiprope observer

    NASA Technical Reports Server (NTRS)

    Lea, Robert N.; Villarreal, James A.; Jani, Yashvant; Copeland, Charles

    1992-01-01

    The Software Technology Laboratory at JSC is testing a Space Time Neural Network (STNN) for observing tether oscillations present during retrieval of a tethered satellite. Proper identification of tether oscillations, known as 'skiprope' motion, is vital to safe retrieval of the tethered satellite. Our studies indicate that STNN has certain learning characteristics that must be understood properly to utilize this type of neural network for the tethered satellite problem. We present our findings on the learning characteristics including a learning rate versus momentum performance table.

  2. Design, Implementation, and Wide Pilot Deployment of FitForAll: An Easy to use Exergaming Platform Improving Physical Fitness and Life Quality of Senior Citizens.

    PubMed

    Konstantinidis, Evdokimos I; Billis, Antonis S; Mouzakidis, Christos A; Zilidou, Vasiliki I; Antoniou, Panagiotis E; Bamidis, Panagiotis D

    2016-01-01

    Many platforms have emerged as response to the call for technology supporting active and healthy aging. Key requirements for any such e-health systems and any subsequent business exploitation are tailor-made design and proper evaluation. This paper presents the design, implementation, wide deployment, and evaluation of the low cost, physical exercise, and gaming (exergaming) FitForAll (FFA) platform system usability, user adherence to exercise, and efficacy are explored. The design of FFA is tailored to elderly populations, distilling literature guidelines and recommendations. The FFA architecture introduces standard physical exercise protocols in exergaming software engineering, as well as, standard physical assessment tests for augmented adaptability through adjustable exercise intensity. This opens up the way to next generation exergaming software, which may be more automatically/smartly adaptive. 116 elderly users piloted FFA five times/week, during an eight-week controlled intervention. Usability evaluation was formally conducted (SUS, SUMI questionnaires). Control group consisted of a size-matched elderly group following cognitive training. Efficacy was assessed objectively through the senior fitness (Fullerton) test, and subjectively, through WHOQoL-BREF comparisons of pre-postintervention between groups. Adherence to schedule was measured by attendance logs. The global SUMI score was 68.33±5.85%, while SUS was 77.7. Good usability perception is reflected in relatively high adherence of 82% for a daily two months pilot schedule. Compared to control group, elderly using FFA improved significantly strength, flexibility, endurance, and balance while presenting a significant trend in quality of life improvements. This is the first elderly focused exergaming platform intensively evaluated with more than 100 participants. The use of formal tools makes the findings comparable to other studies and forms an elderly exergaming corpus.

  3. From implant planning to surgical execution: an integrated approach for surgery in oral implantology.

    PubMed

    Chiarelli, Tommaso; Franchini, Federico; Lamma, Achille; Lamma, Evelina; Sansoni, Tommaso

    2012-03-01

    Using oral implantology software and transferring the preoperative planning into a stereolithographic model, prosthodontists can produce the related surgical guide. This procedure has some disadvantages: bone-supported stent invasiveness, lack of references due to scattering and non-negligible stereolithography cost. An alternative solution is presented that provides an ideal surgical stent (not invasive, precise, and cheap) as a result. This work focuses on the third phase of a fully 3D approach to oral implant planning, that starts by CT scanning a patient who wears a markers-equipped radiological stent, continues exploiting built-on-purpose preoperative planning software, and finishes producing the ideal surgical template. A 5-axes bur-equipped robot has been designed able to reproduce the milling vectors planned by the software. Software-robot interfacing has been achieved properly matching the stent reference frame and the software and robot coordinate systems. Invasiveness has been avoided achieving the surgical stent from the mucosa-supported radiological mask wax-up. Scattering is ignored because of the surgical stent independency from the bone structure radiography. Production cost has been strongly reduced by avoiding the stereolithographic model. Finally, software-robot interfacing precision has been validated comparing digitally a multi-marker base and its planning transfer. Average position and orientation errors (respectively 0.283 mm ± 0.073 mm and 1.798° ± 0.496°) were significantly better than those achieved using methods based on stereolithography (respectively, 1.45 mm ± 1.42 mm and 7.25° ± 2.67°, with a general best maximum translation discrepancy of about 1.1 mm). This paper describes the last step of a fully 3D approach in which implant planning can be done in a 3D environment, and the correct position, orientation and depth of the planned implants are easily computed and transferred to the surgical phase. Copyright © 2011 John Wiley & Sons, Ltd.

  4. Intelligent Agents for Design and Synthesis Environments: My Summary

    NASA Technical Reports Server (NTRS)

    Norvig, Peter

    1999-01-01

    This presentation gives a summary of intelligent agents for design synthesis environments. We'll start with the conclusions, and work backwards to justify them. First, an important assumption is that agents (whatever they are) are good for software engineering. This is especially true for software that operates in an uncertain, changing environment. The "real world" of physical artifacts is like that: uncertain in what we can measure, changing in that things are always breaking down, and we must interact with non-software entities. The second point is that software engineering techniques can contribute to good design. There may have been a time when we wanted to build simple artifacts containing little or no software. But modern aircraft and spacecraft are complex, and rely on a great deal of software. So better software engineering leads to better designed artifacts, especially when we are designing a series of related artifacts and can amortize the costs of software development. The third point is that agents are especially useful for design tasks, above and beyond their general usefulness for software engineering, and the usefulness of software engineering to design.

  5. Design of tissue engineering scaffolds based on hyperbolic surfaces: structural numerical evaluation.

    PubMed

    Almeida, Henrique A; Bártolo, Paulo J

    2014-08-01

    Tissue engineering represents a new field aiming at developing biological substitutes to restore, maintain, or improve tissue functions. In this approach, scaffolds provide a temporary mechanical and vascular support for tissue regeneration while tissue in-growth is being formed. These scaffolds must be biocompatible, biodegradable, with appropriate porosity, pore structure and distribution, and optimal vascularization with both surface and structural compatibility. The challenge is to establish a proper balance between porosity and mechanical performance of scaffolds. This work investigates the use of two different types of triple periodic minimal surfaces, Schwarz and Schoen, in order to design better biomimetic scaffolds with high surface-to-volume ratio, high porosity and good mechanical properties. The mechanical behaviour of these structures is assessed through the finite element method software Abaqus. The effect of two parametric parameters (thickness and surface radius) is also evaluated regarding its porosity and mechanical behaviour. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  6. Display Developer for Firing Room Applications

    NASA Technical Reports Server (NTRS)

    Bowman, Elizabeth A.

    2013-01-01

    The firing room at Kennedy Space Center (KSC) is responsible for all NASA human spaceflight launch operations, therefore it is vital that all displays within the firing room be properly tested, up-to-date, and user-friendly during a launch. The Ground Main Propulsion System (GMPS) requires a number of remote displays for Vehicle Integration and Launch (VIL) Operations at KSC. My project is to develop remote displays for the GMPS using the Display Services and Framework (DSF) editor. These remote displays will be based on model images provided by GMPS through PowerPoint. Using the DSF editor, the PowerPoint images can be recreated with active buttons associated with the correct Compact Unique Identifiers (CUIs). These displays will be documented in the Software Requirements and Design Specifications (SRDS) at the 90% GMPS Design Review. In the future, these remote displays will be available for other developers to improve, edit, or add on to so that the display may be incorporated into the firing room to be used for launches.

  7. SEI Software Engineering Education Directory.

    DTIC Science & Technology

    1987-02-01

    Software Design and Development Gilbert. Philip Systems: CDC Cyber 170/750 CDC Cyber 170760 DEC POP 11/44 PRIME AT&T 3B5 IBM PC IBM XT IBM RT...Macintosh VAx 8300 Software System Development and Laboratory CS 480/480L U P X T Textbooks: Software Design and Development Gilbert, Philip Systems: CDC...Acting Chair (618) 692-2386 Courses: Software Design and Development CS 424 U P E Y Textbooks: Software Design and Development, Gilbert, Philip Topics

  8. The design of the local monitor and control system of SKA dishes

    NASA Astrophysics Data System (ADS)

    Schillirò, F.; Baldini, V.; Becciani, U.; Cirami, R.; Costa, A.; Ingallinera, A.; Marassi, A.; Nicotra, G.; Nocita, C.; Riggi, S.; Trigilio, C.

    2016-08-01

    The Square Kilometer Array (SKA) project aims at building the world's largest radio observatory to observe the sky with unprecedented sensitivity and collecting area. In the first phase of the project (SKA1), an array of dishes, SKA1-MID, will be built in South Africa. It will consist of 133 15m-dishes, which will include the MeerKAT array, for the 0.350-20 GHz frequency band observations. Each antenna will be provided with a local monitor and control system (LMC), enabling operations both to the Telescope Manager remote system, and to the engineers and maintenance staff; it provides different environment for the telescope control (positioning, pointing, observational bands), metadata collection for monitoring and database storaging, operational modes and functional states management for all the telescope capabilities. In this paper we present the LMC software architecture designed for the detailed design phase (DD), where we describe functional and physical interfaces with monitored and controlled sub-elements, and highlight the data flow between each LMC modules and its sub-element controllers from one side, and Telescope Manager on the other side. We also describe the complete Product Breakdown Structure (PBS) created in order to optimize resources allocation in terms of calculus and memory, able to perform required task for each element according to the proper requirements. Among them, time response and system reliability are the most important, considering the complexity of SKA dish network and its isolated placement. Performances obtained by software implementation using TANGO framework will be discussed, matching them with technical requirements derived by SKA science drivers.

  9. Integrated Design Software Predicts the Creep Life of Monolithic Ceramic Components

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Significant improvements in propulsion and power generation for the next century will require revolutionary advances in high-temperature materials and structural design. Advanced ceramics are candidate materials for these elevated-temperature applications. As design protocols emerge for these material systems, designers must be aware of several innate features, including the degrading ability of ceramics to carry sustained load. Usually, time-dependent failure in ceramics occurs because of two different, delayedfailure mechanisms: slow crack growth and creep rupture. Slow crack growth initiates at a preexisting flaw and continues until a critical crack length is reached, causing catastrophic failure. Creep rupture, on the other hand, occurs because of bulk damage in the material: void nucleation and coalescence that eventually leads to macrocracks which then propagate to failure. Successful application of advanced ceramics depends on proper characterization of material behavior and the use of an appropriate design methodology. The life of a ceramic component can be predicted with the NASA Lewis Research Center's Ceramics Analysis and Reliability Evaluation of Structures (CARES) integrated design programs. CARES/CREEP determines the expected life of a component under creep conditions, and CARES/LIFE predicts the component life due to fast fracture and subcritical crack growth. The previously developed CARES/LIFE program has been used in numerous industrial and Government applications.

  10. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  11. Application of Design Patterns in Refactoring Software Design

    NASA Technical Reports Server (NTRS)

    Baggs. Rjpda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  12. Apply Design Patterns to Refactor Software Design

    NASA Technical Reports Server (NTRS)

    Baggs, Rhoda; Shaykhian, Gholam Ali

    2007-01-01

    Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.

  13. Designing a Medical Tourism Website: A Qualitative Study.

    PubMed

    Samadbeik, Mahnaz; Asadi, Heshmatollah; Mohseni, Mohammad; Takbiri, Afsaneh; Moosavi, Ahmad; Garavand, Ali

    2017-02-01

    Informing plays a prominent role in attracting medical tourists. The enjoyment of proper medical information systems is one of the most important tools for the attraction of medical tourists. Iran's ability in designing and implementing information networks has remained largely unknown. The current study aimed to explore information needs for designing a medical tourism website. This qualitative study was conducted in 2015 for designing Hospital Medical-Tourism Website (HMTW). A purposive sampling method was used and data were gathered using a semi-structured questionnaire. Totally, 12 faculty members and experts in the field of medical tourism were interviewed. Data were analyzed using the MAXQDA10 software. Totally 41 sub-themes and 10 themes were identified. The themes included the introduction of hospital, general guide for patients, tourism information, information related to physicians in hospital, costs, treatment follow-up, online hospital appointment scheduling in website, statistics and news of hospital medical tourism, photo gallery and contacts. Among the themes, the participants highly emphasized four themes including costs (100%), tourism information (91.6%), information related to physicians in hospital, (83.3%) and treatment follow-up (83.3%). This profitable industry can be developed through considering information requirements for hospital medical tourism website.

  14. Designing a Medical Tourism Website: A Qualitative Study

    PubMed Central

    SAMADBEIK, Mahnaz; ASADI, Heshmatollah; MOHSENI, Mohammad; TAKBIRI, Afsaneh; MOOSAVI, Ahmad; GARAVAND, Ali

    2017-01-01

    Background: Informing plays a prominent role in attracting medical tourists. The enjoyment of proper medical information systems is one of the most important tools for the attraction of medical tourists. Iran’s ability in designing and implementing information networks has remained largely unknown. The current study aimed to explore information needs for designing a medical tourism website. Methods: This qualitative study was conducted in 2015 for designing Hospital Medical-Tourism Website (HMTW). A purposive sampling method was used and data were gathered using a semi-structured questionnaire. Totally, 12 faculty members and experts in the field of medical tourism were interviewed. Data were analyzed using the MAXQDA10 software. Results: Totally 41 sub-themes and 10 themes were identified. The themes included the introduction of hospital, general guide for patients, tourism information, information related to physicians in hospital, costs, treatment follow-up, online hospital appointment scheduling in website, statistics and news of hospital medical tourism, photo gallery and contacts. Among the themes, the participants highly emphasized four themes including costs (100%), tourism information (91.6%), information related to physicians in hospital, (83.3%) and treatment follow-up (83.3%). Conclusion: This profitable industry can be developed through considering information requirements for hospital medical tourism website. PMID:28451562

  15. Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE)

    DTIC Science & Technology

    2005-04-01

    PA 15213-3890 Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE) Felix Bachmann and Mark Klein Software...DATES COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Methodical Design of Software Architecture Using an Architecture Design Assistant...important for architecture design – quality requirements and constraints are most important Here’s some evidence: If the only concern is

  16. Reliability measurement during software development. [for a multisensor tracking system

    NASA Technical Reports Server (NTRS)

    Hecht, H.; Sturm, W. A.; Trattner, S.

    1977-01-01

    During the development of data base software for a multi-sensor tracking system, reliability was measured. The failure ratio and failure rate were found to be consistent measures. Trend lines were established from these measurements that provided good visualization of the progress on the job as a whole as well as on individual modules. Over one-half of the observed failures were due to factors associated with the individual run submission rather than with the code proper. Possible application of these findings for line management, project managers, functional management, and regulatory agencies is discussed. Steps for simplifying the measurement process and for use of these data in predicting operational software reliability are outlined.

  17. Software Design Methods for Real-Time Systems

    DTIC Science & Technology

    1989-12-01

    This module describes the concepts and methods used in the software design of real time systems . It outlines the characteristics of real time systems , describes...the role of software design in real time system development, surveys and compares some software design methods for real - time systems , and

  18. Designing Educational Software for Tomorrow.

    ERIC Educational Resources Information Center

    Harvey, Wayne

    Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…

  19. Investigating the key factors in designing a communication skills program for medical students: A qualitative study.

    PubMed

    Mahdi Hazavehei, Seyyed M; Karimi Moonaghi, Hossein; Moeini, Babak; Moghimbeigi, Abbas; Emadzadeh, Ali

    2015-11-01

    Medical students have a serious need to acquire communication skills with others. In many medical schools, special curriculums are developed to improve such skills. Effective training of communication skills requires expert curriculum design. The aim of this study was to explore the experiences and views of experts and stakeholders in order to design a suitable training program in communication skills for medical students. The content analysis approach was used in this qualitative study. Forty-three participants were selected from the faculty, nurses, physicians, residents, and medical students at Mashhad University of Medical Sciences using purposive sampling. The data were collected through focus group discussions and semi-structured interviews. To ensure the accuracy of the data, the criteria of credibility, transferability, dependability, and conformability were met. The data were analyzed by MAXQDA software using the Graneheim & Lundman model. The findings of this study consisted of two main themes, i.e., "The vast nature of the present communication skills training" and "administrative requirements of the training program regarding communication skills." The first theme included the educational needs of students, the problems associated with training people to have good communication skills, the importance of good communication skills in performing professional duties, communication skills and job requirements, the learning environment of communication skills, and the status of existing training programs for communication skills. Strategies and suitable methods for teaching communication skills and methods of evaluating the students in this regard also were obtained. The findings of this study were the elements required to design a proper and local model to teach communication skills to medical students through analyzing the concepts of effective communication. The results of this study can be useful for medical faculties in designing a proper program for teaching medical students how to communicate effectively with patients and colleagues.

  20. Investigating the key factors in designing a communication skills program for medical students: A qualitative study

    PubMed Central

    Mahdi Hazavehei, Seyyed M.; Moonaghi, Hossein Karimi; Moeini, Babak; Moghimbeigi, Abbas; Emadzadeh, Ali

    2015-01-01

    Introduction Medical students have a serious need to acquire communication skills with others. In many medical schools, special curriculums are developed to improve such skills. Effective training of communication skills requires expert curriculum design. The aim of this study was to explore the experiences and views of experts and stakeholders in order to design a suitable training program in communication skills for medical students. Methods The content analysis approach was used in this qualitative study. Forty-three participants were selected from the faculty, nurses, physicians, residents, and medical students at Mashhad University of Medical Sciences using purposive sampling. The data were collected through focus group discussions and semi-structured interviews. To ensure the accuracy of the data, the criteria of credibility, transferability, dependability, and conformability were met. The data were analyzed by MAXQDA software using the Graneheim & Lundman model. Results The findings of this study consisted of two main themes, i.e., “The vast nature of the present communication skills training” and “administrative requirements of the training program regarding communication skills.” The first theme included the educational needs of students, the problems associated with training people to have good communication skills, the importance of good communication skills in performing professional duties, communication skills and job requirements, the learning environment of communication skills, and the status of existing training programs for communication skills. Strategies and suitable methods for teaching communication skills and methods of evaluating the students in this regard also were obtained. Conclusion The findings of this study were the elements required to design a proper and local model to teach communication skills to medical students through analyzing the concepts of effective communication. The results of this study can be useful for medical faculties in designing a proper program for teaching medical students how to communicate effectively with patients and colleagues. PMID:26767096

  1. Repair, Evaluation, Maintenance, and Rehabilitation Research Program. Lubricants for Hydraulic Structures

    DTIC Science & Technology

    1989-08-01

    machinery design , precision machining, proper maintenance, and proper lubrication. Ordinarily, wear is thought of only in terms of abrasive wear occurring in...operate under this principle. However, the design must allow the plates to lift and tilt properly and provide sufficient area to lift the load. 38. Another...friction and wear to a minimum. Boundary Lubrication 42. Lubrication designed to protect against frictional effects when asperities meet is called

  2. Reducing Stressful Aspects of Information Technology in Public Services.

    ERIC Educational Resources Information Center

    Quinn, Brian

    1995-01-01

    Identifies sources of technological stress for public services librarians and patrons and proposes ways to reduce stress, including communicating with staff, implementing a system gradually, providing adequate training, creating proper documentation, planning, considering ergonomics in hardware and software selection, selecting a good interface,…

  3. The Ins and Outs of Access Control.

    ERIC Educational Resources Information Center

    Longworth, David

    1999-01-01

    Presents basic considerations when school districts plan to acquire an access-control system for their education facilities. Topics cover cards and readers, controllers, software, automation, card technology, expandability, price, specification of needs beyond the canned specifications already supplied, and proper usage training to cardholders.…

  4. Staying Secure for School Safety

    ERIC Educational Resources Information Center

    Youngkin, Minu

    2012-01-01

    Proper planning and preventive maintenance can increase school security and return on investment. Preventive maintenance begins with planning. Through careful planning, education institutions can determine what is working and if any equipment, hardware or software needs to be replaced or upgraded. When reviewing a school's safety and security…

  5. Software Maintenance.

    ERIC Educational Resources Information Center

    Cannon, Glenn; Jobe, Holly

    Proper cleaning and storage of audiovisual aids is outlined in this brief guide. Materials and equipment needed for first line maintenance are listed, as well as maintenance procedures for records, audio and video tape, film, filmstrips, slides, realia, models, prints, graphics, maps, and overhead transparencies. A 15-item quiz on software…

  6. Development and Application of Collaborative Optimization Software for Plate - fin Heat Exchanger

    NASA Astrophysics Data System (ADS)

    Chunzhen, Qiao; Ze, Zhang; Jiangfeng, Guo; Jian, Zhang

    2017-12-01

    This paper introduces the design ideas of the calculation software and application examples for plate - fin heat exchangers. Because of the large calculation quantity in the process of designing and optimizing heat exchangers, we used Visual Basic 6.0 as a software development carrier to design a basic calculation software to reduce the calculation quantity. Its design condition is plate - fin heat exchanger which was designed according to the boiler tail flue gas. The basis of the software is the traditional design method of the plate-fin heat exchanger. Using the software for design and calculation of plate-fin heat exchangers, discovery will effectively reduce the amount of computation, and similar to traditional methods, have a high value.

  7. Averting Denver Airports on a Chip

    NASA Technical Reports Server (NTRS)

    Sullivan, Kevin J.

    1995-01-01

    As a result of recent advances in software engineering capabilities, we are now in a more stable environment. De-facto hardware and software standards are emerging. Work on software architecture and design patterns signals a consensus on the importance of early system-level design decisions, and agreements on the uses of certain paradigmatic software structures. We now routinely build systems that would have been risky or infeasible a few years ago. Unfortunately, technological developments threaten to destabilize software design again. Systems designed around novel computing and peripheral devices will spark ambitious new projects that will stress current software design and engineering capabilities. Micro-electro-mechanical systems (MEMS) and related technologies provide the physical basis for new systems with the potential to produce this kind of destabilizing effect. One important response to anticipated software engineering and design difficulties is carefully directed engineering-scientific research. Two specific problems meriting substantial research attention are: A lack of sufficient means to build software systems by generating, extending, specializing, and integrating large-scale reusable components; and a lack of adequate computational and analytic tools to extend and aid engineers in maintaining intellectual control over complex software designs.

  8. GNSS Network Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Balodis, J.; Janpaule, I.; Haritonova, D.; Normand, M.; Silabriedis, G.; Zarinjsh, A.; Zvirgzds, J.

    2012-04-01

    Time series of GNSS station results of both the EUPOS®-RIGA and LATPOS networks has been developed at the Institute of Geodesy and Geoinformation (University of Latvia) using Bernese v.5.0 software. The base stations were selected among the EPN and IGS stations in surroundings of Latvia. In various day solutions the base station selection has been miscellaneous. Most frequently 5 - 8 base stations were selected from a set of stations {BOR1, JOEN, JOZE, MDVJ, METS, POLV, PULK, RIGA, TORA, VAAS, VISO, VLNS}. The rejection of "bad base stations" was performed by Bernese software depending on the quality of proper station data in proper day. This caused a reason of miscellaneous base station selection in various days. The results of time series are analysed. The question aroused on the nature of some outlying situations. The seasonal effect of the behaviour of the network has been identified when distance and elevation changes between stations has been analysed. The dependence from various influences has been recognised.

  9. Ancestral haplotype-based association mapping with generalized linear mixed models accounting for stratification.

    PubMed

    Zhang, Z; Guillaume, F; Sartelet, A; Charlier, C; Georges, M; Farnir, F; Druet, T

    2012-10-01

    In many situations, genome-wide association studies are performed in populations presenting stratification. Mixed models including a kinship matrix accounting for genetic relatedness among individuals have been shown to correct for population and/or family structure. Here we extend this methodology to generalized linear mixed models which properly model data under various distributions. In addition we perform association with ancestral haplotypes inferred using a hidden Markov model. The method was shown to properly account for stratification under various simulated scenari presenting population and/or family structure. Use of ancestral haplotypes resulted in higher power than SNPs on simulated datasets. Application to real data demonstrates the usefulness of the developed model. Full analysis of a dataset with 4600 individuals and 500 000 SNPs was performed in 2 h 36 min and required 2.28 Gb of RAM. The software GLASCOW can be freely downloaded from www.giga.ulg.ac.be/jcms/prod_381171/software. francois.guillaume@jouy.inra.fr Supplementary data are available at Bioinformatics online.

  10. The Resolved Stellar Populations Early Release Science Program

    NASA Astrophysics Data System (ADS)

    Gilbert, Karoline; Weisz, Daniel; Resolved Stellar Populations ERS Program Team

    2018-06-01

    The Resolved Stellar Populations Early Release Science Program (PI D. Weisz) will observe Local Group targets covering a range of stellar density and star formation histories, including a globular cluster, and ultra-faint dwarf galaxy, and a star-forming dwarf galaxy. Using observations of these diverse targets we will explore a broad science program: we will measure star formation histories, the sub-solar stellar initial mass function, and proper motions, perform studies of evolved stars, and map extinction in the target fields. Our observations will be of high archival value for other science such as calibrating stellar evolution models, studying variable stars, and searching for metal-poor stars. We will determine optimal observational setups and develop data reduction techniques that will be common to JWST studies of resolved stellar populations. We will also design, test, and release point spread function (PSF) fitting software specific to NIRCam and NIRISS, required for the crowded stellar regime. Prior to the Cycle 2 Call for Proposals, we will release PSF fitting software, matched HST and JWST catalogs, and clear documentation and step-by-step tutorials (such as Jupyter notebooks) for reducing crowded stellar field data and producing resolved stellar photometry catalogs, as well as for specific resolved stellar photometry science applications.

  11. Cassini Tour Atlas Automated Generation

    NASA Technical Reports Server (NTRS)

    Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.

    2011-01-01

    During the Cassini spacecraft s cruise phase and nominal mission, the Cassini Science Planning Team developed and maintained an online database of geometric and timing information called the Cassini Tour Atlas. The Tour Atlas consisted of several hundreds of megabytes of EVENTS mission planning software outputs, tables, plots, and images used by mission scientists for observation planning. Each time the nominal mission trajectory was altered or tweaked, a new Tour Atlas had to be regenerated manually. In the early phases of Cassini s Equinox Mission planning, an a priori estimate suggested that mission tour designers would develop approximately 30 candidate tours within a short period of time. So that Cassini scientists could properly analyze the science opportunities in each candidate tour quickly and thoroughly so that the optimal series of orbits for science return could be selected, a separate Tour Atlas was required for each trajectory. The task of manually generating the number of trajectory analyses in the allotted time would have been impossible, so the entire task was automated using code written in five different programming languages. This software automates the generation of the Cassini Tour Atlas database. It performs with one UNIX command what previously took a day or two of human labor.

  12. csaw: a Bioconductor package for differential binding analysis of ChIP-seq data using sliding windows

    PubMed Central

    Lun, Aaron T.L.; Smyth, Gordon K.

    2016-01-01

    Chromatin immunoprecipitation with massively parallel sequencing (ChIP-seq) is widely used to identify binding sites for a target protein in the genome. An important scientific application is to identify changes in protein binding between different treatment conditions, i.e. to detect differential binding. This can reveal potential mechanisms through which changes in binding may contribute to the treatment effect. The csaw package provides a framework for the de novo detection of differentially bound genomic regions. It uses a window-based strategy to summarize read counts across the genome. It exploits existing statistical software to test for significant differences in each window. Finally, it clusters windows into regions for output and controls the false discovery rate properly over all detected regions. The csaw package can handle arbitrarily complex experimental designs involving biological replicates. It can be applied to both transcription factor and histone mark datasets, and, more generally, to any type of sequencing data measuring genomic coverage. csaw performs favorably against existing methods for de novo DB analyses on both simulated and real data. csaw is implemented as a R software package and is freely available from the open-source Bioconductor project. PMID:26578583

  13. Shuttle mission simulator software conceptual design

    NASA Technical Reports Server (NTRS)

    Burke, J. F.

    1973-01-01

    Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.

  14. Software Assurance: Five Essential Considerations for Acquisition Officials

    DTIC Science & Technology

    2007-05-01

    May 2007 www.stsc.hill.af.mil 17 2 • address security concerns in the software development life cycle ( SDLC )? • Are there formal software quality...What threat modeling process, if any, is used when designing the software ? What analysis, design, and construction tools are used by your software design...the-shelf (COTS), government off-the-shelf (GOTS), open- source, embedded, and legacy software . Attackers exploit unintentional vulnerabil- ities or

  15. OSI for hardware/software interoperability

    NASA Astrophysics Data System (ADS)

    Wood, Richard J.; Harvey, Donald L.; Linderman, Richard W.; Gardener, Gary A.; Capraro, Gerard T.

    1994-03-01

    There is a need in public safety for real-time data collection and transmission from one or more sensors. The Rome Laboratory and the Ballistic Missile Defense Organization are pursuing an effort to bring the benefits of Open System Architectures (OSA) to embedded systems within the Department of Defense. When developed properly OSA provides interoperability, commonality, graceful upgradeability, survivability and hardware/software transportability to greatly minimize life cycle costs, integration and supportability. Architecture flexibility can be achieved to take advantage of commercial accomplishments by basing these developments on vendor-neutral commercially accepted standards and protocols.

  16. A self-referential HOWTO on release engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galassi, Mark C.

    Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early andmore » continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.« less

  17. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... standards. (f) The reviewer shall analyze all Fault Tree Analyses (FTA), Failure Mode and Effects... for each product vulnerability cited by the reviewer; (4) Identification of any documentation or... not properly followed; (6) Identification of the software verification and validation procedures, as...

  18. Software Prototyping: Designing Systems for Users.

    ERIC Educational Resources Information Center

    Spies, Phyllis Bova

    1983-01-01

    Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…

  19. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    NASA Technical Reports Server (NTRS)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  20. Software archeology: a case study in software quality assurance and design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less

  1. Language and Program for Documenting Software Design

    NASA Technical Reports Server (NTRS)

    Kleine, H.; Zepko, T. M.

    1986-01-01

    Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.

  2. Automatic extraction and visualization of object-oriented software design metrics

    NASA Astrophysics Data System (ADS)

    Lakshminarayana, Anuradha; Newman, Timothy S.; Li, Wei; Talburt, John

    2000-02-01

    Software visualization is a graphical representation of software characteristics and behavior. Certain modes of software visualization can be useful in isolating problems and identifying unanticipated behavior. In this paper we present a new approach to aid understanding of object- oriented software through 3D visualization of software metrics that can be extracted from the design phase of software development. The focus of the paper is a metric extraction method and a new collection of glyphs for multi- dimensional metric visualization. Our approach utilize the extensibility interface of a popular CASE tool to access and automatically extract the metrics from Unified Modeling Language class diagrams. Following the extraction of the design metrics, 3D visualization of these metrics are generated for each class in the design, utilizing intuitively meaningful 3D glyphs that are representative of the ensemble of metrics. Extraction and visualization of design metrics can aid software developers in the early study and understanding of design complexity.

  3. Research on Visualization Design Method in the Field of New Media Software Engineering

    NASA Astrophysics Data System (ADS)

    Deqiang, Hu

    2018-03-01

    In the new period of increasingly developed science and technology, with the increasingly fierce competition in the market and the increasing demand of the masses, new design and application methods have emerged in the field of new media software engineering, that is, the visualization design method. Applying the visualization design method to the field of new media software engineering can not only improve the actual operation efficiency of new media software engineering but more importantly the quality of software development can be enhanced by means of certain media of communication and transformation; on this basis, the progress and development of new media software engineering in China are also continuously promoted. Therefore, the application of visualization design method in the field of new media software engineering is analysed concretely in this article from the perspective of the overview of visualization design methods and on the basis of systematic analysis of the basic technology.

  4. Designing an Optimized Novel Femoral Stem

    PubMed Central

    Babaniamansour, Parto; Ebrahimian-Hosseinabadi, Mehdi; Zargar-Kharazi, Anousheh

    2017-01-01

    Background: After total hip arthroplasty, there would be some problems for the patients. Implant loosening is one of the significant problems which results in thigh pain and even revision surgery. Difference between Young's modulus of bone-metal is the cause of stress shielding, atrophy, and subsequent implant loosening. Materials and Methods: In this paper, femoral stem stiffness is reduced by novel biomechanical and biomaterial design which includes using proper design parameters, coating it with porous surface, and modeling the sketch by the software. Parametric design of femoral stem is done on the basis of clinical reports. Results: Optimized model for femoral stem is proposed. Curved tapered stem with trapezoidal cross-section and particular neck and offset is designed. Fully porous surface is suggested. Moreover, Designed femoral stem analysis showed the Ti6Al4V stem which is covered with layer of 1.5 mm in thickness and 50% of porosity is as stiff as 77 GPa that is 30% less than the stem without any porosity. Porous surface of designed stem makes it fix biologically; thus, prosthesis loosening probability decreases. Conclusion: By optimizing femoral stem geometry (size and shape) and also making a porous surface, which had an intermediate stiffness of bone and implant, a more efficient hip joint prosthesis with more durability fixation was achieved due to better stress transmission from implant to the bone. PMID:28840118

  5. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  6. The Application of the EIS in Li-ion Batteries Measurement

    NASA Astrophysics Data System (ADS)

    Zhai, N. S.; Li, M. W.; Wang, W. L.; Zhang, D. L.; Xu, D. G.

    2006-10-01

    The measurement and determination of the lithium ion battery's electrochemical impedance spectroscopy (EIS) and the application of EIS to battery classification are researched in this paper. The lithium ion battery gets extensive applications due to its inherent advantages over other batteries. For proper and sustainable performance, it is very necessary to check the uniformity of the lithium ion batteries. In this paper, the equivalent circuit of the lithium ion battery is analyzed; the design of hardware circuit based on DSP and software that calculates the EIS of the lithium ion battery is critically done and evaluated. The parameters of the lithium ion equivalent circuit are determined, the parameter values of li-ion equivalent circuit are achieved by least square method, and the application of Principal Component Analysis (CPA) to the battery classification is analyzed.

  7. Automatization of hardware configuration for plasma diagnostic system

    NASA Astrophysics Data System (ADS)

    Wojenski, A.; Pozniak, K. T.; Kasprowicz, G.; Kolasinski, P.; Krawczyk, R. D.; Zabolotny, W.; Linczuk, P.; Chernyshova, M.; Czarski, T.; Malinowski, K.

    2016-09-01

    Soft X-ray plasma measurement systems are mostly multi-channel, high performance systems. In case of the modular construction it is necessary to perform sophisticated system discovery in parallel with automatic system configuration. In the paper the structure of the modular system designed for tokamak plasma soft X-ray measurements is described. The concept of the system discovery and further automatic configuration is also presented. FCS application (FMC/ FPGA Configuration Software) is used for running sophisticated system setup with automatic verification of proper configuration. In order to provide flexibility of further system configurations (e.g. user setup), common communication interface is also described. The approach presented here is related to the automatic system firmware building presented in previous papers. Modular construction and multichannel measurements are key requirement in term of SXR diagnostics with use of GEM detectors.

  8. Endovascular abdominal aortic aneurysm sizing and case planning using the TeraRecon Aquarius workstation.

    PubMed

    Lee, W Anthony

    2007-01-01

    The gold standard for preoperative evaluation of an aortic aneurysm is a computed tomography angiogram (CTA). Three-dimensional reconstruction and analysis of the computed tomography data set is enormously helpful, and even sometimes essential, in proper sizing and planning for endovascular stent graft repair. To a large extent, it has obviated the need for conventional angiography for morphologic evaluation. The TeraRecon Aquarius workstation (San Mateo, Calif) represents a highly sophisticated but user-friendly platform utilizing a combination of task-specific hardware and software specifically designed to rapidly manipulate large Digital Imaging and Communications in Medicine (DICOM) data sets and provide surface-shaded and multiplanar renderings in real-time. This article discusses the basics of sizing and planning for endovascular abdominal aortic aneurysm repair and the role of 3-dimensional analysis using the TeraRecon workstation.

  9. Integrated restructurable flight control system demonstration results

    NASA Technical Reports Server (NTRS)

    Weiss, Jerold L.; Hsu, John Y.

    1987-01-01

    The purpose of this study was to examine the complementary capabilities of several restructurable flight control system (RFCS) concepts through the integration of these technologies into a complete system. Performance issues were addressed through a re-examination of RFCS functional requirements, and through a qualitative analysis of the design issues that, if properly addressed during integration, will lead to the highest possible degree of fault-tolerant performance. Software developed under previous phases of this contract and under NAS1-18004 was modified and integrated into a complete RFCS subroutine for NASA's B-737 simulation. The integration of these modules involved the development of methods for dealing with the mismatch between the outputs of the failure detection module and the input requirements of the automatic control system redesign module. The performance of this demonstration system was examined through extensive simulation trials.

  10. Minimalist identification system based on venous map for security applications

    NASA Astrophysics Data System (ADS)

    Jacinto G., Edwar; Martínez S., Fredy; Martínez S., Fernando

    2015-07-01

    This paper proposes a technique and an algorithm used to build a device for people identification through the processing of a low resolution camera image. The infrared channel is the only information needed, sensing the blood reaction with the proper wave length, and getting a preliminary snapshot of the vascular map of the back side of the hand. The software uses this information to extract the characteristics of the user in a limited area (region of interest, ROI), unique for each user, which applicable to biometric access control devices. This kind of recognition prototypes functions are expensive, but in this case (minimalist design), the biometric equipment only used a low cost camera and the matrix of IR emitters adaptation to construct an economic and versatile prototype, without neglecting the high level of effectiveness that characterizes this kind of identification method.

  11. The Utility of Free Software for Gravity and Magnetic Advanced Data Processing

    NASA Astrophysics Data System (ADS)

    Grandis, Hendra; Dahrin, Darharta

    2017-04-01

    The lack of computational tools, i.e. software, often hinders the proper teaching and application of geophysical data processing in academic institutions in Indonesia. Although there are academic licensing options for commercial software, such options are still way beyond the financial capability of some academic institutions. Academic community members (both lecturers and students) are supposed to be creative and resourceful to overcome such situation. Therefore, capability for writing computer programs or codes is a necessity. However, there are also many computer programs and even software that are freely available on the internet. Generally, the utility of the freely distributed software is limited for demonstration only or for visualizing and exchanging data. The paper discusses the utility of Geosoft’s Oasis Montaj Viewer along with USGS GX programs that are available for free. Useful gravity and magnetic advanced data processing (i.e. gradient calculation, spectral analysis etc.) can be performed “correctly” without any approximation that sometimes leads to dubious results and interpretation.

  12. Theoretical and software considerations for general dynamic analysis using multilevel substructured models

    NASA Technical Reports Server (NTRS)

    Schmidt, R. J.; Dodds, R. H., Jr.

    1985-01-01

    The dynamic analysis of complex structural systems using the finite element method and multilevel substructured models is presented. The fixed-interface method is selected for substructure reduction because of its efficiency, accuracy, and adaptability to restart and reanalysis. This method is extended to reduction of substructures which are themselves composed of reduced substructures. The implementation and performance of the method in a general purpose software system is emphasized. Solution algorithms consistent with the chosen data structures are presented. It is demonstrated that successful finite element software requires the use of software executives to supplement the algorithmic language. The complexity of the implementation of restart and reanalysis porcedures illustrates the need for executive systems to support the noncomputational aspects of the software. It is shown that significant computational efficiencies can be achieved through proper use of substructuring and reduction technbiques without sacrificing solution accuracy. The restart and reanalysis capabilities and the flexible procedures for multilevel substructured modeling gives economical yet accurate analyses of complex structural systems.

  13. Projecting manpower to attain quality

    NASA Technical Reports Server (NTRS)

    Rone, K. Y.

    1983-01-01

    The resulting model is useful as a projection tool but must be validated in order to be used as an on-going software cost engineering tool. A procedure is developed to facilitate the tracking of model projections and actual data to allow the model to be tuned. Finally, since the model must be used in an environment of overlapping development activities on a progression of software elements in development and maintenance, a manpower allocation model is developed for use in a steady state development/maintenance environment. In these days of soaring software costs it becomes increasingly important to properly manage a software development project. One element of the management task is the projection and tracking of manpower required to perform the task. In addition, since the total cost of the task is directly related to the initial quality built into the software, it becomes a necessity to project the development manpower in a way to attain that quality. An approach to projecting and tracking manpower with quality in mind is described.

  14. Using Docker Compose for the Simple Deployment of an Integrated Drug Target Screening Platform.

    PubMed

    List, Markus

    2017-06-10

    Docker virtualization allows for software tools to be executed in an isolated and controlled environment referred to as a container. In Docker containers, dependencies are provided exactly as intended by the developer and, consequently, they simplify the distribution of scientific software and foster reproducible research. The Docker paradigm is that each container encapsulates one particular software tool. However, to analyze complex biomedical data sets, it is often necessary to combine several software tools into elaborate workflows. To address this challenge, several Docker containers need to be instantiated and properly integrated, which complicates the software deployment process unnecessarily. Here, we demonstrate how an extension to Docker, Docker compose, can be used to mitigate these problems by providing a unified setup routine that deploys several tools in an integrated fashion. We demonstrate the power of this approach by example of a Docker compose setup for a drug target screening platform consisting of five integrated web applications and shared infrastructure, deployable in just two lines of codes.

  15. A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data

    NASA Technical Reports Server (NTRS)

    Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.

    2011-01-01

    A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.

  16. Design and Effects of Scenario Educational Software.

    ERIC Educational Resources Information Center

    Keegan, Mark

    1993-01-01

    Describes the development of educational computer software called scenario software that was designed to incorporate advances in cognitive, affective, and physiological research. Instructional methods are outlined; the need to change from didactic methods to discovery learning is explained; and scenario software design features are discussed. (24…

  17. An empirical study of software design practices

    NASA Technical Reports Server (NTRS)

    Card, David N.; Church, Victor E.; Agresti, William W.

    1986-01-01

    Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.

  18. Three-dimensional reconstruction of teeth and jaws based on segmentation of CT images using watershed transformation.

    PubMed

    Naumovich, S S; Naumovich, S A; Goncharenko, V G

    2015-01-01

    The objective of the present study was the development and clinical testing of a three-dimensional (3D) reconstruction method of teeth and a bone tissue of the jaw on the basis of CT images of the maxillofacial region. 3D reconstruction was performed using the specially designed original software based on watershed transformation. Computed tomograms in digital imaging and communications in medicine format obtained on multispiral CT and CBCT scanners were used for creation of 3D models of teeth and the jaws. The processing algorithm is realized in the stepwise threshold image segmentation with the placement of markers in the mode of a multiplanar projection in areas relating to the teeth and a bone tissue. The developed software initially creates coarse 3D models of the entire dentition and the jaw. Then, certain procedures specify the model of the jaw and cut the dentition into separate teeth. The proper selection of the segmentation threshold is very important for CBCT images having a low contrast and high noise level. The developed semi-automatic algorithm of multispiral and cone beam computed tomogram processing allows 3D models of teeth to be created separating them from a bone tissue of the jaws. The software is easy to install in a dentist's workplace, has an intuitive interface and takes little time in processing. The obtained 3D models can be used for solving a wide range of scientific and clinical tasks.

  19. Nerves of Steel: a Low-Cost Method for 3D Printing the Cranial Nerves.

    PubMed

    Javan, Ramin; Davidson, Duncan; Javan, Afshin

    2017-10-01

    Steady-state free precession (SSFP) magnetic resonance imaging (MRI) can demonstrate details down to the cranial nerve (CN) level. High-resolution three-dimensional (3D) visualization can now quickly be performed at the workstation. However, we are still limited by visualization on flat screens. The emerging technologies in rapid prototyping or 3D printing overcome this limitation. It comprises a variety of automated manufacturing techniques, which use virtual 3D data sets to fabricate solid forms in a layer-by-layer technique. The complex neuroanatomy of the CNs may be better understood and depicted by the use of highly customizable advanced 3D printed models. In this technical note, after manually perfecting the segmentation of each CN and brain stem on each SSFP-MRI image, initial 3D reconstruction was performed. The bony skull base was also reconstructed from computed tomography (CT) data. Autodesk 3D Studio Max, available through freeware student/educator license, was used to three-dimensionally trace the 3D reconstructed CNs in order to create smooth graphically designed CNs and to assure proper fitting of the CNs into their respective neural foramina and fissures. This model was then 3D printed with polyamide through a commercial online service. Two different methods are discussed for the key segmentation and 3D reconstruction steps, by either using professional commercial software, i.e., Materialise Mimics, or utilizing a combination of the widely available software Adobe Photoshop, as well as a freeware software, OsiriX Lite.

  20. CosmoQuest: A Glance at Citizen Science Building

    NASA Astrophysics Data System (ADS)

    Richardson, Matthew; Grier, Jennifer; Gay, Pamela; Lehan, Cory; Buxner, Sanlyn; CosmoQuest Team

    2018-01-01

    CosmoQuest is a virtual research facility focused on engaging people - citizen scientists - from across the world in authentic research projects designed to enhance our knowledge of the cosmos around us. Using image data acquired by NASA missions, our citizen scientists are first trained to identify specific features within the data and then requested to identify those features across large datasets. Responses submitted by the citizen scientists are then stored in our database where they await for analysis and eventual publication by CosmoQuest staff and collaborating professional research scientists.While it is clear that the driving power behind our projects are the eyes and minds of our citizen scientists, it is CosmoQuest’s custom software, Citizen Science Builder (CSB), that enables citizen science to be accomplished. On the front end, CosmoQuest’s CSB software allows for the creation of web-interfaces that users can access to perform image annotation through both drawing tools and questions that can accompany images. These tools include: using geometric shapes to identify regions within an image, tracing image attributes using freeform line tools, and flagging features within images. Additionally, checkboxes, dropdowns, and free response boxes may be used to collect information. On the back end, this software is responsible for the proper storage of all data, which allows project staff to perform periodic data quality checks and track the progress of each project. In this poster we present these available tools and resources and seek potential collaborations.

  1. Computers and Cognitive Development at Work

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael; Lee, Yew-Jin

    2006-01-01

    Data-logging exercises in science classrooms assume that with the proper scaffolding and provision of contexts by instructors, pupils are able to meaningfully comprehend the experimental variables under investigation. From a case study of knowing and learning in a fish hatchery using real-time computer statistical software, we show that…

  2. An Interactive Computer-Based Training Program for Beginner Personal Computer Maintenance.

    ERIC Educational Resources Information Center

    Summers, Valerie Brooke

    A computer-assisted instructional program, which was developed for teaching beginning computer maintenance to employees of Unisys, covered external hardware maintenance, proper diskette care, making software backups, and electro-static discharge prevention. The procedure used in developing the program was based upon the Dick and Carey (1985) model…

  3. Systems Librarian and Automation Review.

    ERIC Educational Resources Information Center

    Schuyler, Michael

    1992-01-01

    Discusses software sharing on computer networks and the need for proper bandwidth; and describes the technology behind FidoNet, a computer network made up of electronic bulletin boards. Network features highlighted include front-end mailers, Zone Mail Hour, Nodelist, NetMail, EchoMail, computer conferences, tosser and scanner programs, and host…

  4. Back to the Source, or It's A You-Bet-Your-Business Game!

    ERIC Educational Resources Information Center

    Galvin, Wayne W.

    1987-01-01

    Many administrators are signing contracts for software products that leave their institutions completely unprotected in the event of a default by the vendor. It is proper for a customer to include contractual provisions whereby they may gain legal access to the program source code. (MLW)

  5. Advanced CNC Programming (EZ-CAM). 439-366.

    ERIC Educational Resources Information Center

    Casey, Joe

    This document contains two units for an advanced course in computer numerical control (CNC) for computer-aided manufacturing. It is intended to familiarize students with the principles and techniques necessary to create proper CNC programs using computer software. Each unit consists of an introduction, instructional objectives, learning materials,…

  6. Performance Analysis of Saturated Induction Motors by Virtual Tests

    ERIC Educational Resources Information Center

    Ojaghi, M.; Faiz, J.; Kazemi, M.; Rezaei, M.

    2012-01-01

    Many undergraduate-level electrical machines textbooks give detailed treatments of the performance of induction motors. Students can deepen this understanding of motor performance by performing the appropriate practical work in laboratories or in simulation using proper software packages. This paper considers various common and less-common tests…

  7. Software for simulation of a computed tomography imaging spectrometer using optical design software

    NASA Astrophysics Data System (ADS)

    Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.

    2000-11-01

    Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.

  8. Design software for reuse

    NASA Technical Reports Server (NTRS)

    Tracz, Will

    1990-01-01

    Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.

  9. National plan to enhance aviation safety through human factors improvements

    NASA Technical Reports Server (NTRS)

    Foushee, Clay

    1990-01-01

    The purpose of this section of the plan is to establish a development and implementation strategy plan for improving safety and efficiency in the Air Traffic Control (ATC) system. These improvements will be achieved through the proper applications of human factors considerations to the present and future systems. The program will have four basic goals: (1) prepare for the future system through proper hiring and training; (2) develop a controller work station team concept (managing human errors); (3) understand and address the human factors implications of negative system results; and (4) define the proper division of responsibilities and interactions between the human and the machine in ATC systems. This plan addresses six program elements which together address the overall purpose. The six program elements are: (1) determine principles of human-centered automation that will enhance aviation safety and the efficiency of the air traffic controller; (2) provide new and/or enhanced methods and techniques to measure, assess, and improve human performance in the ATC environment; (3) determine system needs and methods for information transfer between and within controller teams and between controller teams and the cockpit; (4) determine how new controller work station technology can optimally be applied and integrated to enhance safety and efficiency; (5) assess training needs and develop improved techniques and strategies for selection, training, and evaluation of controllers; and (6) develop standards, methods, and procedures for the certification and validation of human engineering in the design, testing, and implementation of any hardware or software system element which affects information flow to or from the human.

  10. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  11. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  12. ClassCompass: A Software Design Mentoring System

    ERIC Educational Resources Information Center

    Coelho, Wesley; Murphy, Gail

    2007-01-01

    Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…

  13. Cost benefits of advanced software: A review of methodology used at Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Joglekar, Prafulla N.

    1993-01-01

    To assist rational investments in advanced software, a formal, explicit, and multi-perspective cost-benefit analysis methodology is proposed. The methodology can be implemented through a six-stage process which is described and explained. The current practice of cost-benefit analysis at KSC is reviewed in the light of this methodology. The review finds that there is a vicious circle operating. Unsound methods lead to unreliable cost-benefit estimates. Unreliable estimates convince management that cost-benefit studies should not be taken seriously. Then, given external demands for cost-benefit estimates, management encourages software enginees to somehow come up with the numbers for their projects. Lacking the expertise needed to do a proper study, courageous software engineers with vested interests use ad hoc and unsound methods to generate some estimates. In turn, these estimates are unreliable, and the cycle continues. The proposed methodology should help KSC to break out of this cycle.

  14. Exploration of freely available web-interfaces for comparative homology modelling of microbial proteins.

    PubMed

    Nema, Vijay; Pal, Sudhir Kumar

    2013-01-01

    This study was conducted to find the best suited freely available software for modelling of proteins by taking a few sample proteins. The proteins used were small to big in size with available crystal structures for the purpose of benchmarking. Key players like Phyre2, Swiss-Model, CPHmodels-3.0, Homer, (PS)2, (PS)(2)-V(2), Modweb were used for the comparison and model generation. Benchmarking process was done for four proteins, Icl, InhA, and KatG of Mycobacterium tuberculosis and RpoB of Thermus Thermophilus to get the most suited software. Parameters compared during analysis gave relatively better values for Phyre2 and Swiss-Model. This comparative study gave the information that Phyre2 and Swiss-Model make good models of small and large proteins as compared to other screened software. Other software was also good but is often not very efficient in providing full-length and properly folded structure.

  15. A Mechanized Decision Support System for Academic Scheduling.

    DTIC Science & Technology

    1986-03-01

    an operational system called software. The first step in the development phase is Design . Designers destribute software control by factoring the Data...SUBJECT TERMS (Continue on reverse if necessary and identify by block number) ELD GROUP SUB-GROUP Scheduling, Decision Support System , Software Design ...scheduling system . It will also examine software - design techniques to identify the most appropriate method- ology for this problem. " - Chapter 3 will

  16. Guidance and Navigation Software Architecture Design for the Autonomous Multi-Agent Physically Interacting Spacecraft (AMPHIS) Test Bed

    DTIC Science & Technology

    2006-12-01

    NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI

  17. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  18. Contingency theoretic methodology for agent-based web-oriented manufacturing systems

    NASA Astrophysics Data System (ADS)

    Durrett, John R.; Burnell, Lisa J.; Priest, John W.

    2000-12-01

    The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.

  19. Software Coherence in Multiprocessor Memory Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Bolosky, William Joseph

    1993-01-01

    Processors are becoming faster and multiprocessor memory interconnection systems are not keeping up. Therefore, it is necessary to have threads and the memory they access as near one another as possible. Typically, this involves putting memory or caches with the processors, which gives rise to the problem of coherence: if one processor writes an address, any other processor reading that address must see the new value. This coherence can be maintained by the hardware or with software intervention. Systems of both types have been built in the past; the hardware-based systems tended to outperform the software ones. However, the ratio of processor to interconnect speed is now so high that the extra overhead of the software systems may no longer be significant. This issue is explored both by implementing a software maintained system and by introducing and using the technique of offline optimal analysis of memory reference traces. It finds that in properly built systems, software maintained coherence can perform comparably to or even better than hardware maintained coherence. The architectural features necessary for efficient software coherence to be profitable include a small page size, a fast trap mechanism, and the ability to execute instructions while remote memory references are outstanding.

  20. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  1. Assessment of the primary stability of root analog zirconia implants designed using cone beam computed tomography software by means of the Periotest® device: An ex vivo study. A preliminary report.

    PubMed

    Matys, Jacek; Świder, Katarzyna; Flieger, Rafał; Dominiak, Marzena

    2017-08-01

    The implant primary stability is a fundamental prerequisite for a success of osseointegration process which determines the prosthetic reconstruction time. The aim of the present study was to assess the quality and precision of modern conical bone computer tomography (CBCT) software in preparing root analog zirconia implants (RAZIs) by measuring its primary stability by means of the Periotest device. Thirteen pig jaws with proper erupted first premolar (P1) teeth were used in the study. The CBCT examination was conducted in the area of the P1 tooth in each mandible. The 3-dimensional (3D) view of each tooth was designed from CBCT scan. The created 3D images were used to prepare root analog zirconia implants milled from a medical-grade zirconia block by means of laboratory milling. The RAZIs and titanium implants were placed into an alveolar socket after the tooth had been removed. The primary stability of the teeth before their extraction (G1), RAZIs (G2) and titanium implants (G3) were checked by Periotest devices. The mean results in PTV were: 15.9, 3.35, 12.7 for G1, G2 and G3 group, respectively. RAZIs during immediate loading achieved a significantly higher primary stability (lower Periotest value) as compared to the teeth and implants. The modern CBCT device allows us to design a precise image of an extracted tooth for the purpose of manufacturing a root analog implant. The additional feature of the surgical protocol using RAZI is the possibility of avoiding the augmentation procedure, which reduces the whole cost of the treatment.

  2. Using software agents to preserve individual health data confidentiality in micro-scale geographical analyses.

    PubMed

    Kamel Boulos, Maged N; Cai, Qiang; Padget, Julian A; Rushton, Gerard

    2006-04-01

    Confidentiality constraints often preclude the release of disaggregate data about individuals, which limits the types and accuracy of the results of geographical health analyses that could be done. Access to individually geocoded (disaggregate) data often involves lengthy and cumbersome procedures through review boards and committees for approval (and sometimes is not possible). Moreover, current data confidentiality-preserving solutions compatible with fine-level spatial analyses either lack flexibility or yield less than optimal results (because of confidentiality-preserving changes they introduce to disaggregate data), or both. In this paper, we present a simulation case study to illustrate how some analyses cannot be (or will suffer if) done on aggregate data. We then quickly review some existing data confidentiality-preserving techniques, and move on to explore a solution based on software agents with the potential of providing flexible, controlled (software-only) access to unmodified confidential disaggregate data and returning only results that do not expose any person-identifiable details. The solution is thus appropriate for micro-scale geographical analyses where no person-identifiable details are required in the final results (i.e., only aggregate results are needed). Our proposed software agent technique also enables post-coordinated analyses to be designed and carried out on the confidential database(s), as needed, compared to a more conventional solution based on the Web Services model that would only support a rigid, pre-coordinated (pre-determined) and rather limited set of analyses. The paper also provides an exploratory discussion of mobility, security, and trust issues associated with software agents, as well as possible directions/solutions to address these issues, including the use of virtual organizations. Successful partnerships between stakeholder organizations, proper collaboration agreements, clear policies, and unambiguous interpretations of laws and regulations are also much needed to support and ensure the success of any technological solution.

  3. AEDT Software Requirements Documents - Draft

    DOT National Transportation Integrated Search

    2007-01-25

    This software requirements document serves as the basis for designing and testing the Aviation Environmental Design Tool (AEDT) software. The intended audience for this document consists of the following groups: the AEDT designers, developers, and te...

  4. Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop

    NASA Technical Reports Server (NTRS)

    Cottrell, William L.

    1994-01-01

    The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.

  5. Comparative analysis of numerical models of pipe handling equipment used in offshore drilling applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pawlus, Witold, E-mail: witold.p.pawlus@ieee.org; Ebbesen, Morten K.; Hansen, Michael R.

    Design of offshore drilling equipment is a task that involves not only analysis of strict machine specifications and safety requirements but also consideration of changeable weather conditions and harsh environment. These challenges call for a multidisciplinary approach and make the design process complex. Various modeling software products are currently available to aid design engineers in their effort to test and redesign equipment before it is manufactured. However, given the number of available modeling tools and methods, the choice of the proper modeling methodology becomes not obvious and – in some cases – troublesome. Therefore, we present a comparative analysis ofmore » two popular approaches used in modeling and simulation of mechanical systems: multibody and analytical modeling. A gripper arm of the offshore vertical pipe handling machine is selected as a case study for which both models are created. In contrast to some other works, the current paper shows verification of both systems by benchmarking their simulation results against each other. Such criteria as modeling effort and results accuracy are evaluated to assess which modeling strategy is the most suitable given its eventual application.« less

  6. Effect of strain rate and temperature on mechanical properties of selected building Polish steels

    NASA Astrophysics Data System (ADS)

    Moćko, Wojciech; Kruszka, Leopold

    2015-09-01

    Currently, the computer programs of CAD type are basic tool for designing of various structures under impact loading. Application of the numerical calculations allows to substantially reduce amount of time required for the design stage of such projects. However, the proper use of computer aided designing technique requires input data for numerical software including elastic-plastic models of structural materials. This work deals with the constitutive model developed by Rusinek and Klepaczko (RK) applied for the modelling of mechanical behaviour of selected grades structural St0S, St3SX, 18GS and 34GS steels and presents here results of experimental and empirical analyses to describe dynamic elastic-plastic behaviours of tested materials at wide range of temperature. In order to calibrate the RK constitutive model, series of compression tests at wide range of strain rates, including static, quasi-static and dynamic investigations at lowered, room and elevated temperatures, were carried out using two testing stands: servo-hydraulic machine and split Hopkinson bar. The results were analysed to determine influence of temperature and strain rate on visco-plastic response of tested steels, and show good correlation with experimental data.

  7. Expert system for adhesive selection of composite material joints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allen, R.B.; Vanderveldt, H.H.

    The development of composite joining is still in its infancy and much is yet to be learned. Consequently, this field is developing rapidly and new advances occur with great regularity. The need for up-to-date information and expertise in engineering and planning of composite materials, especially in critical applications, is acute. The American Joining Institute`s (AJI) development of JOINEXCELL (an off-line intelligent planner for joining composite materials) is an intelligent engineering/planning software system that incorporates the knowledge of several experts which can be expanded as these developments occur. Phase I effort of JOINEXCELL produced an expert system for adhesive selection, JOINADSELECT,more » for composite material joints. The expert system successfully selects from over 26 different adhesive families for 44 separate material types and hundreds of application situations. Through a series of design questions the expert system selects the proper adhesive for each particular design. Performing this {open_quotes}off-line{close_quotes} engineering planning by computer allows the decision to be made with full knowledge of the latest information about materials and joining procedures. JOINADSELECT can greatly expedite the joining design process, thus yielding cost savings.« less

  8. Construction of the Dependence Matrix Based on the TRIZ Contradiction Matrix in OOD

    NASA Astrophysics Data System (ADS)

    Ma, Jianhong; Zhang, Quan; Wang, Yanling; Luo, Tao

    In the Object-Oriented software design (OOD), design of the class and object, definition of the classes’ interface and inheritance levels and determination of dependent relations have a serious impact on the reusability and flexibility of the system. According to the concrete problems of design, how to select the right solution from the hundreds of the design schemas which has become the focus of attention of designers. After analyzing lots of software design schemas in practice and Object-Oriented design patterns, this paper constructs the dependence matrix of Object-Oriented software design filed, referring to contradiction matrix of TRIZ (Theory of Inventive Problem Solving) proposed by the former Soviet Union innovation master Altshuller. As the practice indicates, it provides a intuitive, common and standardized method for designers to choose the right design schema. Make research and communication more effectively, and also improve the software development efficiency and software quality.

  9. Toward a Formal Model of the Design and Evolution of Software

    DTIC Science & Technology

    1988-12-20

    should have the flezibiity to support a variety of design methodologies, be compinhenaive enough to encompass the gamut of software lifecycle...the future. It should have the flezibility to support a variety of design methodologies, be comprehensive enough to encompass the gamut of software...variety of design methodologies, be comprehensive enough to encompass the gamut of software lifecycle activities, and be precise enough to provide the

  10. Computerized design of controllers using data models

    NASA Technical Reports Server (NTRS)

    Irwin, Dennis; Mitchell, Jerrel; Medina, Enrique; Allwine, Dan; Frazier, Garth; Duncan, Mark

    1995-01-01

    The major contributions of the grant effort have been the enhancement of the Compensator Improvement Program (CIP), which resulted in the Ohio University CIP (OUCIP) package, and the development of the Model and Data-Oriented Computer Aided Design System (MADCADS). Incorporation of direct z-domain designs into CIP was tested and determined to be numerically ill-conditioned for the type of lightly damped problems for which the development was intended. Therefore, it was decided to pursue the development of z-plane designs in the w-plane, and to make this conversion transparent to the user. The analytical development needed for this feature, as well as that needed for including compensator damping ratios and DC gain specifications, closed loop stability requirements, and closed loop disturbance rejection specifications into OUCIP are all contained in Section 3. OUCIP was successfully tested with several example systems to verify proper operation of existing and new features. The extension of the CIP philosophy and algorithmic approach to handle modern multivariable controller design criteria was implemented and tested. Several new algorithms for implementing the search approach to modern multivariable control system design were developed and tested. This analytical development, most of which was incorporated into the MADCADS software package, is described in Section 4, which also includes results of the application of MADCADS to the MSFC ACES facility and the Hubble Space Telescope.

  11. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  12. SEPAC flight software detailed design specifications, volume 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.

  13. Designing Computerized Provider Order Entry Software in Iran: The Nurses' and Physicians' Viewpoints.

    PubMed

    Khammarnia, Mohammad; Sharifian, Roxana; Zand, Farid; Keshtkaran, Ali; Barati, Omid

    2016-09-01

    This study aimed to identify the functional requirements of computerized provider order entry software and design this software in Iran. This study was conducted using review documentation, interview, and focus group discussions in Shiraz University of Medical Sciences, as the medical pole in Iran, in 2013-2015. The study sample consisted of physicians (n = 12) and nurses (n = 2) in the largest hospital in the southern part of Iran and information technology experts (n = 5) in Shiraz University of Medical Sciences. Functional requirements of the computerized provider order entry system were examined in three phases. Finally, the functional requirements were distributed in four levels, and accordingly, the computerized provider order entry software was designed. The software had seven main dimensions: (1) data entry, (2) drug interaction management system, (3) warning system, (4) treatment services, (5) ability to write in software, (6) reporting from all sections of the software, and (7) technical capabilities of the software. The nurses and physicians emphasized quick access to the computerized provider order entry software, order prescription section, and applicability of the software. The software had some items that had not been mentioned in other studies. Ultimately, the software was designed by a company specializing in hospital information systems in Iran. This study was the first specific investigation of computerized provider order entry software design in Iran. Based on the results, it is suggested that this software be implemented in hospitals.

  14. Exploring machine-learning-based control plane intrusion detection techniques in software defined optical networks

    NASA Astrophysics Data System (ADS)

    Zhang, Huibin; Wang, Yuqiao; Chen, Haoran; Zhao, Yongli; Zhang, Jie

    2017-12-01

    In software defined optical networks (SDON), the centralized control plane may encounter numerous intrusion threatens which compromise the security level of provisioned services. In this paper, the issue of control plane security is studied and two machine-learning-based control plane intrusion detection techniques are proposed for SDON with properly selected features such as bandwidth, route length, etc. We validate the feasibility and efficiency of the proposed techniques by simulations. Results show an accuracy of 83% for intrusion detection can be achieved with the proposed machine-learning-based control plane intrusion detection techniques.

  15. Design study of Software-Implemented Fault-Tolerance (SIFT) computer

    NASA Technical Reports Server (NTRS)

    Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.

    1982-01-01

    Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.

  16. Using Software Design Methods in CALL

    ERIC Educational Resources Information Center

    Ward, Monica

    2006-01-01

    The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…

  17. Autonomous robot software development using simple software components

    NASA Astrophysics Data System (ADS)

    Burke, Thomas M.; Chung, Chan-Jin

    2004-10-01

    Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.

  18. Development of Probabilistic Life Prediction Methodologies and Testing Strategies for MEMS and CMC's

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama

    2003-01-01

    This effort is to investigate probabilistic life prediction methodologies for ceramic matrix composites and MicroElectroMechanical Systems (MEMS) and to analyze designs that determine stochastic properties of MEMS. For CMC's this includes a brief literature survey regarding lifing methodologies. Also of interest for MEMS is the design of a proper test for the Weibull size effect in thin film (bulge test) specimens. The Weibull size effect is a consequence of a stochastic strength response predicted from the Weibull distribution. Confirming that MEMS strength is controlled by the Weibull distribution will enable the development of a probabilistic design methodology for MEMS - similar to the GRC developed CARES/Life program for bulk ceramics. A main objective of this effort is to further develop and verify the ability of the Ceramics Analysis and Reliability Evaluation of Structures/Life (CARES/Life) code to predict the time-dependent reliability of MEMS structures subjected to multiple transient loads. A second set of objectives is to determine the applicability/suitability of the CARES/Life methodology for CMC analysis, what changes would be needed to the methodology and software, and if feasible, run a demonstration problem. Also important is an evaluation of CARES/Life coupled to the ANSYS Probabilistic Design System (PDS) and the potential of coupling transient reliability analysis to the ANSYS PDS.

  19. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  20. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  1. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  2. Make Your Voice Heard!

    ERIC Educational Resources Information Center

    Branzburg, Jeffrey

    2006-01-01

    A podcast is a method of distributing multimedia files, usually (but not limited to) audio in the MP3 format, over the Internet to subscribers. Anybody can be a subscriber--one only needs the proper software to receive the subscription. In this article, the author discusses how to create one's own podcast. Before creating the podcast, one needs a…

  3. Tablets in K-12 Education: Integrated Experiences and Implications

    ERIC Educational Resources Information Center

    An, Heejung, Ed.; Alon, Sandra, Ed.; Fuentes, David, Ed.

    2015-01-01

    The inclusion of new and emerging technologies in the education sector has been a topic of interest to researchers, educators, and software developers alike in recent years. Utilizing the proper tools in a classroom setting is a critical factor in student success. "Tablets in K-12 Education: Integrated Experiences and Implications"…

  4. Fitting program for linear regressions according to Mahon (1996)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trappitsch, Reto G.

    2018-01-09

    This program takes the users' Input data and fits a linear regression to it using the prescription presented by Mahon (1996). Compared to the commonly used York fit, this method has the correct prescription for measurement error propagation. This software should facilitate the proper fitting of measurements with a simple Interface.

  5. Apps. Accessibility and Usability by People with Visual Disabilities

    ERIC Educational Resources Information Center

    Olmedo-Moreno, Eva María; López-Delgado, Adrian

    2015-01-01

    The increasing use of ICT devices, such as smartphones and tablets, needs development of properly software or apps to facilitate socio-educative life of citizens in smart cities: Adaptive educational resources, leisure and entertainment facilities or mobile payment services, among others. Undoubtedly, all that is opening a new age with more…

  6. Perceived Utility of Typesetting Homework in Post-Calculus Mathematics Courses

    ERIC Educational Resources Information Center

    Quinlan, James; Tennenhouse, Craig

    2016-01-01

    Too often our students submit incomplete homework that is disorganized, unclear, and nonlinear. Typesetting with LATEX, although time consuming for those new to the software, strengthens communication by forcing organization and proper notation required by the precise, formal language of mathematics. In this manuscript we report on a study of 42…

  7. Music Teacher Perceptions of a Model of Technology Training and Support in Virginia

    ERIC Educational Resources Information Center

    Welch, Lee Arthur

    2013-01-01

    A plethora of technology resources currently exists for the music classroom of the twenty-first century, including digital audio and video, music software, electronic instruments, Web 2.0 tools and more. Research shows a strong need for professional development for teachers to properly implement and integrate instructional technology resources…

  8. The Software Design Document: More than a User's Manual.

    ERIC Educational Resources Information Center

    Bowers, Dennis

    1989-01-01

    Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…

  9. Current And Future Directions Of Lens Design Software

    NASA Astrophysics Data System (ADS)

    Gustafson, Darryl E.

    1983-10-01

    The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.

  10. Experimental investigation and CFD simulation of multi-pipe earth-to-air heat exchangers (EAHEs) flow performance

    NASA Astrophysics Data System (ADS)

    Amanowicz, Łukasz; Wojtkowiak, Janusz

    2017-11-01

    In this paper the experimentally obtained flow characteristics of multi-pipe earth-to-air heat exchangers (EAHEs) were used to validate the EAHE flow performance numerical model prepared by means of CFD software Ansys Fluent. The cut-cell meshing and the k-ɛ realizable turbulence model with default coefficients values and enhanced wall treatment was used. The total pressure losses and airflow in each pipe of multi-pipe exchangers was investigated both experimentally and numerically. The results show that airflow in each pipe of multi-pipe EAHE structures is not equal. The validated numerical model can be used for a proper designing of multi-pipe EAHEs from the flow characteristics point of view. The influence of EAHEs geometrical parameters on the total pressure losses and airflow division between the exchanger pipes can be also analysed. Usage of CFD for designing the EAHEs can be helpful for HVAC engineers (Heating Ventilation and Air Conditioning) for optimizing the geometrical structure of multi-pipe EAHEs in order to save the energy and decrease operational costs of low-energy buildings.

  11. Design and Implementation of a Brain Computer Interface System for Controlling a Robotic Claw

    NASA Astrophysics Data System (ADS)

    Angelakis, D.; Zoumis, S.; Asvestas, P.

    2017-11-01

    The aim of this paper is to present the design and implementation of a brain-computer interface (BCI) system that can control a robotic claw. The system is based on the Emotiv Epoc headset, which provides the capability of simultaneous recording of 14 EEG channels, as well as wireless connectivity by means of the Bluetooth protocol. The system is initially trained to decode what user thinks to properly formatted data. The headset communicates with a personal computer, which runs a dedicated software application, implemented under the Processing integrated development environment. The application acquires the data from the headset and invokes suitable commands to an Arduino Uno board. The board decodes the received commands and produces corresponding signals to a servo motor that controls the position of the robotic claw. The system was tested successfully on a healthy, male subject, aged 28 years. The results are promising, taking into account that no specialized hardware was used. However, tests on a larger number of users is necessary in order to draw solid conclusions regarding the performance of the proposed system.

  12. New asphalt mix design system for Oklahoma department of transportation : final report.

    DOT National Transportation Integrated Search

    2013-03-01

    Oklahoma Department of Transportation (ODOT) has been using the Superpave mix design software for several years. The original Superpave mix design software was built around Fox Database and did not meet ODOT requirements. The software currently being...

  13. Software Requirements Engineering Methodology (Development)

    DTIC Science & Technology

    1979-06-01

    Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway

  14. "Mobile Nurse" platform for ubiquitous medicine.

    PubMed

    Struzik, Z R; Yoshiuchi, K; Sone, M; Ishikawa, T; Kikuchi, H; Kumano, H; Watsuji, T; Natelson, B H; Yamamoto, Y

    2007-01-01

    We introduce "Mobile Nurse" (MN) - an emerging platform for the practice of ubiquitous medicine. By implementing in a dynamic setting of daily life the patient care traditionally provided by the clinical nurses on duty, MN aims at integral data collection and shortening the response time to the patient. MN is also capable of intelligent interaction with the patient and is able to learn from the patient's behavior and disease sign evaluation for improved personalized treatment. In this paper, we outline the most essential concepts around the hardware, software and methodological designs of MN. We provide an example of the implementation, and elaborate on the possible future impact on medical practice and biomedical science research. The main innovation of MN, setting it apart from current tele-medicine systems, is the ability to integrate the patient's signs and symptoms on site, providing medical professionals with powerful tools to elucidate disease mechanisms, to make proper diagnoses and to prescribe treatment.

  15. Multimedia telehomecare system using standard TV set.

    PubMed

    Guillén, S; Arredondo, M T; Traver, V; García, J M; Fernández, C

    2002-12-01

    Nowadays, there are a very large number of patients that need specific health support at home. The deployment of broadband communication networks is making feasible the provision of home care services with a proper quality of service. This paper presents a telehomecare multimedia platform that runs over integrated services digital network and internet protocol using videoconferencing standards H.320 and H.323, and standard TV set for patient interaction. This platform allows online remote monitoring: ECG, heart sound, blood pressure. Usability, affordability, and interoperability were considered for the design and development of its hardware and software components. A first evaluation of technical and usability aspects were carried forward with 52 patients of a private clinic and 10 students in the University. Results show a high rate (mean = 4.33, standard deviation--SD = 1.63 in a five-points Likert scale) in the global perception of users on the quality of images, voice, and feeling of virtual presence.

  16. Calibration of gamma-ray detectors using Gaussian photopeak fitting in the multichannel spectra with a LabVIEW-based digital system

    NASA Astrophysics Data System (ADS)

    Schlattauer, Leo; Parali, Levent; Pechousek, Jiri; Sabikoglu, Israfil; Celiktas, Cuneyt; Tektas, Gozde; Novak, Petr; Jancar, Ales; Prochazka, Vit

    2017-09-01

    This paper reports on the development of a gamma-ray spectroscopic system for the (i) recording and (ii) processing of spectra. The utilized data read-out unit consists of a PCI digital oscilloscope, personal computer and LabVIEW™ programming environment. A pulse-height spectra of various sources were recorded with two NaI(Tl) detectors and analyzed, demonstrating the proper usage of the detectors. A multichannel analyzer implements the Gaussian photopeak fitting. The presented method provides results which are in compliance to the ones taken from commercial spectroscopy systems. Each individual hardware or software unit can be further utilized in different spectrometric user-systems. An application of the developed system for research and teaching purposes regarding the design of digital spectrometric systems has been successfully tested at the laboratories of the Department of Experimental Physics.

  17. UAV Cooperation Architectures for Persistent Sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, R S; Kent, C A; Jones, E D

    2003-03-20

    With the number of small, inexpensive Unmanned Air Vehicles (UAVs) increasing, it is feasible to build multi-UAV sensing networks. In particular, by using UAVs in conjunction with unattended ground sensors, a degree of persistent sensing can be achieved. With proper UAV cooperation algorithms, sensing is maintained even though exceptional events, e.g., the loss of a UAV, have occurred. In this paper a cooperation technique that allows multiple UAVs to perform coordinated, persistent sensing with unattended ground sensors over a wide area is described. The technique automatically adapts the UAV paths so that on the average, the amount of time thatmore » any sensor has to wait for a UAV revisit is minimized. We also describe the Simulation, Tactical Operations and Mission Planning (STOMP) software architecture. This architecture is designed to help simulate and operate distributed sensor networks where multiple UAVs are used to collect data.« less

  18. A Web-based open-source database for the distribution of hyperspectral signatures

    NASA Astrophysics Data System (ADS)

    Ferwerda, J. G.; Jones, S. D.; Du, Pei-Jun

    2006-10-01

    With the coming of age of field spectroscopy as a non-destructive means to collect information on the physiology of vegetation, there is a need for storage of signatures, and, more importantly, their metadata. Without the proper organisation of metadata, the signatures itself become limited. In order to facilitate re-distribution of data, a database for the storage & distribution of hyperspectral signatures and their metadata was designed. The database was built using open-source software, and can be used by the hyperspectral community to share their data. Data is uploaded through a simple web-based interface. The database recognizes major file-formats by ASD, GER and International Spectronics. The database source code is available for download through the hyperspectral.info web domain, and we happily invite suggestion for additions & modification for the database to be submitted through the online forums on the same website.

  19. Full extraction methods to retrieve effective refractive index and parameters of a bianisotropic metamaterial based on material dispersion models

    NASA Astrophysics Data System (ADS)

    Hsieh, Feng-Ju; Wang, Wei-Chih

    2012-09-01

    This paper discusses two improved methods in retrieving effective refractive indices, impedances, and material properties, such as permittivity (ɛ) and permeability (μ), of metamaterials. The first method modified from Kong's retrieval method allows effective constitutive parameters over all frequencies including the anti-resonant band, where imaginary parts of ɛ or μ are negative, to be solved. The second method is based on genetic algorithms and optimization of properly defined goal functions to retrieve parameters of the Drude and Lorentz dispersion models. Equations of effective refractive index and impedance at any reference planes are derived. Split ring resonator-rod based metamaterials operating in terahertz frequencies are designed and investigated with proposed methods. Retrieved material properties and parameters are used to regenerate S-parameters and compared with simulation results generated by cst microwave studio software.

  20. Application Of Interferometry To Optical Components And Systems Evaluation

    NASA Astrophysics Data System (ADS)

    Houston, Joseph B., Jr.

    1982-05-01

    Interferometry provides opticians and lens designers with the ability to evaluate optical components and systems quantitatively. A variety of interferometers and interferometric test procedures have evolved over the past several decades. This evolution has stimulated an ever-increasing amount of interest in using a new generation of instrumentation and computer software for solving cost and schedule problems both in the shop and at field test sites. Optical engineers and their customers continue to gain confidence in their abilities to perform several operations such as assure component quality, analyze and optimize lens assemblies, and accurately predict end-item performance. In this paper, a set of typical test situations are addressed and some standard instrumentation is described, as a means of illustrating the special advantages of interferometric testing. Emphasis will be placed on the proper application of currently available hardware and some of the latest proven techniques.

  1. The Last Meter: Blind Visual Guidance to a Target.

    PubMed

    Manduchi, Roberto; Coughlan, James M

    2014-01-01

    Smartphone apps can use object recognition software to provide information to blind or low vision users about objects in the visual environment. A crucial challenge for these users is aiming the camera properly to take a well-framed picture of the desired target object. We investigate the effects of two fundamental constraints of object recognition - frame rate and camera field of view - on a blind person's ability to use an object recognition smartphone app. The app was used by 18 blind participants to find visual targets beyond arm's reach and approach them to within 30 cm. While we expected that a faster frame rate or wider camera field of view should always improve search performance, our experimental results show that in many cases increasing the field of view does not help, and may even hurt, performance. These results have important implications for the design of object recognition systems for blind users.

  2. OHD/HL - National Weather Hydrology Laboratory

    Science.gov Websites

    resources and services. Design and Programming Standards and Guidelines General Programming C C++ FORTRAN Java v 2.0 updated 3/28/2008 Java v 1.9 Korn and Bash Shell Software Design Phase Guidelines OHD Design Specification Template OHD Design Specification Example Software Peer Review Guidelines and Checklists Software

  3. Investigation of asphalt content design for open-graded bituminous mixes.

    DOT National Transportation Integrated Search

    1974-01-01

    Several design procedures associated with determining the proper asphalt content for open-graded bituminous mixes were investigated. Also considered was the proper amount of tack coat that should be placed on the old surface prior to paving operation...

  4. Performance Characteristic Mems-Based IMUs for UAVs Navigation

    NASA Astrophysics Data System (ADS)

    Mohamed, H. A.; Hansen, J. M.; Elhabiby, M. M.; El-Sheimy, N.; Sesay, A. B.

    2015-08-01

    Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs) are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS) or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK), and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS) signal outage.

  5. Hardware and software implementation of a low power attitude control and determination system for cubesats

    NASA Astrophysics Data System (ADS)

    Frey, Jesse

    In recent years there has been a growing interest in smaller satellites. Smaller satellites are cheaper to build and launch than larger satellites. One form factor, the CubeSat, is especially popular with universities and is a 10~cm cube. Being smaller means that the mass and power budgets are tighter and as such new ways must be developed to cope with these constraints. Traditional attitude control systems often use reaction wheels with gas thrusters which present challenges on a CubeSat. Many CubeSats use magnetic attitude control which uses the Earth's magnetic field to torque the satellite into the proper orientation. Magnetic attitude control systems fall into two main categories: active and passive. Active control is often achieved by running current through a coil to produce a dipole moment, while passive control uses the dipole moment from permanent magnets that consume no power. This thesis describes a system that uses twelve hard magnetic torquers along with a magnetometer. The torquers only consume current when their dipole moment is flipped, thereby significantly reducing power requirements compared with traditional active control. The main focus of this thesis is on the design, testing and fabrication of CubeSat hardware and software in preparation for launch.

  6. Plug-and-play design approach to smart harness for modular small satellites

    NASA Astrophysics Data System (ADS)

    Mughal, M. Rizwan; Ali, Anwar; Reyneri, Leonardo M.

    2014-02-01

    A typical satellite involves many different components that vary in bandwidth demand. Sensors that require a very low data rate may reside on a simple two- or three-wire interface such as I2C, SPI, etc. Complex sensors that require high data rate and bandwidth may reside on an optical interface. The AraMiS architecture is an enhanced capability architecture with different satellite configurations. Although keeping the low-cost and COTS approach of CubeSats, it extends the modularity concept as it also targets different satellite shapes and sizes. But modularity moves beyond the mechanical structure: the tiles also have thermo-mechanical, harness and signal-processing functionalities. Further modularizing the system, every tile can also host a variable number of small sensors, actuators or payloads, connected using a plug-and-play approach. Every subsystem is housed in a small daughter board and is supplied, by the main tile, with power and data distribution functions, power and data harness, mechanical support and is attached and interconnected with space-grade spring-loaded connectors. The tile software is also modular and allows a quick adaptation to specific subsystems. The basic software for the CPU is properly hardened to guarantee high level of radiation tolerance at very low cost.

  7. Analytical Design of Evolvable Software for High-Assurance Computing

    DTIC Science & Technology

    2001-02-14

    Mathematical expression for the Total Sum of Squares which measures the variability that results when all values are treated as a combined sample coming from...primarily interested in background on software design and high-assurance computing, research in software architecture generation or evaluation...respectively. Those readers solely interested in the validation of a software design approach should at the minimum read Chapter 6 followed by Chapter

  8. Transforming Aggregate Object-Oriented Formal Specifications to Code

    DTIC Science & Technology

    1999-03-01

    integration issues associated with a formal-based software transformation system, such as the source specification, the problem space architecture , design architecture ... design transforms, and target software transforms. Software is critical in today’s Air Force, yet its specification, design, and development

  9. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  10. ASTRID© - Advanced Solar Tubular ReceIver Design: A powerful tool for receiver design and optimization

    NASA Astrophysics Data System (ADS)

    Frantz, Cathy; Fritsch, Andreas; Uhlig, Ralf

    2017-06-01

    In solar tower power plants the receiver is one of the critical components. It converts the solar radiation into heat and must withstand high heat flux densities and high daily or even hourly gradients (due to passage of clouds). For this reason, the challenge during receiver design is to find a reasonable compromise between receiver efficiency, reliability, lifetime and cost. There is a strong interaction between the heliostat field, the receiver and the heat transfer fluid. Therefore, a proper receiver design needs to consider these components within the receiver optimization. There are several design and optimization tools for receivers, but most of them focus only on the receiver, ignoring the heliostat field and other parts of the plant. During the last years DLR developed the ASTRIDcode for tubular receiver concept simulation. The code comprises both a high and a low-detail model. The low-detail model utilizes a number of simplifications which allow the user to screen a high number of receiver concepts for optimization purposes. The high-detail model uses a FE model and is able to compute local absorber and salt temperatures with high accuracy. One key strength of the ASTRIDcode is its interface to a ray tracing software which simulates a realistic heat flux distributions on the receiver surface. The results generated by the ASTRIDcode have been validated by CFD simulations and measurement data.

  11. Software reengineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III

    1991-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.

  12. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  13. Design Optimization Toolkit: Users' Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLABmore » command window.« less

  14. Review & Peer Review of “Parameters for Properly Designed and Operated Flares” Documents

    EPA Pesticide Factsheets

    This page contains two 2012 memoranda on the review of EPA's parameters for properly designed and operated flares. One details the process of peer review, and the other provides background information and specific charge questions to the panel.

  15. ATM Technology Demonstration-1 Phase II Boeing Configurable Graphical Display (CGD) Software Design Description

    NASA Technical Reports Server (NTRS)

    Wilber, George F.

    2017-01-01

    This Software Description Document (SDD) captures the design for developing the Flight Interval Management (FIM) system Configurable Graphics Display (CGD) software. Specifically this SDD describes aspects of the Boeing CGD software and the surrounding context and interfaces. It does not describe the Honeywell components of the CGD system. The SDD provides the system overview, architectural design, and detailed design with all the necessary information to implement the Boeing components of the CGD software and integrate them into the CGD subsystem within the larger FIM system. Overall system and CGD system-level requirements are derived from the CGD SRS (in turn derived from the Boeing System Requirements Design Document (SRDD)). Display and look-and-feel requirements are derived from Human Machine Interface (HMI) design documents and working group recommendations. This Boeing CGD SDD is required to support the upcoming Critical Design Review (CDR).

  16. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov Websites

    battery designers, developers, and manufacturers create affordable, high-performance lithium-ion (Li-ion Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided ) batteries for next-generation electric-drive vehicles (EDVs). An image of a simulation of a battery pack

  17. Attributes and Behaviors of Performance-Centered Systems.

    ERIC Educational Resources Information Center

    Gery, Gloria

    1995-01-01

    Examines attributes, characteristics, and behaviors of performance-centered software packages that are emerging in the consumer software marketplace and compares them with large-scale systems software being designed by internal information systems staffs and vendors of large-scale software designed for financial, manufacturing, processing, and…

  18. Specifications for Thesaurus Software.

    ERIC Educational Resources Information Center

    Milstead, Jessica L.

    1991-01-01

    Presents specifications for software that is designed to support manual development and maintenance of information retrieval thesauri. Evaluation of existing software and design of custom software is discussed, requirements for integration with larger systems and for the user interface are described, and relationships among terms are discussed.…

  19. CosmoQuest: Better Citizen Science Through Education

    NASA Technical Reports Server (NTRS)

    Gay, P. L.; Lehan, C.; Bracey, G.; Yamani, A.; Francis, M.; Durrell, P.; Spivey, C.; Noel-Storr, J.; Buxner, S.; Cobb, W.; hide

    2016-01-01

    In the modern era, NASA SMD missions and facilities are producing data at a rate too great for the science community to maximally utilize. While software can help, what is really needed is additional eyes, hands, and minds - help we can find in the form of citizen scientist volunteers. The CosmoQuest virtual research facility has demonstrated through published research results that classroom students and the public can, with proper training and support from Subject Matter Experts (SMEs), fill roles more traditionally filled by university students. The research question behind CosmoQuest's creation was simple: if students and the public are provided a properly scaffolded experience that mirrors that of researchers, will they come and perform as well as our students? and can they rise up to be research collaborators? In creating CosmoQuest, we started with a core of citizen science portals, educational materials for both students and life-long learners, and collaboration areas. These three primary focuses mirror the research, courses, and collaboration spaces that form the foundation of a university department. We then went on to add the features that make a center stand out - we added seminars in the form of Google Hangouts on Air, planetarium content through our Science on the Half Sphere program, and even the chance to vicariously attend conferences through live blogging by our team members. With this design for a virtual research facility, the answer to our foundational question has been a resounding yes; the public can aid us in doing science provided they are properly trained. To meet the needs of our population we have developed four areas of engagement: research, education, media, and community.

  20. Recent mass spectrometry-based techniques and considerations for disulfide bond characterization in proteins.

    PubMed

    Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather

    2018-04-01

    Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.

  1. The Five 'R's' for Developing Trusted Software Frameworks to increase confidence in, and maximise reuse of, Open Source Software.

    NASA Astrophysics Data System (ADS)

    Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens

    2015-04-01

    Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.

  2. General Mission Analysis Tool (GMAT) Acceptance Test Plan [Draft

    NASA Technical Reports Server (NTRS)

    Dove, Edwin; Hughes, Steve

    2007-01-01

    The information presented in this Acceptance Test Plan document shows the current status of the General Mission Analysis Tool (GMAT). GMAT is a software system developed by NASA Goddard Space Flight Center (GSFC) in collaboration with the private sector. The GMAT development team continuously performs acceptance tests in order to verify that the software continues to operate properly after updates are made. The GMAT Development team consists of NASA/GSFC Code 583 software developers, NASA/GSFC Code 595 analysts, and contractors of varying professions. GMAT was developed to provide a development approach that maintains involvement from the private sector and academia, encourages collaborative funding from multiple government agencies and the private sector, and promotes the transfer of technology from government funded research to the private sector. GMAT contains many capabilities, such as integrated formation flying modeling and MATLAB compatibility. The propagation capabilities in GMAT allow for fully coupled dynamics modeling of multiple spacecraft, in any flight regime. Other capabilities in GMAT inclucle: user definable coordinate systems, 3-D graphics in any coordinate system GMAT can calculate, 2-D plots, branch commands, solvers, optimizers, GMAT functions, planetary ephemeris sources including DE405, DE200, SLP and analytic models, script events, impulsive and finite maneuver models, and many more. GMAT runs on Windows, Mac, and Linux platforms. Both the Graphical User Interface (GUI) and the GMAT engine were built and tested on all of the mentioned platforms. GMAT was designed for intuitive use from both the GUI and with an importable script language similar to that of MATLAB.

  3. Software wizards to adjust keyboard and mouse settings for people with physical impairments

    PubMed Central

    Koester, Heidi; Simpson, Richard; Mankowski, Jennifer

    2013-01-01

    Context/objective This study describes research behind two software wizards that help users with physical impairments adjust their keyboard and mouse settings to meet their specific needs. The Keyboard Wizard and Pointing Wizard programs help ensure that keyboard and pointing devices are properly configured for an individual, and reconfigured as the user's needs change. We summarize four effectiveness studies and six usability studies. Methods Studies involved participants whose physical impairments affect their ability to use a keyboard and mouse. Effectiveness studies used an A-B-A design, with condition A using default Windows settings and condition B using wizard-recommended settings. Primary data were performance metrics for text entry and target acquisition. Usability studies asked participants to run through each wizard, with no outside guidance. Primary data were completion time, errors made, and user feedback. Results The wizards were effective at recommending new settings for users who needed them and not recommending them for users who did not. Sensitivity for StickyKeys, pointer speed, and object size algorithms was 100%. Specificity for StickyKeys and pointer speed was over 80%, and 50% for object size. For those who needed settings changes, the recommendations improved performance, with speed increases ranging from 9 to 59%. Accuracy improved significantly with the wizard recommendations, eliminating up to 100% of errors. Users ran through the current wizard software in less than 6 minutes. Ease-of-use rating averaged over 4.5 on a scale of 1 to 5. Conclusion The wizards are a simple yet effective way of adjusting Windows to accommodate physical impairments. PMID:23820146

  4. Introducing high performance distributed logging service for ACS

    NASA Astrophysics Data System (ADS)

    Avarias, Jorge A.; López, Joao S.; Maureira, Cristián; Sommer, Heiko; Chiozzi, Gianluca

    2010-07-01

    The ALMA Common Software (ACS) is a software framework that provides the infrastructure for the Atacama Large Millimeter Array and other projects. ACS, based on CORBA, offers basic services and common design patterns for distributed software. Every properly built system needs to be able to log status and error information. Logging in a single computer scenario can be as easy as using fprintf statements. However, in a distributed system, it must provide a way to centralize all logging data in a single place without overloading the network nor complicating the applications. ACS provides a complete logging service infrastructure in which every log has an associated priority and timestamp, allowing filtering at different levels of the system (application, service and clients). Currently the ACS logging service uses an implementation of the CORBA Telecom Log Service in a customized way, using only a minimal subset of the features provided by the standard. The most relevant feature used by ACS is the ability to treat the logs as event data that gets distributed over the network in a publisher-subscriber paradigm. For this purpose the CORBA Notification Service, which is resource intensive, is used. On the other hand, the Data Distribution Service (DDS) provides an alternative standard for publisher-subscriber communication for real-time systems, offering better performance and featuring decentralized message processing. The current document describes how the new high performance logging service of ACS has been modeled and developed using DDS, replacing the Telecom Log Service. Benefits and drawbacks are analyzed. A benchmark is presented comparing the differences between the implementations.

  5. SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE

    NASA Technical Reports Server (NTRS)

    Kleine, H.

    1994-01-01

    Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.

  6. Drone Defense System Architecture for U.S. Navy Strategic Facilities

    DTIC Science & Technology

    2017-09-01

    evaluation and weapons assignment (TEWA) to properly address threats. This report follows a systems engineering process to develop a software architecture...C-UAS requires a central system to connect these new and existing systems. The central system uses data fusion and threat evaluation and weapons...30  Table 6.  Decision Type Descriptions .......................................................................40  Table 7

  7. Opening up the Collaborative Problem-Solving Process to Solvers

    ERIC Educational Resources Information Center

    Robison, Tyler

    2013-01-01

    In software systems, having features of openness means that some of the internal components of the system are made available for examination by users. Researchers have looked at different effects of open systems a great deal in the area of educational technology, but also in areas outside of education. Properly used, openness has the potential to…

  8. Software Technology for Adaptable, Reliable Systems (STARS): UUS40 - Risk-Reduction Reasoning-Based Development Paradigm Tailored to Navy C2 Systems

    DTIC Science & Technology

    1991-07-30

    4 Management reviews, engineering and WBS -Spiral 0 -5 *Risk Management Planning -Spiral 0-5 ,41.- Unrelsi ugt .Proper initial planning -Spiral 0.1...Reusability issues for trusted systems are associated closely with maintenance issues. Reuse theory and practice for highly trusted systems will require

  9. Storage system software solutions for high-end user needs

    NASA Technical Reports Server (NTRS)

    Hogan, Carole B.

    1992-01-01

    Today's high-end storage user is one that requires rapid access to a reliable terabyte-capacity storage system running in a distributed environment. This paper discusses conventional storage system software and concludes that this software, designed for other purposes, cannot meet high-end storage requirements. The paper also reviews the philosophy and design of evolving storage system software. It concludes that this new software, designed with high-end requirements in mind, provides the potential for solving not only the storage needs of today but those of the foreseeable future as well.

  10. Learning & Personality Types: A Case Study of a Software Design Course

    ERIC Educational Resources Information Center

    Ahmed, Faheem; Campbell, Piers; Jaffar, Ahmad; Alkobaisi, Shayma; Campbell, Julie

    2010-01-01

    The software industry has continued to grow over the past decade and there is now a need to provide education and hands-on training to students in various phases of software life cycle. Software design is one of the vital phases of the software development cycle. Psychological theories assert that not everybody is fit for all kind of tasks as…

  11. 78 FR 40149 - Scientific Information Request on Chronic Urinary Retention (CUR) Treatment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-03

    ... improve the quality of this review. AHRQ is conducting this comparative effectiveness review pursuant to..., study period, design, methodology, indication and diagnosis, proper use instructions, inclusion and... study number, the study period, design, methodology, indication and diagnosis, proper use instructions...

  12. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  13. Exploration of freely available web-interfaces for comparative homology modelling of microbial proteins

    PubMed Central

    Nema, Vijay; Pal, Sudhir Kumar

    2013-01-01

    Aim: This study was conducted to find the best suited freely available software for modelling of proteins by taking a few sample proteins. The proteins used were small to big in size with available crystal structures for the purpose of benchmarking. Key players like Phyre2, Swiss-Model, CPHmodels-3.0, Homer, (PS)2, (PS)2-V2, Modweb were used for the comparison and model generation. Results: Benchmarking process was done for four proteins, Icl, InhA, and KatG of Mycobacterium tuberculosis and RpoB of Thermus Thermophilus to get the most suited software. Parameters compared during analysis gave relatively better values for Phyre2 and Swiss-Model. Conclusion: This comparative study gave the information that Phyre2 and Swiss-Model make good models of small and large proteins as compared to other screened software. Other software was also good but is often not very efficient in providing full-length and properly folded structure. PMID:24023424

  14. Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform

    NASA Astrophysics Data System (ADS)

    Liu, H. S.; Liao, H. M.

    2015-08-01

    Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.

  15. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    NASA Technical Reports Server (NTRS)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  16. Warfighting Concepts to Future Weapon System Designs (WARCON)

    DTIC Science & Technology

    2003-09-12

    34* Software design documents rise to litigation. "* A Material List "Cost information that may support, or may * Final Engineering Process Maps be...document may include design the system as derived from the engineering design, software development, SRD. MTS Technologies, Inc. 26 FOR OFFICIAL USE...document, early in the development phase. It is software engineers produce the vision of important to establish a standard, formal the design effort. As

  17. Software Infrastructure for Computer-aided Drug Discovery and Development, a Practical Example with Guidelines.

    PubMed

    Moretti, Loris; Sartori, Luca

    2016-09-01

    In the field of Computer-Aided Drug Discovery and Development (CADDD) the proper software infrastructure is essential for everyday investigations. The creation of such an environment should be carefully planned and implemented with certain features in order to be productive and efficient. Here we describe a solution to integrate standard computational services into a functional unit that empowers modelling applications for drug discovery. This system allows users with various level of expertise to run in silico experiments automatically and without the burden of file formatting for different software, managing the actual computation, keeping track of the activities and graphical rendering of the structural outcomes. To showcase the potential of this approach, performances of five different docking programs on an Hiv-1 protease test set are presented. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A software platform for the analysis of dermatology images

    NASA Astrophysics Data System (ADS)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  19. PHM Enabled Autonomous Propellant Loading Operations

    NASA Technical Reports Server (NTRS)

    Walker, Mark; Figueroa, Fernando

    2017-01-01

    The utility of Prognostics and Health Management (PHM) software capability applied to Autonomous Operations (AO) remains an active research area within aerospace applications. The ability to gain insight into which assets and subsystems are functioning properly, along with the derivation of confident predictions concerning future ability, reliability, and availability, are important enablers for making sound mission planning decisions. When coupled with software that fully supports mission planning and execution, an integrated solution can be developed that leverages state assessment and estimation for the purposes of delivering autonomous operations. The authors have been applying this integrated, model-based approach to the autonomous loading of cryogenic spacecraft propellants at Kennedy Space Center.

  20. The Applicability of Proposed Object-Oriented Metrics to Developer Feedback in Time to Impact Development

    NASA Technical Reports Server (NTRS)

    Neal, Ralph D.

    1996-01-01

    This paper looks closely at each of the software metrics generated by the McCabe object-Oriented Tool(TM) and its ability to convey timely information to developers. The metrics are examined for meaningfulness in terms of the scale assignable to the metric by the rules of measurement theory and the software dimension being measured. Recommendations are made as to the proper use of each metric and its ability to influence development at an early stage. The metrics of the McCabe Object-Oriented Tool(TM) set were selected because of the tool's use in a couple of NASA IV&V projects.

  1. IMS software developments for the detection of chemical warfare agent

    NASA Technical Reports Server (NTRS)

    Klepel, ST.; Graefenhain, U.; Lippe, R.; Stach, J.; Starrock, V.

    1995-01-01

    Interference compounds like gasoline, diesel, burning wood or fuel, etc. are presented in common battlefield situations. These compounds can cause detectors to respond as a false positive or interfere with the detector's ability to respond to target compounds such as chemical warfare agents. To ensure proper response of the ion mobility spectrometer to chemical warfare agents, two special software packages were developed and incorporated into the Bruker RAID-1. The programs suppress interferring signals caused by car exhaust or smoke gases resulting from burning materials and correct the influence of variable sample gas humidity which is important for detection and quantification of blister agents like mustard gas or lewisite.

  2. Taming the Viper: Software Upgrade for VFAUser and Viper

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DORIN,RANDALL T.; MOSER III,JOHN C.

    2000-08-08

    This report describes the procedure and properties of the software upgrade for the Vibration Performance Recorder. The upgrade will check the 20 memory cards for proper read/write operation. The upgrade was successfully installed and uploaded into the Viper and the field laptop. The memory checking routine must run overnight to complete the test, although the laptop need only be connected to the Viper unit until the downloading routine is finished. The routine has limited ability to recognize incomplete or corrupt header and footer files. The routine requires 400 Megabytes of free hard disk space. There is one minor technical flawmore » detailed in the conclusion.« less

  3. The KASE approach to domain-specific software systems

    NASA Technical Reports Server (NTRS)

    Bhansali, Sanjay; Nii, H. Penny

    1992-01-01

    Designing software systems, like all design activities, is a knowledge-intensive task. Several studies have found that the predominant cause of failures among system designers is lack of knowledge: knowledge about the application domain, knowledge about design schemes, knowledge about design processes, etc. The goal of domain-specific software design systems is to explicitly represent knowledge relevant to a class of applications and use it to partially or completely automate various aspects of the designing systems within that domain. The hope is that this would reduce the intellectual burden on the human designers and lead to more efficient software development. In this paper, we present a domain-specific system built on top of KASE, a knowledge-assisted software engineering environment being developed at the Stanford Knowledge Systems Laboratory. We introduce the main ideas underlying the construction of domain specific systems within KASE, illustrate the application of the idea in the synthesis of a system for tracking aircraft from radar signals, and discuss some of the issues in constructing domain-specific systems.

  4. Automatic Requirements Specification Extraction from Natural Language (ARSENAL)

    DTIC Science & Technology

    2014-10-01

    designers, implementers) involved in the design of software systems. However, natural language descriptions can be informal, incomplete, imprecise...communication of technical descriptions between the various stakeholders (e.g., customers, designers, imple- menters) involved in the design of software systems...the accuracy of the natural language processing stage, the degree of automation, and robustness to noise. 1 2 Introduction Software systems operate in

  5. How Well Can Existing Software Support Processes Accomplish Sustainment of a Non-Developmental Item-Based Acquisition Strategy

    DTIC Science & Technology

    2017-04-06

    Research Hypothesis ........................................................................................................... 15 Research Design ...user community and of accommodating advancing software applications by the vendors. Research Design My approach to this project was to conduct... design descriptions , requirements specifications, test documentation, interface requirement specifications, product specifications, and software

  6. Design Features of Pedagogically-Sound Software in Mathematics.

    ERIC Educational Resources Information Center

    Haase, Howard; And Others

    Weaknesses in educational software currently available in the domain of mathematics are discussed. A technique that was used for the design and production of mathematics software aimed at improving problem-solving skills which combines sound pedagogy and innovative programming is presented. To illustrate the design portion of this technique, a…

  7. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Endert, Alexander

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We discuss a number of studies of collaboration in the intelligence community and use this information to provide some guidelines for collaboration software.

  8. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  9. Tools Lighten Designs, Maintain Structural Integrity

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Collier Research Corporation of Hampton, Virginia, licensed software developed at Langley Research Center to reduce design weight through the use of composite materials. The first license of NASA-developed software, it has now been used in everything from designing next-generation cargo containers, to airframes, rocket engines, ship hulls, and train bodies. The company now has sales of the NASA-derived software topping $4 million a year and has recently received several Small Business Innovation Research (SBIR) contracts to apply its software to nearly all aspects of the new Orion crew capsule design.

  10. Software Design Document SAF Workstation. Volume 1, Sections 1.0 - 2.4. 3.4.86

    DTIC Science & Technology

    1991-06-01

    SLECT TERMS IS. NUMER OF PAGES SIMNET Software Design Document for the SAF Workstation CSCI (CSCI 6). 14. PRICE CODE SECUWItY CLASSIFICATION Is. SECUJRITY...AD-A244 972 SOFTWARE DESIGN DOCUMENT SAF Workstation CSCI (6) Volume 1 of 2 Sections 1.0 - 2.4.3.4.86 DTIC June, 1991 Flt. FCTE S JAN 09 1992...00247 APPROVED FOR PUBLIC RELEASE DISTRBUTION UNLIMITED -Mono SOFTWARE DESIGN DOCUMENT SAF Workstation CSCI (6) Volume 1 of 2 Sections 1.0 - 2.4.3.4.86

  11. Ethics in computer software design and development

    Treesearch

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  12. A new transiently chaotic flow with ellipsoid equilibria

    NASA Astrophysics Data System (ADS)

    Panahi, Shirin; Aram, Zainab; Jafari, Sajad; Pham, Viet-Thanh; Volos, Christos; Rajagopal, Karthikeyan

    2018-03-01

    In this article, a simple autonomous transiently chaotic flow with cubic nonlinearities is proposed. This system represents some unusual features such as having a surface of equilibria. We shall describe some dynamical properties and behaviours of this system in terms of eigenvalue structures, bifurcation diagrams, time series, and phase portraits. Various behaviours of this system such as periodic and transiently chaotic dynamics can be shown by setting special parameters in proper values. Our system belongs to a newly introduced category of transiently chaotic systems: systems with hidden attractors. Transiently chaotic behaviour of our proposed system has been implemented and tested by the OrCAD-PSpise software. We have found a proper qualitative similarity between circuit and simulation results.

  13. Coordination and organization of security software process for power information application environment

    NASA Astrophysics Data System (ADS)

    Wang, Qiang

    2017-09-01

    As an important part of software engineering, the software process decides the success or failure of software product. The design and development feature of security software process is discussed, so is the necessity and the present significance of using such process. Coordinating the function software, the process for security software and its testing are deeply discussed. The process includes requirement analysis, design, coding, debug and testing, submission and maintenance. In each process, the paper proposed the subprocesses to support software security. As an example, the paper introduces the above process into the power information platform.

  14. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  15. RARtool: A MATLAB Software Package for Designing Response-Adaptive Randomized Clinical Trials with Time-to-Event Outcomes.

    PubMed

    Ryeznik, Yevgen; Sverdlov, Oleksandr; Wong, Weng Kee

    2015-08-01

    Response-adaptive randomization designs are becoming increasingly popular in clinical trial practice. In this paper, we present RARtool , a user interface software developed in MATLAB for designing response-adaptive randomized comparative clinical trials with censored time-to-event outcomes. The RARtool software can compute different types of optimal treatment allocation designs, and it can simulate response-adaptive randomization procedures targeting selected optimal allocations. Through simulations, an investigator can assess design characteristics under a variety of experimental scenarios and select the best procedure for practical implementation. We illustrate the utility of our RARtool software by redesigning a survival trial from the literature.

  16. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints.

    PubMed

    Sundharam, Sakthivel Manikandan; Navet, Nicolas; Altmeyer, Sebastian; Havet, Lionel

    2018-02-20

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system.

  17. A Model-Driven Co-Design Framework for Fusing Control and Scheduling Viewpoints

    PubMed Central

    Navet, Nicolas; Havet, Lionel

    2018-01-01

    Model-Driven Engineering (MDE) is widely applied in the industry to develop new software functions and integrate them into the existing run-time environment of a Cyber-Physical System (CPS). The design of a software component involves designers from various viewpoints such as control theory, software engineering, safety, etc. In practice, while a designer from one discipline focuses on the core aspects of his field (for instance, a control engineer concentrates on designing a stable controller), he neglects or considers less importantly the other engineering aspects (for instance, real-time software engineering or energy efficiency). This may cause some of the functional and non-functional requirements not to be met satisfactorily. In this work, we present a co-design framework based on timing tolerance contract to address such design gaps between control and real-time software engineering. The framework consists of three steps: controller design, verified by jitter margin analysis along with co-simulation, software design verified by a novel schedulability analysis, and the run-time verification by monitoring the execution of the models on target. This framework builds on CPAL (Cyber-Physical Action Language), an MDE design environment based on model-interpretation, which enforces a timing-realistic behavior in simulation through timing and scheduling annotations. The application of our framework is exemplified in the design of an automotive cruise control system. PMID:29461489

  18. Geometric modeling for computer aided design

    NASA Technical Reports Server (NTRS)

    Schwing, James L.; Olariu, Stephen

    1995-01-01

    The primary goal of this grant has been the design and implementation of software to be used in the conceptual design of aerospace vehicles particularly focused on the elements of geometric design, graphical user interfaces, and the interaction of the multitude of software typically used in this engineering environment. This has resulted in the development of several analysis packages and design studies. These include two major software systems currently used in the conceptual level design of aerospace vehicles. These tools are SMART, the Solid Modeling Aerospace Research Tool, and EASIE, the Environment for Software Integration and Execution. Additional software tools were designed and implemented to address the needs of the engineer working in the conceptual design environment. SMART provides conceptual designers with a rapid prototyping capability and several engineering analysis capabilities. In addition, SMART has a carefully engineered user interface that makes it easy to learn and use. Finally, a number of specialty characteristics have been built into SMART which allow it to be used efficiently as a front end geometry processor for other analysis packages. EASIE provides a set of interactive utilities that simplify the task of building and executing computer aided design systems consisting of diverse, stand-alone, analysis codes. Resulting in a streamlining of the exchange of data between programs reducing errors and improving the efficiency. EASIE provides both a methodology and a collection of software tools to ease the task of coordinating engineering design and analysis codes.

  19. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    PubMed

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  20. The National Health Insurance Scheme (NHIS): a survey of knowledge and opinions of Nigerian dentists' in Lagos.

    PubMed

    Adeniyi, A A; Onajole, A T

    2010-03-01

    This study was designed to assess the knowledge and perceptions of Nigerian dentists to the National Health Insurance scheme (NHIS). A cross-sectional descriptive survey was conducted amongst 250 dentists employed in private and public dental clinics in Lagos State, Nigeria. The survey instrument was a self-administered questionnaire designed to assess their knowledge and attitudes towards the scheme. Data analysis was done using the Epi-Info statistical software (version 6.04). Statistical tools used included measures of central tendency, frequency distribution and chi-square test. A total of 216 dentists (response rate of 82.4%) participated in this study. Most 132 (61.1%) of the respondents had a fair knowledge of the NHIS, while 22 (10.2%) and 62 (28.7%) had poor and good knowledge respectively. Majority (70.4%) viewed the NHIS as a good idea that will succeed if properly implemented. Most (76.6%) respondents also believed that the scheme will improve access to oral health services, affordability of services (71.4%), availability of the services (68.3%) and recognition of dentistry as a profession (62.4%). Most of the respondents (66.2%) considered oral health care as not properly positioned in the NHIS and 154 respondents (74.4%) found the current position of oral health on the NHIS unacceptable. A good number of the respondents (77.3%) would like dentistry to operate at the primary care level on the NHIS. Majority of the dentists involved in this study had some knowledge of the NHIS and were generally positively disposed towards the scheme and viewed it as a good idea.

  1. Design and implementation of Skype USB user gateway software

    NASA Astrophysics Data System (ADS)

    Qi, Yang

    2017-08-01

    With the widespread application of VoIP, the client with private protocol becomes more and more popular. Skype is one of the representatives. How to connect Skype with PSTN just by Skype client has gradually become hot. This paper design and implement the software based on a kind of USB User Gateway. With the software Skype user can freely communicate with PSTN phone. FSM is designed as the core of the software, and Skype control is separated by the USB Gateway control. In this way, the communication becomes more flexible and efficient. In the actual user testing, the software obtains good results.

  2. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  3. 78 FR 52929 - Scientific Information Request on Imaging Tests for the Diagnosis and Staging of Pancreatic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... improve the quality of this review. AHRQ is conducting this comparative effectiveness review pursuant to..., study period, design, methodology, indication and diagnosis, proper use instructions, inclusion and... including a study number, the study period, design, methodology, indication and diagnosis, proper use...

  4. Ideas in Practice (3): A Simulated Laboratory Experience in Digital Design.

    ERIC Educational Resources Information Center

    Cleaver, Thomas G.

    1988-01-01

    Gives an example of the use of a simplified logic simulator in a logic design course. Discusses some problems in logic design classes, commercially available software, and software problems. Describes computer-aided engineering (CAE) software. Lists 14 experiments in the simulated laboratory and presents students' evaluation of the course. (YP)

  5. ModSAF Software Architecture Design and Overview Document

    DTIC Science & Technology

    1993-12-20

    ADVANCED DISTRIBUTED SIMULATIONTECHNOLOGY AD-A282 740 ModSAF SOFTWARE ARCHITECTURE DESIGN AND OVERVIEW DOCUMENT Ver 1.0 - 20 December 1993 D T...AND SUBTITLE 5. FUNDING NUMBERS MOdSAF SOFTWARE ARCHITECTURE DESIGN AND OVERVIEW DOCUMENT C N61339-91-D-O00, Delivery Order (0021), ModSAF (CDRL A004) 6

  6. 49 CFR Appendix C to Part 236 - Safety Assurance Criteria and Processes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... system (all its elements including hardware and software) must be designed to assure safe operation with... unsafe errors in the software due to human error in the software specification, design, or coding phases... (hardware or software, or both) are used in combination to ensure safety. If a common mode failure exists...

  7. Software for quantitative analysis of radiotherapy: overview, requirement analysis and design solutions.

    PubMed

    Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O

    2013-06-01

    Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  9. Feasibility study of an Integrated Program for Aerospace-vehicle Design (IPAD) system. Volume 5: Design of the IPAD system. Part 2: System design. Part 3: General purpose utilities, phase 1, task 2

    NASA Technical Reports Server (NTRS)

    Garrocq, C. A.; Hurley, M. J.

    1973-01-01

    Viable designs are presented of various elements of the IPAD framework software, data base management system, and required new languages in relation to the capabilities of operating systems software. A thorough evaluation was made of the basic systems functions to be provide by each software element, its requirements defined in the conceptual design, the operating systems features affecting its design, and the engineering/design functions which it was intended to enhance.

  10. Computer software.

    PubMed

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  11. Automating Software Design Metrics.

    DTIC Science & Technology

    1984-02-01

    INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this

  12. Control-structure-thermal interactions in analysis of lunar telescopes

    NASA Technical Reports Server (NTRS)

    Thompson, Roger C.

    1992-01-01

    The lunar telescope project was an excellent model for the CSTI study because a telescope is a very sensitive instrument, and thermal expansion or mechanical vibration of the mirror assemblies will rapidly degrade the resolution of the device. Consequently, the interactions are strongly coupled. The lunar surface experiences very large temperature variations that range from approximately -180 C to over 100 C. Although the optical assemblies of the telescopes will be well insulated, the temperature of the mirrors will inevitably fluctuate in a similar cycle, but of much smaller magnitude. In order to obtain images of high quality and clarity, allowable thermal deformations of any point on a mirror must be less than 1 micron. Initial estimates indicate that this corresponds to a temperature variation of much less than 1 deg through the thickness of the mirror. Therefore, a lunar telescope design will most probably include active thermal control, a means of controlling the shape of the mirrors, or a combination of both systems. Historically, the design of a complex vehicle was primarily a sequential process in which the basic structure was defined without concurrent detailed analyses or other subsystems. The basic configuration was then passed to the different teams responsible for each subsystem, and their task was to produce a workable solution without requiring major alterations to any principal components or subsystems. Consequently, the final design of the vehicle was not always the most efficient, owing to the fact that each subsystem design was partially constrained by the previous work. This procedure was necessary at the time because the analysis process was extremely time-consuming and had to be started over with each significant alteration of the vehicle. With recent advances in the power and capacity of small computers, and the parallel development of powerful software in structural, thermal, and control system analysis, it is now possible to produce very detailed analyses of intermediate designs in a much shorter period of time. The subsystems can thus be designed concurrently, and alterations in the overall design can be quickly adopted into each analysis; the design becomes an iterative process in which it is much easier to experiment with new ideas, configurations, and components. Concurrent engineering has the potential to produce efficient, highly capable designs because the effect of one subystem on another can be assessed in much more detail at a very early point in the program. The research program consisted of several tasks: scale a prototype telescope assembly to a 1 m aperture, develop a model of the telescope assembly by using finite element (FEM) codes that are available on site, determine structural deflections of the mirror surfaces due to the temperature variations, develop a prototype control system to maintain the proper shape of the optical elements, and most important of all, demonstrate the concurrent engineering approach with this example. In addition, the software used for the finite element models and thermal analysis was relatively new within the Program Development Office and had yet to be applied to systems this large or complex; understanding the software and modifying it for use with this project was also required. The I-DEAS software by Structural Dynamics Research Corporation (SDRC) was used to build the finite element models, and TMG developed by Maya Heat Transfer Technologies, Ltd. (which runs as an I-DEAS module) was used for the thermal model calculations. All control system development was accomplished with MATRIX(sub X) by Integrated Systems, Inc.

  13. Integrated design optimization research and development in an industrial environment

    NASA Astrophysics Data System (ADS)

    Kumar, V.; German, Marjorie D.; Lee, S.-J.

    1989-04-01

    An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.

  14. Integrated design optimization research and development in an industrial environment

    NASA Technical Reports Server (NTRS)

    Kumar, V.; German, Marjorie D.; Lee, S.-J.

    1989-01-01

    An overview is given of a design optimization project that is in progress at the GE Research and Development Center for the past few years. The objective of this project is to develop a methodology and a software system for design automation and optimization of structural/mechanical components and systems. The effort focuses on research and development issues and also on optimization applications that can be related to real-life industrial design problems. The overall technical approach is based on integration of numerical optimization techniques, finite element methods, CAE and software engineering, and artificial intelligence/expert systems (AI/ES) concepts. The role of each of these engineering technologies in the development of a unified design methodology is illustrated. A software system DESIGN-OPT has been developed for both size and shape optimization of structural components subjected to static as well as dynamic loadings. By integrating this software with an automatic mesh generator, a geometric modeler and an attribute specification computer code, a software module SHAPE-OPT has been developed for shape optimization. Details of these software packages together with their applications to some 2- and 3-dimensional design problems are described.

  15. New High Proper Motion Stars from the Digitized Sky Survey. II. Northern Stars with 0.5" yr-1 < μ < 2.0" yr-1 at High Galactic Latitudes

    NASA Astrophysics Data System (ADS)

    Lépine, Sébastien; Shara, Michael M.; Rich, R. Michael

    2003-08-01

    In a continuation of our systematic search for high proper motion stars in the Digitized Sky Survey, we have completed the analysis of northern sky fields at Galactic latitudes above 25°. With the help of our SUPERBLINK software, a powerful automated blink comparator developed by us, we have identified 1146 stars in the magnitude range 8

  16. DEVELOPMENT OF A SOFTWARE DESIGN TOOL FOR HYBRID SOLAR-GEOTHERMAL HEAT PUMP SYSTEMS IN HEATING- AND COOLING-DOMINATED BUILDINGS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yavuzturk, C. C.; Chiasson, A. D.; Filburn, T. P.

    This project provides an easy-to-use, menu-driven, software tool for designing hybrid solar-geothermal heat pump systems (GHP) for both heating- and cooling-dominated buildings. No such design tool currently exists. In heating-dominated buildings, the design approach takes advantage of glazed solar collectors to effectively balance the annual thermal loads on the ground with renewable solar energy. In cooling-dominated climates, the design approach takes advantage of relatively low-cost, unglazed solar collectors as the heat rejecting component. The primary benefit of hybrid GHPs is the reduced initial cost of the ground heat exchanger (GHX). Furthermore, solar thermal collectors can be used to balance themore » ground loads over the annual cycle, thus making the GHX fully sustainable; in heating-dominated buildings, the hybrid energy source (i.e., solar) is renewable, in contrast to a typical fossil fuel boiler or electric resistance as the hybrid component; in cooling-dominated buildings, use of unglazed solar collectors as a heat rejecter allows for passive heat rejection, in contrast to a cooling tower that consumes a significant amount of energy to operate, and hybrid GHPs can expand the market by allowing reduced GHX footprint in both heating- and cooling-dominated climates. The design tool allows for the straight-forward design of innovative GHP systems that currently pose a significant design challenge. The project lays the foundations for proper and reliable design of hybrid GHP systems, overcoming a series of difficult and cumbersome steps without the use of a system simulation approach, and without an automated optimization scheme. As new technologies and design concepts emerge, sophisticated design tools and methodologies must accompany them and be made usable for practitioners. Lack of reliable design tools results in reluctance of practitioners to implement more complex systems. A menu-driven software tool for the design of hybrid solar GHP systems is provided that is based on mathematically robust, validated models. An automated optimization tool is used to balance ground loads and incorporated into the simulation engine. With knowledge of the building loads, thermal properties of the ground, the borehole heat exchanger configuration, the heat pump peak hourly and seasonal COP for heating and cooling, the critical heat pump design entering fluid temperature, and the thermal performance of a solar collector, the total GHX length can be calculated along with the area of a supplemental solar collector array and the corresponding reduced GHX length. An economic analysis module allows for the calculation of the lowest capital cost combination of solar collector area and GHX length. ACKNOWLEDGMENTS This project was funded by the United States Department of Energy DOE-DE-FOA-0000116, Recovery Act Geothermal Technologies Program: Ground Source Heat Pumps. The lead contractor, The University of Hartford, was supported by The University of Dayton, and the Oak Ridge National Laboratories. All funding and support for this project as well as contributions of graduate and undergraduate students from the contributing institutions are gratefully acknowledged.« less

  17. Recommendations for research design of telehealth studies.

    PubMed

    Chumbler, Neale R; Kobb, Rita; Brennan, David M; Rabinowitz, Terry

    2008-11-01

    Properly designed randomized controlled trials (RCTs) are the gold standard to use when examining the effectiveness of telehealth interventions on clinical outcomes. Some published telehealth studies have employed well-designed RCTs. However, such methods are not always feasible and practical in particular settings. This white paper addresses not only the need for properly designed RCTs, but also offers alternative research designs, such as quasi-experimental designs, and statistical techniques that can be employed to rigorously assess the effectiveness of telehealth studies. This paper further offers design and measurement recommendations aimed at and relevant to administrative decision-makers, policymakers, and practicing clinicians.

  18. The Environment for Application Software Integration and Execution (EASIE) version 1.0. Volume 1: Executive overview

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Davis, John S.

    1989-01-01

    The Environment for Application Software Integration and Execution (EASIE) provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational database management system. Volume 1, Executive Overview, gives an overview of the functions provided by EASIE and describes their use. Three operational design systems based upon the EASIE software are briefly described.

  19. An object-oriented software approach for a distributed human tracking motion system

    NASA Astrophysics Data System (ADS)

    Micucci, Daniela L.

    2003-06-01

    Tracking is a composite job involving the co-operation of autonomous activities which exploit a complex information model and rely on a distributed architecture. Both information and activities must be classified and related in several dimensions: abstraction levels (what is modelled and how information is processed); topology (where the modelled entities are); time (when entities exist); strategy (why something happens); responsibilities (who is in charge of processing the information). A proper Object-Oriented analysis and design approach leads to a modular architecture where information about conceptual entities is modelled at each abstraction level via classes and intra-level associations, whereas inter-level associations between classes model the abstraction process. Both information and computation are partitioned according to level-specific topological models. They are also placed in a temporal framework modelled by suitable abstractions. Domain-specific strategies control the execution of the computations. Computational components perform both intra-level processing and intra-level information conversion. The paper overviews the phases of the analysis and design process, presents major concepts at each abstraction level, and shows how the resulting design turns into a modular, flexible and adaptive architecture. Finally, the paper sketches how the conceptual architecture can be deployed into a concrete distribute architecture by relying on an experimental framework.

  20. Learning Chemistry from Good and (Why Not?) Problematic Results: Kinetics of the pH-Independent Hydrolysis of 4-Nitrophenyl Chloroformate

    ERIC Educational Resources Information Center

    El Seoud, Omar A.; Galgano, Paula D.; Are^as, Elizabeth P. G.; Moraes, Jamille M.

    2015-01-01

    The determination of kinetic data is central to reaction mechanism; science courses usually include experiments on chemical kinetics. Thanks to PC-controlled data acquisition and availability of software, the students calculate rate constants, whether the experiment has been done properly or not. This contrasts with their experience in, e.g.,…

  1. Structural Technology Evaluation and Analysis Program (STEAP). Delivery Order 0046: Multiscale Modeling of Composite Structures Subjected to Cyclic Loading

    DTIC Science & Technology

    2012-09-01

    on transformation field analysis [19], proper orthogonal decomposition [63], eigenstrains [23], and others [1, 29, 39] have brought significant...commercial finite element software (Abaqus) along with the user material subroutine utility ( UMAT ) is employed to solve these problems. In this section...Symmetric Coefficients TFA: Transformation Field Analysis UMAT : User Material Subroutine

  2. Learning Performance with Interactive Simulations in Medical Education: Lessons Learned from Results of Learning Complex Physiological Models with the HAEMOdynamics SIMulator

    ERIC Educational Resources Information Center

    Holzinger, Andreas; Kickmeier-Rust, Michael D.; Wassertheurer, Sigi; Hessinger, Michael

    2009-01-01

    Objective: Since simulations are often accepted uncritically, with excessive emphasis being placed on technological sophistication at the expense of underlying psychological and educational theories, we evaluated the learning performance of simulation software, in order to gain insight into the proper use of simulations for application in medical…

  3. Development and Implementation of Methods and Means for Achieving a Uniform Functional Coating Thickness

    NASA Astrophysics Data System (ADS)

    Shishlov, A. V.; Sagatelyan, G. R.; Shashurin, V. D.

    2017-12-01

    A mathematical model is proposed to calculate the growth rate of the thin-film coating thickness at various points in a flat substrate surface during planetary motion of the substrate, which makes it possible to calculate an expected coating thickness distribution. Proper software package is developed. The coefficients used for computer simulation are experimentally determined.

  4. Influence of the power law index on the fiber breakage during injection molding by numerical simulations

    NASA Astrophysics Data System (ADS)

    Desplentere, Frederik; Six, Wim; Bonte, Hilde; Debrabandere, Eric

    2013-04-01

    In predictive engineering for polymer processes, the proper prediction of material microstructure from known processing conditions and constituent material properties is a critical step forward properly predicting bulk properties in the finished composite. Operating within the context of long-fiber thermoplastics (LFT, length > 15mm) this investigation concentrates on the influence of the power law index on the final fiber length distribution within the injection molded part. To realize this, the Autodesk Simulation Moldflow Insight Scandium 2013 software has been used. In this software, a fiber breakage algorithm is available from this release on. Using virtual material data with realistic viscosity levels allows to separate the influence of the power law index on the fiber breakage from the other material and process parameters. Applying standard settings for the fiber breakage parameters results in an obvious influence on the fiber length distribution through the thickness of the part and also as function of position in the part. Finally, the influence of the shear rate constant within the fiber breakage model has been investigated illustrating the possibility to fit the virtual fiber length distribution to the possible experimentally available data.

  5. Experiences in integrating auto-translated state-chart designs for model checking

    NASA Technical Reports Server (NTRS)

    Pingree, P. J.; Benowitz, E. G.

    2003-01-01

    In the complex environment of JPL's flight missions with increasing dependency on advanced software designs, traditional software validation methods of simulation and testing are being stretched to adequately cover the needs of software development.

  6. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  7. Design requirements for SRB production control system. Volume 3: Package evaluation, modification and hardware

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The software package evaluation was designed to analyze commercially available, field-proven, production control or manufacturing resource planning management technology and software package. The analysis was conducted by comparing SRB production control software requirements and conceptual system design to software package capabilities. The methodology of evaluation and the findings at each stage of evaluation are described. Topics covered include: vendor listing; request for information (RFI) document; RFI response rate and quality; RFI evaluation process; and capabilities versus requirements.

  8. Research and Design Issues Concerning the Development of Educational Software for Children. Technical Report No. 14.

    ERIC Educational Resources Information Center

    Char, Cynthia

    Several research and design issues to be considered when creating educational software were identified by a field test evaluation of three types of innovative software created at Bank Street College: (1) Probe, software for measuring and graphing temperature data; (2) Rescue Mission, a navigation game that illustrates the computer's use for…

  9. Teacher-Designed Software for Interactive Linear Equations: Concepts, Interpretive Skills, Applications & Word-Problem Solving.

    ERIC Educational Resources Information Center

    Lawrence, Virginia

    No longer just a user of commercial software, the 21st century teacher is a designer of interactive software based on theories of learning. This software, a comprehensive study of straightline equations, enhances conceptual understanding, sketching, graphic interpretive and word problem solving skills as well as making connections to real-life and…

  10. The Design and Realization of Radio Telescope Control Software in Windows XP System with VC++

    NASA Astrophysics Data System (ADS)

    Zhao, Rong-Bing; Aili, Yu; Zhang, Jin; Yu, Yun

    2007-03-01

    The main function of the radio telescope control software is to drive the radio telescope to track the target accurately. The design of radio telescope control software is based on Windows XP system with VC++. The functions of the software, communication mode and the user interface is introduced in this article.

  11. Software Development Group. Software Review Center. Microcomputing Working Paper Series.

    ERIC Educational Resources Information Center

    Perkey, Nadine; Smith, Shirley C.

    Two papers describe the roles of the Software Development Group (SDG) and the Software Review Center (SRC) at Drexel University. The first paper covers the primary role of the SDG, which is designed to assist Drexel faculty with the technical design and programming of courseware for the Apple Macintosh microcomputer; the relationship of the SDG…

  12. User-Centered Design Guidelines for Collaborative Software for Intelligence Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scholtz, Jean; Endert, Alexander N.

    In this position paper we discuss the necessity of using User-Centered Design (UCD) methods in order to design collaborative software for the intelligence community. We present some standing issues in collaborative software based on existing work within the intelligence community. Based on this information we present opportunities to address some of these challenges.

  13. Rational Design Methodology.

    DTIC Science & Technology

    1978-09-01

    This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a

  14. Sighten Final Technical Report DEEE0006690 Deploying an integrated and comprehensive solar financing software platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Conlan

    Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software,more » and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a rooftop solar financing program. Standardizing and improving all calculations, improving data quality, and exposing new analysis tools previously unavailable affects investment in the residential space in several important ways: 1) lowering the cost of capital for existing capital providers by mitigating uncertainty and de-risking the solar asset class; 2) attracting new, lower cost investors to the solar asset class as reporting and data quality resemble standards of more mature asset classes; 3) increasing the prevalence of liquidity options for investors through back leverage, securitization, or secondary sale by providing the tools necessary for lenders, ratings agencies, etc. to properly understand a portfolio of residential solar assets. During the project period, Sighten successfully built and scaled a commercially ready tool for the residential solar market. The software solution built by Sighten has been deployed with key target customer segments identified in the award deliverables: solar installers, solar developers/channel managers, and solar financiers, including lenders. Each of these segments greatly benefits from the availability of the Sighten toolset.« less

  15. A New Control System Software for SANS BATAN Spectrometer in Serpong, Indonesia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bharoto; Putra, Edy Giri Rachman

    2010-06-22

    The original main control system of the 36 meter small-angle neutron scattering (SANS) BATAN Spectrometer (SMARTer) has been replaced with the new ones due to the malfunction of the main computer. For that reason, a new control system software for handling all the control systems was also developed in order to put the spectrometer back in operation. The developed software is able to control the system such as rotation movement of six pinholes system, vertical movement of four neutron guide system with the total length of 16.5 m, two-directional movement of a neutron beam stopper, forward-backward movement of a 2Dmore » position sensitive detector (2D-PSD) along 16.7 m, etc. A Visual Basic language program running on Windows operating system was employed to develop the software and it can be operated by other remote computers in the local area network. All device positions and command menu are displayed graphically in the main monitor or window and each device control can be executed by clicking the control button. Those advantages are necessary required for developing a new user-friendly control system software. Finally, the new software has been tested for handling a complete SANS experiment and it works properly.« less

  16. A New Control System Software for SANS BATAN Spectrometer in Serpong, Indonesia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bharoto,; Putra, Edy Giri Rachman

    2010-06-22

    The original main control system of the 36 meter small‐angle neutron scattering (SANS) BATAN Spectrometer (SMARTer) has been replaced with the new ones due to the malfunction of the main computer. For that reason, a new control system software for handling all the control systems was also developed in order to put the spectrometer back in operation. The developed software is able to control the system such as rotation movement of six pinholes system, vertical movement of four neutron guide system with the total length of 16.5 m, two‐directional movement of a neutron beam stopper, forward‐backward movement of a 2Dmore » position sensitive detector (2D‐PSD) along 16.7 m, etc. A Visual Basic language program running on Windows operating system was employed to develop the software and it can be operated by other remote computers in the local area network. All device positions and command menu are displayed graphically in the main monitor or window and each device control can be executed by clicking the control button. Those advantages are necessary required for developing a new user‐friendly control system software. Finally, the new software has been tested for handling a complete SANS experiment and it works properly.« less

  17. Choosing a software design method for real-time Ada applications: JSD process inversion as a means to tailor a design specification to the performance requirements and target machine

    NASA Technical Reports Server (NTRS)

    Withey, James V.

    1986-01-01

    The validity of real-time software is determined by its ability to execute on a computer within the time constraints of the physical system it is modeling. In many applications the time constraints are so critical that the details of process scheduling are elevated to the requirements analysis phase of the software development cycle. It is not uncommon to find specifications for a real-time cyclic executive program included to assumed in such requirements. It was found that prelininary designs structured around this implementation abscure the data flow of the real world system that is modeled and that it is consequently difficult and costly to maintain, update and reuse the resulting software. A cyclic executive is a software component that schedules and implicitly synchronizes the real-time software through periodic and repetitive subroutine calls. Therefore a design method is sought that allows the deferral of process scheduling to the later stages of design. The appropriate scheduling paradigm must be chosen given the performance constraints, the largest environment and the software's lifecycle. The concept of process inversion is explored with respect to the cyclic executive.

  18. Software and resources for computational medicinal chemistry

    PubMed Central

    Liao, Chenzhong; Sitzmann, Markus; Pugliese, Angelo; Nicklaus, Marc C

    2011-01-01

    Computer-aided drug design plays a vital role in drug discovery and development and has become an indispensable tool in the pharmaceutical industry. Computational medicinal chemists can take advantage of all kinds of software and resources in the computer-aided drug design field for the purposes of discovering and optimizing biologically active compounds. This article reviews software and other resources related to computer-aided drug design approaches, putting particular emphasis on structure-based drug design, ligand-based drug design, chemical databases and chemoinformatics tools. PMID:21707404

  19. Proofreading using an assistive software homophone tool: compensatory and remedial effects on the literacy skills of students with reading difficulties.

    PubMed

    Lange, Alissa A; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones highlighted only, or with no help. The group using the homophone tool significantly outperformed the other two groups on assisted proofreading and outperformed the others on unassisted spelling, although not significantly. Remedial (unassisted) improvements in automaticity of word recognition, homophone proofreading, and basic reading were found over all groups. Results elucidate the differential contributions of each function of the homophone tool and suggest that with the proper training, assistive software can help not only students with diagnosed disabilities but also those with generally weak reading skills.

  20. A Functional Description of a Digital Flight Test System for Navigation and Guidance Research in the Terminal Area

    NASA Technical Reports Server (NTRS)

    Hegarty, D. M.

    1974-01-01

    A guidance, navigation, and control system, the Simulated Shuttle Flight Test System (SS-FTS), when interfaced with existing aircraft systems, provides a research facility for studying concepts for landing the space shuttle orbiter and conventional jet aircraft. The SS-FTS, which includes a general-purpose computer, performs all computations for precisely following a prescribed approach trajectory while properly managing the vehicle energy to allow safe arrival at the runway and landing within prescribed dispersions. The system contains hardware and software provisions for navigation with several combinations of possible navigation aids that have been suggested for the shuttle. The SS-FTS can be reconfigured to study different guidance and navigation concepts by changing only the computer software, and adapted to receive different radio navigation information through minimum hardware changes. All control laws, logic, and mode interlocks reside solely in the computer software.

  1. Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory

    NASA Astrophysics Data System (ADS)

    Stoeckel, Gerhard P.; Doyle, Keith B.

    2013-09-01

    Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.

  2. An XML-based method for astronomy software designing

    NASA Astrophysics Data System (ADS)

    Liao, Mingxue; Aili, Yusupu; Zhang, Jin

    XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.

  3. An overview of software design languages. [for Galileo spacecraft Command and Data Subsystems

    NASA Technical Reports Server (NTRS)

    Callender, E. D.

    1980-01-01

    The nature and use of design languages and associated processors that are used in software development are reviewed with reference to development work on the Galileo spacecraft project, a Jupiter orbiter scheduled for launch in 1984. The major design steps are identified (functional design, architectural design, detailed design, coding, and testing), and the purpose, functions and the range of applications of design languages are examined. Then the general character of any design language is analyzed in terms of syntax and semantics. Finally, the differences and similarities between design languages are illustrated by examining two specific design languages: Software Design and Documentation language and Problem Statement Language/Problem Statement Analyzer.

  4. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  5. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  6. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  7. IPAD project overview

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.

    1980-01-01

    To respond to national needs for improved productivity in engineering design and manufacturing, a NASA supported joint industry/government project is underway denoted Integrated Programs for Aerospace-Vehicle Design (IPAD). The objective is to improve engineering productivity through better use of computer technology. It focuses on development of technology and associated software for integrated company-wide management of engineering information. The project has been underway since 1976 under the guidance of an Industry Technical Advisory Board (ITAB) composed of representatives of major engineering and computer companies and in close collaboration with the Air Force Integrated Computer-Aided Manufacturing (ICAM) program. Results to date on the IPAD project include an in-depth documentation of a representative design process for a large engineering project, the definition and design of computer-aided design software needed to support that process, and the release of prototype software to integrate selected design functions. Ongoing work concentrates on development of prototype software to manage engineering information, and initial software is nearing release.

  8. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software.

    PubMed

    Nakano, Shogo; Asano, Yasuhisa

    2015-02-03

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.

  9. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software

    NASA Astrophysics Data System (ADS)

    Nakano, Shogo; Asano, Yasuhisa

    2015-02-01

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.

  10. Practical research on the teaching of Optical Design

    NASA Astrophysics Data System (ADS)

    Fan, Changjiang; Ren, Zhijun; Ying, Chaofu; Peng, Baojin

    2017-08-01

    Optical design, together with applied optics, forms a complete system from basic theory to application theory, and it plays a very important role in professional education. In order to improve senior undergraduates' understanding of optical design, this course is divided into three parts: theoretical knowledge, software design and product processing. Through learning theoretical knowledge, students can master the aberration theory and the design principles of typical optical system. By using ZEMAX(an imaging design software), TRACEPRO(a lighting optical design software), SOLIDWORKS or PROE( mechanical design software), student can establish a complete model of optical system. Student can use carving machine located in lab or cooperative units to process the model. Through the above three parts, student can learn necessary practical knowledge and get improved in their learning and analysis abilities, thus they can also get enough practice to prompt their creative abilities, then they could gradually change from scientific theory learners to an Optics Engineers.

  11. Journal of Open Source Software (JOSS): design and first-year review

    NASA Astrophysics Data System (ADS)

    Smith, Arfon M.

    2018-01-01

    JOSS is a free and open-access journal that publishes articles describing research software across all disciplines. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. JOSS published more than 100 articles in its first year, many from the scientific python ecosystem (including a number of articles related to astronomy and astrophysics). JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.In this presentation, I'll describes the motivation, design, and progress of the Journal of Open Source Software (JOSS) and how it compares to other avenues for publishing research software in astronomy.

  12. Using CASE Software to Teach Undergraduates Systems Analysis and Design.

    ERIC Educational Resources Information Center

    Wilcox, Russell E.

    1988-01-01

    Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)

  13. Software analysis handbook: Software complexity analysis and software reliability estimation and prediction

    NASA Technical Reports Server (NTRS)

    Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron

    1994-01-01

    This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.

  14. Comprehensive evaluation of untargeted metabolomics data processing software in feature detection, quantification and discriminating marker selection.

    PubMed

    Li, Zhucui; Lu, Yan; Guo, Yufeng; Cao, Haijie; Wang, Qinhong; Shui, Wenqing

    2018-10-31

    Data analysis represents a key challenge for untargeted metabolomics studies and it commonly requires extensive processing of more than thousands of metabolite peaks included in raw high-resolution MS data. Although a number of software packages have been developed to facilitate untargeted data processing, they have not been comprehensively scrutinized in the capability of feature detection, quantification and marker selection using a well-defined benchmark sample set. In this study, we acquired a benchmark dataset from standard mixtures consisting of 1100 compounds with specified concentration ratios including 130 compounds with significant variation of concentrations. Five software evaluated here (MS-Dial, MZmine 2, XCMS, MarkerView, and Compound Discoverer) showed similar performance in detection of true features derived from compounds in the mixtures. However, significant differences between untargeted metabolomics software were observed in relative quantification of true features in the benchmark dataset. MZmine 2 outperformed the other software in terms of quantification accuracy and it reported the most true discriminating markers together with the fewest false markers. Furthermore, we assessed selection of discriminating markers by different software using both the benchmark dataset and a real-case metabolomics dataset to propose combined usage of two software for increasing confidence of biomarker identification. Our findings from comprehensive evaluation of untargeted metabolomics software would help guide future improvements of these widely used bioinformatics tools and enable users to properly interpret their metabolomics results. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Space Telecommunications Radio Architecture (STRS)

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG's SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA s current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  16. Space Telecommunications Radio Architecture (STRS): Technical Overview

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.

    2006-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG s SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA's current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  17. NASA's SDR Standard: Space Telecommunications Radio System

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Johnson, Sandra K.

    2007-01-01

    A software defined radio (SDR) architecture used in space-based platforms proposes to standardize certain aspects of radio development such as interface definitions, functional control and execution, and application software and firmware development. NASA has charted a team to develop an open software defined radio hardware and software architecture to support NASA missions and determine the viability of an Agency-wide Standard. A draft concept of the proposed standard has been released and discussed among organizations in the SDR community. Appropriate leveraging of the JTRS SCA, OMG s SWRadio Architecture and other aspects are considered. A standard radio architecture offers potential value by employing common waveform software instantiation, operation, testing and software maintenance. While software defined radios offer greater flexibility, they also poses challenges to the radio development for the space environment in terms of size, mass and power consumption and available technology. An SDR architecture for space must recognize and address the constraints of space flight hardware, and systems along with flight heritage and culture. NASA is actively participating in the development of technology and standards related to software defined radios. As NASA considers a standard radio architecture for space communications, input and coordination from government agencies, the industry, academia, and standards bodies is key to a successful architecture. The unique aspects of space require thorough investigation of relevant terrestrial technologies properly adapted to space. The talk will describe NASA s current effort to investigate SDR applications to space missions and a brief overview of a candidate architecture under consideration for space based platforms.

  18. Open source software to control Bioflo bioreactors.

    PubMed

    Burdge, David A; Libourel, Igor G L

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW.

  19. Open Source Software to Control Bioflo Bioreactors

    PubMed Central

    Burdge, David A.; Libourel, Igor G. L.

    2014-01-01

    Bioreactors are designed to support highly controlled environments for growth of tissues, cell cultures or microbial cultures. A variety of bioreactors are commercially available, often including sophisticated software to enhance the functionality of the bioreactor. However, experiments that the bioreactor hardware can support, but that were not envisioned during the software design cannot be performed without developing custom software. In addition, support for third party or custom designed auxiliary hardware is often sparse or absent. This work presents flexible open source freeware for the control of bioreactors of the Bioflo product family. The functionality of the software includes setpoint control, data logging, and protocol execution. Auxiliary hardware can be easily integrated and controlled through an integrated plugin interface without altering existing software. Simple experimental protocols can be entered as a CSV scripting file, and a Python-based protocol execution model is included for more demanding conditional experimental control. The software was designed to be a more flexible and free open source alternative to the commercially available solution. The source code and various auxiliary hardware plugins are publicly available for download from https://github.com/LibourelLab/BiofloSoftware. In addition to the source code, the software was compiled and packaged as a self-installing file for 32 and 64 bit windows operating systems. The compiled software will be able to control a Bioflo system, and will not require the installation of LabVIEW. PMID:24667828

  20. On the systematics in apparent proper motions of radio sources observed by VLBI

    NASA Astrophysics Data System (ADS)

    Raposo-Pulido, V.; Lambert, S.; Capitaine, N.; Nilsson, T.; Heinkelmann, R.; Schuh, H.

    2015-08-01

    For about twenty years, several authors have been investigating the systematics in the apparent proper motions of radio source positions. In some cases, the theoretical work developed (Pyne et al., 1996) could not be assessed due to the few number of VLBI observations. In other cases, the effects attributed to apparent proper motion could not be related successfully because there were no significant evidences from a statistical point of view (MacMillan, 2005). In this work we provide considerations about the estimation of the coefficients of spherical harmonics, based on a three-step procedure used by Titov et al. (2011) and Titov and Lambert (2013). The early stage of this work has been to compare step by step the computations and estimation processes between the Calc/Solve (http://gemini.gsfc.nasa.gov/solve/) and VieVS software (Böhm et al., 2012). To achieve this, the results were analyzed and compared with the previous study done by Titov and Lambert (2013).

Top