Designing Educational Software for Tomorrow.
ERIC Educational Resources Information Center
Harvey, Wayne
Designed to address the management and use of computer software in education and training, this paper explores both good and poor software design, calling for improvements in the quality of educational software by attending to design considerations that are based on general principles of learning rather than specific educational objectives. This…
Application and Implications of Agent Technology for Librarians.
ERIC Educational Resources Information Center
Nardi, Bonnie A.; O'Day, Vicki L.
1998-01-01
Examines intelligent software agents, presents nine design principles aimed specifically at the technology perspective (to personalize task performance and general principles), and discusses what librarians can do that software agents (agents defined as activity-aware software programs) cannot do. Describes an information ecology that integrates…
Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O
2013-06-01
Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Screen Design Principles of Computer-Aided Instructional Software for Elementary School Students
ERIC Educational Resources Information Center
Berrin, Atiker; Turan, Bülent Onur
2017-01-01
This study aims to present primary school students' views about current educational software interfaces, and to propose principles for educational software screens. The study was carried out with a general screening model. Sample group of the study consisted of sixth grade students in Sehit Ögretmen Hasan Akan Elementary School. In this context,…
Agile: From Software to Mission Systems
NASA Technical Reports Server (NTRS)
Trimble, Jay; Shirley, Mark; Hobart, Sarah
2017-01-01
To maximize efficiency and flexibility in Mission Operations System (MOS) design, we are evolving principles from agile and lean methods for software, to the complete mission system. This allows for reduced operational risk at reduced cost, and achieves a more effective design through early integration of operations into mission system engineering and flight system design. The core principles are assessment of capability through demonstration, risk reduction through targeted experiments, early test and deployment, and maturation of processes and tools through use.
ERIC Educational Resources Information Center
Adnan, Nor Hafizah; Ritzhaupt, Albert D.
2018-01-01
The failure of many instructional design initiatives is often attributed to poor instructional design. Current instructional design models do not provide much insight into design processes for creating e-learning instructional solutions. Given the similarities between the fields of instructional design and software engineering, instructional…
Teacher's Guide to SERAPHIM Software II. Chemical Principles.
ERIC Educational Resources Information Center
Bogner, Donna J.
Designed to assist chemistry teachers in selecting appropriate software programs, this publication is the second in a series of six teacher's guides from Project SERAPHIM, a program sponsored by the National Science Foundation. This guide is keyed to the chapters of the text "Chemical Principles." Program suggestions are arranged in the…
Writing Better Software for Economics Principles Textbooks.
ERIC Educational Resources Information Center
Walbert, Mark S.
1989-01-01
Examines computer software currently available with most introductory economics textbooks. Compares what is available with what should be available in order to meet the goal of effectively using the microcomputer to teach economic principles. Recommends 14 specific pedagogical changes that should be made in order to improve current designs. (LS)
ERIC Educational Resources Information Center
van der Aa, H. J., Comp.; And Others
This 249 item, mostly annotated bibliography cites literature on the general themes of basic principles, hardware, software and application studies of data bases. The broad categories are principles, development possibilities, organizational design, bibliographies, economic aspects, data structure-design, file organization, programing, aviation,…
Designing Design into an Advanced Desktop Publishing Course (A Teaching Tip).
ERIC Educational Resources Information Center
Guthrie, Jim
1995-01-01
Describes an advanced desktop publishing course that combines instruction in a few advanced techniques for using software with extensive discussion of such design principles as consistency, proportion, asymmetry, appropriateness, contrast, and color. Describes computer hardware and software, class assignments, problems, and the rationale for such…
JPL Facilities and Software for Collaborative Design: 1994 - Present
NASA Technical Reports Server (NTRS)
DeFlorio, Paul A.
2004-01-01
The viewgraph presentation provides an overview of the history of the JPL Project Design Center (PDC) and, since 2000, the Center for Space Mission Architecture and Design (CSMAD). The discussion includes PDC objectives and scope; mission design metrics; distributed design; a software architecture timeline; facility design principles; optimized design for group work; CSMAD plan view, facility design, and infrastructure; and distributed collaboration tools.
General object-oriented software development
NASA Technical Reports Server (NTRS)
Seidewitz, Edwin V.; Stark, Mike
1986-01-01
Object-oriented design techniques are gaining increasing popularity for use with the Ada programming language. A general approach to object-oriented design which synthesizes the principles of previous object-oriented methods into the overall software life-cycle, providing transitions from specification to design and from design to code. It therefore provides the basis for a general object-oriented development methodology.
Object Oriented Learning Objects
ERIC Educational Resources Information Center
Morris, Ed
2005-01-01
We apply the object oriented software engineering (OOSE) design methodology for software objects (SOs) to learning objects (LOs). OOSE extends and refines design principles for authoring dynamic reusable LOs. Our learning object class (LOC) is a template from which individualised LOs can be dynamically created for, or by, students. The properties…
Constraint-Driven Software Design: An Escape from the Waterfall Model.
ERIC Educational Resources Information Center
de Hoog, Robert; And Others
1994-01-01
Presents the principles of a development methodology for software design based on a nonlinear, product-driven approach that integrates quality aspects. Two examples are given to show that the flexibility needed for building high quality systems leads to integrated development environments in which methodology, product, and tools are closely…
Teaching Reprint File Management: Basic Principles and Software Programs.
ERIC Educational Resources Information Center
Wood, Elizabeth H.
1989-01-01
Describes a workshop for teaching library users how to manage reprint files which was developed at the University of Southern California Norris Medical Library. Software programs designed for this purpose are suggested, and a sidebar lists software features to consider. (eight references) (MES)
1991-09-01
level are, by necessity, designed to be accomplished by one or a few students in the course of a single academic term. Moreover, the software is seldom...that are covered in Computer Science curricula today, but with more of an engineering structure added. A stronger engineering design component is...ing, and sound software design principles found throughout Ada, and they are unambiguously specified. These are not features which were grafted onto a
ERIC Educational Resources Information Center
Drachova-Strang, Svetlana V.
2013-01-01
As computing becomes ubiquitous, software correctness has a fundamental role in ensuring the safety and security of the systems we build. To design and develop software correctly according to their formal contracts, CS students, the future software practitioners, need to learn a critical set of skills that are necessary and sufficient for…
Ada Software Design Methods Formulation.
1982-10-01
Programmer technical 2018 Principle Scientific Programmer technical 2020 Principle Scientif:c Programmer tnchnical 3001 Junior Programns. entry level...0.570 156 6010-. I---. 0.684 7 1031------------- 0.481 77 3119-. 0.620 94 4034-. ----- 0.696 90 4027-. -- ’---- 0.759 31 2018 -. I-’" 0.823 142 5063-. I...1094-2 0-117 cluster 4 2007 Senior Scientific Programmer technical 2016 Scientific Programmer technical 1080 Senior Software Engineer technical 2018
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
"Citizen Jane": Rethinking Design Principles for Closing the Gender Gap in Computing.
ERIC Educational Resources Information Center
Raphael, Chad
This paper identifies three rationales in the relevant literature for closing the gender gap in computing: economic, cultural and political. Each rationale implies a different set of indicators of present inequalities, disparate goals for creating equality, and distinct principles for software and web site design that aims to help girls overcome…
Imprinting Community College Computer Science Education with Software Engineering Principles
ERIC Educational Resources Information Center
Hundley, Jacqueline Holliday
2012-01-01
Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…
Design and Implementation of a REST API for the Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Design and Implementation of a REST API for the ?Human Well Being Index (HWBI)
Interoperable software development uses principles of component reuse, systems integration, flexible data transfer, and standardized ontological documentation to promote access, reuse, and integration of code. While interoperability principles are increasingly considered technolo...
Automated software development workstation
NASA Technical Reports Server (NTRS)
Prouty, Dale A.; Klahr, Philip
1988-01-01
A workstation is being developed that provides a computational environment for all NASA engineers across application boundaries, which automates reuse of existing NASA software and designs, and efficiently and effectively allows new programs and/or designs to be developed, catalogued, and reused. The generic workstation is made domain specific by specialization of the user interface, capturing engineering design expertise for the domain, and by constructing/using a library of pertinent information. The incorporation of software reusability principles and expert system technology into this workstation provide the obvious benefits of increased productivity, improved software use and design reliability, and enhanced engineering quality by bringing engineering to higher levels of abstraction based on a well tested and classified library.
NASA software specification and evaluation system: Software verification/validation techniques
NASA Technical Reports Server (NTRS)
1977-01-01
NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.
Agile: From Software to Mission System
NASA Technical Reports Server (NTRS)
Trimble, Jay; Shirley, Mark H.; Hobart, Sarah Groves
2016-01-01
The Resource Prospector (RP) is an in-situ resource utilization (ISRU) technology demonstration mission, designed to search for volatiles at the Lunar South Pole. This is NASA's first near real time tele-operated rover on the Moon. The primary objective is to search for volatiles at one of the Lunar Poles. The combination of short mission duration, a solar powered rover, and the requirement to explore shadowed regions makes for an operationally challenging mission. To maximize efficiency and flexibility in Mission System design and thus to improve the performance and reliability of the resulting Mission System, we are tailoring Agile principles that we have used effectively in ground data system software development and applying those principles to the design of elements of the mission operations system.
A general observatory control software framework design for existing small and mid-size telescopes
NASA Astrophysics Data System (ADS)
Ge, Liang; Lu, Xiao-Meng; Jiang, Xiao-Jun
2015-07-01
A general framework for observatory control software would help to improve the efficiency of observation and operation of telescopes, and would also be advantageous for remote and joint observations. We describe a general framework for observatory control software, which considers principles of flexibility and inheritance to meet the expectations from observers and technical personnel. This framework includes observation scheduling, device control and data storage. The design is based on a finite state machine that controls the whole process.
NASA Astrophysics Data System (ADS)
Loh, Ben Tun-Bin
2003-07-01
The demand for students to engage in complex student-driven and information-rich inquiry investigations poses challenges to existing learning environments. Students are not familiar with this style of work, and lack the skills, tools, and expectations it demands, often forging blindly forward in the investigation. If students are to be successful, they need to learn to be reflective inquirers, periodically stepping back from an investigation to evaluate their work. The fundamental goal of my dissertation is to understand how to design learning environments to promote and support reflective inquiry. I have three basic research questions: how to define this mode of work, how to help students learn it, and understanding how it facilitates reflection when enacted in a classroom. I take an exploratory approach in which, through iterative cycles of design, development, and reflection, I develop principles of design for reflective inquiry, instantiate those principles in the design of a software environment, and test that software in the context of classroom work. My work contributes to the understanding of reflective inquiry in three ways: First, I define a task model that describes the kinds of operations (cognitive tasks) that students should engage in as reflective inquirers. These operations are defined in terms of two basic tasks: articulation and inscription, which serve as catalysts for externalizing student thinking as objects of and triggers for reflection. Second, I instantiate the task model in the design of software tools (the Progress Portfolio). And, through proof of concept pilot studies, I examine how the task model and tools helped students with their investigative classroom work. Finally, I take a step back from these implementations and articulate general design principles for reflective inquiry with the goal of informing the design of other reflective inquiry learning environments. There are three design principles: (1) Provide a designated work space for reflection activities to focus student attention on reflection. (2) Help students create and use artifacts that represent their work and their thinking as a means to create referents for reflection. (3) Support and take advantage of social processes that help students reflect on their own work.
Practical research on the teaching of Optical Design
NASA Astrophysics Data System (ADS)
Fan, Changjiang; Ren, Zhijun; Ying, Chaofu; Peng, Baojin
2017-08-01
Optical design, together with applied optics, forms a complete system from basic theory to application theory, and it plays a very important role in professional education. In order to improve senior undergraduates' understanding of optical design, this course is divided into three parts: theoretical knowledge, software design and product processing. Through learning theoretical knowledge, students can master the aberration theory and the design principles of typical optical system. By using ZEMAX(an imaging design software), TRACEPRO(a lighting optical design software), SOLIDWORKS or PROE( mechanical design software), student can establish a complete model of optical system. Student can use carving machine located in lab or cooperative units to process the model. Through the above three parts, student can learn necessary practical knowledge and get improved in their learning and analysis abilities, thus they can also get enough practice to prompt their creative abilities, then they could gradually change from scientific theory learners to an Optics Engineers.
Description of the IV + V System Software Package.
ERIC Educational Resources Information Center
Microcomputers for Information Management: An International Journal for Library and Information Services, 1984
1984-01-01
Describes the IV + V System, a software package designed by the Institut fur Maschinelle Dokumentation for the United Nations General Information Programme and UNISIST to support automation of local information and documentation services. Principle program features and functions outlined include input/output, databank, text image, output, and…
NASA Technical Reports Server (NTRS)
Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell
1991-01-01
The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.
GESTALT: A Framework for Redesign of Educational Software
ERIC Educational Resources Information Center
Puustinen, M.; Baker, M.; Lund, K.
2006-01-01
Design of educational multimedia rarely starts from scratch, but rather by attempting to reuse existing software. Although redesign has been an issue in research on evaluation and on learning objects, how it should be carried out in a principled way has remained relatively unexplored. Furthermore, understanding how empirical research on…
Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.
ERIC Educational Resources Information Center
Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.
2002-01-01
Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…
ERIC Educational Resources Information Center
Hung, Wei-Chen; Smith, Thomas J.; Harris, Marian S.; Lockard, James
2010-01-01
This study adopted design and development research methodology (Richey & Klein, "Design and development research: Methods, strategies, and issues," 2007) to systematically investigate the process of applying instructional design principles, human-computer interaction, and software engineering to a performance support system (PSS) for behavior…
Modeling Web-Based Educational Systems: Process Design Teaching Model
ERIC Educational Resources Information Center
Rokou, Franca Pantano; Rokou, Elena; Rokos, Yannis
2004-01-01
Using modeling languages is essential to the construction of educational systems based on software engineering principles and methods. Furthermore, the instructional design is undoubtedly the cornerstone of the design and development of educational systems. Although several methodologies and languages have been proposed for the specification of…
The development of mathematics courseware for learning line and angle
NASA Astrophysics Data System (ADS)
Halim, Noor Dayana Abd; Han, Ong Boon; Abdullah, Zaleha; Yusup, Junaidah
2015-05-01
Learning software is a teaching aid which is often used in schools to increase students' motivation, attract students' attention and also improve the quality of teaching and learning process. However, the development of learning software should be followed the phases in Instructional Design (ID) Model, therefore the process can be carried out systematic and orderly. Thus, this concept paper describes the application of ADDIE model in the development of mathematics learning courseware for learning Line and Angle named CBL-Math. ADDIE model consists of five consecutive phases which are Analysis, Design, Development, Implementation and Evaluation. Each phase must be properly planned in order to achieve the objectives stated. Other than to describe the processes occurring in each phase, this paper also demonstrating how cognitive theory of multimedia learning principles are integrated in the developed courseware. The principles that applied in the courseware reduce the students' cognitive load while learning the topic of line and angle. With well prepared development process and the integration of appropriate principles, it is expected that the developed software can help students learn effectively and also increase students' achievement in the topic of Line and Angle.
IDEA: Identifying Design Principles in Educational Applets
ERIC Educational Resources Information Center
Underwood, Jody S.; Hoadley, Christopher; Lee, Hollylynne Stohl; Hollebrands, Karen; DiGiano, Chris; Renninger, K. Ann
2005-01-01
The Internet is increasingly being used as a medium for educational software in the form of miniature applications (e.g., applets) to explore concepts in a domain. One such effort in mathematics education, the Educational Software Components of Tomorrow (ESCOT) project, created 42 miniature applications each consisting of a context, a set of…
A Virtual World Workshop Environment for Learning Agile Software Development Techniques
ERIC Educational Resources Information Center
Parsons, David; Stockdale, Rosemary
2012-01-01
Multi-User Virtual Environments (MUVEs) are the subject of increasing interest for educators and trainers. This article reports on a longitudinal project that seeks to establish a virtual agile software development workshop hosted in the Open Wonderland MUVE, designed to help learners to understand the basic principles of some core agile software…
The image-guided surgery toolkit IGSTK: an open source C++ software toolkit.
Enquobahrie, Andinet; Cheng, Patrick; Gary, Kevin; Ibanez, Luis; Gobbi, David; Lindseth, Frank; Yaniv, Ziv; Aylward, Stephen; Jomier, Julien; Cleary, Kevin
2007-11-01
This paper presents an overview of the image-guided surgery toolkit (IGSTK). IGSTK is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. It is intended for fast prototyping and development of image-guided surgery applications. The toolkit was developed through a collaboration between academic and industry partners. Because IGSTK was designed for safety-critical applications, the development team has adopted lightweight software processes that emphasizes safety and robustness while, at the same time, supporting geographically separated developers. A software process that is philosophically similar to agile software methods was adopted emphasizing iterative, incremental, and test-driven development principles. The guiding principle in the architecture design of IGSTK is patient safety. The IGSTK team implemented a component-based architecture and used state machine software design methodologies to improve the reliability and safety of the components. Every IGSTK component has a well-defined set of features that are governed by state machines. The state machine ensures that the component is always in a valid state and that all state transitions are valid and meaningful. Realizing that the continued success and viability of an open source toolkit depends on a strong user community, the IGSTK team is following several key strategies to build an active user community. These include maintaining a users and developers' mailing list, providing documentation (application programming interface reference document and book), presenting demonstration applications, and delivering tutorial sessions at relevant scientific conferences.
Valizadeh, Sousan; Feizalahzadeh, Hossein; Avari, Mina; Virani, Faza
2016-07-01
Medication errors are risk factors for patients' health and may have irrecoverable effects. These errors include medication miscalculations by nurses and nursing students. This study aimed to design a multimedia application in the field of education for drug calculations in order to compare its effectiveness with the lecture method. This study selected 82 nursing students of Tabriz University of Medical Sciences in their second and third semesters in 2015. They were pre-tested by a researcher-made multiple-choice questionnaire on their knowledge of drug administration principles and ability to carry out medicinal calculations before training and were then divided through a random block design into two groups of intervention (education with designed software) and control (lecturing) based on the mean grade of previous semesters and the pre-test score. The knowledge and ability post-test was performed using the same questions after 4 weeks of training, and the data were analyzed with IBM SPSS 20 using independent samples t-test, paired-samples t-test, and ANCOVA. Drug calculation ability significantly increased after training in both the control and experimental groups (p<0.05). However, no significant difference emerged between the two groups in terms of medicinal calculation ability after training (p>0.05). The results showed that both training methods had no significant effect on study participants' knowledge of medicinal principles (p>0.05), whereas the score of knowledge of medicinal principles in the control group increased non-significantly. The results of the Kolmogorov-Smirnov test show that, since p>0.05, the data in the variable of knowledge of drug prescription principles and ability of medicinal calculations had a normal distribution. The use of educational software has no significant effect on nursing students' drug knowledge or medicinal calculation ability. However, an e-learning program can reduce the lecture time and cost of repeated topics, such as medication, suggesting that it can be an effective component in nurse education programs.
Software-Engineering Process Simulation (SEPS) model
NASA Technical Reports Server (NTRS)
Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.
1992-01-01
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.
ERIC Educational Resources Information Center
Cárdenas-Claros, Mónica Stella
2015-01-01
This paper reports on the findings of two qualitative exploratory studies that sought to investigate design features of help options in computer-based L2 listening materials. Informed by principles of participatory design, language learners, software designers, language teachers, and a computer programmer worked collaboratively in a series of…
NASA Astrophysics Data System (ADS)
Drachova-Strang, Svetlana V.
As computing becomes ubiquitous, software correctness has a fundamental role in ensuring the safety and security of the systems we build. To design and develop software correctly according to their formal contracts, CS students, the future software practitioners, need to learn a critical set of skills that are necessary and sufficient for reasoning about software correctness. This dissertation presents a systematic approach to both introducing these reasoning skills into the curriculum, and assessing how well the students have learned them. Specifically, it introduces a comprehensive Reasoning Concept Inventory (RCI) that captures the fine details of basic reasoning skills that are ideally learned across the undergraduate curriculum to reason about software correctness, to develop high quality software, and to understand why software works as specified. The RCI forms the basis for developing learning outcomes that help educators to assess the adequacy of current techniques and pinpoint necessary improvements. This dissertation contains results from experimentation and assessment over the past few years in multiple CS courses. The results show that the finer principles of mathematical reasoning of software correctness can be taught effectively and continuously improved with the help of the RCI using suitable teaching practices, and supporting methods and tools.
Design of web platform for science and engineering in the model of open market
NASA Astrophysics Data System (ADS)
Demichev, A. P.; Kryukov, A. P.
2016-09-01
This paper presents a design and operation algorithms of a web-platform for convenient, secure and effective remote interaction on the principles of the open market of users and providers of scientific application software and databases.
A Mobile-Based E-Learning System
ERIC Educational Resources Information Center
Ojokoh, Bolanle Adefowoke; Doyeni, Olubimtan Ayo; Adewale, Olumide Sunday; Isinkaye, Folasade Olubusola
2013-01-01
E-learning is an innovative approach for delivering electronically mediated, well-designed, learner-centred interactive learning environments by utilizing internet and digital technologies with respect to instructional design principles. This paper presents the application of Software Development techniques in the development of a Mobile Based…
Measuring and assessing maintainability at the end of high level design
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.
1993-01-01
Software architecture appears to be one of the main factors affecting software maintainability. Therefore, in order to be able to predict and assess maintainability early in the development process we need to be able to measure the high-level design characteristics that affect the change process. To this end, we propose a measurement approach, which is based on precise assumptions derived from the change process, which is based on Object-Oriented Design principles and is partially language independent. We define metrics for cohesion, coupling, and visibility in order to capture the difficulty of isolating, understanding, designing and validating changes.
NASA Technical Reports Server (NTRS)
Helfrich, Reinhard
1987-01-01
The concepts of software engineering which allow a user of the finite element method to describe a model, to collect and to check the model data in a data base as well as to form the matrices required for a finite element calculation are examined. Next the components of the model description are conceived including the mesh tree, the topology, the configuration, the kinematic boundary conditions, the data for each element, and the loads. The possibilities for description and review of the data are considered. The concept of the segments for the modularization of the programs follows the components of the model description. The significance of the mesh tree as a globular guiding structure will be understood in view of the principle of the unity of the model, the mesh tree, and the data base. The user-friendly aspects of the software system will be summarized: the principle of language communication, the data generators, error processing, and data security.
Toward More Critical Reviewing and Analysis of CD-ROM User Software Interfaces.
ERIC Educational Resources Information Center
Zink, Steven D.
1991-01-01
Criticizes reviews of library CD-ROM products as being uncritical of the user interface and advocates a more rigorous evaluation, not only to aid potential buyers, but as a way to influence manufacturers. Congressional Information Services' Masterfile 2 is evaluated in the context of Heckel's "Principles of Friendly Software Design." (24…
Robot-operated quality control station based on the UTT method
NASA Astrophysics Data System (ADS)
Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz; Muszyńska, Magdalena; Nawrocki, Jacek
2017-03-01
This paper presents a robotic test stand for the ultrasonic transmission tomography (UTT) inspection of stator vane thickness. The article presents the method of the test stand design in Autodesk Robot Structural Analysis Professional 2013 software suite. The performance of the designed test stand solution was simulated in the RobotStudio software suite. The operating principle of the test stand measurement system is presented with a specific focus on the measurement strategy. The results of actual wall thickness measurements performed on stator vanes are presented.
A high order approach to flight software development and testing
NASA Technical Reports Server (NTRS)
Steinbacher, J.
1981-01-01
The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.
Wilson, Anna J; Dehaene, Stanislas; Pinel, Philippe; Revkin, Susannah K; Cohen, Laurent; Cohen, David
2006-05-30
Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations). The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article describes the evolution of number sense and arithmetic scores before and after training. The software, open-source and freely available online, is designed for learning disabled children aged 5-8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains.
Calculating phase diagrams using PANDAT and panengine
NASA Astrophysics Data System (ADS)
Chen, S.-L.; Zhang, F.; Xie, F.-Y.; Daniel, S.; Yan, X.-Y.; Chang, Y. A.; Schmid-Fetzer, R.; Oates, W. A.
2003-12-01
Knowledge of phase equilibria or phase diagrams and thermodynamic properties is important in alloy design and materials-processing simulation. In principle, stable phase equilibrium is uniquely determined by the thermodynamic properties of the system, such as the Gibbs energy functions of the phases. PANDAT, a new computer software package for multicomponent phase-diagram calculation, was developed under the guidance of this principle.
Flexibility Requirements concerning the Design of Synchronous E-Learning Systems
ERIC Educational Resources Information Center
Jahn, Matthias; Piesche, Claudia; Jablonski, Stefan
2012-01-01
Purpose: Today's requirements concerning successful learning support comprise a variety of application scenarios. Therefore, the development of supporting software preferably aims at modular design. This article discusses requirements regarding flexibility of e-learning systems and presents important principles, which should be met by successful…
Rezaei-Hachesu, Peyman; Pesianian, Esmaeil; Mohammadian, Mohsen
2016-02-01
Radiology information system (RIS) in order to reduce workload and improve the quality of services must be well-designed. Heuristic evaluation is one of the methods that understand usability problems with the least time, cost and resources. The aim of present study is to evaluate the usability of RISs in hospitals. This is a cross-sectional descriptive study (2015) that uses heuristic evaluation method to evaluate the usability of RIS used in 3 hospitals of Tabriz city. The data are collected using a standard checklist based on 13 principles of Nielsen Heuristic evaluation method. Usability of RISs was investigated based on the number of components observed from Nielsen principles and problems of usability based on the number of non-observed components as well as non-existent or unrecognizable components. by evaluation of RISs in each of the hospitals 1, 2 and 3, total numbers of observed components were obtained as 173, 202 and 196, respectively. It was concluded that the usability of RISs in the studied population, on average and with observing 190 components of the 291 components related to the 13 principles of Nielsen is 65.41 %. Furthermore, problems of usability were obtained as 26.35%. The established and visible nature of some components such as response time of application, visual feedbacks, colors, view and design and arrangement of software objects cause more attention to these components as principal components in designing UI software. Also, incorrect analysis before system design leads to a lack of attention to secondary needs like Help software and security issues.
Towards a general object-oriented software development methodology
NASA Technical Reports Server (NTRS)
Seidewitz, ED; Stark, Mike
1986-01-01
An object is an abstract software model of a problem domain entity. Objects are packages of both data and operations of that data (Goldberg 83, Booch 83). The Ada (tm) package construct is representative of this general notion of an object. Object-oriented design is the technique of using objects as the basic unit of modularity in systems design. The Software Engineering Laboratory at the Goddard Space Flight Center is currently involved in a pilot program to develop a flight dynamics simulator in Ada (approximately 40,000 statements) using object-oriented methods. Several authors have applied object-oriented concepts to Ada (e.g., Booch 83, Cherry 85). It was found that these methodologies are limited. As a result a more general approach was synthesized with allows a designer to apply powerful object-oriented principles to a wide range of applications and at all stages of design. An overview is provided of this approach. Further, how object-oriented design fits into the overall software life-cycle is considered.
Hufnagel, S; Harbison, K; Silva, J; Mettala, E
1994-01-01
This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries.
Reference software implementation for GIFTS ground data processing
NASA Astrophysics Data System (ADS)
Garcia, R. K.; Howell, H. B.; Knuteson, R. O.; Martin, G. D.; Olson, E. R.; Smuga-Otto, M. J.
2006-08-01
Future satellite weather instruments such as high spectral resolution imaging interferometers pose a challenge to the atmospheric science and software development communities due to the immense data volumes they will generate. An open-source, scalable reference software implementation demonstrating the calibration of radiance products from an imaging interferometer, the Geosynchronous Imaging Fourier Transform Spectrometer1 (GIFTS), is presented. This paper covers essential design principles laid out in summary system diagrams, lessons learned during implementation and preliminary test results from the GIFTS Information Processing System (GIPS) prototype.
A CS1 Pedagogical Approach to Parallel Thinking
ERIC Educational Resources Information Center
Rague, Brian William
2010-01-01
Almost all collegiate programs in Computer Science offer an introductory course in programming primarily devoted to communicating the foundational principles of software design and development. The ACM designates this introduction to computer programming course for first-year students as CS1, during which methodologies for solving problems within…
The Implementation of Satellite Attitude Control System Software Using Object Oriented Design
NASA Technical Reports Server (NTRS)
Reid, W. Mark; Hansell, William; Phillips, Tom; Anderson, Mark O.; Drury, Derek
1998-01-01
NASA established the Small Explorer (SNMX) program in 1988 to provide frequent opportunities for highly focused and relatively inexpensive space science missions. The SMEX program has produced five satellites, three of which have been successfully launched. The remaining two spacecraft are scheduled for launch within the coming year. NASA has recently developed a prototype for the next generation Small Explorer spacecraft (SMEX-Lite). This paper describes the object-oriented design (OOD) of the SMEX-Lite Attitude Control System (ACS) software. The SMEX-Lite ACS is three-axis controlled and is capable of performing sub-arc-minute pointing. This paper first describes high level requirements governing the SMEX-Lite ACS software architecture. Next, the context in which the software resides is explained. The paper describes the principles of encapsulation, inheritance, and polymorphism with respect to the implementation of an ACS software system. This paper will also discuss the design of several ACS software components. Specifically, object-oriented designs are presented for sensor data processing, attitude determination, attitude control, and failure detection. Finally, this paper will address the establishment of the ACS Foundation Class (AFC) Library. The AFC is a large software repository, requiring a minimal amount of code modifications to produce ACS software for future projects.
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ouyang, Lizhi
Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations tomore » minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.« less
Design and implementation of telephone dialer based on Arduino
NASA Astrophysics Data System (ADS)
Ma, Zilong; Lei, Ying
2017-03-01
Introduces a system design scheme of the telephone dialer based on Arduino, including the design principle, hardware and software design and the experimental results in this paper. The scheme is based on the dual tone multi frequency (DTMF) dialing mode, using the Arduino UNO as the main controller, the serial port send out the telephone number to be dialed, speaker synthesize the voice.
Automatic design of magazine covers
NASA Astrophysics Data System (ADS)
Jahanian, Ali; Liu, Jerry; Tretter, Daniel R.; Lin, Qian; Damera-Venkata, Niranjan; O'Brien-Strain, Eamonn; Lee, Seungyon; Fan, Jian; Allebach, Jan P.
2012-03-01
In this paper, we propose a system for automatic design of magazine covers that quantifies a number of concepts from art and aesthetics. Our solution to automatic design of this type of media has been shaped by input from professional designers, magazine art directors and editorial boards, and journalists. Consequently, a number of principles in design and rules in designing magazine covers are delineated. Several techniques are derived and employed in order to quantify and implement these principles and rules in the format of a software framework. At this stage, our framework divides the task of design into three main modules: layout of magazine cover elements, choice of color for masthead and cover lines, and typography of cover lines. Feedback from professional designers on our designs suggests that our results are congruent with their intuition.
Wilson, Anna J; Dehaene, Stanislas; Pinel, Philippe; Revkin, Susannah K; Cohen, Laurent; Cohen, David
2006-01-01
Background Adaptive game software has been successful in remediation of dyslexia. Here we describe the cognitive and algorithmic principles underlying the development of similar software for dyscalculia. Our software is based on current understanding of the cerebral representation of number and the hypotheses that dyscalculia is due to a "core deficit" in number sense or in the link between number sense and symbolic number representations. Methods "The Number Race" software trains children on an entertaining numerical comparison task, by presenting problems adapted to the performance level of the individual child. We report full mathematical specifications of the algorithm used, which relies on an internal model of the child's knowledge in a multidimensional "learning space" consisting of three difficulty dimensions: numerical distance, response deadline, and conceptual complexity (from non-symbolic numerosity processing to increasingly complex symbolic operations). Results The performance of the software was evaluated both by mathematical simulations and by five weeks of use by nine children with mathematical learning difficulties. The results indicate that the software adapts well to varying levels of initial knowledge and learning speeds. Feedback from children, parents and teachers was positive. A companion article [1] describes the evolution of number sense and arithmetic scores before and after training. Conclusion The software, open-source and freely available online, is designed for learning disabled children aged 5–8, and may also be useful for general instruction of normal preschool children. The learning algorithm reported is highly general, and may be applied in other domains. PMID:16734905
Reengineering legacy software to object-oriented systems
NASA Technical Reports Server (NTRS)
Pitman, C.; Braley, D.; Fridge, E.; Plumb, A.; Izygon, M.; Mears, B.
1994-01-01
NASA has a legacy of complex software systems that are becoming increasingly expensive to maintain. Reengineering is one approach to modemizing these systems. Object-oriented technology, other modem software engineering principles, and automated tools can be used to reengineer the systems and will help to keep maintenance costs of the modemized systems down. The Software Technology Branch at the NASA/Johnson Space Center has been developing and testing reengineering methods and tools for several years. The Software Technology Branch is currently providing training and consulting support to several large reengineering projects at JSC, including the Reusable Objects Software Environment (ROSE) project, which is reengineering the flight analysis and design system (over 2 million lines of FORTRAN code) into object-oriented C++. Many important lessons have been learned during the past years; one of these is that the design must never be allowed to diverge from the code during maintenance and enhancement. Future work on open, integrated environments to support reengineering is being actively planned.
Evaluation of simSchool: An Instructional Simulation for Pre-Service Teachers
ERIC Educational Resources Information Center
Deale, Deb; Pastore, Ray
2014-01-01
This study uses theory-based design principles to evaluate the effectiveness of an instructional simulation, simSchool. It begins by examining the simulation and evaluation literature, followed by an evaluation of the simSchool software. It is a Web-based simulation designed to emulate various students (reactions) in order to provide practice for…
Development for equipment of the milk macromolecules content detection
NASA Astrophysics Data System (ADS)
Ding, Guochao; Li, Weimin; Shang, Tingyi; Xi, Yang; Gao, Yunli; Zhou, Zhen
Developed an experimental device for rapid and accurate detection of milk macromolecular content. This device developed based on laser scattered through principle, the principle use of the ingredients of the scattered light and transmitted light ratio characterization of macromolecules. Peristaltic pump to achieve automatic input and output of the milk samples, designing weak signal detection amplifier circuit for detecting the ratio with ICL7650. Real-time operating system μC / OS-II is the core design of the software part of the whole system. The experimental data prove that the device can achieve a fast real-time measurement of milk macromolecules.
Verifying Architectural Design Rules of the Flight Software Product Line
NASA Technical Reports Server (NTRS)
Ganesan, Dharmalingam; Lindvall, Mikael; Ackermann, Chris; McComas, David; Bartholomew, Maureen
2009-01-01
This paper presents experiences of verifying architectural design rules of the NASA Core Flight Software (CFS) product line implementation. The goal of the verification is to check whether the implementation is consistent with the CFS architectural rules derived from the developer's guide. The results indicate that consistency checking helps a) identifying architecturally significant deviations that were eluded during code reviews, b) clarifying the design rules to the team, and c) assessing the overall implementation quality. Furthermore, it helps connecting business goals to architectural principles, and to the implementation. This paper is the first step in the definition of a method for analyzing and evaluating product line implementations from an architecture-centric perspective.
Strategies for a Creative Future with Computer Science, Quality Design and Communicability
NASA Astrophysics Data System (ADS)
Cipolla Ficarra, Francisco V.; Villarreal, Maria
In the current work is presented the importance of the two-way triad between computer science, design and communicability. It is demonstrated how the principles of quality of software engineering are not universal since they are disappearing inside university training. Besides, a short analysis of the term "creativity" males apparent the existence of plagiarism as a human factor that damages the future of communicability applied to the on-line and off-line contents of the open software. A set of measures and guidelines are presented so that the triad works again correctly in the next years to foster the qualitative design of the interactive systems on-line and/or off-line.
Towards a general object-oriented software development methodology
NASA Technical Reports Server (NTRS)
Seidewitz, ED; Stark, Mike
1986-01-01
Object diagrams were used to design a 5000 statement team training exercise and to design the entire dynamics simulator. The object diagrams are also being used to design another 50,000 statement Ada system and a personal computer based system that will be written in Modula II. The design methodology evolves out of these experiences as well as the limitations of other methods that were studied. Object diagrams, abstraction analysis, and associated principles provide a unified framework which encompasses concepts from Yourdin, Booch, and Cherry. This general object-oriented approach handles high level system design, possibly with concurrency, through object-oriented decomposition down to a completely functional level. How object-oriented concepts can be used in other phases of the software life-cycle, such as specification and testing is being studied concurrently.
An Ada Object Oriented Missile Flight Simulation
1991-09-01
identify by block number) This thesis uses the Ada programming language in the design and development of an air-to-air missile flight simulation with...object oriented techniques and sound software engineering principles. The simulation is designed to be more understandable, modifiable, efficient and...Department of Computer Science ii ABSTRACT This thesis uses the Ada programming language in the design and development of an air-to-air missile flight
Software design of a remote real-time ECG monitoring system
NASA Astrophysics Data System (ADS)
Yu, Chengbo; Tao, Hongyan
2005-12-01
Heart disease is one of the main diseases that threaten the health and lives of human beings. At present, the normal remote ECG monitoring system has the disadvantages of a short testing distance and limitation of monitoring lines. Because of accident and paroxysmal disease, ECG monitoring has extended from the hospital to the family. Therefore, remote ECG monitoring through the Internet has the actual value and significance. The principle and design method of software of the remote dynamic ECG monitor was presented and discussed. The monitoring software is programmed with Delphi software based on client-sever interactive mode. The application program of the system, which makes use of multithreading technology, is shown to perform in an excellent manner. The program includes remote link users and ECG processing, i.e. ECG data's receiving, real-time displaying, recording and replaying. The system can connect many clients simultaneously and perform real-time monitoring to patients.
A virtual environment for medical radiation collaborative learning.
Bridge, Pete; Trapp, Jamie V; Kastanis, Lazaros; Pack, Darren; Parker, Jacqui C
2015-06-01
A software-based environment was developed to provide practical training in medical radiation principles and safety. The Virtual Radiation Laboratory application allowed students to conduct virtual experiments using simulated diagnostic and radiotherapy X-ray generators. The experiments were designed to teach students about the inverse square law, half value layer and radiation protection measures and utilised genuine clinical and experimental data. Evaluation of the application was conducted in order to ascertain the impact of the software on students' understanding, satisfaction and collaborative learning skills and also to determine potential further improvements to the software and guidelines for its continued use. Feedback was gathered via an anonymous online survey consisting of a mixture of Likert-style questions and short answer open questions. Student feedback was highly positive with 80 % of students reporting increased understanding of radiation protection principles. Furthermore 72 % enjoyed using the software and 87 % of students felt that the project facilitated collaboration within small groups. The main themes arising in the qualitative feedback comments related to efficiency and effectiveness of teaching, safety of environment, collaboration and realism. Staff and students both report gains in efficiency and effectiveness associated with the virtual experiments. In addition students particularly value the visualisation of "invisible" physical principles and increased opportunity for experimentation and collaborative problem-based learning. Similar ventures will benefit from adopting an approach that allows for individual experimentation while visualizing challenging concepts.
Software for Fermat's Principle and Lenses
ERIC Educational Resources Information Center
Mihas, Pavlos
2012-01-01
Fermat's principle is considered as a unifying concept. It is usually presented erroneously as a "least time principle". In this paper we present some software that shows cases of maxima and minima and the application of Fermat's principle to the problem of focusing in lenses. (Contains 12 figures.)
Design of virtual SCADA simulation system for pressurized water reactor
NASA Astrophysics Data System (ADS)
Wijaksono, Umar; Abdullah, Ade Gafar; Hakim, Dadang Lukman
2016-02-01
The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles of energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.
Design of a Single Motor Based Leg Structure with the Consideration of Inherent Mechanical Stability
NASA Astrophysics Data System (ADS)
Taha Manzoor, Muhammad; Sohail, Umer; Noor-e-Mustafa; Nizami, Muhammad Hamza Asif; Ayaz, Yasar
2017-07-01
The fundamental aspect of designing a legged robot is constructing a leg design that is robust and presents a simple control problem. In this paper, we have successfully designed a robotic leg based on a unique four bar mechanism with only one motor per leg. The leg design parameters used in our platform are extracted from design principles used in biological systems, multiple iterations and previous research findings. These principles guide a robotic leg to have minimal mechanical passive impedance, low leg mass and inertia, a suitable foot trajectory utilizing a practical balance between leg kinematics and robot usage, and the resultant inherent mechanical stability. The designed platform also exhibits the key feature of self-locking. Theoretical tools and software iterations were used to derive these practical features and yield an intuitive sense of the required leg design parameters.
NASA Technical Reports Server (NTRS)
Chow, Edward; Spence, Matthew Chew; Pell, Barney; Stewart, Helen; Korsmeyer, David; Liu, Joseph; Chang, Hsin-Ping; Viernes, Conan; Gogorth, Andre
2003-01-01
This paper discusses the challenges and security issues inherent in building complex cross-organizational collaborative projects and software systems within NASA. By applying the design principles of compartmentalization, organizational hierarchy and inter-organizational federation, the Secured Advanced Federated Environment (SAFE) is laying the foundation for a collaborative virtual infrastructure for the NASA community. A key element of SAFE is the Micro Security Domain (MSD) concept, which balances the need to collaborate and the need to enforce enterprise and local security rules. With the SAFE approach, security is an integral component of enterprise software and network design, not an afterthought.
Competitive Phylogenetics: A Laboratory Exercise
ERIC Educational Resources Information Center
McCabe, Declan J.
2014-01-01
This exercise demonstrates the principle of parsimony in constructing cladograms. Although it is designed using mammalian cranial characters, the activity could be adapted for characters from any group of organisms. Students score categorical traits on skulls and record the data in a spreadsheet. Using the Mesquite software package, students…
Agile Learning: Sprinting through the Semester
ERIC Educational Resources Information Center
Lang, Guido
2017-01-01
This paper introduces agile learning, a novel pedagogical approach that applies the processes and principles of agile software development to the context of learning. Agile learning is characterized by short project cycles, called sprints, in which a usable deliverable is fully planned, designed, built, tested, reviewed, and launched. An…
The ORT Open Tech Robotics and Automation Literacy Course.
ERIC Educational Resources Information Center
Sharon, Dan; And Others
1987-01-01
Presents an overview of a course on robotics and automation developed by the Organization for Rehabilitation through Training (ORT) to be offered through an open learning environment in the United Kingdom. Highlights include hardware and software requirements, an educational model, design principles, and future developments. (LRW)
"Extreme Programming" in a Bioinformatics Class
ERIC Educational Resources Information Center
Kelley, Scott; Alger, Christianna; Deutschman, Douglas
2009-01-01
The importance of Bioinformatics tools and methodology in modern biological research underscores the need for robust and effective courses at the college level. This paper describes such a course designed on the principles of cooperative learning based on a computer software industry production model called "Extreme Programming" (EP).…
ERIC Educational Resources Information Center
Virtue, Alicia; Dean, Ellen; Matheson, Molly
2014-01-01
More and more of today's scholars conduct their research in a digital realm rather than using a print collection. The University of Arizona Libraries Guide on the Side tutorial software offers an opportunity to apply the principles of active learning with real world research scenarios. This paper reports on the design and introduction of…
Integrating MPI and deduplication engines: a software architecture roadmap.
Baksi, Dibyendu
2009-03-01
The objective of this paper is to clarify the major concepts related to architecture and design of patient identity management software systems so that an implementor looking to solve a specific integration problem in the context of a Master Patient Index (MPI) and a deduplication engine can address the relevant issues. The ideas presented are illustrated in the context of a reference use case from Integrating the Health Enterprise Patient Identifier Cross-referencing (IHE PIX) profile. Sound software engineering principles using the latest design paradigm of model driven architecture (MDA) are applied to define different views of the architecture. The main contribution of the paper is a clear software architecture roadmap for implementors of patient identity management systems. Conceptual design in terms of static and dynamic views of the interfaces is provided as an example of platform independent model. This makes the roadmap applicable to any specific solutions of MPI, deduplication library or software platform. Stakeholders in need of integration of MPIs and deduplication engines can evaluate vendor specific solutions and software platform technologies in terms of fundamental concepts and can make informed decisions that preserve investment. This also allows freedom from vendor lock-in and the ability to kick-start integration efforts based on a solid architecture.
Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G
2013-01-16
Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.
Practical Findings from Applying the PSD Model for Evaluating Software Design Specifications
NASA Astrophysics Data System (ADS)
Räisänen, Teppo; Lehto, Tuomas; Oinas-Kukkonen, Harri
This paper presents practical findings from applying the PSD model to evaluating the support for persuasive features in software design specifications for a mobile Internet device. On the one hand, our experiences suggest that the PSD model fits relatively well for evaluating design specifications. On the other hand, the model would benefit from more specific heuristics for evaluating each technique to avoid unnecessary subjectivity. Better distinction between the design principles in the social support category would also make the model easier to use. Practitioners who have no theoretical background can apply the PSD model to increase the persuasiveness of the systems they design. The greatest benefit of the PSD model for researchers designing new systems may be achieved when it is applied together with a sound theory, such as the Elaboration Likelihood Model. Using the ELM together with the PSD model, one may increase the chances for attitude change.
GOATS Image Projection Component
NASA Technical Reports Server (NTRS)
Haber, Benjamin M.; Green, Joseph J.
2011-01-01
When doing mission analysis and design of an imaging system in orbit around the Earth, answering the fundamental question of imaging performance requires an understanding of the image products that will be produced by the imaging system. GOATS software represents a series of MATLAB functions to provide for geometric image projections. Unique features of the software include function modularity, a standard MATLAB interface, easy-to-understand first-principles-based analysis, and the ability to perform geometric image projections of framing type imaging systems. The software modules are created for maximum analysis utility, and can all be used independently for many varied analysis tasks, or used in conjunction with other orbit analysis tools.
A Nuclear Reactions Primer with Computers.
ERIC Educational Resources Information Center
Calle, Carlos I.; Roach, Jennifer A.
1987-01-01
Described is a microcomputer software program NUCLEAR REACTIONS designed for college level students and in use at Sweet Briar College (Sweet Briar, VA). The program is written in Microsoft Basic Version 2.1 for the Apple Macintosh Microcomputer. It introduces two conservation principles: (1) conservation of charge; and (2) conservation of nucleon…
[Multimag-M magnetotherapy system of the new generation].
Borisov, A G; Grigor'ev, E M; Gurzhin, S G; Zhulev, V I; Kriakov, V G; Proshin, E M
2007-01-01
The Multimag-M microprocessor chronomagne-totherapy system of the new generation is described. The system provides on-line diagnosis of the pulse parameters and the breathing rate during a biotechnical feedback session. The requirements to the system software, as well as its specific features and design principles, are considered.
Optics derotator servo control system for SONG Telescope
NASA Astrophysics Data System (ADS)
Xu, Jin; Ren, Changzhi; Ye, Yu
2012-09-01
The Stellar Oscillations Network Group (SONG) is an initiative which aims at designing and building a groundbased network of 1m telescopes dedicated to the study of phenomena occurring in the time domain. Chinese standard node of SONG is an Alt-Az Telescope of F/37 with 1m diameter. Optics derotator control system of SONG telescope adopts the development model of "Industrial Computer + UMAC Motion Controller + Servo Motor".1 Industrial computer is the core processing part of the motion control, motion control card(UMAC) is in charge of the details on the motion control, Servo amplifier accepts the control commands from UMAC, and drives the servo motor. The position feedback information comes from the encoder, to form a closed loop control system. This paper describes in detail hardware design and software design for the optics derotator servo control system. In terms of hardware design, the principle, structure, and control algorithm of servo system based on optics derotator are analyzed and explored. In terms of software design, the paper proposes the architecture of the system software based on Object-Oriented Programming.
Design of virtual SCADA simulation system for pressurized water reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wijaksono, Umar, E-mail: umar.wijaksono@student.upi.edu; Abdullah, Ade Gafar; Hakim, Dadang Lukman
The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles ofmore » energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.« less
Application of Human Augmentics: A Persuasive Asthma Inhaler.
Grossman, Brent; Conner, Steve; Mosnaim, Giselle; Albers, Joshua; Leigh, Jason; Jones, Steve; Kenyon, Robert
2017-03-01
This article describes a tailored health intervention delivered on a mobile phone platform, integrating low-literacy design strategies and basic principles of behavior change, to promote increased adherence and asthma control among underserved minority adolescents. We based the intervention and design principles on theories of Human Augmentics and the Elaboration Likelihood Model. We tested the efficacy of using electronic monitoring devices that incorporate informative and persuasive elements to improve adherence to a prescribed daily medication regimen intended to reduce use of asthma rescue medications. We describe the theoretical framework, hardware and software systems, and results of user testing for design purposes and a clinical pilot study incorporating use of the device and software by the targeted population. The results of the clinical pilot study showed an 83% completion rate for the treatment as well as improved adherence. Of note, 8% and 58% of participants achieved clinically significant adherence targets at baseline and last week of the study, respectively. Rescue asthma medication use decreased from a median of 3 puffs per week at baseline to 0 puffs per week during the last week of the study. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Sapp, Wendy
2007-01-01
This article presents the universal design features that were identified during the alpha development of a scheduler software program, known as MySchoolDayOnline, for use in schools, and provides preliminary research on the usability of these features. The study presented here investigated the accessibility and usability of MySchoolDayOnline for…
Human factor implications of the Eurocopter AS332L-1 Super Puma cockpit
NASA Technical Reports Server (NTRS)
Padfield, R. Randall
1993-01-01
The purpose of this paper is to identify and describe some of the human factor problems which can occur in the cockpit of a modern civilian helicopter. After examining specific hardware and software problems in the cockpit design of the Eurocopter (Aerospatiale) AS332L-1 Super Puma, the author proposes several principles that can be used to avoid similar human factors problems in the design of future cockpits. These principles relate to the use and function of warning lights, the design of autopilots in two-pilot aircraft, and the labeling of switches and warning lights, specifically with respect to abbreviations and translations from languages other than English. In the final section of the paper, the author describes current trends in society which he suggests should be taken into consideration when designing future aircraft cockpits.
Software for the Integration of Multiomics Experiments in Bioconductor.
Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi
2017-11-01
Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.
Design of optical transmitting antenna with enhance performance in visible light communication
NASA Astrophysics Data System (ADS)
Kuang, Dang; Wang, Jianping; Lu, Huimin
2016-10-01
An optical transmitting antenna for visible light communication(VLC) is designed in this work, in which the antenna is positioned before the light-emitting diodes (LED) source to change the lighting distribution, in order to achieve uniform received power effect. The method to design antenna is introduced into physical optical lens principle. According to the energy conservation law and Snell law, the antenna is designed via establishing energy mapping between the luminous flux emitted by a LED source with Lambertian distribution and the target plane. The coordinates of the antenna model are obtained under matrix laboratory (MATLAB). The antenna model entity is generated through three dimensional (3D) composition software AutoCAD with the coordinates of antenna. Ray-tracing software Tracepro is used to trace the ray which through antenna, and validate the irradiance maps. The uniformity of illumination and received power of the designed VLC is improved from approximately 35% to over 83%.
Insights into Global Health Practice from the Agile Software Development Movement
Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter
2016-01-01
Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of ‘agile global health’ and reflect on the limitations, trade-offs, and implications of this approach. PMID:27134081
Insights into Global Health Practice from the Agile Software Development Movement.
Flood, David; Chary, Anita; Austad, Kirsten; Diaz, Anne Kraemer; García, Pablo; Martinez, Boris; Canú, Waleska López; Rohloff, Peter
2016-01-01
Global health practitioners may feel frustration that current models of global health research, delivery, and implementation are overly focused on specific interventions, slow to provide health services in the field, and relatively ill-equipped to adapt to local contexts. Adapting design principles from the agile software development movement, we propose an analogous approach to designing global health programs that emphasizes tight integration between research and implementation, early involvement of ground-level health workers and program beneficiaries, and rapid cycles of iterative program improvement. Using examples from our own fieldwork, we illustrate the potential of 'agile global health' and reflect on the limitations, trade-offs, and implications of this approach.
Two Eyes, 3D: Stereoscopic Design Principles
NASA Astrophysics Data System (ADS)
Price, Aaron; Subbarao, M.; Wyatt, R.
2013-01-01
Two Eyes, 3D is a NSF-funded research project about how people perceive highly spatial objects when shown with 2D or stereoscopic ("3D") representations. As part of the project, we produced a short film about SN 2011fe. The high definition film has been rendered in both 2D and stereoscopic formats. It was developed according to a set of stereoscopic design principles we derived from the literature and past experience producing and studying stereoscopic films. Study participants take a pre- and post-test that involves a spatial cognition assessment and scientific knowledge questions about Type-1a supernovae. For the evaluation, participants use iPads in order to record spatial manipulation of the device and look for elements of embodied cognition. We will present early results and also describe the stereoscopic design principles and the rationale behind them. All of our content and software is available under open source licenses. More information is at www.twoeyes3d.org.
Technology transfer in software engineering
NASA Technical Reports Server (NTRS)
Bishop, Peter C.
1989-01-01
The University of Houston-Clear Lake is the prime contractor for the AdaNET Research Project under the direction of NASA Johnson Space Center. AdaNET was established to promote the principles of software engineering to the software development industry. AdaNET will contain not only environments and tools, but also concepts, principles, models, standards, guidelines and practices. Initially, AdaNET will serve clients from the U.S. government and private industry who are working in software development. It will seek new clients from those who have not yet adopted the principles and practices of software engineering. Some of the goals of AdaNET are to become known as an objective, authoritative source of new software engineering information and parts, to provide easy access to information and parts, and to keep abreast of innovations in the field.
The Design of the Digital Multiplexer based on Power Carrier Communication on Sports Venues
NASA Astrophysics Data System (ADS)
Lu, Ming-jing; Liang, Li; Yu, Xiao-yan
In this paper, one kind of double CPU, the low power loss, the low cost digital multiplexer has been designed in conducted the full research to this communicated way, which is satisfied the need of the electric power correspondence transmission system, especially in sports venues. This article is elaborated the digital multiplexer's hardware and the software principle of design in detail, carries on the simulation using the monolithic integrated circuit simulator, has achieved the satisfactory effect through the debug.
1991-12-01
abstract data type is, what an object-oriented design is and how to apply "software engineering" principles to the design of both of them. I owe a great... Program (ASVP), a research and development effort by two aerospace contractors to redesign and implement subsets of two existing flight simulators in...effort addresses how to implement a simulator designed using the SEI OOD Paradigm on a distributed, parallel, multiple instruction, multiple data (MIMD
Gooding, Owen W
2004-06-01
The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.
Software Engineering Principles for Courseware Development.
ERIC Educational Resources Information Center
Magel, Kenneth
1980-01-01
Courseware (computer based curriculum materials) development should follow the lessons learned by software engineers. The most important of 28 principles of software development presented here include a stress on human readability, the importance of early planning and analysis, the need for independent evaluation, and the need to be flexible.…
NASA Astrophysics Data System (ADS)
Pang, Zhibo; Zheng, Lirong; Tian, Junzhe; Kao-Walter, Sharon; Dubrova, Elena; Chen, Qiang
2015-01-01
In-home health care services based on the Internet-of-Things are promising to resolve the challenges caused by the ageing of population. But the existing research is rather scattered and shows lack of interoperability. In this article, a business-technology co-design methodology is proposed for cross-boundary integration of in-home health care devices and services. In this framework, three key elements of a solution (business model, device and service integration architecture and information system integration architecture) are organically integrated and aligned. In particular, a cooperative Health-IoT ecosystem is formulated, and information systems of all stakeholders are integrated in a cooperative health cloud as well as extended to patients' home through the in-home health care station (IHHS). Design principles of the IHHS includes the reuse of 3C platform, certification of the Health Extension, interoperability and extendibility, convenient and trusted software distribution, standardised and secured electrical health care record handling, effective service composition and efficient data fusion. These principles are applied to the design of an IHHS solution called iMedBox. Detailed device and service integration architecture and hardware and software architecture are presented and verified by an implemented prototype. The quantitative performance analysis and field trials have confirmed the feasibility of the proposed design methodology and solution.
NASA Astrophysics Data System (ADS)
Furuya, Haruhisa; Hiratsuka, Mitsuyoshi
This article overviews the historical transition of legal protection of Computer software contracts in the Unite States and presents how it should function under Uniform Commercial Code and its amended Article 2B, Uniform Computer Information Transactions Act, and also recently-approved “Principles of the Law of Software Contracts”.
Distilling Design Patterns From Agile Curation Case Studies
NASA Astrophysics Data System (ADS)
Benedict, K. K.; Lenhardt, W. C.; Young, J. W.
2016-12-01
In previous work the authors have argued that there is a need to take a new look at the data management lifecycle. Our core argument is that the data management lifecycle needs to be in essence deconstructed and rebuilt. As part of this process we also argue that much can be gained from applying ideas, concepts, and principles from agile software development methods. To be sure we are not arguing for a rote application of these agile software approaches, however, given various trends related to data and technology, it is imperative to update our thinking about how to approach the data management lifecycle, recognize differing project scales, corresponding variations in structure, and alternative models for solving the problems of scientific data curation. In this paper we will describe what we term agile curation design patterns, borrowing the concept of design patterns from the software world and we will present some initial thoughts on agile curation design patterns as informed by a sample of data curation case studies solicited from participants in agile data curation meeting sessions conducted in 2015-16.
The Azimuth Project: an Open-Access Educational Resource
NASA Astrophysics Data System (ADS)
Baez, J. C.
2012-12-01
The Azimuth Project is an online collaboration of scientists, engineers and programmers who are volunteering their time to do something about a wide range of environmental problems. The project has several aspects: 1) a wiki designed to make reliable, sourced information easy to find and accessible to a technically literate nonexperts, 2) a blog featuring expository articles and news items, 3) a project to write programs that explain basic concepts of climate physics and illustrate principles of good open-source software design, and 4) a project to develop mathematical tools for studying complex networked systems. We discuss the progress so far and some preliminary lessons. For example, enlisting the help of experts outside academia highlights the problems with pay-walled journals and the benefits of open access, as well as differences between how software development is done commercially, in the free software community, and in academe.
The Design and Analysis of the Hydraulic-pressure Seal of the Engine Box
NASA Astrophysics Data System (ADS)
Chen, Zhenya; Shen, Xingquan; Xin, Zhijie; Guo, Tingting; Liao, Kewei
2017-12-01
According to the sealing requirements of engine casing, using NX software to establish three-dimensional solid model of the engine box. Designing two seals suppress schemes basing on analyzing the characteristics of the case structure, one of seal is using two pins on one side to localize, the other is using cylinder to top tight and fasten, Clarifying the reasons for the using the former scheme have a lower cost. At the same time analysesing of the forces and deformation of the former scheme using finite element analysis software and the NX software, results proved that the pressure scheme can meet the actual needs of the program. It illustrated the composition of the basic principles of manual pressure and hydraulic system, verifed the feasibility of the seal program using experiment, providing reference for the experimental program of hydrostatic pressure in the future.
Secure Encapsulation and Publication of Biological Services in the Cloud Computing Environment
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved. PMID:24078906
Secure encapsulation and publication of biological services in the cloud computing environment.
Zhang, Weizhe; Wang, Xuehui; Lu, Bo; Kim, Tai-hoon
2013-01-01
Secure encapsulation and publication for bioinformatics software products based on web service are presented, and the basic function of biological information is realized in the cloud computing environment. In the encapsulation phase, the workflow and function of bioinformatics software are conducted, the encapsulation interfaces are designed, and the runtime interaction between users and computers is simulated. In the publication phase, the execution and management mechanisms and principles of the GRAM components are analyzed. The functions such as remote user job submission and job status query are implemented by using the GRAM components. The services of bioinformatics software are published to remote users. Finally the basic prototype system of the biological cloud is achieved.
Software handlers for process interfaces
NASA Technical Reports Server (NTRS)
Bercaw, R. W.
1976-01-01
The principles involved in the development of software handlers for custom interfacing problems are discussed. Handlers for the CAMAC standard are examined in detail. The types of transactions that must be supported have been established by standards groups, eliminating conflicting requirements arising out of different design philosophies and applications. Implementation of the standard handlers has been facilititated by standardization of hardware. The necessary local processors can be placed in the handler when it is written or at run time by means of input/output directives, or they can be built into a high-performance input/output processor. The full benefits of these process interfaces will only be realized when software requirements are incorporated uniformly into the hardware.
Object-Oriented Programming When Developing Software in Geology and Geophysics
NASA Astrophysics Data System (ADS)
Ahmadulin, R. K.; Bakanovskaya, L. N.
2017-01-01
The paper reviews the role of object-oriented programming when developing software in geology and geophysics. Main stages have been identified at which it is worthwhile to apply principles of object-oriented programming when developing software in geology and geophysics. The research was based on a number of problems solved in Geology and Petroleum Production Institute. Distinctive features of these problems are given and areas of application of the object-oriented approach are identified. Developing applications in the sphere of geology and geophysics has shown that the process of creating such products is simplified due to the use of object-oriented programming, firstly when designing structures for data storage and graphical user interfaces.
Human-Automation Integration: Principle and Method for Design and Evaluation
NASA Technical Reports Server (NTRS)
Billman, Dorrit; Feary, Michael
2012-01-01
Future space missions will increasingly depend on integration of complex engineered systems with their human operators. It is important to ensure that the systems that are designed and developed do a good job of supporting the needs of the work domain. Our research investigates methods for needs analysis. We included analysis of work products (plans for regulation of the space station) as well as work processes (tasks using current software), in a case study of Attitude Determination and Control Officers (ADCO) planning work. This allows comparing how well different designs match the structure of the work to be supported. Redesigned planning software that better matches the structure of work was developed and experimentally assessed. The new prototype enabled substantially faster and more accurate performance in plan revision tasks. This success suggests the approach to needs assessment and use in design and evaluation is promising, and merits investigatation in future research.
High-Level Performance Modeling of SAR Systems
NASA Technical Reports Server (NTRS)
Chen, Curtis
2006-01-01
SAUSAGE (Still Another Utility for SAR Analysis that s General and Extensible) is a computer program for modeling (see figure) the performance of synthetic- aperture radar (SAR) or interferometric synthetic-aperture radar (InSAR or IFSAR) systems. The user is assumed to be familiar with the basic principles of SAR imaging and interferometry. Given design parameters (e.g., altitude, power, and bandwidth) that characterize a radar system, the software predicts various performance metrics (e.g., signal-to-noise ratio and resolution). SAUSAGE is intended to be a general software tool for quick, high-level evaluation of radar designs; it is not meant to capture all the subtleties, nuances, and particulars of specific systems. SAUSAGE was written to facilitate the exploration of engineering tradeoffs within the multidimensional space of design parameters. Typically, this space is examined through an iterative process of adjusting the values of the design parameters and examining the effects of the adjustments on the overall performance of the system at each iteration. The software is designed to be modular and extensible to enable consideration of a variety of operating modes and antenna beam patterns, including, for example, strip-map and spotlight SAR acquisitions, polarimetry, burst modes, and squinted geometries.
Estimation of sample size and testing power (part 5).
Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo
2012-02-01
Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.
Bonsai: an event-based framework for processing and controlling data streams
Lopes, Gonçalo; Bonacchi, Niccolò; Frazão, João; Neto, Joana P.; Atallah, Bassam V.; Soares, Sofia; Moreira, Luís; Matias, Sara; Itskov, Pavel M.; Correia, Patrícia A.; Medina, Roberto E.; Calcaterra, Lorenza; Dreosti, Elena; Paton, Joseph J.; Kampff, Adam R.
2015-01-01
The design of modern scientific experiments requires the control and monitoring of many different data streams. However, the serial execution of programming instructions in a computer makes it a challenge to develop software that can deal with the asynchronous, parallel nature of scientific data. Here we present Bonsai, a modular, high-performance, open-source visual programming framework for the acquisition and online processing of data streams. We describe Bonsai's core principles and architecture and demonstrate how it allows for the rapid and flexible prototyping of integrated experimental designs in neuroscience. We specifically highlight some applications that require the combination of many different hardware and software components, including video tracking of behavior, electrophysiology and closed-loop control of stimulation. PMID:25904861
Biomimetic optimization research on wind noise reduction of an asymmetric cross-section bar.
Zhang, Yingchao; Meng, Weijiang; Fan, Bing; Tang, Wenhui
2016-01-01
In this paper, we used the principle of biomimetics to design two-dimensional and three-dimensional bar sections, and used computational fluid dynamics software to numerically simulate and analyse the aerodynamic noise, to reduce drag and noise. We used the principle of biomimetics to design the cross-section of a bar. An owl wing shape was used for the initial design of the section geometry; then the feathered form of an owl wing, the v-shaped micro-grooves of a shark's skin, the tubercles of a humpback whale's flipper, and the stripy surface of a scallop's shell were used to inspire surface features, added to the initial section and three-dimensional shape. Through computational aeroacoustic simulations, we obtained the aerodynamic characteristics and the noise levels of the models. These biomimetic models dramatically decreased noise levels.
a Mini Multi-Gas Detection System Based on Infrared Principle
NASA Astrophysics Data System (ADS)
Zhijian, Xie; Qiulin, Tan
2006-12-01
To counter the problems of gas accidents in coal mines, family safety resulted from using gas, a new infrared detection system with integration and miniaturization has been developed. The infrared detection optics principle used in developing this system is mainly analyzed. The idea that multi gas detection is introduced and guided through analyzing single gas detection is got across. Through researching the design of cell structure, the cell with integration and miniaturization has been devised. The way of data transmission on Controller Area Network (CAN) bus is explained. By taking Single-Chip Microcomputer (SCM) as intelligence handling, the functional block diagram of gas detection system is designed with its hardware and software system analyzed and devised. This system designed has reached the technology requirement of lower power consumption, mini-volume, big measure range, and able to realize multi-gas detection.
NASA Astrophysics Data System (ADS)
McNamara, Laura A.; Berg, Leif; Butler, Karin; Klein, Laura
2017-05-01
Even as remote sensing technology has advanced in leaps and bounds over the past decade, the remote sensing community lacks interfaces and interaction models that facilitate effective human operation of our sensor platforms. Interfaces that make great sense to electrical engineers and flight test crews can be anxiety-inducing to operational users who lack professional experience in the design and testing of sophisticated remote sensing platforms. In this paper, we reflect on an 18-month collaboration which our Sandia National Laboratory research team partnered with an industry software team to identify and fix critical issues in a widely-used sensor interface. Drawing on basic principles from cognitive and perceptual psychology and interaction design, we provide simple, easily learned guidance for minimizing common barriers to system learnability, memorability, and user engagement.
The Development of Ada (Trademark) Software for Secure Environments
1986-05-23
Telecommunications environment, This paper discusses software socurity and seeks to demostrate how the Ada programming language can be utilizec as a tool...complexity 4 . We use abstraction in our lives every day to control complexity; the principles of abstraction for software engineering are ro different...systems. These features directly sup,) )-t t.ie m odernp software engineering principles d1 s I , , 1 t, thne previous section. This is not surprising
The research of binocular vision ranging system based on LabVIEW
NASA Astrophysics Data System (ADS)
Li, Shikuan; Yang, Xu
2017-10-01
Based on the study of the principle of binocular parallax ranging, a binocular vision ranging system is designed and built. The stereo matching algorithm is realized by LabVIEW software. The camera calibration and distance measurement are completed. The error analysis shows that the system fast, effective, can be used in the corresponding industrial occasions.
Development (design and systematization) of HMS Group pump ranges
NASA Astrophysics Data System (ADS)
Tverdokhleb, I.; Yamburenko, V.
2017-08-01
The article reveals the need for pump range charts development for different applications and describes main principles used by HMS Group. Some modern approaches to pump selection are reviewed and highlighted the need for pump compliance with international standards and modern customer requirements. Even though pump design types are similar for different applications they need adjustment to specific requirements, which gets manufacturers develop their particular design for each pump range. Having wide pump ranges for different applications enables to create pump selection software, facilitating manufacturers to prepare high quality quotations in shortest time.
Software development environments: Present and future, appendix D
NASA Technical Reports Server (NTRS)
Riddle, W. E.
1980-01-01
Computerized environments which facilitate the development of appropriately functioning software systems are discussed. Their current status is reviewed and several trends exhibited by their history are identified. A number of principles, some at (slight) variance with the historical trends, are suggested and it is argued that observance of these principles is critical to achieving truly effective and efficient software development support environments.
The Muon Ionization Cooling Experiment User Software
NASA Astrophysics Data System (ADS)
Dobbs, A.; Rajaram, D.;
2017-10-01
The Muon Ionization Cooling Experiment (MICE) is a proof-of-principle experiment designed to demonstrate muon ionization cooling for the first time. MICE is currently on Step IV of its data taking programme, where transverse emittance reduction will be demonstrated. The MICE Analysis User Software (MAUS) is the reconstruction, simulation and analysis framework for the MICE experiment. MAUS is used for both offline data analysis and fast online data reconstruction and visualization to serve MICE data taking. This paper provides an introduction to MAUS, describing the central Python and C++ based framework, the data structure and and the code management and testing procedures.
NASA Technical Reports Server (NTRS)
Bremmer, D. A.
1986-01-01
The feasibility of some off-the-shelf microprocessors and state-of-art software is assessed (1) as a development system for the principle investigator (pi) in the design of the experiment model, (2) as an example of available technology application for future PI's experiments, (3) as a system capable of being interactive in the PCTC's simulation of the dedicated experiment processor (DEP), preferably by bringing the PI's DEP software directly into the simulation model, (4) as a system having bus compatibility with host VAX simulation computers, (5) as a system readily interfaced with mock-up panels and information displays, and (6) as a functional system for post mission data analysis.
Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience
Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin
2009-01-01
Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671
Designing the user interface: strategies for effective human-computer interaction
NASA Astrophysics Data System (ADS)
Shneiderman, B.
1998-03-01
In revising this popular book, Ben Shneiderman again provides a complete, current and authoritative introduction to user-interface design. The user interface is the part of every computer system that determines how people control and operate that system. When the interface is well designed, it is comprehensible, predictable, and controllable; users feel competent, satisfied, and responsible for their actions. Shneiderman discusses the principles and practices needed to design such effective interaction. Based on 20 years experience, Shneiderman offers readers practical techniques and guidelines for interface design. He also takes great care to discuss underlying issues and to support conclusions with empirical results. Interface designers, software engineers, and product managers will all find this book an invaluable resource for creating systems that facilitate rapid learning and performance, yield low error rates, and generate high user satisfaction. Coverage includes the human factors of interactive software (with a new discussion of diverse user communities), tested methods to develop and assess interfaces, interaction styles such as direct manipulation for graphical user interfaces, and design considerations such as effective messages, consistent screen design, and appropriate color.
Digital avionics systems - Principles and practices (2nd revised and enlarged edition)
NASA Technical Reports Server (NTRS)
Spitzer, Cary R.
1993-01-01
The state of the art in digital avionics systems is surveyed. The general topics addressed include: establishing avionics system requirements; avionics systems essentials in data bases, crew interfaces, and power; fault tolerance, maintainability, and reliability; architectures; packaging and fitting the system into the aircraft; hardware assessment and validation; software design, assessment, and validation; determining the costs of avionics.
Low-cost optical data acquisition system for blade vibration measurement
NASA Technical Reports Server (NTRS)
Posta, Stephen J.
1988-01-01
A low cost optical data acquisition system was designed to measure deflection of vibrating rotor blade tips. The basic principle of the new design is to record raw data, which is a set of blade arrival times, in memory and to perform all processing by software following a run. This approach yields a simple and inexpensive system with the least possible hardware. Functional elements of the system were breadboarded and operated satisfactorily during rotor simulations on the bench, and during a data collection run with a two-bladed rotor in the Lewis Research Center Spin Rig. Software was written to demonstrate the sorting and processing of data stored in the system control computer, after retrieval from the data acquisition system. The demonstration produced an accurate graphical display of deflection versus time.
Monks, K; Molnár, I; Rieger, H-J; Bogáti, B; Szabó, E
2012-04-06
Robust HPLC separations lead to fewer analysis failures and better method transfer as well as providing an assurance of quality. This work presents the systematic development of an optimal, robust, fast UHPLC method for the simultaneous assay of two APIs of an eye drop sample and their impurities, in accordance with Quality by Design principles. Chromatography software is employed to effectively generate design spaces (Method Operable Design Regions), which are subsequently employed to determine the final method conditions and to evaluate robustness prior to validation. Copyright © 2011 Elsevier B.V. All rights reserved.
The design of a small flow optical sensor of particle counter
NASA Astrophysics Data System (ADS)
Zhan, Yongbo; zhang, Jianwei; Zeng, Jianxiong; Li, Bin; Chen, Lu
2018-01-01
Based on the principle of Mie scattering, we design a small flow optical sensor of particle counter. Firstly, laser illumination system was simulated and designed by ZEMAX optical design software, and the uniform light intensity of photosensitive area was obtained. The gas circuit structure was also designed according to the related theory of fluid mechanics. Then, the method of combining with MIST scattering calculation software and geometric modeling was firstly used to design spherical reflection system, on the basis of the formula of object-image distance. Finally, the test was conducted after the optical sensor placed in self-designed pre-amplification and high-speed processing circuit. The test results show that the counting efficiency of 0.3 μm gear is above 70%, 0.5 μm gear and 1.0 μm gear are both reached more than 90%, and the dispersion coefficient of each gear is very nearly the same, compared with the standard machine of Kanomax 3886 under the particle spraying flow of 2.5SCFH, 3.0SCFH, 3.5SCFH.
Design for Run-Time Monitor on Cloud Computing
NASA Astrophysics Data System (ADS)
Kang, Mikyung; Kang, Dong-In; Yun, Mira; Park, Gyung-Leen; Lee, Junghoon
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is the type of a parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring the system status change, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize resources on cloud computing. RTM monitors application software through library instrumentation as well as underlying hardware through performance counter optimizing its computing configuration based on the analyzed data.
Software design and implementation concepts for an interoperable medical communication framework.
Besting, Andreas; Bürger, Sebastian; Kasparick, Martin; Strathen, Benjamin; Portheine, Frank
2018-02-23
The new IEEE 11073 service-oriented device connectivity (SDC) standard proposals for networked point-of-care and surgical devices constitutes the basis for improved interoperability due to its independence of vendors. To accelerate the distribution of the standard a reference implementation is indispensable. However, the implementation of such a framework has to overcome several non-trivial challenges. First, the high level of complexity of the underlying standard must be reflected in the software design. An efficient implementation has to consider the limited resources of the underlying hardware. Moreover, the frameworks purpose of realizing a distributed system demands a high degree of reliability of the framework itself and its internal mechanisms. Additionally, a framework must provide an easy-to-use and fail-safe application programming interface (API). In this work, we address these challenges by discussing suitable software engineering principles and practical coding guidelines. A descriptive model is developed that identifies key strategies. General feasibility is shown by outlining environments in which our implementation has been utilized.
ICE System: Interruptible control expert system. M.S. Thesis
NASA Technical Reports Server (NTRS)
Vezina, James M.
1990-01-01
The Interruptible Control Expert (ICE) System is based on an architecture designed to provide a strong foundation for real-time production rule expert systems. Three principles are adopted to guide the development of ICE. A practical delivery platform must be provided, no specialized hardware can be used to solve deficiencies in the software design. Knowledge of the environment and the rule-base is exploited to improve the performance of a delivered system. The third principle of ICE is to respond to the most critical event, at the expense of the more trivial tasks. Minimal time is spent on classifying the potential importance of environmental events with the majority of the time used for finding the responses. A feature of the system, derived from all three principles, is the lack of working memory. By using a priori information, a fixed amount of memory can be specified for the hardware platform. The absence of working memory removes the dangers of garbage collection during the continuous operation of the controller.
Report on the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3)
NASA Astrophysics Data System (ADS)
Katz, Daniel S.; Choi, Sou-Cheng T.; Niemeyer, Kyle E.; Hetherington, James; Löffler, Frank; Gunter, Dan; Idaszak, Ray; Brandt, Steven R.; Miller, Mark A.; Gesing, Sandra; Jones, Nick D.; Weber, Nic; Marru, Suresh; Allen, Gabrielle; Penzenstadler, Birgit; Venters, Colin C.; Davis, Ethan; Hwang, Lorraine; Todorov, Ilian; Patra, Abani; de Val-Borro, Miguel
2016-02-01
This report records and discusses the Third Workshop on Sustainable Software for Science: Practice and Experiences (WSSSPE3). The report includes a description of the keynote presentation of the workshop, which served as an overview of sustainable scientific software. It also summarizes a set of lightning talks in which speakers highlighted to-the-point lessons and challenges pertaining to sustaining scientific software. The final and main contribution of the report is a summary of the discussions, future steps, and future organization for a set of self-organized working groups on topics including developing pathways to funding scientific software; constructing useful common metrics for crediting software stakeholders; identifying principles for sustainable software engineering design; reaching out to research software organizations around the world; and building communities for software sustainability. For each group, we include a point of contact and a landing page that can be used by those who want to join that group's future activities. The main challenge left by the workshop is to see if the groups will execute these activities that they have scheduled, and how the WSSSPE community can encourage this to happen.
Specificity of software cooperating with an optoelectronic sensor in the pulse oximeter system
NASA Astrophysics Data System (ADS)
Cysewska-Sobusiak, Anna; Wiczynski, Grzegorz; Jedwabny, Tomasz
1995-06-01
Specificity of a software package composed of two parts which control an optoelectronic sensor of the computer-aided system made to realize the noninvasive measurements of the arterial blood oxygen saturation as well as some parameters of the peripheral pulse waveform, has been described. Principles of the transmission variant of the one and only noninvasive measurement method, so-called pulse oximetry, has been utilized. The software co-ordinates the suitable cooperation of an IBM PC compatible microcomputer with the sensor and one specialized card. This novel card is a key part of the whole measuring system which some application fields are extended in comparison to pulse oximeters commonly attainable. The user-friendly MS Windows graphical environment which creates the system to be multitask and non-preemptive, has been used to design the specific part of the programming presented here. With this environment, sophisticated tasks of the software package can be performed without excessive complication.
Principles and tools for collaborative entity-based intelligence analysis.
Bier, Eric A; Card, Stuart K; Bodnar, John W
2010-01-01
Software tools that make it easier for analysts to collaborate as a natural part of their work will lead to better analysis that is informed by more perspectives. We are interested to know if software tools can be designed that support collaboration even as they allow analysts to find documents and organize information (including evidence, schemas, and hypotheses). We have modified the Entity Workspace system, described previously, to test such designs. We have evaluated the resulting design in both a laboratory study and a study where it is situated with an analysis team. In both cases, effects on collaboration appear to be positive. Key aspects of the design include an evidence notebook optimized for organizing entities (rather than text characters), information structures that can be collapsed and expanded, visualization of evidence that emphasizes events and documents (rather than emphasizing the entity graph), and a notification system that finds entities of mutual interest to multiple analysts. Long-term tests suggest that this approach can support both top-down and bottom-up styles of analysis.
Hybrid Energy System Design of Micro Hydro-PV-biogas Based Micro-grid
NASA Astrophysics Data System (ADS)
Nishrina; Abdullah, A. G.; Risdiyanto, A.; Nandiyanto, ABD
2017-03-01
Hybrid renewable energy system is an arrangement of one or more sources of renewable energy and also conventional energy. This paper describes a simulation results of hybrid renewable power system based on the available potential in an educational institution in Indonesia. HOMER software was used to simulate and analyse both in terms of optimization and economic terms. This software was developed through 3 main principles; simulation, optimization, and sensitivity analysis. Generally, the presented results show that the software can demonstrate a feasible hybrid power system as well to be realized. The entire demand in case study area can be supplied by the system configuration and can be met by ¾ of electricity production. So, there are ¼ of generated energy became an excess electricity.
A Clustering-Based Approach to Enriching Code Foraging Environment.
Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu
2016-09-01
Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.
A Decision-Based Methodology for Object Oriented-Design
1988-12-16
willing to take the time to meet together weekly for mutual encouragement and prayer . Their friendship, uncompromising standards, and lifestyle were...assume the validity of the object-oriented and software engineering principles involved, and define and proto- type a generic, language independent...mean- ingful labels for variables, abstraction requires the ability to define new types that relieve the programmer from having to know or mess with
ERIC Educational Resources Information Center
Tang, Michael; David, Hyerle; Byrne, Roxanne; Tran, John
2012-01-01
This paper is a mathematical (Boolean) analysis a set of cognitive maps called Thinking Maps[R], based on Albert Upton's semantic principles developed in his seminal works, Design for Thinking (1961) and Creative Analysis (1961). Albert Upton can be seen as a brilliant thinker who was before his time or after his time depending on the future of…
Generalized adjustment by least squares ( GALS).
Elassal, A.A.
1983-01-01
The least-squares principle is universally accepted as the basis for adjustment procedures in the allied fields of geodesy, photogrammetry and surveying. A prototype software package for Generalized Adjustment by Least Squares (GALS) is described. The package is designed to perform all least-squares-related functions in a typical adjustment program. GALS is capable of supporting development of adjustment programs of any size or degree of complexity. -Author
JDFTx: Software for joint density-functional theory
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...
2017-11-14
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
JDFTx: Software for joint density-functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
Warning system against locomotive driving wheel flaccidity
NASA Astrophysics Data System (ADS)
Luo, Peng
2014-09-01
Causes of locomotive relaxation are discussed. Alarm system against locomotive driving wheel flaccidity is designed by means of techniques of infrared temperature measurement and Hall sensor measurement. The design scheme of the system, the principle of detecting locomotive driving wheel flaccidity with temperature and Hall sensor is introduced, threshold temperature of infrared alarm is determined. The circuit system is designed by microcontroller technology and the software is designed with the assembly language. The experiment of measuring the flaccid displacement with Hall sensor measurement is simulated. The results show that the system runs well with high reliability and low cost, which has a wide prospect of application and popularization.
Software system design for the non-null digital Moiré interferometer
NASA Astrophysics Data System (ADS)
Chen, Meng; Hao, Qun; Hu, Yao; Wang, Shaopu; Li, Tengfei; Li, Lin
2016-11-01
Aspheric optical components are an indispensable part of modern optics systems. With the development of aspheric optical elements fabrication technique, high-precision figure error test method of aspheric surfaces is a quite urgent issue now. We proposed a digital Moiré interferometer technique (DMIT) based on partial compensation principle for aspheric and freeform surface measurement. Different from traditional interferometer, DMIT consists of a real and a virtual interferometer. The virtual interferometer is simulated with Zemax software to perform phase-shifting and alignment. We can get the results by a series of calculation with the real interferogram and virtual interferograms generated by computer. DMIT requires a specific, reliable software system to ensure its normal work. Image acquisition and data processing are two important parts in this system. And it is also a challenge to realize the connection between the real and virtual interferometer. In this paper, we present a software system design for DMIT with friendly user interface and robust data processing features, enabling us to acquire the figure error of the measured asphere. We choose Visual C++ as the software development platform and control the ideal interferometer by using hybrid programming with Zemax. After image acquisition and data transmission, the system calls image processing algorithms written with Matlab to calculate the figure error of the measured asphere. We test the software system experimentally. In the experiment, we realize the measurement of an aspheric surface and prove the feasibility of the software system.
Automated Transfer Vehicle (ATV) Critical Safety Software Overview
NASA Astrophysics Data System (ADS)
Berthelier, D.
2002-01-01
The European Automated Transfer Vehicle is an unmanned transportation system designed to dock to International Space Station (ISS) and to contribute to the logistic servicing of the ISS. Concisely, ATV control is realized by a nominal flight control function (using computers, softwares, sensors, actuators). In order to cover the extreme situations where this nominal chain can not ensure safe trajectory with respect to ISS, a segregated proximity flight safety function is activated, where unsafe free drift trajectories can be encountered. This function relies notably on a segregated computer, the Monitoring and Safing Unit (MSU) ; in case of major ATV malfunction detection, ATV is then controlled by MSU software. Therefore, this software is critical because a MSU software failure could result in catastrophic consequences. This paper provides an overview both of this software functions and of the software development and validation method which is specific considering its criticality. First part of the paper describes briefly the proximity flight safety chain. Second part deals with the software functions. Indeed, MSU software is in charge of monitoring nominal computers and ATV corridors, using its own navigation algorithms, and, if an abnormal situation is detected, it is in charge of the ATV control during the Collision Avoidance Manoeuvre (CAM) consisting in an attitude controlled braking boost, followed by a Post-CAM manoeuvre : a Sun-pointed ATV attitude control during up to 24 hours on a safe trajectory. Monitoring, navigation and control algorithms principles are presented. Third part of this paper describes the development and validation process : algorithms functional studies , ADA coding and unit validations ; algorithms ADA code integration and validation on a specific non real-time MATLAB/SIMULINK simulator ; global software functional engineering phase, architectural design, unit testing, integration and validation on target computer.
Automatic Molecular Design using Evolutionary Techniques
NASA Technical Reports Server (NTRS)
Globus, Al; Lawton, John; Wipke, Todd; Saini, Subhash (Technical Monitor)
1998-01-01
Molecular nanotechnology is the precise, three-dimensional control of materials and devices at the atomic scale. An important part of nanotechnology is the design of molecules for specific purposes. This paper describes early results using genetic software techniques to automatically design molecules under the control of a fitness function. The fitness function must be capable of determining which of two arbitrary molecules is better for a specific task. The software begins by generating a population of random molecules. The population is then evolved towards greater fitness by randomly combining parts of the better individuals to create new molecules. These new molecules then replace some of the worst molecules in the population. The unique aspect of our approach is that we apply genetic crossover to molecules represented by graphs, i.e., sets of atoms and the bonds that connect them. We present evidence suggesting that crossover alone, operating on graphs, can evolve any possible molecule given an appropriate fitness function and a population containing both rings and chains. Prior work evolved strings or trees that were subsequently processed to generate molecular graphs. In principle, genetic graph software should be able to evolve other graph representable systems such as circuits, transportation networks, metabolic pathways, computer networks, etc.
NASA Astrophysics Data System (ADS)
Vdovin, R. A.; Smelov, V. G.
2017-02-01
This work describes the experience in manufacturing the turbine rotor for the micro-engine. It demonstrates the design principles for the complex investment casting process combining the use of the ProCast software and the rapid prototyping techniques. At the virtual modelling stage, in addition to optimized process parameters, the casting structure was improved to obtain the defect-free section. The real production stage allowed demonstrating the performance and fitness of rapid prototyping techniques for the manufacture of geometrically-complex engine-building parts.
Use of Soft Computing Technologies For Rocket Engine Control
NASA Technical Reports Server (NTRS)
Trevino, Luis C.; Olcmen, Semih; Polites, Michael
2003-01-01
The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that
Autonomous dexterous end-effectors for space robotics
NASA Technical Reports Server (NTRS)
Bekey, George A.; Iberall, Thea; Liu, Huan
1989-01-01
The development of a knowledge-based controller is summarized for the Belgrade/USC robot hand, a five-fingered end effector, designed for maximum autonomy. The biological principles of the hand and its architecture are presented. The conceptual and software aspects of the grasp selection system are discussed, including both the effects of the geometry of the target object and the task to be performed. Some current research issues are presented.
Bochev, P.; Edwards, H. C.; Kirby, R. C.; ...
2012-01-01
Intrepid is a Trilinos package for advanced discretizations of Partial Differential Equations (PDEs). The package provides a comprehensive set of tools for local, cell-based construction of a wide range of numerical methods for PDEs. This paper describes the mathematical ideas and software design principles incorporated in the package. We also provide representative examples showcasing the use of Intrepid both in the context of numerical PDEs and the more general context of data analysis.
Implementing Software Safety in the NASA Environment
NASA Technical Reports Server (NTRS)
Wetherholt, Martha S.; Radley, Charles F.
1994-01-01
Until recently, NASA did not consider allowing computers total control of flight systems. Human operators, via hardware, have constituted the ultimate safety control. In an attempt to reduce costs, NASA has come to rely more and more heavily on computers and software to control space missions. (For example. software is now planned to control most of the operational functions of the International Space Station.) Thus the need for systematic software safety programs has become crucial for mission success. Concurrent engineering principles dictate that safety should be designed into software up front, not tested into the software after the fact. 'Cost of Quality' studies have statistics and metrics to prove the value of building quality and safety into the development cycle. Unfortunately, most software engineers are not familiar with designing for safety, and most safety engineers are not software experts. Software written to specifications which have not been safety analyzed is a major source of computer related accidents. Safer software is achieved step by step throughout the system and software life cycle. It is a process that includes requirements definition, hazard analyses, formal software inspections, safety analyses, testing, and maintenance. The greatest emphasis is placed on clearly and completely defining system and software requirements, including safety and reliability requirements. Unfortunately, development and review of requirements are the weakest link in the process. While some of the more academic methods, e.g. mathematical models, may help bring about safer software, this paper proposes the use of currently approved software methodologies, and sound software and assurance practices to show how, to a large degree, safety can be designed into software from the start. NASA's approach today is to first conduct a preliminary system hazard analysis (PHA) during the concept and planning phase of a project. This determines the overall hazard potential of the system to be built. Shortly thereafter, as the system requirements are being defined, the second iteration of hazard analyses takes place, the systems hazard analysis (SHA). During the systems requirements phase, decisions are made as to what functions of the system will be the responsibility of software. This is the most critical time to affect the safety of the software. From this point, software safety analyses as well as software engineering practices are the main focus for assuring safe software. While many of the steps proposed in this paper seem like just sound engineering practices, they are the best technical and most cost effective means to assure safe software within a safe system.
Lowry, Svetlana Z; Patterson, Emily S
2014-01-01
Background There is growing recognition that design flaws in health information technology (HIT) lead to increased cognitive work, impact workflows, and produce other undesirable user experiences that contribute to usability issues and, in some cases, patient harm. These usability issues may in turn contribute to HIT utilization disparities and patient safety concerns, particularly among “non-typical” HIT users and their health care providers. Health care disparities are associated with poor health outcomes, premature death, and increased health care costs. HIT has the potential to reduce these disparate outcomes. In the computer science field, it has long been recognized that embedded cultural assumptions can reduce the usability, usefulness, and safety of HIT systems for populations whose characteristics differ from “stereotypical” users. Among these non-typical users, inappropriate embedded design assumptions may contribute to health care disparities. It is unclear how to address potentially inappropriate embedded HIT design assumptions once detected. Objective The objective of this paper is to explain HIT universal design principles derived from the human factors engineering literature that can help to overcome potential usability and/or patient safety issues that are associated with unrecognized, embedded assumptions about cultural groups when designing HIT systems. Methods Existing best practices, guidance, and standards in software usability and accessibility were subjected to a 5-step expert review process to identify and summarize those best practices, guidance, and standards that could help identify and/or address embedded design assumptions in HIT that could negatively impact patient safety, particularly for non-majority HIT user populations. An iterative consensus-based process was then used to derive evidence-based design principles from the data to address potentially inappropriate embedded cultural assumptions. Results Design principles that may help identify and address embedded HIT design assumptions are available in the existing literature. Conclusions Evidence-based HIT design principles derived from existing human factors and informatics literature can help HIT developers identify and address embedded cultural assumptions that may underlie HIT-associated usability and patient safety concerns as well as health care disparities. PMID:27025349
Gibbons, Michael C; Lowry, Svetlana Z; Patterson, Emily S
2014-12-18
There is growing recognition that design flaws in health information technology (HIT) lead to increased cognitive work, impact workflows, and produce other undesirable user experiences that contribute to usability issues and, in some cases, patient harm. These usability issues may in turn contribute to HIT utilization disparities and patient safety concerns, particularly among "non-typical" HIT users and their health care providers. Health care disparities are associated with poor health outcomes, premature death, and increased health care costs. HIT has the potential to reduce these disparate outcomes. In the computer science field, it has long been recognized that embedded cultural assumptions can reduce the usability, usefulness, and safety of HIT systems for populations whose characteristics differ from "stereotypical" users. Among these non-typical users, inappropriate embedded design assumptions may contribute to health care disparities. It is unclear how to address potentially inappropriate embedded HIT design assumptions once detected. The objective of this paper is to explain HIT universal design principles derived from the human factors engineering literature that can help to overcome potential usability and/or patient safety issues that are associated with unrecognized, embedded assumptions about cultural groups when designing HIT systems. Existing best practices, guidance, and standards in software usability and accessibility were subjected to a 5-step expert review process to identify and summarize those best practices, guidance, and standards that could help identify and/or address embedded design assumptions in HIT that could negatively impact patient safety, particularly for non-majority HIT user populations. An iterative consensus-based process was then used to derive evidence-based design principles from the data to address potentially inappropriate embedded cultural assumptions. Design principles that may help identify and address embedded HIT design assumptions are available in the existing literature. Evidence-based HIT design principles derived from existing human factors and informatics literature can help HIT developers identify and address embedded cultural assumptions that may underlie HIT-associated usability and patient safety concerns as well as health care disparities.
Automated UHPLC separation of 10 pharmaceutical compounds using software-modeling.
Zöldhegyi, A; Rieger, H-J; Molnár, I; Fekhretdinova, L
2018-03-20
Human mistakes are still one of the main reasons of underlying regulatory affairs that in a compliance with FDA's Data Integrity and Analytical Quality by Design (AQbD) must be eliminated. To develop smooth, fast and robust methods that are free of human failures, a state-of-the-art automation was presented. For the scope of this study, a commercial software (DryLab) and a model mixture of 10 drugs were subjected to testing. Following AQbD-principles, the best available working point was selected and conformational experimental runs, i.e. the six worst cases of the conducted robustness calculation, were performed. Simulated results were found to be in excellent agreement with the experimental ones, proving the usefulness and effectiveness of an automated, software-assisted analytical method development. Copyright © 2018. Published by Elsevier B.V.
User-centered design of multi-gene sequencing panel reports for clinicians.
Cutting, Elizabeth; Banchero, Meghan; Beitelshees, Amber L; Cimino, James J; Fiol, Guilherme Del; Gurses, Ayse P; Hoffman, Mark A; Jeng, Linda Jo Bone; Kawamoto, Kensaku; Kelemen, Mark; Pincus, Harold Alan; Shuldiner, Alan R; Williams, Marc S; Pollin, Toni I; Overby, Casey Lynnette
2016-10-01
The objective of this study was to develop a high-fidelity prototype for delivering multi-gene sequencing panel (GS) reports to clinicians that simulates the user experience of a final application. The delivery and use of GS reports can occur within complex and high-paced healthcare environments. We employ a user-centered software design approach in a focus group setting in order to facilitate gathering rich contextual information from a diverse group of stakeholders potentially impacted by the delivery of GS reports relevant to two precision medicine programs at the University of Maryland Medical Center. Responses from focus group sessions were transcribed, coded and analyzed by two team members. Notification mechanisms and information resources preferred by participants from our first phase of focus groups were incorporated into scenarios and the design of a software prototype for delivering GS reports. The goal of our second phase of focus group, to gain input on the prototype software design, was accomplished through conducting task walkthroughs with GS reporting scenarios. Preferences for notification, content and consultation from genetics specialists appeared to depend upon familiarity with scenarios for ordering and delivering GS reports. Despite familiarity with some aspects of the scenarios we proposed, many of our participants agreed that they would likely seek consultation from a genetics specialist after viewing the test reports. In addition, participants offered design and content recommendations. Findings illustrated a need to support customized notification approaches, user-specific information, and access to genetics specialists with GS reports. These design principles can be incorporated into software applications that deliver GS reports. Our user-centered approach to conduct this assessment and the specific input we received from clinicians may also be relevant to others working on similar projects. Copyright © 2016 Elsevier Inc. All rights reserved.
Analysis of a mammography teaching program based on an affordance design model.
Luo, Ping; Eikman, Edward A; Kealy, William; Qian, Wei
2006-12-01
The wide use of computer technology in education, particularly in mammogram reading, asks for e-learning evaluation. The existing media comparative studies, learner attitude evaluations, and performance tests are problematic. Based on an affordance design model, this study examined an existing e-learning program on mammogram reading. The selection criteria include content relatedness, representativeness, e-learning orientation, image quality, program completeness, and accessibility. A case study was conducted to examine the affordance features, functions, and presentations of the selected software. Data collection and analysis methods include interviews, protocol-based document analysis, and usability tests and inspection. Also some statistics were calculated. The examination of PBE identified that this educational software designed and programmed some tools. The learner can use these tools in the process of optimizing displays, scanning images, comparing different projections, marking the region of interests, constructing a descriptive report, assessing one's learning outcomes, and comparing one's decisions with the experts' decisions. Further, PBE provides some resources for the learner to construct one's knowledge and skills, including a categorized image library, a term-searching function, and some teaching links. Besides, users found it easy to navigate and carry out tasks. The users also reacted positively toward PBE's navigation system, instructional aids, layout, pace and flow of information, graphics, and other presentation design. The software provides learners with some cognitive tools, supporting their perceptual problem-solving processes and extending their capabilities. Learners can internalize the mental models in mammogram reading through multiple perceptual triangulations, sensitization of related features, semantic description of mammogram findings, and expert-guided semantic report construction. The design of these cognitive tools and the software interface matches the findings and principles in human learning and instructional design. Working with PBE's case-based simulations and categorized gallery, learners can enrich and transfer their experience to their jobs.
Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing
Kang, Mikyung; Kang, Dong-In; Crago, Stephen P.; Park, Gyung-Leen; Lee, Junghoon
2011-01-01
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data. PMID:22163811
Design and development of a run-time monitor for multi-core architectures in cloud computing.
Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon
2011-01-01
Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga; Stephens, Philip; Iijima, Bryron A.
2013-01-01
Modeling and imaging the Earth's ionosphere as well as understanding its structures, inhomogeneities, and disturbances is a key part of NASA's Heliophysics Directorate science roadmap. This invention provides a design tool for scientific missions focused on the ionosphere. It is a scientifically important and technologically challenging task to assess the impact of a new observation system quantitatively on our capability of imaging and modeling the ionosphere. This question is often raised whenever a new satellite system is proposed, a new type of data is emerging, or a new modeling technique is developed. The proposed constellation would be part of a new observation system with more low-Earth orbiters tracking more radio occultation signals broadcast by Global Navigation Satellite System (GNSS) than those offered by the current GPS and COSMIC observation system. A simulation system was developed to fulfill this task. The system is composed of a suite of software that combines the Global Assimilative Ionospheric Model (GAIM) including first-principles and empirical ionospheric models, a multiple- dipole geomagnetic field model, data assimilation modules, observation simulator, visualization software, and orbit design, simulation, and optimization software.
ERIC Educational Resources Information Center
Biju, Soly Mathew
2008-01-01
Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…
Estimation of sample size and testing power (Part 4).
Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo
2012-01-01
Sample size estimation is necessary for any experimental or survey research. An appropriate estimation of sample size based on known information and statistical knowledge is of great significance. This article introduces methods of sample size estimation of difference test for data with the design of one factor with two levels, including sample size estimation formulas and realization based on the formulas and the POWER procedure of SAS software for quantitative data and qualitative data with the design of one factor with two levels. In addition, this article presents examples for analysis, which will play a leading role for researchers to implement the repetition principle during the research design phase.
On Designing Lightweight Threads for Substrate Software
NASA Technical Reports Server (NTRS)
Haines, Matthew
1997-01-01
Existing user-level thread packages employ a 'black box' design approach, where the implementation of the threads is hidden from the user. While this approach is often sufficient for application-level programmers, it hides critical design decisions that system-level programmers must be able to change in order to provide efficient service for high-level systems. By applying the principles of Open Implementation Analysis and Design, we construct a new user-level threads package that supports common thread abstractions and a well-defined meta-interface for altering the behavior of these abstractions. As a result, system-level programmers will have the advantages of using high-level thread abstractions without having to sacrifice performance, flexibility or portability.
Numerical evaluation of an innovative cup layout for open volumetric solar air receivers
NASA Astrophysics Data System (ADS)
Cagnoli, Mattia; Savoldi, Laura; Zanino, Roberto; Zaversky, Fritz
2016-05-01
This paper proposes an innovative volumetric solar absorber design to be used in high-temperature air receivers of solar power tower plants. The innovative absorber, a so-called CPC-stacked-plate configuration, applies the well-known principle of a compound parabolic concentrator (CPC) for the first time in a volumetric solar receiver, heating air to high temperatures. The proposed absorber configuration is analyzed numerically, applying first the open-source ray-tracing software Tonatiuh in order to obtain the solar flux distribution on the absorber's surfaces. Next, a Computational Fluid Dynamic (CFD) analysis of a representative single channel of the innovative receiver is performed, using the commercial CFD software ANSYS Fluent. The solution of the conjugate heat transfer problem shows that the behavior of the new absorber concept is promising, however further optimization of the geometry will be necessary in order to exceed the performance of the classical absorber designs.
Programmable multi-node quantum network design and simulation
NASA Astrophysics Data System (ADS)
Dasari, Venkat R.; Sadlier, Ronald J.; Prout, Ryan; Williams, Brian P.; Humble, Travis S.
2016-05-01
Software-defined networking offers a device-agnostic programmable framework to encode new network functions. Externally centralized control plane intelligence allows programmers to write network applications and to build functional network designs. OpenFlow is a key protocol widely adopted to build programmable networks because of its programmability, flexibility and ability to interconnect heterogeneous network devices. We simulate the functional topology of a multi-node quantum network that uses programmable network principles to manage quantum metadata for protocols such as teleportation, superdense coding, and quantum key distribution. We first show how the OpenFlow protocol can manage the quantum metadata needed to control the quantum channel. We then use numerical simulation to demonstrate robust programmability of a quantum switch via the OpenFlow network controller while executing an application of superdense coding. We describe the software framework implemented to carry out these simulations and we discuss near-term efforts to realize these applications.
Designing a SCADA system simulator for fast breeder reactor
NASA Astrophysics Data System (ADS)
Nugraha, E.; Abdullah, A. G.; Hakim, D. L.
2016-04-01
SCADA (Supervisory Control and Data Acquisition) system simulator is a Human Machine Interface-based software that is able to visualize the process of a plant. This study describes the results of the process of designing a SCADA system simulator that aims to facilitate the operator in monitoring, controlling, handling the alarm, accessing historical data and historical trend in Nuclear Power Plant (NPP) type Fast Breeder Reactor (FBR). This research used simulation to simulate NPP type FBR Kalpakkam in India. This simulator was developed using Wonderware Intouch software 10 and is equipped with main menu, plant overview, area graphics, control display, set point display, alarm system, real-time trending, historical trending and security system. This simulator can properly simulate the principle of energy flow and energy conversion process on NPP type FBR. This SCADA system simulator can be used as training media for NPP type FBR prospective operators.
Yan, Koon-Kiu; Fang, Gang; Bhardwaj, Nitin; Alexander, Roger P; Gerstein, Mark
2010-05-18
The genome has often been called the operating system (OS) for a living organism. A computer OS is described by a regulatory control network termed the call graph, which is analogous to the transcriptional regulatory network in a cell. To apply our firsthand knowledge of the architecture of software systems to understand cellular design principles, we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology and evolution. We show that both networks have a fundamentally hierarchical layout, but there is a key difference: The transcriptional regulatory network possesses a few global regulators at the top and many targets at the bottom; conversely, the call graph has many regulators controlling a small set of generic functions. This top-heavy organization leads to highly overlapping functional modules in the call graph, in contrast to the relatively independent modules in the regulatory network. We further develop a way to measure evolutionary rates comparably between the two networks and explain this difference in terms of network evolution. The process of biological evolution via random mutation and subsequent selection tightly constrains the evolution of regulatory network hubs. The call graph, however, exhibits rapid evolution of its highly connected generic components, made possible by designers' continual fine-tuning. These findings stem from the design principles of the two systems: robustness for biological systems and cost effectiveness (reuse) for software systems.
Software for real-time control of a tidal liquid ventilator.
Heckman, J L; Hoffman, J; Shaffer, T H; Wolfson, M R
1999-01-01
The purpose of this project was to develop and test computer software and control algorithms designed to operate a tidal liquid ventilator. The tests were executed on a 90-MHz Pentium PC with 16 MB RAM and a prototype liquid ventilator. The software was designed using Microsoft Visual C++ (Ver. 5.0) and the Microsoft Foundation Classes. It uses a graphic user interface, is multithreaded, runs in real time, and has a built-in simulator that facilitates user education in liquid-ventilation principles. The operator can use the software to specify ventilation parameters such as the frequency of ventilation, the tidal volume, and the inspiratory-expiratory time ratio. Commands are implemented via control of the pump speed and by setting the position of two two-way solenoid-controlled valves. Data for use in monitoring and control are gathered by analog-to-digital conversion. Control strategies are implemented to maintain lung volumes and airway pressures within desired ranges, according to limits set by the operator. Also, the software allows the operator to define the shape of the flow pulse during inspiration and expiration, and to optimize perfluorochemical liquid transfer while minimizing airway pressures and maintaining the desired tidal volume. The operator can stop flow during inspiration and expiration to measure alveolar pressures. At the end of expiration, the software stores all user commands and 30 ventilation parameters into an Excel spreadsheet for later review and analysis. Use of these software and control algorithms affords user-friendly operation of a tidal liquid ventilator while providing precise control of ventilation parameters.
Farinango, Charic D; Benavides, Juan S; Cerón, Jesús D; López, Diego M; Álvarez, Rosa E
2018-01-01
Previous studies have demonstrated the effectiveness of information and communication technologies to support healthy lifestyle interventions. In particular, personal health record systems (PHR-Ss) empower self-care, essential to support lifestyle changes. Approaches such as the user-centered design (UCD), which is already a standard within the software industry (ISO 9241-210:2010), provide specifications and guidelines to guarantee user acceptance and quality of eHealth systems. However, no single PHR-S for metabolic syndrome (MS) developed following the recommendations of the ISO 9241-210:2010 specification has been found in the literature. The aim of this study was to describe the development of a PHR-S for the management of MS according to the principles and recommendations of the ISO 9241-210 standard. The proposed PHR-S was developed using a formal software development process which, in addition to the traditional activities of any software process, included the principles and recommendations of the ISO 9241-210 standard. To gather user information, a survey sample of 1,187 individuals, eight interviews, and a focus group with seven people were performed. Throughout five iterations, three prototypes were built. Potential users of each system evaluated each prototype. The quality attributes of efficiency, effectiveness, and user satisfaction were assessed using metrics defined in the ISO/IEC 25022 standard. The following results were obtained: 1) a technology profile from 1,187 individuals at risk for MS from the city of Popayan, Colombia, identifying that 75.2% of the people use the Internet and 51% had a smartphone; 2) a PHR-S to manage MS developed (the PHR-S has the following five main functionalities: record the five MS risk factors, share these measures with health care professionals, and three educational modules on nutrition, stress management, and a physical activity); and 3) usability tests on each prototype obtaining the following results: 100% effectiveness, 100% efficiency, and 84.2 points in the system usability scale. The software development methodology used was based on the ISO 9241-210 standard, which allowed the development team to maintain a focus on user's needs and requirements throughout the project, which resulted in an increased satisfaction and acceptance of the system. Additionally, the establishment of a multidisciplinary team allowed the application of considerations not only from the disciplines of software engineering and health sciences but also from other disciplines such as graphical design and media communication. Finally, usability testing allowed the observation of flaws in the designs, which helped to improve the solution.
2016-04-01
infrastructure . The work is motivated by the fact that today’s clouds are very static, uniform, and predictable, allowing attackers who identify a...vulnerability in one of the services or infrastructure components to spread their effect to other, mission-critical services. Our goal is to integrate into...clouds by elevating continuous change, evolution, and misinformation as first-rate design principles of the cloud’s infrastructure . Our work is
NASA Astrophysics Data System (ADS)
Lunt, T.; Fuchs, J. C.; Mank, K.; Feng, Y.; Brochard, F.; Herrmann, A.; Rohde, V.; Endstrasser, N.; ASDEX Upgrade Team
2010-11-01
A generally available and easy-to-use viewer for the simultaneous visualisation of the ASDEX Upgrade vacuum vessel computer aided design models, diagnostics and magnetic geometry, solutions of 3D plasma simulation codes and 2D camera images was developed. Here we report on the working principle of this software and give several examples of its technical and scientific application.
Software Compensates Electronic-Nose Readings for Humidity
NASA Technical Reports Server (NTRS)
Zhou, Hanying
2007-01-01
A computer program corrects for the effects of humidity on the readouts of an array of chemical sensors (an "electronic nose"). To enable the use of this program, the array must incorporate an independent humidity sensor in addition to sensors designed to detect analytes other than water vapor. The basic principle of the program was described in "Compensating for Effects of Humidity on Electronic Noses" (NPO-30615), NASA Tech Briefs, Vol. 28, No. 6 (June 2004), page 63. To recapitulate: The output of the humidity sensor is used to generate values that are subtracted from the outputs of the other sensors to correct for contributions of humidity to those readings. Hence, in principle, what remains after corrections are the contributions of the analytes only. The outputs of the non-humidity sensors are then deconvolved to obtain the concentrations of the analytes. In addition, the humidity reading is retained as an analyte reading in its own right. This subtraction of the humidity background increases the ability of the software to identify such events as spills in which contaminants may be present in small concentrations and accompanied by large changes in humidity.
Design and realization of assessment software for DC-bias of transformers
NASA Astrophysics Data System (ADS)
Liu, Chang; Liu, Lian-guang; Yuan, Zhong-chen
2013-03-01
The transformer working at the rated state will partically be saturated, and its mangetic current will be distorted accompanying with various of harmonic, increasing reactive power demand and some other affilicated phenomenon, which will threaten the safe operation of power grid. This paper establishes a transformer saturation circuit model of DCbias under duality principle basing on J-A theory which can reflect the hysteresis characteristics of iron core, and develops an software can assess the effects of transformer DC-bias using hybrid programming technology of C#.net and MATLAB with the microsoft.net platform. This software is able to simulate the mangnetizing current of different structures and assess the Saturation Level of transformers and the influnces of affilicated phenomenon accroding to the parameter of transformers and the DC equivalent voltage. It provides an effective method to assess the influnces of transformers caused by magnetic storm disaster and the earthing current of the HVDC project.
Optical design of free face reflective headlamps
NASA Astrophysics Data System (ADS)
Cen, Zhao Feng; Li, Xiao Tong; Deng, Shi Tao
2005-02-01
Headlamps are installed at the head of automobiles for road lighting. About the illumination and anti-dazzle, some standards such as the standard of ECE are established. Now more and more free face reflective headlamps (FFR headlamps) are applied, and the light distribution design of FFR mirror becomes an important subject in the field of automobile assembling part. In this paper the surface shape of FFR headlamps is analyzed and described as a multi-partition aspherical surface with some simple parameters. According to the fundamental principles of geometrical optics and using the theory of ray transmission with energy, millions of real rays emitted from lower beam filament and high beam filament are traced and the relative intensity of illumination at the test screen with distance of 25m from the automobiles is obtained. In this paper the description of FFR mirrors is discussed, the algorithm of FFR headlamp design is presented, the flow chart is given and the light distribution simulation software with friendly interfaces is developed. In the light distribution graphic interface of the software, the illumination area could be dragged to a certain position while the parameters of current partition at the FFR mirror will be automatically changed. Using this software the FFR headlamps satisfying criteria will be designed very quickly and the 3D coordinates of any points at the mirror will be obtained. This makes CAM of FFR headlamps easy.
Ten recommendations for software engineering in research.
Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph
2014-01-01
Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software.
Computing design principles for robotic telescopes
NASA Astrophysics Data System (ADS)
Bowman, Mark K.; Ford, Martyn J.; Lett, Robert D. J.; McKay, Derek J.; Mücke-Herzberg, Dorothy; Norbury, Martin A.
2002-12-01
Telescopes capable of making observing decisions independent of human supervision have become a reality in the 21st century. These new telescopes are likely to replace automated systems as the telescopes of choice. A fully robotic implementation offers not only reduced operating costs, but also significant gains in scientific output over automated or remotely operated systems. The design goals are to maximise the telescope operating time and minimise the cost of diagnosis and repair. However, the demands of a robotic telescope greatly exceed those of its remotely operated counterpart, and the design of the computing system is key to its operational performance. This paper outlines the challenges facing the designer of these computing systems, and describes some of the principles of design which may be applied. Issues considered include automatic control and efficiency, system awareness, robustness and reliability, access, security and safety, as well as ease-of-use and maintenance. These requirements cannot be considered simply within the context of the application software. Hence, this paper takes into account operating system, hardware and environmental issues. Consideration is also given to accommodating different levels of manual control within robotic telescopes, as well as methods of accessing and overriding the system in the event of failure.
Chou, Ting-Chao
2006-09-01
The median-effect equation derived from the mass-action law principle at equilibrium-steady state via mathematical induction and deduction for different reaction sequences and mechanisms and different types of inhibition has been shown to be the unified theory for the Michaelis-Menten equation, Hill equation, Henderson-Hasselbalch equation, and Scatchard equation. It is shown that dose and effect are interchangeable via defined parameters. This general equation for the single drug effect has been extended to the multiple drug effect equation for n drugs. These equations provide the theoretical basis for the combination index (CI)-isobologram equation that allows quantitative determination of drug interactions, where CI < 1, = 1, and > 1 indicate synergism, additive effect, and antagonism, respectively. Based on these algorithms, computer software has been developed to allow automated simulation of synergism and antagonism at all dose or effect levels. It displays the dose-effect curve, median-effect plot, combination index plot, isobologram, dose-reduction index plot, and polygonogram for in vitro or in vivo studies. This theoretical development, experimental design, and computerized data analysis have facilitated dose-effect analysis for single drug evaluation or carcinogen and radiation risk assessment, as well as for drug or other entity combinations in a vast field of disciplines of biomedical sciences. In this review, selected examples of applications are given, and step-by-step examples of experimental designs and real data analysis are also illustrated. The merging of the mass-action law principle with mathematical induction-deduction has been proven to be a unique and effective scientific method for general theory development. The median-effect principle and its mass-action law based computer software are gaining increased applications in biomedical sciences, from how to effectively evaluate a single compound or entity to how to beneficially use multiple drugs or modalities in combination therapies.
NASA Astrophysics Data System (ADS)
Radygin, V. Y.; Lukyanova, N. V.; Kupriyanov, D. Yu.
2017-01-01
Transformation of learning management systems over last two decades was investigated. The features of using e-learning systems for in-class education were discussed. The necessity of integration e-learning system with the student performance controlling system was shown. The influence of choice of student ranking system on students' motivation was described. The own way to choice of e-learning system design principles and technologies were suggested.
Computer presentation of data in science.
NASA Astrophysics Data System (ADS)
Simmonds, D.; Reynolds, L.
Contents: How this book was created. Foreword. 1. Introduction. 2. Choosing your system and software. 3. Working methods. 4. Preparing manuscripts and camera-ready copy. 5. Principles of typography and layout. 6. Using type and space to show the structure of text. 7. Artwork creation and drawing tips. 8. Posters, slides and OHP transparencies. 9. Designing with colour. Glossaries 1 and 2. Appendix 1: Copyfitting. Appendix 2: Signatures and imposition. Appendix 3: Publishing and the law. Appendix 4: Working comfort.
Studies to design and develop improved remote manipulator systems
NASA Technical Reports Server (NTRS)
Hill, J. W.; Sword, A. J.
1973-01-01
Remote manipulator control considered is based on several levels of automatic supervision which derives manipulator commands from an analysis of sensor states and task requirements. Principle sensors are manipulator joint position, tactile, and currents. The tactile sensor states can be displayed visually in perspective or replicated in the operator's control handle of perceived by the automatic supervisor. Studies are reported on control organization, operator performance and system performance measures. Unusual hardware and software details are described.
The Need for Vendor Source Code at NAS. Revised
NASA Technical Reports Server (NTRS)
Carter, Russell; Acheson, Steve; Blaylock, Bruce; Brock, David; Cardo, Nick; Ciotti, Bob; Poston, Alan; Wong, Parkson; Chancellor, Marisa K. (Technical Monitor)
1997-01-01
The Numerical Aerodynamic Simulation (NAS) Facility has a long standing practice of maintaining buildable source code for installed hardware. There are two reasons for this: NAS's designated pathfinding role, and the need to maintain a smoothly running operational capacity given the widely diversified nature of the vendor installations. NAS has a need to maintain support capabilities when vendors are not able; diagnose and remedy hardware or software problems where applicable; and to support ongoing system software development activities whether or not the relevant vendors feel support is justified. This note provides an informal history of these activities at NAS, and brings together the general principles that drive the requirement that systems integrated into the NAS environment run binaries built from source code, onsite.
Third-Party Software's Trust Quagmire.
Voas, J; Hurlburt, G
2015-12-01
Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.
Development of an ICT in IBSE course for science teachers: A design-based research
NASA Astrophysics Data System (ADS)
Tran, Trinh-Ba
2018-01-01
Integration of ICT tools for measuring with sensors, analyzing video, and modelling into Inquiry-Based Science Education (IBSE) is a need globally recognized. The challenge to teachers is how to turn manipulation of equipment and software into manipulation of ideas. We have developed a short ICT in IBSE course to prepare and support science teachers to teach inquiry-based activities with ICT tools. Within the framework of design-based research, we first defined the pedagogical principles from the literature, developed core materials for teacher learning, explored boundary conditions of the training in different countries, and elaborated set-ups of the course for the Dutch, Slovak, and Vietnamese contexts. Next, we taught and evaluated three iterative cycles of the Dutch course set-ups for pre-service science teachers from four teacher-education institutes nationwide. In each cycle, data on the teacher learning was collected via observations, questionnaires, interviews, and documents. These data were then analyzed for the questions about faithful implementation and effectiveness of the course. Following the same approach, we taught and evaluated two cycles of the Slovak course set-ups for in-service science teachers in the context of the national accreditation programme for teacher professional development. In addition, we investigated applicability of the final Dutch course set-up in the context of the physics-education master program in Vietnam with adaptations geared to educational and cultural difference. Through the iterations of implementation, evaluation, and revision, eventually the course objectives were achieved to certain extent; the pedagogical principles and core materials proved to be effective and applicable in different contexts. We started this research and design project with the pedagogical principles and concluded it with these principles (i.e. complete theory-practice cycle, depth first, distributed learning, and ownership of learning) as the core of the basic design of the ICT in IBSE course. These principles can be considered as independent, validated educational products, which teacher educators can "buy into" and use for broader aims than only "ICT in IBSE" integration. Pedagogical principles establish the theoretical model underlying the course design, provide guidelines and structure to the (re)design, implementation, evaluation, and optimization process, and help to communicate the design-based research to others. The role of pedagogical principles in design-based research is indeed essential. Moreover, we incorporated a robustness test and a generalizability/transferability test as a further step in our design-based research and achieved successful outcomes with this step. Consequently, we strongly recommend the testing of the design product in routine implementation conditions and in considerably different contexts (e.g. different programmes or even countries) as part of design-based research.
Software engineering and the role of Ada: Executive seminar
NASA Technical Reports Server (NTRS)
Freedman, Glenn B.
1987-01-01
The objective was to introduce the basic terminology and concepts of software engineering and Ada. The life cycle model is reviewed. The application of the goals and principles of software engineering is applied. An introductory understanding of the features of the Ada language is gained. Topics addressed include: the software crises; the mandate of the Space Station Program; software life cycle model; software engineering; and Ada under the software engineering umbrella.
Aslam, Tariq M; Tahir, Humza J; Parry, Neil R A; Murray, Ian J; Kwak, Kun; Heyes, Richard; Salleh, Mahani M; Czanner, Gabriela; Ashworth, Jane
2016-10-01
To report on the utility of a computer tablet-based method for automated testing of visual acuity in children based on the principles of game design. We describe the testing procedure and present repeatability as well as agreement of the score with accepted visual acuity measures. Reliability and validity study. Setting: Manchester Royal Eye Hospital Pediatric Ophthalmology Outpatients Department. Total of 112 sequentially recruited patients. For each patient 1 eye was tested with the Mobile Assessment of Vision by intERactIve Computer for Children (MAVERIC-C) system, consisting of a software application running on a computer tablet, housed in a bespoke viewing chamber. The application elicited touch screen responses using a game design to encourage compliance and automatically acquire visual acuity scores of participating patients. Acuity was then assessed by an examiner with a standard chart-based near ETDRS acuity test before the MAVERIC-C assessment was repeated. Reliability of MAVERIC-C near visual acuity score and agreement of MAVERIC-C score with near ETDRS chart for visual acuity. Altogether, 106 children (95%) completed the MAVERIC-C system without assistance. The vision scores demonstrated satisfactory reliability, with test-retest VA scores having a mean difference of 0.001 (SD ±0.136) and limits of agreement of 2 SD (LOA) of ±0.267. Comparison with the near EDTRS chart showed agreement with a mean difference of -0.0879 (±0.106) with LOA of ±0.208. This study demonstrates promising utility for software using a game design to enable automated testing of acuity in children with ophthalmic disease in an objective and accurate manner. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Rong, Guoping; Shao, Dong
2012-01-01
The importance of delivering software process courses to software engineering students has been more and more recognized in China in recent years. However, students usually cannot fully appreciate the value of software process courses by only learning methodology and principle in the classroom. Therefore, a process-specific project course was…
Neinstein, Aaron; Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh
2016-03-01
Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application ("app"), Blip, to visualize the data. Tidepool's software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool's open source, cloud model for health data interoperability is applicable to other healthcare use cases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh
2016-01-01
Objective Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. Materials and Methods An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Results Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application (“app”), Blip, to visualize the data. Tidepool’s software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. Discussion By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. Conclusion The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool’s open source, cloud model for health data interoperability is applicable to other healthcare use cases. PMID:26338218
Design of multi-mode compatible image acquisition system for HD area array CCD
NASA Astrophysics Data System (ADS)
Wang, Chen; Sui, Xiubao
2014-11-01
Combining with the current development trend in video surveillance-digitization and high-definition, a multimode-compatible image acquisition system for HD area array CCD is designed. The hardware and software designs of the color video capture system of HD area array CCD KAI-02150 presented by Truesense Imaging company are analyzed, and the structure parameters of the HD area array CCD and the color video gathering principle of the acquisition system are introduced. Then, the CCD control sequence and the timing logic of the whole capture system are realized. The noises of the video signal (KTC noise and 1/f noise) are filtered by using the Correlated Double Sampling (CDS) technique to enhance the signal-to-noise ratio of the system. The compatible designs in both software and hardware for the two other image sensors of the same series: KAI-04050 and KAI-08050 are put forward; the effective pixels of these two HD image sensors are respectively as many as four million and eight million. A Field Programmable Gate Array (FPGA) is adopted as the key controller of the system to perform the modularization design from top to bottom, which realizes the hardware design by software and improves development efficiency. At last, the required time sequence driving is simulated accurately by the use of development platform of Quartus II 12.1 combining with VHDL. The result of the simulation indicates that the driving circuit is characterized by simple framework, low power consumption, and strong anti-interference ability, which meet the demand of miniaturization and high-definition for the current tendency.
Taveira-Gomes, Tiago; Ferreira, Patrícia; Taveira-Gomes, Isabel; Severo, Milton; Ferreira, Maria Amélia
2016-08-01
Computer-based learning (CBL) has been widely used in medical education, and reports regarding its usage and effectiveness have ranged broadly. Most work has been done on the effectiveness of CBL approaches versus traditional methods, and little has been done on the comparative effects of CBL versus CBL methodologies. These findings urged other authors to recommend such studies in hopes of improving knowledge about which CBL methods work best in which settings. In this systematic review, we aimed to characterize recent studies of the development of software platforms and interventions in medical education, search for common points among studies, and assess whether recommendations for CBL research are being taken into consideration. We conducted a systematic review of the literature published from 2003 through 2013. We included studies written in English, specifically in medical education, regarding either the development of instructional software or interventions using instructional software, during training or practice, that reported learner attitudes, satisfaction, knowledge, skills, or software usage. We conducted 2 latent class analyses to group articles according to platform features and intervention characteristics. In addition, we analyzed references and citations for abstracted articles. We analyzed 251 articles. The number of publications rose over time, and they encompassed most medical disciplines, learning settings, and training levels, totaling 25 different platforms specifically for medical education. We uncovered 4 latent classes for educational software, characteristically making use of multimedia (115/251, 45.8%), text (64/251, 25.5%), Web conferencing (54/251, 21.5%), and instructional design principles (18/251, 7.2%). We found 3 classes for intervention outcomes: knowledge and attitudes (175/212, 82.6%), knowledge, attitudes, and skills (11.8%), and online activity (12/212, 5.7%). About a quarter of the articles (58/227, 25.6%) did not hold references or citations in common with other articles. The number of common references and citations increased in articles reporting instructional design principles (P=.03), articles measuring online activities (P=.01), and articles citing a review by Cook and colleagues on CBL (P=.04). There was an association between number of citations and studies comparing CBL versus CBL, independent of publication date (P=.02). Studies in this field vary highly, and a high number of software systems are being developed. It seems that past recommendations regarding CBL interventions are being taken into consideration. A move into a more student-centered model, a focus on implementing reusable software platforms for specific learning contexts, and the analysis of online activity to track and predict outcomes are relevant areas for future research in this field.
Web-Based Software for Managing Research
NASA Technical Reports Server (NTRS)
Hoadley, Sherwood T.; Ingraldi, Anthony M.; Gough, Kerry M.; Fox, Charles; Cronin, Catherine K.; Hagemann, Andrew G.; Kemmerly, Guy T.; Goodman, Wesley L.
2007-01-01
aeroCOMPASS is a software system, originally designed to aid in the management of wind tunnels at Langley Research Center, that could be adapted to provide similar aid to other enterprises in which research is performed in common laboratory facilities by users who may be geographically dispersed. Included in aeroCOMPASS is Web-interface software that provides a single, convenient portal to a set of project- and test-related software tools and other application programs. The heart of aeroCOMPASS is a user-oriented document-management software subsystem that enables geographically dispersed users to easily share and manage a variety of documents. A principle of "write once, read many" is implemented throughout aeroCOMPASS to eliminate the need for multiple entry of the same information. The Web framework of aeroCOMPASS provides links to client-side application programs that are fully integrated with databases and server-side application programs. Other subsystems of aeroCOMPASS include ones for reserving hardware, tracking of requests and feedback from users, generating interactive notes, administration of a customer-satisfaction questionnaire, managing execution of tests, managing archives of metadata about tests, planning tests, and providing online help and instruction for users.
On the Development of Multi-Step Inverse FEM with Shell Model
NASA Astrophysics Data System (ADS)
Huang, Y.; Du, R.
2005-08-01
The inverse or one-step finite element approach is increasingly used in the sheet metal stamping industry to predict strain distribution and the initial blank shape in the preliminary design stage. Based on the existing theory, there are two types of method: one is based on the principle of virtual work and the other is based on the principle of extreme work. Much research has been conducted to improve the accuracy of simulation results. For example, based on the virtual work principle, Batoz et al. developed a new method using triangular DKT shell elements. In this new method, the bending and unbending effects are considered. Based on the principle of extreme work, Majlessi and et al. proposed the multi-step inverse approach with membrane elements and applied it to an axis-symmetric part. Lee and et al. presented an axis-symmetric shell element model to solve the similar problem. In this paper, a new multi-step inverse method is introduced with no limitation on the workpiece shape. It is a shell element model based on the virtual work principle. The new method is validated by means of comparing to the commercial software system (PAMSTAMP®). The comparison results indicate that the accuracy is good.
Generic Software Architecture for Launchers
NASA Astrophysics Data System (ADS)
Carre, Emilien; Gast, Philippe; Hiron, Emmanuel; Leblanc, Alain; Lesens, David; Mescam, Emmanuelle; Moro, Pierre
2015-09-01
The definition and reuse of generic software architecture for launchers is not so usual for several reasons: the number of European launcher families is very small (Ariane 5 and Vega for these last decades); the real time constraints (reactivity and determinism needs) are very hard; low levels of versatility are required (implying often an ad hoc development of the launcher mission). In comparison, satellites are often built on a generic platform made up of reusable hardware building blocks (processors, star-trackers, gyroscopes, etc.) and reusable software building blocks (middleware, TM/TC, On Board Control Procedure, etc.). If some of these reasons are still valid (e.g. the limited number of development), the increase of the available CPU power makes today an approach based on a generic time triggered middleware (ensuring the full determinism of the system) and a centralised mission and vehicle management (offering more flexibility in the design and facilitating the long term maintenance) achievable. This paper presents an example of generic software architecture which could be envisaged for future launchers, based on the previously described principles and supported by model driven engineering and automatic code generation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shah, Anuj; Castleton, Karl J.; Hoopes, Bonnie L.
2004-06-01
The study of the release and effects of chemicals in the environment and their associated risks to humans is central to public and private decision making. FRAMES 1.X, Framework for Risk Analysis in Multimedia Environmental Systems, is a systems modeling software platform, developed by PNNL, Pacific Northwest National Laboratory, that helps scientists study the release and effects of chemicals on a source to outcome basis, create environmental models for similar risk assessment and management problems. The unique aspect of FRAMES is to dynamically introduce software modules representing individual components of a risk assessment (e.g., source release of contaminants, fate andmore » transport in various environmental media, exposure, etc.) within a software framework, manipulate their attributes and run simulations to obtain results. This paper outlines the fundamental constituents of FRAMES 2.X, an enhanced version of FRAMES 1.X, that greatly improve the ability of the module developers to “plug” their self-developed software modules into the system. The basic design, the underlying principles and a discussion of the guidelines for module developers are presented.« less
Software Engineering Principles 3-14 August 1981,
1981-08-01
small disk used (but rot that of the extended mass storage or large disk option); it is very fast (about 1/5 the speed of the primary memory, where the...extended mass storage or large disk option); it is very fast (about 1/5 the speed of the primary memory, where the disk was 1/10000 for access); and...programed and tested - must be correct and fast D. Choice of right synchronization operations: Design problem 1. Several mentioned in literature 9-22
Methods and analysis of realizing randomized grouping.
Hu, Liang-Ping; Bao, Xiao-Lei; Wang, Qi
2011-07-01
Randomization is one of the four basic principles of research design. The meaning of randomization includes two aspects: one is to randomly select samples from the population, which is known as random sampling; the other is to randomly group all the samples, which is called randomized grouping. Randomized grouping can be subdivided into three categories: completely, stratified and dynamically randomized grouping. This article mainly introduces the steps of complete randomization, the definition of dynamic randomization and the realization of random sampling and grouping by SAS software.
NASA Astrophysics Data System (ADS)
Gould, C. A.; Shammas, N. Y. A.; Grainger, S.; Taylor, I.; Simpson, K.
2012-06-01
This paper documents the 3D modeling and simulation of a three couple thermoelectric module using the Synopsys Technology Computer Aided Design (TCAD) semiconductor simulation software. Simulation results are presented for thermoelectric power generation, cooling and heating, and successfully demonstrate the basic thermoelectric principles. The 3D TCAD simulation model of a three couple thermoelectric module can be used in the future to evaluate different thermoelectric materials, device structures, and improve the efficiency and performance of thermoelectric modules.
2003-06-01
greater detail in the next section, is to achieve these principles. Besides the fact, that these principles illustrate the essence of agile software...like e.g. ADLER, JASMIN , SAMOC or HEROS. In all of these projects the framework for the process model was the Vorgehensmodell (V-Model) of the...practical essence of the solutions to manage projects within the constraints of cost, schedule, functionality and quality and ways to get the
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2010 CFR
2010-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2012 CFR
2012-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2014 CFR
2014-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2013 CFR
2013-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
15 CFR 734.2 - Important EAR terms and principles.
Code of Federal Regulations, 2011 CFR
2011-01-01
... technology and software not subject to the EAR are described in §§ 734.7 through 734.11 and supplement no. 1... of items subject to the EAR out of the United States, or release of technology or software subject to... source code and object code software subject to the EAR. (2) Export of technology or software. (See...
Copyright: Know Your Electronic Rights!
ERIC Educational Resources Information Center
Valauskas, Edward J.
1992-01-01
Defines copyright and examines the interests of computer software publishers. Issues related to the rights of libraries in the circulation of software are discussed, including the fair use principle, software vendors' licensing agreements, and cooperation between libraries and vendors. An inset describes procedures for internal auditing of…
Zhang, Kam Y. J.
2013-01-01
One of the underlying principles in drug discovery is that a biologically active compound is complimentary in shape and molecular recognition features to its receptor. This principle infers that molecules binding to the same receptor may share some common features. Here, we have investigated whether the electrostatic similarity can be used for the discovery of small molecule protein-protein interaction inhibitors (SMPPIIs). We have developed a method that can be used to evaluate the similarity of electrostatic potentials between small molecules and known protein ligands. This method was implemented in a software called EleKit. Analyses of all available (at the time of research) SMPPII structures indicate that SMPPIIs bear some similarities of electrostatic potential with the ligand proteins of the same receptor. This is especially true for the more polar SMPPIIs. Retrospective analysis of several successful SMPPIIs has shown the applicability of EleKit in the design of new SMPPIIs. PMID:24130741
Farinango, Charic D; Benavides, Juan S; Cerón, Jesús D; López, Diego M; Álvarez, Rosa E
2018-01-01
Background Previous studies have demonstrated the effectiveness of information and communication technologies to support healthy lifestyle interventions. In particular, personal health record systems (PHR-Ss) empower self-care, essential to support lifestyle changes. Approaches such as the user-centered design (UCD), which is already a standard within the software industry (ISO 9241-210:2010), provide specifications and guidelines to guarantee user acceptance and quality of eHealth systems. However, no single PHR-S for metabolic syndrome (MS) developed following the recommendations of the ISO 9241-210:2010 specification has been found in the literature. Objective The aim of this study was to describe the development of a PHR-S for the management of MS according to the principles and recommendations of the ISO 9241-210 standard. Methods The proposed PHR-S was developed using a formal software development process which, in addition to the traditional activities of any software process, included the principles and recommendations of the ISO 9241-210 standard. To gather user information, a survey sample of 1,187 individuals, eight interviews, and a focus group with seven people were performed. Throughout five iterations, three prototypes were built. Potential users of each system evaluated each prototype. The quality attributes of efficiency, effectiveness, and user satisfaction were assessed using metrics defined in the ISO/IEC 25022 standard. Results The following results were obtained: 1) a technology profile from 1,187 individuals at risk for MS from the city of Popayan, Colombia, identifying that 75.2% of the people use the Internet and 51% had a smartphone; 2) a PHR-S to manage MS developed (the PHR-S has the following five main functionalities: record the five MS risk factors, share these measures with health care professionals, and three educational modules on nutrition, stress management, and a physical activity); and 3) usability tests on each prototype obtaining the following results: 100% effectiveness, 100% efficiency, and 84.2 points in the system usability scale. Conclusion The software development methodology used was based on the ISO 9241-210 standard, which allowed the development team to maintain a focus on user’s needs and requirements throughout the project, which resulted in an increased satisfaction and acceptance of the system. Additionally, the establishment of a multidisciplinary team allowed the application of considerations not only from the disciplines of software engineering and health sciences but also from other disciplines such as graphical design and media communication. Finally, usability testing allowed the observation of flaws in the designs, which helped to improve the solution. PMID:29386903
Layered approach to workstation design for medical image viewing
NASA Astrophysics Data System (ADS)
Haynor, David R.; Zick, Gregory L.; Heritage, Marcus B.; Kim, Yongmin
1992-07-01
Software engineering principles suggest that complex software systems are best constructed from independent, self-contained modules, thereby maximizing the portability, maintainability and modifiability of the produced code. This principal is important in the design of medical imaging workstations, where further developments in technology (CPU, memory, interface devices, displays, network connections) are required for clinically acceptable workstations, and it is desirable to provide different hardware platforms with the ''same look and feel'' for the user. In addition, the set of desired functions is relatively well understood, but the optimal user interface for delivering these functions on a clinically acceptable workstation is still different depending on department, specialty, or individual preference. At the University of Washington, we are developing a viewing station based on the IBM RISC/6000 computer and on new technologies that are just becoming commercially available. These include advanced voice recognition systems and an ultra-high-speed network. We are developing a set of specifications and a conceptual design for the workstation, and will be producing a prototype. This paper presents our current concepts concerning the architecture and software system design of the future prototype. Our conceptual design specifies requirements for a Database Application Programming Interface (DBAPI) and for a User API (UAPI). The DBAPI consists of a set of subroutine calls that define the admissible transactions between the workstation and an image archive. The UAPI describes the requests a user interface program can make of the workstation. It incorporates basic display and image processing functions, yet is specifically designed to allow extensions to the basic set at the application level. We will discuss the fundamental elements of the two API''s and illustrate their application to workstation design.
Design of decision support interventions for medication prescribing.
Horsky, Jan; Phansalkar, Shobha; Desai, Amrita; Bell, Douglas; Middleton, Blackford
2013-06-01
Describe optimal design attributes of clinical decision support (CDS) interventions for medication prescribing, emphasizing perceptual, cognitive and functional characteristics that improve human-computer interaction (HCI) and patient safety. Findings from published reports on success, failures and lessons learned during implementation of CDS systems were reviewed and interpreted with regard to HCI and software usability principles. We then formulated design recommendations for CDS alerts that would reduce unnecessary workflow interruptions and allow clinicians to make informed decisions quickly, accurately and without extraneous cognitive and interactive effort. Excessive alerting that tends to distract clinicians rather than provide effective CDS can be reduced by designing only high severity alerts as interruptive dialog boxes and less severe warnings without explicit response requirement, by curating system knowledge bases to suppress warnings with low clinical utility and by integrating contextual patient data into the decision logic. Recommended design principles include parsimonious and consistent use of color and language, minimalist approach to the layout of information and controls, the use of font attributes to convey hierarchy and visual prominence of important data over supporting information, the inclusion of relevant patient data in the context of the alert and allowing clinicians to respond with one or two clicks. Although HCI and usability principles are well established and robust, CDS and EHR system interfaces rarely conform to the best known design conventions and are seldom conceived and designed well enough to be truly versatile and dependable tools. These relatively novel interventions still require careful monitoring, research and analysis of its track record to mature. Clarity and specificity of alert content and optimal perceptual and cognitive attributes, for example, are essential for providing effective decision support to clinicians. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Architected Agile Solutions for Software-Reliant Systems
NASA Astrophysics Data System (ADS)
Boehm, Barry; Lane, Jo Ann; Koolmanojwong, Supannika; Turner, Richard
Systems are becoming increasingly reliant on software due to needs for rapid fielding of “70% capabilities,” interoperability, net-centricity, and rapid adaptation to change. The latter need has led to increased interest in agile methods of software development, in which teams rely on shared tacit interpersonal knowledge rather than explicit documented knowledge. However, such systems often need to be scaled up to higher level of performance and assurance, requiring stronger architectural support. Several organizations have recently transformed themselves by developing successful combinations of agility and architecture that can scale to projects of up to 100 personnel. This chapter identifies a set of key principles for such architected agile solutions for software-reliant systems, provides guidance for how much architecting is enough, and illustrates the key principles with several case studies.
Design and indoor testing of a compact optical concentrator
NASA Astrophysics Data System (ADS)
Zheng, Cheng; Li, Qiyuan; Rosengarten, Gary; Hawkes, Evatt; Taylor, Robert A.
2017-01-01
We propose and analyze designs for stationary and compact optical concentrators. The designs are based on a catadioptric assembly with a linear focus line. They have a focal distance of around 10 to 15 cm with a concentration ratio (4.5 to 5.9 times). The concentrator employs an internal linear-tracking mechanism, making it suitable for rooftop solar applications. The optical performance of the collector has been simulated with ray tracing software (Zemax), and laser-based indoor experiments were carried out to validate this model. The results show that the system is capable of achieving an average optical efficiency of around 66% to 69% during the middle 6 (sunniest) h of the day. The design process and principles described in this work will help enable a new class of rooftop solar thermal concentrators.
SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences
NASA Astrophysics Data System (ADS)
Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.
1994-11-01
A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.
SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences
NASA Technical Reports Server (NTRS)
Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.
1994-01-01
A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.
Virtual Immunology: Software for Teaching Basic Immunology
ERIC Educational Resources Information Center
Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio
2013-01-01
As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available…
Research and emulation of ranging in BPON system
NASA Astrophysics Data System (ADS)
Yang, Guangxiang; Tao, Dexin; He, Yan
2005-12-01
Ranging is one of the key technologies in Broadband Passive Optical Network based on the ATM (BPON) system. It is complex for software designers and difficult to test. In order to simplify the ranging procedure, enhance its efficiency, and find an appropriate method to verify it, a new ranging procedure that completely satisfies the requirements specified in ITU-T G.983.1 and one verifying method is proposed in this paper. A kind of ranging procedure without serial number (SN) searching function, called one-by-one ranging are developed under the condition of cold PON, cold Optical Network Termination (ONU). The ranging procedure includes the use of OLT and ONU flow charts respectively. By using the network emulation software OPNET, the BPON system is modeled and the ranging procedure is simulated. The emulation experimental results show that the presented ranging procedure can effectively eliminate the collision of burst mode signals between ONUs, which can be ranged one-by-one under the controlling of OLT, while also enhancing the ranging efficiency. As all of the message formats used in this research conform with the ITU-T G.983.1, the ranging procedure can meet the protocol specifications with good interoperability, and is very compatible with products of other manufacturer. According to the present study of ranging procedures, guidelines and principles are provided, Also some design difficulties are eliminated in the software design.
NASA Astrophysics Data System (ADS)
Yang, Kun; Xu, Quan-li; Peng, Shuang-yun; Cao, Yan-bo
2008-10-01
Based on the necessity analysis of GIS applications in earthquake disaster prevention, this paper has deeply discussed the spatial integration scheme of urban earthquake disaster loss evaluation models and visualization technologies by using the network development methods such as COM/DCOM, ActiveX and ASP, as well as the spatial database development methods such as OO4O and ArcSDE based on ArcGIS software packages. Meanwhile, according to Software Engineering principles, a solution of Urban Earthquake Emergency Response Decision Support Systems based on GIS technologies have also been proposed, which include the systems logical structures, the technical routes,the system realization methods and function structures etc. Finally, the testing systems user interfaces have also been offered in the paper.
Monitoring system of multiple fire fighting based on computer vision
NASA Astrophysics Data System (ADS)
Li, Jinlong; Wang, Li; Gao, Xiaorong; Wang, Zeyong; Zhao, Quanke
2010-10-01
With the high demand of fire control in spacious buildings, computer vision is playing a more and more important role. This paper presents a new monitoring system of multiple fire fighting based on computer vision and color detection. This system can adjust to the fire position and then extinguish the fire by itself. In this paper, the system structure, working principle, fire orientation, hydrant's angle adjusting and system calibration are described in detail; also the design of relevant hardware and software is introduced. At the same time, the principle and process of color detection and image processing are given as well. The system runs well in the test, and it has high reliability, low cost, and easy nodeexpanding, which has a bright prospect of application and popularization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, Dustin Yewell
Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less
Expanding the term "Design Space" in high performance liquid chromatography (I).
Monks, K E; Rieger, H-J; Molnár, I
2011-12-15
The current article presents a novel approach to applying Quality by Design (QbD) principles to the development of high pressure reversed phase liquid chromatography (HPLC) methods. Four common critical parameters in HPLC--gradient time, temperature, pH of the aqueous eluent, and stationary phase--are evaluated within the Quality by Design framework by the means of computer modeling software and a column database, to a satisfactory degree. This work proposes the establishment of two mutually complimentary Design Spaces to fully depict a chromatographic method; one Column Design Space (CDS) and one Eluent Design Space (EDS) to describe the influence of the stationary phase and of the mobile phase on the separation selectivity, respectively. The merge of both Design Spaces into one is founded on the continuous nature of the mobile phase influence on retention and the great variety of the stationary phases available. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ying, Jia-ju; Chen, Yu-dan; Liu, Jie; Wu, Dong-sheng; Lu, Jun
2016-10-01
The maladjustment of photoelectric instrument binocular optical axis parallelism will affect the observe effect directly. A binocular optical axis parallelism digital calibration system is designed. On the basis of the principle of optical axis binocular photoelectric instrument calibration, the scheme of system is designed, and the binocular optical axis parallelism digital calibration system is realized, which include four modules: multiband parallel light tube, optical axis translation, image acquisition system and software system. According to the different characteristics of thermal infrared imager and low-light-level night viewer, different algorithms is used to localize the center of the cross reticle. And the binocular optical axis parallelism calibration is realized for calibrating low-light-level night viewer and thermal infrared imager.
Detection and classification of human body odor using an electronic nose.
Wongchoosuk, Chatchawal; Lutz, Mario; Kerdcharoen, Teerakiat
2009-01-01
An electronic nose (E-nose) has been designed and equipped with software that can detect and classify human armpit body odor. An array of metal oxide sensors was used for detecting volatile organic compounds. The measurement circuit employs a voltage divider resistor to measure the sensitivity of each sensor. This E-nose was controlled by in-house developed software through a portable USB data acquisition card with a principle component analysis (PCA) algorithm implemented for pattern recognition and classification. Because gas sensor sensitivity in the detection of armpit odor samples is affected by humidity, we propose a new method and algorithms combining hardware/software for the correction of the humidity noise. After the humidity correction, the E-nose showed the capability of detecting human body odor and distinguishing the body odors from two persons in a relative manner. The E-nose is still able to recognize people, even after application of deodorant. In conclusion, this is the first report of the application of an E-nose for armpit odor recognition.
Application of Gaia Analysis Software AGIS to Nano-JASMINE
NASA Astrophysics Data System (ADS)
Yamada, Y.; Lammers, U.; Gouda, N.
2011-07-01
The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). Nano-JASMINE is an ultra small (35 kg) satellite for astrometry observations in Japan and Gaia is ESA's large (over 1000 kg) next-generation astrometry mission. The accuracy of Nano-JASMINE is about 3 mas, comparable to the Hipparcos mission, Gaia's predecessor some 20 years ago. It is challenging that such a small satellite can perform real scientific observations. The collaboration for sharing software started in 2007. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for the Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.
Detection and Classification of Human Body Odor Using an Electronic Nose
Wongchoosuk, Chatchawal; Lutz, Mario; Kerdcharoen, Teerakiat
2009-01-01
An electronic nose (E-nose) has been designed and equipped with software that can detect and classify human armpit body odor. An array of metal oxide sensors was used for detecting volatile organic compounds. The measurement circuit employs a voltage divider resistor to measure the sensitivity of each sensor. This E-nose was controlled by in-house developed software through a portable USB data acquisition card with a principle component analysis (PCA) algorithm implemented for pattern recognition and classification. Because gas sensor sensitivity in the detection of armpit odor samples is affected by humidity, we propose a new method and algorithms combining hardware/software for the correction of the humidity noise. After the humidity correction, the E-nose showed the capability of detecting human body odor and distinguishing the body odors from two persons in a relative manner. The E-nose is still able to recognize people, even after application of deodorant. In conclusion, this is the first report of the application of an E-nose for armpit odor recognition. PMID:22399995
Capturing Requirements for Autonomous Spacecraft with Autonomy Requirements Engineering
NASA Astrophysics Data System (ADS)
Vassev, Emil; Hinchey, Mike
2014-08-01
The Autonomy Requirements Engineering (ARE) approach has been developed by Lero - the Irish Software Engineering Research Center within the mandate of a joint project with ESA, the European Space Agency. The approach is intended to help engineers develop missions for unmanned exploration, often with limited or no human control. Such robotics space missions rely on the most recent advances in automation and robotic technologies where autonomy and autonomic computing principles drive the design and implementation of unmanned spacecraft [1]. To tackle the integration and promotion of autonomy in software-intensive systems, ARE combines generic autonomy requirements (GAR) with goal-oriented requirements engineering (GORE). Using this approach, software engineers can determine what autonomic features to develop for a particular system (e.g., a space mission) as well as what artifacts that process might generate (e.g., goals models, requirements specification, etc.). The inputs required by this approach are the mission goals and the domain-specific GAR reflecting specifics of the mission class (e.g., interplanetary missions).
Yan, Koon-Kiu; Fang, Gang; Bhardwaj, Nitin; Alexander, Roger P.; Gerstein, Mark
2010-01-01
The genome has often been called the operating system (OS) for a living organism. A computer OS is described by a regulatory control network termed the call graph, which is analogous to the transcriptional regulatory network in a cell. To apply our firsthand knowledge of the architecture of software systems to understand cellular design principles, we present a comparison between the transcriptional regulatory network of a well-studied bacterium (Escherichia coli) and the call graph of a canonical OS (Linux) in terms of topology and evolution. We show that both networks have a fundamentally hierarchical layout, but there is a key difference: The transcriptional regulatory network possesses a few global regulators at the top and many targets at the bottom; conversely, the call graph has many regulators controlling a small set of generic functions. This top-heavy organization leads to highly overlapping functional modules in the call graph, in contrast to the relatively independent modules in the regulatory network. We further develop a way to measure evolutionary rates comparably between the two networks and explain this difference in terms of network evolution. The process of biological evolution via random mutation and subsequent selection tightly constrains the evolution of regulatory network hubs. The call graph, however, exhibits rapid evolution of its highly connected generic components, made possible by designers’ continual fine-tuning. These findings stem from the design principles of the two systems: robustness for biological systems and cost effectiveness (reuse) for software systems. PMID:20439753
Usable Interface Design for Everyone
NASA Astrophysics Data System (ADS)
de Castro Lozano, Carlos; Salcines, Enrique García; Sainz de Abajo, Beatriz; Burón Fernández, F. Javier; Ramírez, José Miguel; Recellado, José Gabriel Zato; Montoya, Rafael Sanchez; Bell, John; Marin, Francisco Alcantud
When designing "interfaces for everyone" for interactive systems, it is important to consider factors such as cost, the intended market, the state of the environment, etc. User interfaces are fundamental for the developmental process in any application, and its design must be contemplated from the start. Of the distinct parts of a system (hardware and software), it is the interface that permits the user access to computer resources. The seven principles of "Universal Design" or "Design for Everyone" focus on a universal usable design, but at the same time acknowledge the influences of internal and external factors. Structural changes in social and health services could provide an increase in the well-being of a country's citizens through the use of self-care programming and proactive management/prevention of disease. Automated home platforms can act as an accessibility instrument which permits users to avoid, compensate, mitigate, or neutralize the deficiencies and dependencies caused by living alone.
Building achromatic refractive beam shapers
NASA Astrophysics Data System (ADS)
Laskin, Alexander; Shealy, David
2014-10-01
Achromatic beam shapers can provide beam shaping in a certain spectral band and are very important for various laser techniques, such as, applications based on ultra-short pulse lasers with pulse width <100 fs, confocal microscopy, multicolour holography, life sciences fluorescence techniques, where several lasers in spectrum 405-650 nm are used simultaneously, for example 405-650 nm. Conditions of energy re-distribution and zero wave aberration are strictly fulfilled in ordinary plano-aspheric lens pair beam shapers for a definite wavelength only. Hence, these beam shapers work efficiently in relatively narrow, few nm spectrum. To provide acceptable beam quality for refractive beam shaping over a wide spectrum, an achromatizing design condition should be added. Consequently, the typical beam shaper design contains more than two-lenses, to avoid any damaging and other undesirable effects the lenses of beam shaper should be air-spaced. We suggest a two-step method of designing the beam shaper: 1) achromatizing of each plano-aspheric lens using a buried achromatizing surface ("chromatic radius"), then each beam shaper component presents a cemented doublet lens, 2) "splitting" the cemented lenses and realizing air-spaced lens design using optical systems design software. This method allows for using an achromatic design principle during the first step of the design, and then, refining the design by using optimization software. We shall present examples of this design procedure for an achromatic Keplerian beam shaper and for the design of an achromatic Galilean type of beam shaper. Experimental results of operation of refractive beam shapers will be presented as well.
Software Writing Skills for Your Research - Lessons Learned from Workshops in the Geosciences
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin
2016-04-01
Findings presented in scientific papers are based on data and software. Once in a while they come along with data - but not commonly with software. However, the software used to gain findings plays a crucial role in the scientific work. Nevertheless, software is rarely seen publishable. Thus researchers may not reproduce the findings without the software which is in conflict with the principle of reproducibility in sciences. For both, the writing of publishable software and the reproducibility issue, the quality of software is of utmost importance. For many programming scientists the treatment of source code, e.g. with code design, version control, documentation, and testing is associated with additional work that is not covered in the primary research task. This includes the adoption of processes following the software development life cycle. However, the adoption of software engineering rules and best practices has to be recognized and accepted as part of the scientific performance. Most scientists have little incentive to improve code and do not publish code because software engineering habits are rarely practised by researchers or students. Software engineering skills are not passed on to followers as for paper writing skill. Thus it is often felt that the software or code produced is not publishable. The quality of software and its source code has a decisive influence on the quality of research results obtained and their traceability. So establishing best practices from software engineering to serve scientific needs is crucial for the success of scientific software. Even though scientists use existing software and code, i.e., from open source software repositories, only few contribute their code back into the repositories. So writing and opening code for Open Science means that subsequent users are able to run the code, e.g. by the provision of sufficient documentation, sample data sets, tests and comments which in turn can be proven by adequate and qualified reviews. This assumes that scientist learn to write and release code and software as they learn to write and publish papers. Having this in mind, software could be valued and assessed as a contribution to science. But this requires the relevant skills that can be passed to colleagues and followers. Therefore, the GFZ German Research Centre for Geosciences performed three workshops in 2015 to address the passing of software writing skills to young scientists, the next generation of researchers in the Earth, planetary and space sciences. Experiences in running these workshops and the lessons learned will be summarized in this presentation. The workshops have received support and funding by Software Carpentry, a volunteer organization whose goal is to make scientists more productive, and their work more reliable, by teaching them basic computing skills, and by FOSTER (Facilitate Open Science Training for European Research), a two-year, EU-Funded (FP7) project, whose goal to produce a European-wide training programme that will help to incorporate Open Access approaches into existing research methodologies and to integrate Open Science principles and practice in the current research workflow by targeting the young researchers and other stakeholders.
Software Requirements Specification for an Ammunition Management System
1986-09-01
thesis takes the form of a software requirements specification. Such a specification, according to Pressman [Ref. 7], establishes a complete...defined by Pressman , is depicted in Figure 1.1. 11 Figure 1.1 Generalized Software Life Cycle The common thread which binds the various phases together...application of software engineering principles requires an established methodology. This methodology, according to Pressman [Ref. 8:p. 151 is an
ERIC Educational Resources Information Center
Interuniversity Communications Council (EDUCOM), Washington, DC.
The purpose of this brochure, which was produced as a service to the academic community, is to provide a brief outline of what can and cannot be done legally with software, and to clarify the implications and restrictions of the U.S. Copyright Law. Relevant facts concerning copying software precede the EDUCOM statement of principle on intellectual…
Reconstruction of Cyber and Physical Software Using Novel Spread Method
NASA Astrophysics Data System (ADS)
Ma, Wubin; Deng, Su; Huang, Hongbin
2018-03-01
Cyber and Physical software has been concerned for many years since 2010. Actually, many researchers would disagree with the deployment of traditional Spread Method for reconstruction of Cyber and physical software, which embodies the key principles reconstruction of cyber physical system. NSM(novel spread method), our new methodology for reconstruction of cyber and physical software, is the solution to all of these challenges.
Software Carpentry and the Hydrological Sciences
NASA Astrophysics Data System (ADS)
Ahmadia, A. J.; Kees, C. E.; Farthing, M. W.
2013-12-01
Scientists are spending an increasing amount of time building and using hydrology software. However, most scientists are never taught how to do this efficiently. As a result, many are unaware of tools and practices that would allow them to write more reliable and maintainable code with less effort. As hydrology models increase in capability and enter use by a growing number of scientists and their communities, it is important that the scientific software development practices scale up to meet the challenges posed by increasing software complexity, lengthening software lifecycles, a growing number of stakeholders and contributers, and a broadened developer base that extends from application domains to high performance computing centers. Many of these challenges in complexity, lifecycles, and developer base have been successfully met by the open source community, and there are many lessons to be learned from their experiences and practices. Additionally, there is much wisdom to be found in the results of research studies conducted on software engineering itself. Software Carpentry aims to bridge the gap between the current state of software development and these known best practices for scientific software development, with a focus on hands-on exercises and practical advice based on the following principles: 1. Write programs for people, not computers. 2. Automate repetitive tasks 3. Use the computer to record history 4. Make incremental changes 5. Use version control 6. Don't repeat yourself (or others) 7. Plan for mistakes 8. Optimize software only after it works 9. Document design and purpose, not mechanics 10. Collaborate We discuss how these best practices, arising from solid foundations in research and experience, have been shown to help improve scientist's productivity and the reliability of their software.
Pathways to lean software development: An analysis of effective methods of change
NASA Astrophysics Data System (ADS)
Hanson, Richard D.
This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.
NASA Technical Reports Server (NTRS)
Pi, Xiaoqing; Mannucci, Anthony J.; Verkhoglyadova, Olga P.; Stephens, Philip; Wilson, Brian D.; Akopian, Vardan; Komjathy, Attila; Lijima, Byron A.
2013-01-01
ISOGAME is designed and developed to assess quantitatively the impact of new observation systems on the capability of imaging and modeling the ionosphere. With ISOGAME, one can perform observation system simulation experiments (OSSEs). A typical OSSE using ISOGAME would involve: (1) simulating various ionospheric conditions on global scales; (2) simulating ionospheric measurements made from a constellation of low-Earth-orbiters (LEOs), particularly Global Navigation Satellite System (GNSS) radio occultation data, and from ground-based global GNSS networks; (3) conducting ionospheric data assimilation experiments with the Global Assimilative Ionospheric Model (GAIM); and (4) analyzing modeling results with visualization tools. ISOGAME can provide quantitative assessment of the accuracy of assimilative modeling with the interested observation system. Other observation systems besides those based on GNSS are also possible to analyze. The system is composed of a suite of software that combines the GAIM, including a 4D first-principles ionospheric model and data assimilation modules, an Internal Reference Ionosphere (IRI) model that has been developed by international ionospheric research communities, observation simulator, visualization software, and orbit design, simulation, and optimization software. The core GAIM model used in ISOGAME is based on the GAIM++ code (written in C++) that includes a new high-fidelity geomagnetic field representation (multi-dipole). New visualization tools and analysis algorithms for the OSSEs are now part of ISOGAME.
A Practical Guide To Developing Effective Web-based Learning
Cook, David A; Dupras, Denise M
2004-01-01
OBJECTIVE Online learning has changed medical education, but many “educational” websites do not employ principles of effective learning. This article will assist readers in developing effective educational websites by integrating principles of active learning with the unique features of the Web. DESIGN Narrative review. RESULTS The key steps in developing an effective educational website are: Perform a needs analysis and specify goals and objectives; determine technical resources and needs; evaluate preexisting software and use it if it fully meets your needs; secure commitment from all participants and identify and address potential barriers to implementation; develop content in close coordination with website design (appropriately use multimedia, hyperlinks, and online communication) and follow a timeline; encourage active learning (self-assessment, reflection, self-directed learning, problem-based learning, learner interaction, and feedback); facilitate and plan to encourage use by the learner (make website accessible and user-friendly, provide time for learning, and motivate learners); evaluate learners and course; pilot the website before full implementation; and plan to monitor online communication and maintain the site by resolving technical problems, periodically verifying hyperlinks, and regularly updating content. CONCLUSION Teaching on the Web involves more than putting together a colorful webpage. By consistently employing principles of effective learning, educators will unlock the full potential of Web-based medical education. PMID:15209610
The Flight Telerobotic Servicer (FTS) - A focus for automation and robotics on the Space Station
NASA Technical Reports Server (NTRS)
Hinkal, Sanford W.; Andary, James F.; Watzin, James G.; Provost, David E.
1987-01-01
The concept, fundamental design principles, and capabilities of the FTS, a multipurpose telerobotic system for use on the Space Station and Space Shuttle, are discussed. The FTS is intended to assist the crew in the performance of extravehicular tasks; the telerobot will also be used on the Orbital Maneuvering Vehicle to service free-flyer spacecraft. The FTS will be capable of both teleoperation and autonomous operation; eventually it may also utilize ground control. By careful selection of the functional architecture and a modular approach to the hardware and software design, the FTS can accept developments in artificial intelligence and newer, more advanced sensors, such as machine vision and collision avoidance.
Finding intrinsic rewards by embodied evolution and constrained reinforcement learning.
Uchibe, Eiji; Doya, Kenji
2008-12-01
Understanding the design principle of reward functions is a substantial challenge both in artificial intelligence and neuroscience. Successful acquisition of a task usually requires not only rewards for goals, but also for intermediate states to promote effective exploration. This paper proposes a method for designing 'intrinsic' rewards of autonomous agents by combining constrained policy gradient reinforcement learning and embodied evolution. To validate the method, we use Cyber Rodent robots, in which collision avoidance, recharging from battery packs, and 'mating' by software reproduction are three major 'extrinsic' rewards. We show in hardware experiments that the robots can find appropriate 'intrinsic' rewards for the vision of battery packs and other robots to promote approach behaviors.
Development and test of a model for designing interactive CD-ROMs for teaching nursing skills.
Jeffries, P R
2000-01-01
The use of interactive multimedia is well documented in the education literature as a medium for learning. Many schools of nursing and healthcare agencies purchase commercially-made CD-ROM products, and, in other cases, educators develop their own. Since nurses are increasingly designing CD-ROMs, they must be aware of the instructional design needed to develop comprehensive and effective CD-ROMs that do not compromise the quality of education. This article describes a process for developing and testing an interactive, multimedia CD-ROM on oral medication administration, using an instructional design model based on Chickering and Gamson's Principles of Good Practices in Education. Results from testing the model are reported. The findings can be used to guide the work of nurse educators who are interested in developing educational software.
The General Mission Analysis Tool (GMAT): Current Features And Adding Custom Functionality
NASA Technical Reports Server (NTRS)
Conway, Darrel J.; Hughes, Steven P.
2010-01-01
The General Mission Analysis Tool (GMAT) is a software system for trajectory optimization, mission analysis, trajectory estimation, and prediction developed by NASA, the Air Force Research Lab, and private industry. GMAT's design and implementation are based on four basic principles: open source visibility for both the source code and design documentation; platform independence; modular design; and user extensibility. The system, released under the NASA Open Source Agreement, runs on Windows, Mac and Linux. User extensions, loaded at run time, have been built for optimization, trajectory visualization, force model extension, and estimation, by parties outside of GMAT's development group. The system has been used to optimize maneuvers for the Lunar Crater Observation and Sensing Satellite (LCROSS) and ARTEMIS missions and is being used for formation design and analysis for the Magnetospheric Multiscale Mission (MMS).
The design of preamplifier and ADC circuit base on weak e-optical signal
NASA Astrophysics Data System (ADS)
Fen, Leng; Ying-ping, Yang; Ya-nan, Yu; Xiao-ying, Xu
2011-02-01
Combined with the demand of the process of weak e-optical signal in QPD detection system, the article introduced the circuit principle of deigning preamplifier and ADC circuit with I/V conversion, instrumentation amplifier, low-pass filter and 16-bit A/D transformation. At the same time the article discussed the circuit's noise suppression and isolation according to the characteristics of the weak signal, and gave the method of software rectification. Finally, tested the weak signal with keithley2000, and got a good effect.
Ferreira, Patrícia; Taveira-Gomes, Isabel; Severo, Milton; Ferreira, Maria Amélia
2016-01-01
Background Computer-based learning (CBL) has been widely used in medical education, and reports regarding its usage and effectiveness have ranged broadly. Most work has been done on the effectiveness of CBL approaches versus traditional methods, and little has been done on the comparative effects of CBL versus CBL methodologies. These findings urged other authors to recommend such studies in hopes of improving knowledge about which CBL methods work best in which settings. Objective In this systematic review, we aimed to characterize recent studies of the development of software platforms and interventions in medical education, search for common points among studies, and assess whether recommendations for CBL research are being taken into consideration. Methods We conducted a systematic review of the literature published from 2003 through 2013. We included studies written in English, specifically in medical education, regarding either the development of instructional software or interventions using instructional software, during training or practice, that reported learner attitudes, satisfaction, knowledge, skills, or software usage. We conducted 2 latent class analyses to group articles according to platform features and intervention characteristics. In addition, we analyzed references and citations for abstracted articles. Results We analyzed 251 articles. The number of publications rose over time, and they encompassed most medical disciplines, learning settings, and training levels, totaling 25 different platforms specifically for medical education. We uncovered 4 latent classes for educational software, characteristically making use of multimedia (115/251, 45.8%), text (64/251, 25.5%), Web conferencing (54/251, 21.5%), and instructional design principles (18/251, 7.2%). We found 3 classes for intervention outcomes: knowledge and attitudes (175/212, 82.6%), knowledge, attitudes, and skills (11.8%), and online activity (12/212, 5.7%). About a quarter of the articles (58/227, 25.6%) did not hold references or citations in common with other articles. The number of common references and citations increased in articles reporting instructional design principles (P=.03), articles measuring online activities (P=.01), and articles citing a review by Cook and colleagues on CBL (P=.04). There was an association between number of citations and studies comparing CBL versus CBL, independent of publication date (P=.02). Conclusions Studies in this field vary highly, and a high number of software systems are being developed. It seems that past recommendations regarding CBL interventions are being taken into consideration. A move into a more student-centered model, a focus on implementing reusable software platforms for specific learning contexts, and the analysis of online activity to track and predict outcomes are relevant areas for future research in this field. PMID:27480053
F-Nets and Software Cabling: Deriving a Formal Model and Language for Portable Parallel Programming
NASA Technical Reports Server (NTRS)
DiNucci, David C.; Saini, Subhash (Technical Monitor)
1998-01-01
Parallel programming is still being based upon antiquated sequence-based definitions of the terms "algorithm" and "computation", resulting in programs which are architecture dependent and difficult to design and analyze. By focusing on obstacles inherent in existing practice, a more portable model is derived here, which is then formalized into a model called Soviets which utilizes a combination of imperative and functional styles. This formalization suggests more general notions of algorithm and computation, as well as insights into the meaning of structured programming in a parallel setting. To illustrate how these principles can be applied, a very-high-level graphical architecture-independent parallel language, called Software Cabling, is described, with many of the features normally expected from today's computer languages (e.g. data abstraction, data parallelism, and object-based programming constructs).
C++, objected-oriented programming, and astronomical data models
NASA Technical Reports Server (NTRS)
Farris, A.
1992-01-01
Contemporary astronomy is characterized by increasingly complex instruments and observational techniques, higher data collection rates, and large data archives, placing severe stress on software analysis systems. The object-oriented paradigm represents a significant new approach to software design and implementation that holds great promise for dealing with this increased complexity. The basic concepts of this approach will be characterized in contrast to more traditional procedure-oriented approaches. The fundamental features of objected-oriented programming will be discussed from a C++ programming language perspective, using examples familiar to astronomers. This discussion will focus on objects, classes and their relevance to the data type system; the principle of information hiding; and the use of inheritance to implement generalization/specialization relationships. Drawing on the object-oriented approach, features of a new database model to support astronomical data analysis will be presented.
Data Management System (DMS) testbed user's manual development, volumes 1 and 2
NASA Technical Reports Server (NTRS)
Mcbride, John G.; Cohen, Norman
1986-01-01
A critical review of the network communication services contained in the Tinman User's Manual for Data Management System Test Bed (Tinman DMS User's Manual) is presented. The review is from the perspective of applying modern software engineering principles and using the Ada language effectively to ensure the test bed network communication services provide a robust capability. Overall the material on network communication services reflects a reasonably good grasp of the Ada language. Language features are appropriately used for most services. Design alternatives are offered to provide improved system performance and a basis for better application software development. Section two contains a review and suggests clarifications of the Statement of Policies and Services contained in Appendix B of the Tinman DMS User's Manual. Section three contains a review of the Network Communication Services and section four contains concluding comments.
UAF: a generic OPC unified architecture framework
NASA Astrophysics Data System (ADS)
Pessemier, Wim; Deconinck, Geert; Raskin, Gert; Saey, Philippe; Van Winckel, Hans
2012-09-01
As an emerging Service Oriented Architecture (SOA) specically designed for industrial automation and process control, the OPC Unied Architecture specication should be regarded as an attractive candidate for controlling scientic instrumentation. Even though an industry-backed standard such as OPC UA can oer substantial added value to these projects, its inherent complexity poses an important obstacle for adopting the technology. Building OPC UA applications requires considerable eort, even when taking advantage of a COTS Software Development Kit (SDK). The OPC Unied Architecture Framework (UAF) attempts to reduce this burden by introducing an abstraction layer between the SDK and the application code in order to achieve a better separation of the technical and the functional concerns. True to its industrial origin, the primary requirement of the framework is to maintain interoperability by staying close to the standard specications, and by expecting the minimum compliance from other OPC UA servers and clients. UAF can therefore be regarded as a software framework to quickly and comfortably develop and deploy OPC UA-based applications, while remaining compatible to third party OPC UA-compliant toolkits, servers (such as PLCs) and clients (such as SCADA software). In the rst phase, as covered by this paper, only the client-side of UAF has been tackled in order to transparently handle discovery, session management, subscriptions, monitored items etc. We describe the design principles and internal architecture of our open-source software project, the rst results of the framework running at the Mercator Telescope, and we give a preview of the planned server-side implementation.
Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)
2003-01-01
Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.
Software-aided automatic laser optoporation and transfection of cells
Georg Breunig, Hans; Uchugonova, Aisada; Batista, Ana; König, Karsten
2015-01-01
Optoporation, the permeabilization of a cell membrane by laser pulses, has emerged as a powerful non-invasive and highly efficient technique to induce transfection of cells. However, the usual tedious manual targeting of individual cells significantly limits the addressable cell number. To overcome this limitation, we present an experimental setup with custom-made software control, for computer-automated cell optoporation. The software evaluates the image contrast of cell contours, automatically designates cell locations for laser illumination, centres those locations in the laser focus, and executes the illumination. By software-controlled meandering of the sample stage, in principle all cells in a typical cell culture dish can be targeted without further user interaction. The automation allows for a significant increase in the number of treatable cells compared to a manual approach. For a laser illumination duration of 100 ms, 7-8 positions on different cells can be targeted every second inside the area of the microscope field of view. The experimental capabilities of the setup are illustrated in experiments with Chinese hamster ovary cells. Furthermore, the influence of laser power is discussed, with mention on post-treatment cell survival and optoporation-efficiency rates. PMID:26053047
A Method for Assessing the Accuracy of a Photogrammetry System for Precision Deployable Structures
NASA Technical Reports Server (NTRS)
Moore, Ashley
2005-01-01
The measurement techniques used to validate analytical models of large deployable structures are an integral Part of the technology development process and must be precise and accurate. Photogrammetry and videogrammetry are viable, accurate, and unobtrusive methods for measuring such large Structures. Photogrammetry uses Software to determine the three-dimensional position of a target using camera images. Videogrammetry is based on the same principle, except a series of timed images are analyzed. This work addresses the accuracy of a digital photogrammetry system used for measurement of large, deployable space structures at JPL. First, photogrammetry tests are performed on a precision space truss test article, and the images are processed using Photomodeler software. The accuracy of the Photomodeler results is determined through, comparison with measurements of the test article taken by an external testing group using the VSTARS photogrammetry system. These two measurements are then compared with Australis photogrammetry software that simulates a measurement test to predict its accuracy. The software is then used to study how particular factors, such as camera resolution and placement, affect the system accuracy to help design the setup for the videogrammetry system that will offer the highest level of accuracy for measurement of deploying structures.
ERIC Educational Resources Information Center
Science Teacher, 1988
1988-01-01
Reviews two software programs for Apple series computers. Includes "Orbital Mech," a basic planetary orbital simulation for the Macintosh, and "START: Stimulus and Response Tools for Experiments in Memory, Learning, Cognition, and Perception," a program that demonstrates basic psychological principles and experiments. (CW)
NASA Astrophysics Data System (ADS)
Bai, Yang; Chen, Shufen; Fu, Li; Fang, Wei; Lu, Junjun
2005-01-01
A high bit rate more than 10Gbit/s optical pulse generation device is the key to achieving high-speed and broadband optical fiber communication network system .Now, we propose a novel high-speed optical transmission module(TM) consisting of a Ti:Er:LiNbO3 waveguide laser and a Mach-Zehnder-type encoding modulator on the same Er-doped substrate. According to the standard of ITU-T, we design the 10Gbit/ s transmission module at 1.53μm on the Z cut Y propagation LiNbO3 slice. A dynamic model and the corresponding numerical code are used to analyze the waveguide laser while the electrooptic effect to design the modulator. Meanwhile, the working principle, key technology, typical characteristic parameters of the module are given. The transmission module has a high extinction ratio and a low driving voltage, which supplies the efficient, miniaturized light source for wavelength division multiplexing(WDM) system. In additional, the relation of the laser gain with the cavity parameter, as well as the relation of the bandwidth of the electrooptic modulator with some key factors are discussed .The designed module structure is simulated by BPM software and HFSS software.
High-Performance Java Codes for Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Riley, Christopher; Chatterjee, Siddhartha; Biswas, Rupak; Biegel, Bryan (Technical Monitor)
2001-01-01
The computational science community is reluctant to write large-scale computationally -intensive applications in Java due to concerns over Java's poor performance, despite the claimed software engineering advantages of its object-oriented features. Naive Java implementations of numerical algorithms can perform poorly compared to corresponding Fortran or C implementations. To achieve high performance, Java applications must be designed with good performance as a primary goal. This paper presents the object-oriented design and implementation of two real-world applications from the field of Computational Fluid Dynamics (CFD): a finite-volume fluid flow solver (LAURA, from NASA Langley Research Center), and an unstructured mesh adaptation algorithm (2D_TAG, from NASA Ames Research Center). This work builds on our previous experience with the design of high-performance numerical libraries in Java. We examine the performance of the applications using the currently available Java infrastructure and show that the Java version of the flow solver LAURA performs almost within a factor of 2 of the original procedural version. Our Java version of the mesh adaptation algorithm 2D_TAG performs within a factor of 1.5 of its original procedural version on certain platforms. Our results demonstrate that object-oriented software design principles are not necessarily inimical to high performance.
A Guideline of Using Case Method in Software Engineering Courses
ERIC Educational Resources Information Center
Zainal, Dzulaiha Aryanee Putri; Razali, Rozilawati; Shukur, Zarina
2014-01-01
Software Engineering (SE) education has been reported to fall short in producing high quality software engineers. In seeking alternative solutions, Case Method (CM) is regarded as having potential to solve the issue. CM is a teaching and learning (T&L) method that has been found to be effective in Social Science education. In principle,…
Success Factors for Using Case Method in Teaching and Learning Software Engineering
ERIC Educational Resources Information Center
Razali, Rozilawati; Zainal, Dzulaiha Aryanee Putri
2013-01-01
The Case Method (CM) has long been used effectively in Social Science education. Its potential use in Applied Science such as Software Engineering (SE) however has yet to be further explored. SE is an engineering discipline that concerns the principles, methods and tools used throughout the software development lifecycle. In CM, subjects are…
Shachak, Aviv; Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R; Lemieux-Charles, Louise
2013-06-04
Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design.
Evolution of the ATLAS Software Framework towards Concurrency
NASA Astrophysics Data System (ADS)
Jones, R. W. L.; Stewart, G. A.; Leggett, C.; Wynne, B. M.
2015-05-01
The ATLAS experiment has successfully used its Gaudi/Athena software framework for data taking and analysis during the first LHC run, with billions of events successfully processed. However, the design of Gaudi/Athena dates from early 2000 and the software and the physics code has been written using a single threaded, serial design. This programming model has increasing difficulty in exploiting the potential of current CPUs, which offer their best performance only through taking full advantage of multiple cores and wide vector registers. Future CPU evolution will intensify this trend, with core counts increasing and memory per core falling. Maximising performance per watt will be a key metric, so all of these cores must be used as efficiently as possible. In order to address the deficiencies of the current framework, ATLAS has embarked upon two projects: first, a practical demonstration of the use of multi-threading in our reconstruction software, using the GaudiHive framework; second, an exercise to gather requirements for an updated framework, going back to the first principles of how event processing occurs. In this paper we report on both these aspects of our work. For the hive based demonstrators, we discuss what changes were necessary in order to allow the serially designed ATLAS code to run, both to the framework and to the tools and algorithms used. We report on what general lessons were learned about the code patterns that had been employed in the software and which patterns were identified as particularly problematic for multi-threading. These lessons were fed into our considerations of a new framework and we present preliminary conclusions on this work. In particular we identify areas where the framework can be simplified in order to aid the implementation of a concurrent event processing scheme. Finally, we discuss the practical difficulties involved in migrating a large established code base to a multi-threaded framework and how this can be achieved for LHC Run 3.
SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.
Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia
2017-01-01
When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.
Composing, Analyzing and Validating Software Models
NASA Astrophysics Data System (ADS)
Sheldon, Frederick T.
1998-10-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Composing, Analyzing and Validating Software Models
NASA Technical Reports Server (NTRS)
Sheldon, Frederick T.
1998-01-01
This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.
Design and setup of intermittent-flow respirometry system for aquatic organisms.
Svendsen, M B S; Bushnell, P G; Steffensen, J F
2016-01-01
Intermittent-flow respirometry is an experimental protocol for measuring oxygen consumption in aquatic organisms that utilizes the best features of closed (stop-flow) and flow-through respirometry while eliminating (or at least reducing) some of their inherent problems. By interspersing short periods of closed-chamber oxygen consumption measurements with regular flush periods, accurate oxygen uptake rate measurements can be made without the accumulation of waste products, particularly carbon dioxide, which may confound results. Automating the procedure with easily available hardware and software further reduces error by allowing many measurements to be made over long periods thereby minimizing animal stress due to acclimation issues. This paper describes some of the fundamental principles that need to be considered when designing and carrying out automated intermittent-flow respirometry (e.g. chamber size, flush rate, flush time, chamber mixing, measurement periods and temperature control). Finally, recent advances in oxygen probe technology and open source automation software will be discussed in the context of assembling relatively low cost and reliable measurement systems. © 2015 The Fisheries Society of the British Isles.
An ORCID based synchronization framework for a national CRIS ecosystem.
Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno
2015-01-01
PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.
Computation of Sensitivity Derivatives of Navier-Stokes Equations using Complex Variables
NASA Technical Reports Server (NTRS)
Vatsa, Veer N.
2004-01-01
Accurate computation of sensitivity derivatives is becoming an important item in Computational Fluid Dynamics (CFD) because of recent emphasis on using nonlinear CFD methods in aerodynamic design, optimization, stability and control related problems. Several techniques are available to compute gradients or sensitivity derivatives of desired flow quantities or cost functions with respect to selected independent (design) variables. Perhaps the most common and oldest method is to use straightforward finite-differences for the evaluation of sensitivity derivatives. Although very simple, this method is prone to errors associated with choice of step sizes and can be cumbersome for geometric variables. The cost per design variable for computing sensitivity derivatives with central differencing is at least equal to the cost of three full analyses, but is usually much larger in practice due to difficulty in choosing step sizes. Another approach gaining popularity is the use of Automatic Differentiation software (such as ADIFOR) to process the source code, which in turn can be used to evaluate the sensitivity derivatives of preselected functions with respect to chosen design variables. In principle, this approach is also very straightforward and quite promising. The main drawback is the large memory requirement because memory use increases linearly with the number of design variables. ADIFOR software can also be cumber-some for large CFD codes and has not yet reached a full maturity level for production codes, especially in parallel computing environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise
2006-09-01
The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.
Rice, Ian; Gagnon, Dany; Gallagher, Jere; Boninger, Michael
2010-01-01
As considerable progress has been made in laboratory-based assessment of manual wheelchair propulsion biomechanics, the necessity to translate this knowledge into new clinical tools and treatment programs becomes imperative. The objective of this study was to describe the development of a manual wheelchair propulsion training program aimed to promote the development of an efficient propulsion technique among long-term manual wheelchair users. Motor learning theory principles were applied to the design of biomechanical feedback-based learning software, which allows for random discontinuous real-time visual presentation of key spatiotemporal and kinetic parameters. This software was used to train a long-term wheelchair user on a dynamometer during 3 low-intensity wheelchair propulsion training sessions over a 3-week period. Biomechanical measures were recorded with a SmartWheel during over ground propulsion on a 50-m level tile surface at baseline and 3 months after baseline. Training software was refined and administered to a participant who was able to improve his propulsion technique by increasing contact angle while simultaneously reducing stroke cadence, mean resultant force, peak and mean moment out of plane, and peak rate of rise of force applied to the pushrim after training. The proposed propulsion training protocol may lead to favorable changes in manual wheelchair propulsion technique. These changes could limit or prevent upper limb injuries among manual wheelchair users. In addition, many of the motor learning theory-based techniques examined in this study could be applied to training individuals in various stages of rehabilitation to optimize propulsion early on.
Rice, Ian; Gagnon, Dany; Gallagher, Jere; Boninger, Michael
2010-01-01
Background/Objective: As considerable progress has been made in laboratory-based assessment of manual wheelchair propulsion biomechanics, the necessity to translate this knowledge into new clinical tools and treatment programs becomes imperative. The objective of this study was to describe the development of a manual wheelchair propulsion training program aimed to promote the development of an efficient propulsion technique among long-term manual wheelchair users. Methods: Motor learning theory principles were applied to the design of biomechanical feedback-based learning software, which allows for random discontinuous real-time visual presentation of key spatio-temporal and kinetic parameters. This software was used to train a long-term wheelchair user on a dynamometer during 3 low-intensity wheelchair propulsion training sessions over a 3-week period. Biomechanical measures were recorded with a SmartWheel during over ground propulsion on a 50-m level tile surface at baseline and 3 months after baseline. Results: Training software was refined and administered to a participant who was able to improve his propulsion technique by increasing contact angle while simultaneously reducing stroke cadence, mean resultant force, peak and mean moment out of plane, and peak rate of rise of force applied to the pushrim after training. Conclusions: The proposed propulsion training protocol may lead to favorable changes in manual wheelchair propulsion technique. These changes could limit or prevent upper limb injuries among manual wheelchair users. In addition, many of the motor learning theory–based techniques examined in this study could be applied to training individuals in various stages of rehabilitation to optimize propulsion early on. PMID:20397442
Waggle: A Framework for Intelligent Attentive Sensing and Actuation
NASA Astrophysics Data System (ADS)
Sankaran, R.; Jacob, R. L.; Beckman, P. H.; Catlett, C. E.; Keahey, K.
2014-12-01
Advances in sensor-driven computation and computationally steered sensing will greatly enable future research in fields including environmental and atmospheric sciences. We will present "Waggle," an open-source hardware and software infrastructure developed with two goals: (1) reducing the separation and latency between sensing and computing and (2) improving the reliability and longevity of sensing-actuation platforms in challenging and costly deployments. Inspired by "deep-space probe" systems, the Waggle platform design includes features that can support longitudinal studies, deployments with varying communication links, and remote management capabilities. Waggle lowers the barrier for scientists to incorporate real-time data from their sensors into their computations and to manipulate the sensors or provide feedback through actuators. A standardized software and hardware design allows quick addition of new sensors/actuators and associated software in the nodes and enables them to be coupled with computational codes both insitu and on external compute infrastructure. The Waggle framework currently drives the deployment of two observational systems - a portable and self-sufficient weather platform for study of small-scale effects in Chicago's urban core and an open-ended distributed instrument in Chicago that aims to support several research pursuits across a broad range of disciplines including urban planning, microbiology and computer science. Built around open-source software, hardware, and Linux OS, the Waggle system comprises two components - the Waggle field-node and Waggle cloud-computing infrastructure. Waggle field-node affords a modular, scalable, fault-tolerant, secure, and extensible platform for hosting sensors and actuators in the field. It supports insitu computation and data storage, and integration with cloud-computing infrastructure. The Waggle cloud infrastructure is designed with the goal of scaling to several hundreds of thousands of Waggle nodes. It supports aggregating data from sensors hosted by the nodes, staging computation, relaying feedback to the nodes and serving data to end-users. We will discuss the Waggle design principles and their applicability to various observational research pursuits, and demonstrate its capabilities.
User interface design principles for the SSM/PMAD automated power system
NASA Technical Reports Server (NTRS)
Jakstas, Laura M.; Myers, Chris J.
1991-01-01
Martin Marietta has developed a user interface for the space station module power management and distribution (SSM/PMAD) automated power system testbed which provides human access to the functionality of the power system, as well as exemplifying current techniques in user interface design. The testbed user interface was designed to enable an engineer to operate the system easily without having significant knowledge of computer systems, as well as provide an environment in which the engineer can monitor and interact with the SSM/PMAD system hardware. The design of the interface supports a global view of the most important data from the various hardware and software components, as well as enabling the user to obtain additional or more detailed data when needed. The components and representations of the SSM/PMAD testbed user interface are examined. An engineer's interactions with the system are also described.
NASA Astrophysics Data System (ADS)
Dyer, Mark; Grey, Thomas; Kinnane, Oliver
2017-11-01
It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about the future role of engineers and the education needed to address these changes in technology as well as emerging priorities from societal to environmental challenges. In response to these challenges, a new design module was created for undergraduate engineering students to design and build temporary shelters for a wide variety of end users from refugees, to the homeless and children. Even though the module provided guidance on principles of design thinking and methods for observing users needs through field studies, the students found it difficult to respond to needs of specific end users but instead focused more on purely technical issues.
New Digisonde for research and monitoring applications
NASA Astrophysics Data System (ADS)
Reinisch, B. W.; Galkin, I. A.; Khmyrov, G. M.; Kozlov, A. V.; Bibl, K.; Lisysyan, I. A.; Cheney, G. P.; Huang, X.; Kitrosser, D. F.; Paznukhov, V. V.; Luo, Y.; Jones, W.; Stelmash, S.; Hamel, R.; Grochmal, J.
2009-02-01
The new Digisonde-4D, while preserving the basic principles of the Digisonde family, introduces important hardware and software changes that implement the latest capabilities of new digital radio frequency (RF) circuitry and embedded computers. The "D" refers to digital transmitters and receivers in which no analog circuitry is used for conversion between the baseband and the RF. In conjunction with the new hardware design, new software solutions offer significantly enhanced measurement flexibility, enhanced signal selectivity, and new types of data, e.g., the complete set of time domain samples of all four antenna signals suitable for independent scientific analysis. With the new method of mitigating in-band RF interference, the ionogram running time can be made as short as a couple of seconds. The h'(f) precision ranging technique with an accuracy of better than 1 km can be used on a routine basis. The 4D model runs the new ARTIST-5 ionogram autoscaling software which reports in real time the required data for assimilation in ionospheric models. The paper highlights technical advances of the new Digisonde for research and monitoring applications.
Software engineering principles applied to large healthcare information systems--a case report.
Nardon, Fabiane Bizinella; de A Moura, Lincoln
2007-01-01
São Paulo is the largest city in Brazil and one of the largest cities in the world. In 2004, São Paulo City Department of Health decided to implement a Healthcare Information System to support managing healthcare services and provide an ambulatory health record. The resulting information system is one of the largest public healthcare information systems ever built, with more than 2 million lines of code. Although statistics shows that most software projects fail, and the risks for the São Paulo initiative were enormous, the information system was completed on-time and on-budget. In this paper, we discuss the software engineering principles adopted that allowed to accomplish that project's goals, hoping that sharing the experience of this project will help other healthcare information systems initiatives to succeed.
fMRI paradigm designing and post-processing tools
James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan
2014-01-01
In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001
bioWidgets: data interaction components for genomics.
Fischer, S; Crabtree, J; Brunk, B; Gibson, M; Overton, G C
1999-10-01
The presentation of genomics data in a perspicuous visual format is critical for its rapid interpretation and validation. Relatively few public database developers have the resources to implement sophisticated front-end user interfaces themselves. Accordingly, these developers would benefit from a reusable toolkit of user interface and data visualization components. We have designed the bioWidget toolkit as a set of JavaBean components. It includes a wide array of user interface components and defines an architecture for assembling applications. The toolkit is founded on established software engineering design patterns and principles, including componentry, Model-View-Controller, factored models and schema neutrality. As a proof of concept, we have used the bioWidget toolkit to create three extendible applications: AnnotView, BlastView and AlignView.
Collecting Unsolicited User-Generated Change Requests
2015-12-01
change requests, although the core principles of the steps apply equally to non- software change requests ( Champagne and April, 2014:pp 6-9). The...Capabilities Integration and Development System (JCIDS). JCIDS Manual. Washington: CJCS, 12 February 2015. Champagne , Roger and Alain April. “Software
48 CFR 852.211-70 - Service data manuals.
Code of Federal Regulations, 2011 CFR
2011-10-01
... applicable, flow charts and narrative descriptions of software shall be provided. If programming is either...) Section IV, Principles of Operation. This section shall describe in narrative form the principles of... the recommended frequency of performance shall be included for visual inspection, cleaning...
Marathon: An Open Source Software Library for the Analysis of Markov-Chain Monte Carlo Algorithms
Rechner, Steffen; Berger, Annabell
2016-01-01
We present the software library marathon, which is designed to support the analysis of sampling algorithms that are based on the Markov-Chain Monte Carlo principle. The main application of this library is the computation of properties of so-called state graphs, which represent the structure of Markov chains. We demonstrate applications and the usefulness of marathon by investigating the quality of several bounding methods on four well-known Markov chains for sampling perfect matchings and bipartite graphs. In a set of experiments, we compute the total mixing time and several of its bounds for a large number of input instances. We find that the upper bound gained by the famous canonical path method is often several magnitudes larger than the total mixing time and deteriorates with growing input size. In contrast, the spectral bound is found to be a precise approximation of the total mixing time. PMID:26824442
PLACE: an open-source python package for laboratory automation, control, and experimentation.
Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper
2015-02-01
In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.
Modeling Complex Cross-Systems Software Interfaces Using SysML
NASA Technical Reports Server (NTRS)
Mandutianu, Sanda; Morillo, Ron; Simpson, Kim; Liepack, Otfrid; Bonanne, Kevin
2013-01-01
The complex flight and ground systems for NASA human space exploration are designed, built, operated and managed as separate programs and projects. However, each system relies on one or more of the other systems in order to accomplish specific mission objectives, creating a complex, tightly coupled architecture. Thus, there is a fundamental need to understand how each system interacts with the other. To determine if a model-based system engineering approach could be utilized to assist with understanding the complex system interactions, the NASA Engineering and Safety Center (NESC) sponsored a task to develop an approach for performing cross-system behavior modeling. This paper presents the results of applying Model Based Systems Engineering (MBSE) principles using the System Modeling Language (SysML) to define cross-system behaviors and how they map to crosssystem software interfaces documented in system-level Interface Control Documents (ICDs).
NASA Technical Reports Server (NTRS)
Trimble, Jay Phillip
2014-01-01
The Resource Prospector Mission seeks to rove the lunar surface with an in-situ resource utilization payload in search of volatiles at a polar region. The mission operations system (MOS) will need to perform the short-duration mission while taking advantage of the near real time control that the short one-way light time to the Moon provides. To maximize our use of limited resources for the design and development of the MOS we are utilizing agile and lean methods derived from our previous experience with applying these methods to software. By using methods such as "say it then sim it" we will spend less time in meetings and more time focused on the one outcome that counts - the effective utilization of our assets on the Moon to meet mission objectives.
Design of laser afocal zoom expander system
NASA Astrophysics Data System (ADS)
Jiang, Lian; Zeng, Chun-Mei; Hu, Tian-Tian
2018-01-01
Laser afocal zoom expander system due to the beam diameter variable, can be used in the light sheet illumination microscope to observe the samples of different sizes. Based on the principle of afocal zoom system, the laser collimation and beam expander system with a total length of less than 110mm, 6 pieces of spherical lens and a beam expander ratio of 10 is designed by using Zemax software. The system is focused on laser with a wavelength of 532nm, divergence angle of less than 4mrad and incident diameter of 4mm. With the combination of 6 spherical lens, the beam divergence angle is 0.4mrad at the maximum magnification ratio, and the RMS values at different rates are less than λ/4. This design is simple in structure and easy to process and adjust. It has certain practical value.
Study on nondestructive detection system based on x-ray for wire ropes conveyer belt
NASA Astrophysics Data System (ADS)
Miao, Changyun; Shi, Boya; Wan, Peng; Li, Jie
2008-03-01
A nondestructive detection system based on X-ray for wire ropes conveyer belt is designed by X-ray detection technology. In this paper X-ray detection principle is analyzed, a design scheme of the system is presented; image processing of conveyer belt is researched and image processing algorithms are given; X-ray acquisition receiving board is designed with the use of FPGA and DSP; the software of the system is programmed by C#.NET on WINXP/WIN2000 platform. The experiment indicates the system can implement remote real-time detection of wire ropes conveyer belt images, find faults and give an alarm in time. The system is direct perceived, strong real-time and high accurate. It can be used for fault detection of wire ropes conveyer belts in mines, ports, terminals and other fields.
The optical design of solar spectrograph
NASA Astrophysics Data System (ADS)
Zhang, Yang; Pan, Wen-Qiang; Meng, Xiang-Yue; Lv, Xian-Kui; Feng, Jie; Zhu, Jia-Wei; Zhang, Xiao-Xiao; Li, Lei; Yang, Wei-Ping
2017-08-01
At the beginning of this paper, we simply describe the theories of spectrograph and the operating principle of grating. Based on the Spectrometer theory and optical theory we design a solar spectrograph by analyzing and calculating. And the working waveband of this solar spectrograph is between 510nm and 540nm. Besides, according to the design data, we ensure the blaze level of grating and the focal length of collimate. Due to the presence of the collimate in the optical structure, astigmatism exists in the system. For this reason, we add a cylindrical lens to the structure to correct. The optical system is characterized by using white-pupil design and folding light path to make the whole system simple. In the end, according to the calculated design parameters, we use the Zemax software for simulation, then the result is RMS only has 4μm at the 520nm. It's worth nothing that the resolution merely near the reference wavelength (520nm)meets the design requirements.
Design Considerations of a Virtual Laboratory for Advanced X-ray Sources
NASA Astrophysics Data System (ADS)
Luginsland, J. W.; Frese, M. H.; Frese, S. D.; Watrous, J. J.; Heileman, G. L.
2004-11-01
The field of scientific computation has greatly advanced in the last few years, resulting in the ability to perform complex computer simulations that can predict the performance of real-world experiments in a number of fields of study. Among the forces driving this new computational capability is the advent of parallel algorithms, allowing calculations in three-dimensional space with realistic time scales. Electromagnetic radiation sources driven by high-voltage, high-current electron beams offer an area to further push the state-of-the-art in high fidelity, first-principles simulation tools. The physics of these x-ray sources combine kinetic plasma physics (electron beams) with dense fluid-like plasma physics (anode plasmas) and x-ray generation (bremsstrahlung). There are a number of mature techniques and software packages for dealing with the individual aspects of these sources, such as Particle-In-Cell (PIC), Magneto-Hydrodynamics (MHD), and radiation transport codes. The current effort is focused on developing an object-oriented software environment using the Rational© Unified Process and the Unified Modeling Language (UML) to provide a framework where multiple 3D parallel physics packages, such as a PIC code (ICEPIC), a MHD code (MACH), and a x-ray transport code (ITS) can co-exist in a system-of-systems approach to modeling advanced x-ray sources. Initial software design and assessments of the various physics algorithms' fidelity will be presented.
Research and development of the laser tracker measurement system
NASA Astrophysics Data System (ADS)
Zhang, Z. L.; Zhou, W. H.; Lao, D. B.; Yuan, J.; Dong, D. F. F.; Ji, R. Y. Y.
2013-01-01
The working principle and system design of the laser tracker measurement system are introduced, as well as the key technologies and solutions in the implementation of the system. The design and implementation of the hardware and configuration of the software are mainly researched. The components of the hardware include distance measuring unit, angle measuring unit, tracking and servo control unit and electronic control unit. The distance measuring devices include the relative distance measuring device (IFM) and the absolute distance measuring device (ADM). The main component of the angle measuring device, the precision rotating stage, is mainly comprised of the precision axis and the encoders which are both set in the tracking head. The data processing unit, tracking and control unit and power supply unit are all set in the control box. The software module is comprised of the communication module, calibration and error compensation module, data analysis module, database management module, 3D display module and the man-machine interface module. The prototype of the laser tracker system has been accomplished and experiments have been carried out to verify the proposed strategies of the hardware and software modules. The experiments showed that the IFM distance measuring error is within 0.15mm, the ADM distance measuring error is within 3.5mm and the angle measuring error is within 3" which demonstrates that the preliminary prototype can realize fundamental measurement tasks.
An embedded laser marking controller based on ARM and FPGA processors.
Dongyun, Wang; Xinpiao, Ye
2014-01-01
Laser marking is an important branch of the laser information processing technology. The existing laser marking machine based on PC and WINDOWS operating system, are large and inconvenient to move. Still, it cannot work outdoors or in other harsh environments. In order to compensate for the above mentioned disadvantages, this paper proposed an embedded laser marking controller based on ARM and FPGA processors. Based on the principle of laser galvanometer scanning marking, the hardware and software were designed for the application. Experiments showed that this new embedded laser marking controller controls the galvanometers synchronously and could achieve precise marking.
NASA Technical Reports Server (NTRS)
Simpson, J. J.; Frouin, R.
1996-01-01
Grant activities accomplished during this reporting period are summarized. The contributions of the principle investigator are reported under four categories: (1) AHVRR (Advanced Very High Resolution Radiometer) data; (2) GOES (Geostationary Operational Environ Satellite) data; (3) system software design; and (4) ATSR (Along Track Scanning Radiometer) data. The contributions of the associate investigator are reported for:(1) longwave irradiance at the surface; (2) methods to derive surface short-wave irradiance; and (3) estimating PAR (photo-synthetically active radiation) surface. Several papers have resulted. Abstracts for each paper are provided.
The Ames Virtual Environment Workstation: Implementation issues and requirements
NASA Technical Reports Server (NTRS)
Fisher, Scott S.; Jacoby, R.; Bryson, S.; Stone, P.; Mcdowall, I.; Bolas, M.; Dasaro, D.; Wenzel, Elizabeth M.; Coler, C.; Kerr, D.
1991-01-01
This presentation describes recent developments in the implementation of a virtual environment workstation in the Aerospace Human Factors Research Division of NASA's Ames Research Center. Introductory discussions are presented on the primary research objectives and applications of the system and on the system's current hardware and software configuration. Principle attention is then focused on unique issues and problems encountered in the workstation's development with emphasis on its ability to meet original design specifications for computational graphics performance and for associated human factors requirements necessary to provide compelling sense of presence and efficient interaction in the virtual environment.
The Effects of Embedding Generative Cognitive Strategies in Science Software.
ERIC Educational Resources Information Center
Barba, Robertta H.; Merchant, Linda J.
1990-01-01
Discussed is whether embedding generative cognitive strategies in microcomputer courseware improves student performance on cognitive assessment measures and on insect classification tasks. The effects of transactional software on students' knowledge of insect anatomy and principles of insect classification were also investigated. (KR)
Tailoring Software Inspections for Aspect-Oriented Programming
ERIC Educational Resources Information Center
Watkins, Charlette Ward
2009-01-01
Aspect-Oriented Software Development (AOSD) is a new approach that addresses limitations inherent in conventional programming, especially the principle of separation of concerns by emphasizing the encapsulation and modularization of crosscutting concerns through a new abstraction, the "aspect." Aspect-oriented programming is an emerging AOSD…
Student Development of Educational Software: Spin-Offs from Classroom Use of DIAS.
ERIC Educational Resources Information Center
Harrington, John A., Jr.; And Others
1988-01-01
Describes several college courses which encourage students to develop computer software programs in the areas of remote sensing and geographic information systems. A microcomputer-based tutorial package, the Digital Image Analysis System (DAIS), teaches the principles of digital processing. (LS)
ERIC Educational Resources Information Center
Dewhurst, D. G.; And Others
1989-01-01
An interactive computer-assisted learning program written for the BBC microcomputer to teach the basic principles of genetic engineering is described. Discussed are the hardware requirements software, use of the program, and assessment. (Author/CW)
Virtual immunology: software for teaching basic immunology.
Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio
2013-01-01
As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available free of charge in Portuguese and English, which can be used by teachers and students in physiology, immunology, and cellular biology classes. We discuss the development of the initial two modules: "Organs and Lymphoid Tissues" and "Inflammation" and the use of interactive activities to provide microscopic and macroscopic understanding in immunology. Students, both graduate and undergraduate, were questioned along with university level professors about the quality of the software and intuitiveness of use, facility of navigation, and aesthetic organization using a Likert scale. An overwhelmingly satisfactory result was obtained with both students and immunology teachers. Programs such as "Virtual Immunology" are offering more interactive, multimedia approaches to complex scientific principles that increase student motivation, interest, and comprehension. © 2013 by The International Union of Biochemistry and Molecular Biology.
ProteoWizard: open source software for rapid proteomics tools development.
Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag
2008-11-01
The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.
The digital compensation technology system for automotive pressure sensor
NASA Astrophysics Data System (ADS)
Guo, Bin; Li, Quanling; Lu, Yi; Luo, Zai
2011-05-01
Piezoresistive pressure sensor be made of semiconductor silicon based on Piezoresistive phenomenon, has many characteristics. But since the temperature effect of semiconductor, the performance of silicon sensor is also changed by temperature, and the pressure sensor without temperature drift can not be produced at present. This paper briefly describe the principles of sensors, the function of pressure sensor and the various types of compensation method, design the detailed digital compensation program for automotive pressure sensor. Simulation-Digital mixed signal conditioning is used in this dissertation, adopt signal conditioning chip MAX1452. AVR singlechip ATMEGA128 and other apparatus; fulfill the design of digital pressure sensor hardware circuit and singlechip hardware circuit; simultaneously design the singlechip software; Digital pressure sensor hardware circuit is used to implementing the correction and compensation of sensor; singlechip hardware circuit is used to implementing to controll the correction and compensation of pressure sensor; singlechip software is used to implementing to fulfill compensation arithmetic. In the end, it implement to measure the output of sensor, and contrast to the data of non-compensation, the outcome indicates that the compensation precision of compensated sensor output is obviously better than non-compensation sensor, not only improving the compensation precision but also increasing the stabilization of pressure sensor.
Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring
ERIC Educational Resources Information Center
Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri
2017-01-01
Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…
[Optical Design of Miniature Infrared Gratings Spectrometer Based on Planar Waveguide].
Li, Yang-yu; Fang, Yong-hua; Li, Da-cheng; Liu, Yang
2015-03-01
In order to miniaturize an infrared spectrometer, we analyze the current optical design of miniature spectrometers and propose a method for designing a miniature infrared gratings spectrometer based on planar waveguide. Common miniature spectrometer uses miniature optical elements to reduce the size of system, which also shrinks the effective aperture. So the performance of spectrometer has dropped. Miniaturization principle of planar waveguide spectrometer is different from the principle of common miniature spectrometer. In planar waveguide spectrometer, the propagation of light is limited in a thin planar waveguide, which looks like the whole optical system is squashed flat. In the direction parallel to the planar waveguide, the light through the slit is collimated, dispersed and focused. And a spectral image is formed in the detector plane. This propagation of light is similar to the light in common miniature spectrometer. In the direction perpendicular to the planar waveguide, light is multiple reflected by the upper and lower surfaces of the planar waveguide and propagates in the waveguide. So the size of corresponding optical element could be very small in the vertical direction, which can reduce the size of the optical system. And the performance of the spectrometer is still good. The design method of the planar waveguide spectrometer can be separated into two parts, Czerny-Turner structure design and planar waveguide structure design. First, by using aberration theory an aberration-corrected (spherical aberration, coma, focal curve) Czerny-Turner structure is obtained. The operation wavelength range and spectral resolution are also fixed. Then, by using geometrical optics theory a planar waveguide structure is designed for reducing the system size and correcting the astigmatism. The planar waveguide structure includes a planar waveguide and two cylindrical lenses. Finally, they are modeled together in optical design software and are optimized as a whole. An infrared planar waveguide spectrometer is designed using this method. The operation wavelength range is 8 - 12 μm, the numerical aperture is 0.22, and the linear array detector contains 64 elements. By using Zemax software, the design is optimized and analyzed. The results indicate that the size of the optical system is 130 mm x 125 mm x 20 mm and the spectral resolution of spectrometer is 80 nm, which satisfy the requirements of design index. Thus it is this method that can be used for designing a miniature spectrometer without movable parts and sizes in the range of several cubic centimeters.
The Source to S2K Conversion System.
1978-12-01
mandgement system Provides. As for all software production, the cost of writing this program is high, particularily considering it may be executed only...research, and 3 findlly, implement the system using disciplined, structured software engineering principles. In order to properly document how these...complete read step is required (as done by the Michigan System and EXPRESS) or software support outside the conversion system (as in CODS) is required
The Regulation of Medical Computer Software as a “Device” under the Food, Drug, and Cosmetic Act
Brannigan, Vincent
1986-01-01
Recent developments in computer software have raised the possibility that federal regulators may claim to control medical computer software as a “device” under the Food, Drug and Cosmetic Act. The purpose of this paper is to analyze the FDCA to determine whether computer software is included in the statutory scheme, examine constitutional arguments relating to computer software, and discuss regulatory principles that should be taken into account when deciding appropriate regulation. This paper is limited to computer program output used by humans in deciding appropriate medical therapy for a patient.
NASA Astrophysics Data System (ADS)
Guo, Limin; Liu, Youqiang; Huang, Rui; Wang, Zhiyong
2017-06-01
High concentrating PV systems rely on large Fresnel lens that must be precisely oriented in the direction of the Sun to maintain high concentration ratio. We propose a new Fresnel lens design method combining equal-width and equal-height of grooves in this paper based on the principle of focused spot maximum energy. In the ring band near the center of Fresnel lens, the design with equal-width grooves is applied, and when the given condition is reached, the design with equal-height grooves is introduced near the edges of the Fresnel lens, which ensures all the lens grooves are planar. In this paper, we establish a Fresnel lens design example model by Solidworks, and simulate it with the software ZEMAX. An experimental test platform is built to test, and the simulation correctness is proved by experiments. Experimental result shows the concentrating efficiency of this example is 69.3%, slightly lower than the simulation result 75.1%.
Use of Writing with Symbols 2000 Software to Facilitate Emergent Literacy Development
ERIC Educational Resources Information Center
Parette, Howard P.; Boeckmann, Nichole M.; Hourcade, Jack J.
2008-01-01
This paper outlines the use of the "Writing with Symbols 2000" software to facilitate emergent literacy development. The program's use of pictures incorporated with text has great potential to help young children with and without disabilities acquire fundamental literacy concepts about print, phonemic awareness, alphabetic principle, vocabulary…
ERIC Educational Resources Information Center
Jones, Lawrence; Graham, Ian
1986-01-01
Reviews the main principles of interfacing and discusses the software developed to perform kinetic data capture and analysis with a BBC microcomputer linked to a recording spectrophotometer. Focuses on the steps in software development. Includes results of a lactate dehydrogenase assay. (ML)
A Survey of Display Hardware and Software.
ERIC Educational Resources Information Center
Poore, Jesse H., Jr.; And Others
Reported are two papers which deal with the fundamentals of display hardware and software in computer systems. The first report presents the basic principles of display hardware in terms of image generation from buffers presumed to be loaded and controlled by a digital computer. The concepts surrounding the electrostatic tube, the electromagnetic…
Efficient Predictions of Excited State for Nanomaterials Using Aces 3 and 4
2017-12-20
by first-principle methods in the software package ACES by using large parallel computers, growing tothe exascale. 15. SUBJECT TERMS Computer...modeling, excited states, optical properties, structure, stability, activation barriers first principle methods , parallel computing 16. SECURITY...2 Progress with new density functional methods
Origami: An Active Learning Exercise for Scrum Project Management
ERIC Educational Resources Information Center
Sibona, Christopher; Pourreza, Saba; Hill, Stephen
2018-01-01
Scrum is a popular project management model for iterative delivery of software that subscribes to Agile principles. This paper describes an origami active learning exercise to teach the principles of Scrum in management information systems courses. The exercise shows students how Agile methods respond to changes in requirements during project…
NASA Astrophysics Data System (ADS)
Gu, Yu; Li, Qiang; Xu, Bao-Jun; Zhao, Zhe
2014-01-01
We present a new polymer quartz piezoelectric crystal sensor that takes a quartz piezoelectric crystal as the basal material and a nanometer nonmetallic polymer thin film as the surface coating based on the principle of quartz crystal microbalance (QCM). The new sensor can be used to detect the characteristic materials of a volatile liquid. A mechanical model of the new sensor was built, whose structure was a thin circle plate composing of polytef/quartz piezoelectric/polytef. The mechanical model had a diameter of 8 mm and a thickness of 170 μm. The vibration state of the model was simulated by software ANSYS after the physical parameters and the boundary condition of the new sensor were set. According to the results of experiments, we set up a frequency range from 9.995850 MHz to 9.997225 MHz, 17 kinds of frequencies and modes of vibration were obtained within this range. We found a special frequency fsp of 9.996358 MHz. When the resonant frequency of the new sensor's mechanical model reached the special frequency, a special phenomenon occurred. In this case, the amplitude of the center point O on the mechanical model reached the maximum value. At the same time, the minimum absolute difference between the simulated frequency based on the ANSYS software and the experimental measured stable frequency was reached. The research showed that the design of the new polymer quartz piezoelectric crystal sensor perfectly conforms to the principle of QCM. A special frequency value fsp was found and subsequently became one of the most important parameters in the new sensor design.
NASA Technical Reports Server (NTRS)
Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)
2001-01-01
The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by employing soft computing technologies, the quality and reliability of the overall scheme to engine controller development is further improved and vehicle safety is further insured. The final product that this paper proposes is an approach to development of an alternative low cost engine controller that would be capable of performing in unique vision spacecraft vehicles requiring low cost advanced avionics architectures for autonomous operations from engine pre-start to engine shutdown.
Development of a knowledge management system for complex domains.
Perott, André; Schader, Nils; Bruder, Ralph; Leonhardt, Jörg
2012-01-01
Deutsche Flugsicherung GmbH, the German Air Navigation Service Provider, follows a systematic approach, called HERA, for investigating incidents. The HERA analysis shows a distinctive occurrence of incidents in German air traffic control in which the visual perception of information plays a key role. The reasons can be partially traced back to workstation design, where basic ergonomic rules and principles are not sufficiently followed by the designers in some cases. In cooperation with the Institute of Ergonomics in Darmstadt the DFS investigated possible approaches that may support designers to implement ergonomic systems. None of the currently available tools were found to be able to meet the identified user requirements holistically. Therefore it was suggested to develop an enhanced software tool called Design Process Guide. The name Design Process Guide indicates that this tool exceeds the classic functions of currently available Knowledge Management Systems. It offers "design element" based access, shows processual and content related topics, and shows the implications of certain design decisions. Furthermore, it serves as documentation, detailing why a designer made to a decision under a particular set of conditions.
Smart cards: a specific application in the hospital.
Güler, I; Zengin, R M; Sönmez, M
1998-12-01
Computers have the ability to process and access tremendous amounts of information in our daily lives. But, now, individuals have this ability by carrying a smart card in their own wallets. These cards provide us the versatility, power, and security of computers. This study begins with a short description of smart cards and their advantages. Then, an electronic circuit that is designed for healthcare application in hospitals is introduced. This circuit functions as a smart card holder identifier, access controller for hospital doors and also can be used as a smart card reader/writer. Design steps of this electronic circuit, operation principles, serial communication with P.C., and the software are examined. Finally a complete access control network for hospital doors that functions with smart cards is discussed.
Kim, Min H.; Yoon, Hargsoon; Choi, Sang H.; Zhao, Fei; Kim, Jongsung; Song, Kyo D.; Lee, Uhn
2016-01-01
Real-time monitoring of extracellular neurotransmitter concentration offers great benefits for diagnosis and treatment of neurological disorders and diseases. This paper presents the study design and results of a miniaturized and wireless optical neurotransmitter sensor (MWONS) for real-time monitoring of brain dopamine concentration. MWONS is based on fluorescent sensing principles and comprises a microspectrometer unit, a microcontroller for data acquisition, and a Bluetooth wireless network for real-time monitoring. MWONS has a custom-designed application software that controls the operation parameters for excitation light sources, data acquisition, and signal processing. MWONS successfully demonstrated a measurement capability with a limit of detection down to a 100 nanomole dopamine concentration, and high selectivity to ascorbic acid (90:1) and uric acid (36:1). PMID:27834927
Kim, Min H; Yoon, Hargsoon; Choi, Sang H; Zhao, Fei; Kim, Jongsung; Song, Kyo D; Lee, Uhn
2016-11-10
Real-time monitoring of extracellular neurotransmitter concentration offers great benefits for diagnosis and treatment of neurological disorders and diseases. This paper presents the study design and results of a miniaturized and wireless optical neurotransmitter sensor (MWONS) for real-time monitoring of brain dopamine concentration. MWONS is based on fluorescent sensing principles and comprises a microspectrometer unit, a microcontroller for data acquisition, and a Bluetooth wireless network for real-time monitoring. MWONS has a custom-designed application software that controls the operation parameters for excitation light sources, data acquisition, and signal processing. MWONS successfully demonstrated a measurement capability with a limit of detection down to a 100 nanomole dopamine concentration, and high selectivity to ascorbic acid (90:1) and uric acid (36:1).
Liu, M; Wei, L; Zhang, J
2006-01-01
Missing data in clinical trials are inevitable. We highlight the ICH guidelines and CPMP points to consider on missing data. Specifically, we outline how we should consider missing data issues when designing, planning and conducting studies to minimize missing data impact. We also go beyond the coverage of the above two documents, provide a more detailed review of the basic concepts of missing data and frequently used terminologies, and examples of the typical missing data mechanism, and discuss technical details and literature for several frequently used statistical methods and associated software. Finally, we provide a case study where the principles outlined in this paper are applied to one clinical program at protocol design, data analysis plan and other stages of a clinical trial.
Dugas, Martin; Dugas-Breit, Susanne
2014-01-01
Design, execution and analysis of clinical studies involves several stakeholders with different professional backgrounds. Typically, principle investigators are familiar with standard office tools, data managers apply electronic data capture (EDC) systems and statisticians work with statistics software. Case report forms (CRFs) specify the data model of study subjects, evolve over time and consist of hundreds to thousands of data items per study. To avoid erroneous manual transformation work, a converting tool for different representations of study data models was designed. It can convert between office format, EDC and statistics format. In addition, it supports semantic annotations, which enable precise definitions for data items. A reference implementation is available as open source package ODMconverter at http://cran.r-project.org.
Brain MRI volumetry in a single patient with mild traumatic brain injury.
Ross, David E; Castelvecchi, Cody; Ochs, Alfred L
2013-01-01
This letter to the editor describes the case of a 42 year old man with mild traumatic brain injury and multiple neuropsychiatric symptoms which persisted for a few years after the injury. Initial CT scans and MRI scans of the brain showed no signs of atrophy. Brain volume was measured using NeuroQuant®, an FDA-approved, commercially available software method. Volumetric cross-sectional (one point in time) analysis also showed no atrophy. However, volumetric longitudinal (two points in time) analysis showed progressive atrophy in several brain regions. This case illustrated in a single patient the principle discovered in multiple previous group studies, namely that the longitudinal design is more powerful than the cross-sectional design for finding atrophy in patients with traumatic brain injury.
A compact semiconductor digital interferometer and its applications
NASA Astrophysics Data System (ADS)
Britsky, Oleksander I.; Gorbov, Ivan V.; Petrov, Viacheslav V.; Balagura, Iryna V.
2015-05-01
The possibility of using semiconductor laser interferometers to measure displacements at the nanometer scale was demonstrated. The creation principles of miniature digital Michelson interferometers based on semiconductor lasers were proposed. The advanced processing algorithm for the interferometer quadrature signals was designed. It enabled to reduce restrictions on speed of measured movements. A miniature semiconductor digital Michelson interferometer was developed. Designing of the precision temperature stability system for miniature low-cost semiconductor laser with 0.01ºС accuracy enabled to use it for creation of compact interferometer rather than a helium-neon one. Proper firmware and software was designed for the interferometer signals real-time processing and conversion in to respective shifts. In the result the relative displacement between 0-500 mm was measured with a resolution of better than 1 nm. Advantages and disadvantages of practical use of the compact semiconductor digital interferometer in seismometers for the measurement of shifts were shown.
Software for imaging phase-shift interference microscope
NASA Astrophysics Data System (ADS)
Malinovski, I.; França, R. S.; Couceiro, I. B.
2018-03-01
In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.
Development status of the life marker chip instrument for ExoMars
NASA Astrophysics Data System (ADS)
Sims, Mark R.; Cullen, David C.; Rix, Catherine S.; Buckley, Alan; Derveni, Mariliza; Evans, Daniel; Miguel García-Con, Luis; Rhodes, Andrew; Rato, Carla C.; Stefinovic, Marijan; Sephton, Mark A.; Court, Richard W.; Bulloch, Christopher; Kitchingman, Ian; Ali, Zeshan; Pullan, Derek; Holt, John; Blake, Oliver; Sykes, Jonathan; Samara-Ratna, Piyal; Canali, Massimiliano; Borst, Guus; Leeuwis, Henk; Prak, Albert; Norfini, Aleandro; Geraci, Ennio; Tavanti, Marco; Brucato, John; Holm, Nils
2012-11-01
The Life Marker Chip (LMC) is one of the instruments being developed for possible flight on the 2018 ExoMars mission. The instrument uses solvents to extract organic compounds from samples of martian regolith and to transfer the extracts to dedicated detectors based around the use of antibodies. The scientific aims of the instrument are to detect organics in the form of biomarkers that might be associated with extinct life, extant life or abiotic sources of organics. The instrument relies on a novel surfactant-based solvent system and bespoke, commercial and research-developed antibodies against a number of distinct biomarkers or molecular types. The LMC comprises of a number of subsystems designed to accept up to four discrete samples of martian regolith or crushed rock, implement the solvent extraction, perform microfluidic-based multiplexed antibody-assays for biomarkers and other targets, optically detect the fluorescent output of the assays, control the internal instrument pressure and temperature, in addition to the associated instrument control electronics and software. The principle of operation, the design and the instrument development status as of December 2011 are reported here. The instrument principle can be extended to other configurations and missions as needed.
Software engineering and Ada in design
NASA Technical Reports Server (NTRS)
Oneill, Don
1986-01-01
Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.
Thakkar, Jay; Barry, Tony; Thiagalingam, Aravinda; Redfern, Julie; McEwan, Alistair L; Rodgers, Anthony
2016-01-01
Background Mobile health (mHealth) has huge potential to deliver preventative health services. However, there is paucity of literature on theoretical constructs, technical, practical, and regulatory considerations that enable delivery of such services. Objectives The objective of this study was to outline the key considerations in the development of a text message-based mHealth program; thus providing broad recommendations and guidance to future researchers designing similar programs. Methods We describe the key considerations in designing the intervention with respect to functionality, technical infrastructure, data management, software components, regulatory requirements, and operationalization. We also illustrate some of the potential issues and decision points utilizing our experience of developing text message (short message service, SMS) management systems to support 2 large randomized controlled trials: TEXT messages to improve MEDication adherence & Secondary prevention (TEXTMEDS) and Tobacco, EXercise and dieT MEssages (TEXT ME). Results The steps identified in the development process were: (1) background research and development of the text message bank based on scientific evidence and disease-specific guidelines, (2) pilot testing with target audience and incorporating feedback, (3) software-hardware customization to enable delivery of complex personalized programs using prespecified algorithms, and (4) legal and regulatory considerations. Additional considerations in developing text message management systems include: balancing the use of customized versus preexisting software systems, the level of automation versus need for human inputs, monitoring, ensuring data security, interface flexibility, and the ability for upscaling. Conclusions A merging of expertise in clinical and behavioral sciences, health and research data management systems, software engineering, and mobile phone regulatory requirements is essential to develop a platform to deliver and manage support programs to hundreds of participants simultaneously as in TEXT ME and TEXTMEDS trials. This research provides broad principles that may assist other researchers in developing mHealth programs. PMID:27847350
Thakkar, Jay; Barry, Tony; Thiagalingam, Aravinda; Redfern, Julie; McEwan, Alistair L; Rodgers, Anthony; Chow, Clara K
2016-11-15
Mobile health (mHealth) has huge potential to deliver preventative health services. However, there is paucity of literature on theoretical constructs, technical, practical, and regulatory considerations that enable delivery of such services. The objective of this study was to outline the key considerations in the development of a text message-based mHealth program; thus providing broad recommendations and guidance to future researchers designing similar programs. We describe the key considerations in designing the intervention with respect to functionality, technical infrastructure, data management, software components, regulatory requirements, and operationalization. We also illustrate some of the potential issues and decision points utilizing our experience of developing text message (short message service, SMS) management systems to support 2 large randomized controlled trials: TEXT messages to improve MEDication adherence & Secondary prevention (TEXTMEDS) and Tobacco, EXercise and dieT MEssages (TEXT ME). The steps identified in the development process were: (1) background research and development of the text message bank based on scientific evidence and disease-specific guidelines, (2) pilot testing with target audience and incorporating feedback, (3) software-hardware customization to enable delivery of complex personalized programs using prespecified algorithms, and (4) legal and regulatory considerations. Additional considerations in developing text message management systems include: balancing the use of customized versus preexisting software systems, the level of automation versus need for human inputs, monitoring, ensuring data security, interface flexibility, and the ability for upscaling. A merging of expertise in clinical and behavioral sciences, health and research data management systems, software engineering, and mobile phone regulatory requirements is essential to develop a platform to deliver and manage support programs to hundreds of participants simultaneously as in TEXT ME and TEXTMEDS trials. This research provides broad principles that may assist other researchers in developing mHealth programs. ©Jay Thakkar, Tony Barry, Aravinda Thiagalingam, Julie Redfern, Alistair L McEwan, Anthony Rodgers, Clara K Chow. Originally published in JMIR Mhealth and Uhealth (http://mhealth.jmir.org), 15.11.2016.
Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R.; Lemieux-Charles, Louise
2015-01-01
Research problem Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. Research question What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Literature review Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. Methodology We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. Results and discussion The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design. PMID:26190888
NASA Technical Reports Server (NTRS)
Keller, Richard M. (Editor); Barstow, David; Lowry, Michael R.; Tong, Christopher H.
1992-01-01
The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface.
Hatamleh, Muhanad M; Yeung, Elizabeth; Osher, Jonas; Huppa, Chrisopher
2017-05-01
Hemimandibular hyperplasia is characterized by an obvious overgrowth in the size of the mandible on one side, which can extend up to the midline causing facial asymmetry. Surgical resection of the overgrowth depends heavily on the skill and experience of the surgeon. This report describes a novel methodology of applying three-dimensional computer-aided-design and computer-aided-manufacturing principles in improving the outcome of surgery in 2 mandibular hyperplasia patients. Both patients had their cone beam computer tomography (CBCT) scan performed. CMF Pro Plan software (v. 2.1) was used to process the scan data into virtual 3-dimensional models of the maxilla and mandible. Head tilt was adjusted manually by following horizontal reference. Facial asymmetry secondary to mandibular hypertrophy was obvious on frontal and lateral views. Simulation functions were followed including mirror imaging of the unaffected mandibular side into the hyperplastic side and position was optimized by translation and orientation functions. Reconstruction of virtual symmetry was assessed and checked by running 3-dimensional measurements. Then, subtraction functions were used to create a 3-dimensional template defining the outline of the lower mandibular osteotomy needed. Precision of mandibular teeth was enhanced by amalgamating the CBCT scan with e-cast scan of the patient lower teeth. 3-Matic software (v. 10.0) was used in designing cutting guide(s) that define the amount of overgrowth to be resected. The top section of the guide was resting on the teeth hence ensuring stability and accuracy while positioning it. The guide design was exported as an .stl file and printed using in-house 3-dimensional printer in biocompatible resin. Three-dimensional technologies of both softwares (CMF Pro Plan and 3-Matic) are accurate and reliable methods in the diagnosis, treatment planning, and designing of cutting guides that optimize surgical correction of hemimandibular hyperplasia at timely and cost-effect manner.
Common Grounds for Modelling Mathematics in Educational Software
ERIC Educational Resources Information Center
Neuper, Walther
2010-01-01
Two kinds of software, CAS and DGS, are starting to work towards mutual integration. This paper envisages common grounds for such integration based on principles of computer theorem proving (CTP). Presently, the CTP community seems to lack awareness as to which of their products' features might serve mathematics education from high-school to…
PBL-SEE: An Authentic Assessment Model for PBL-Based Software Engineering Education
ERIC Educational Resources Information Center
dos Santos, Simone C.
2017-01-01
The problem-based learning (PBL) approach has been successfully applied to teaching software engineering thanks to its principles of group work, learning by solving real problems, and learning environments that match the market realities. However, the lack of well-defined methodologies and processes for implementing the PBL approach represents a…
Using Cognitive Tutor Software in Learning Linear Algebra Word Concept
ERIC Educational Resources Information Center
Yang, Kai-Ju
2015-01-01
This paper reports on a study of twelve 10th grade students using Cognitive Tutor, a math software program, to learn linear algebra word concept. The study's purpose was to examine whether students' mathematics performance as it is related to using Cognitive Tutor provided evidence to support Koedlinger's (2002) four instructional principles used…
Applying Minimal Manual Principles for Documentation of Graphical User Interfaces.
ERIC Educational Resources Information Center
Nowaczyk, Ronald H.; James, E. Christopher
1993-01-01
Investigates the need to include computer screens in documentation for software using a graphical user interface. Describes the uses and purposes of "minimal manuals" and their principles. Studies student reaction to their use of one of three on-screen manuals: screens, icon, and button. Finds some benefit for including icon and button…
ERIC Educational Resources Information Center
Haley, M.
2013-01-01
The purpose of this study was to investigate whether or not there have been successful applications of lean manufacturing principles in highly variable defense IT environments. Specifically, the study assessed if implementation of the lean philosophies by a defense organization yielded repeatable, predictable results in software release schedules…
Flow analysis of new type propulsion system for UV’s
NASA Astrophysics Data System (ADS)
Eimanis, M.; Auzins, J.
2017-10-01
This paper presents an original design of an autonomous underwater vehicle where thrust force is created by the helicoidal shape of the hull rather than screw propellers. Propulsion force is created by counter-rotating bow and stern parts. The middle part of the vehicle has the function of a cargo compartment containing all control mechanisms and communications. It’s made of elastic material, containing a Cardan-joint mechanism, which allows changing the direction of vehicle, actuated by bending drives. A bending drive velocity control algorithm for the automatic control of vehicle movement direction is proposed. The dynamics of AUV are simulated using multibody simulation software MSC Adams. For the simulation of water resistance forces and torques the surrogate polynomial metamodels are created on the basis of computer experiments with CFD software. For flow interaction with model geometry the simplified vehicle model is submerged in fluid medium using special CFD software, with the same idea used in wind tunnel experiments. The simulation results are compared with measurements of the AUV prototype, created at Institute of Mechanics of Riga Technical University. Experiments with the prototype showed good agreement with simulation results and confirmed the effectiveness and the future potential of the proposed principle.
Nano-JASMINE: use of AGIS for the next astrometric satellite
NASA Astrophysics Data System (ADS)
Yamada, Y.; Gouda, N.; Lammers, U.
The core data reduction for the Nano-JASMINE mission is planned to be done with Gaia's Astrometric Global Iterative Solution (AGIS). The collaboration started at 2007 prompted by Uwe Lammers' proposal. In addition to similar design and operating principles of the two missions, this is possible thanks to the encapsulation of all Gaia-specific aspects of AGIS in a Parameter Database. Nano-JASMINE will be the test bench for Gaia AGIS software. We present this idea in detail and the necessary practical steps to make AGIS work with Nano-JASMINE data. We also show the key mission parameters, goals, and status of the data reduction for the Nano-JASMINE.
An Embedded Laser Marking Controller Based on ARM and FPGA Processors
Dongyun, Wang; Xinpiao, Ye
2014-01-01
Laser marking is an important branch of the laser information processing technology. The existing laser marking machine based on PC and WINDOWS operating system, are large and inconvenient to move. Still, it cannot work outdoors or in other harsh environments. In order to compensate for the above mentioned disadvantages, this paper proposed an embedded laser marking controller based on ARM and FPGA processors. Based on the principle of laser galvanometer scanning marking, the hardware and software were designed for the application. Experiments showed that this new embedded laser marking controller controls the galvanometers synchronously and could achieve precise marking. PMID:24772028
Bradley, Kevin M; Benner, Steven A
2014-01-01
Synthetic biologists wishing to self-assemble large DNA (L-DNA) constructs from small DNA fragments made by automated synthesis need fragments that hybridize predictably. Such predictability is difficult to obtain with nucleotides built from just the four standard nucleotides. Natural DNA's peculiar combination of strong and weak G:C and A:T pairs, the context-dependence of the strengths of those pairs, unimolecular strand folding that competes with desired interstrand hybridization, and non-Watson-Crick interactions available to standard DNA, all contribute to this unpredictability. In principle, adding extra nucleotides to the genetic alphabet can improve the predictability and reliability of autonomous DNA self-assembly, simply by increasing the information density of oligonucleotide sequences. These extra nucleotides are now available as parts of artificially expanded genetic information systems (AEGIS), and tools are now available to generate entirely standard DNA from AEGIS DNA during PCR amplification. Here, we describe the OligArch (for "oligonucleotide architecting") software, an application that permits synthetic biologists to engineer optimally self-assembling DNA constructs from both six- and eight-letter AEGIS alphabets. This software has been used to design oligonucleotides that self-assemble to form complete genes from 20 or more single-stranded synthetic oligonucleotides. OligArch is therefore a key element of a scalable and integrated infrastructure for the rapid and designed engineering of biology.
Hybrid Modeling Improves Health and Performance Monitoring
NASA Technical Reports Server (NTRS)
2007-01-01
Scientific Monitoring Inc. was awarded a Phase I Small Business Innovation Research (SBIR) project by NASA's Dryden Flight Research Center to create a new, simplified health-monitoring approach for flight vehicles and flight equipment. The project developed a hybrid physical model concept that provided a structured approach to simplifying complex design models for use in health monitoring, allowing the output or performance of the equipment to be compared to what the design models predicted, so that deterioration or impending failure could be detected before there would be an impact on the equipment's operational capability. Based on the original modeling technology, Scientific Monitoring released I-Trend, a commercial health- and performance-monitoring software product named for its intelligent trending, diagnostics, and prognostics capabilities, as part of the company's complete ICEMS (Intelligent Condition-based Equipment Management System) suite of monitoring and advanced alerting software. I-Trend uses the hybrid physical model to better characterize the nature of health or performance alarms that result in "no fault found" false alarms. Additionally, the use of physical principles helps I-Trend identify problems sooner. I-Trend technology is currently in use in several commercial aviation programs, and the U.S. Air Force recently tapped Scientific Monitoring to develop next-generation engine health-management software for monitoring its fleet of jet engines. Scientific Monitoring has continued the original NASA work, this time under a Phase III SBIR contract with a joint NASA-Pratt & Whitney aviation security program on propulsion-controlled aircraft under missile-damaged aircraft conditions.
Basic design principles of colorimetric vision systems
NASA Astrophysics Data System (ADS)
Mumzhiu, Alex M.
1998-10-01
Color measurement is an important part of overall production quality control in textile, coating, plastics, food, paper and other industries. The color measurement instruments such as colorimeters and spectrophotometers, used for production quality control have many limitations. In many applications they cannot be used for a variety of reasons and have to be replaced with human operators. Machine vision has great potential for color measurement. The components for color machine vision systems, such as broadcast quality 3-CCD cameras, fast and inexpensive PCI frame grabbers, and sophisticated image processing software packages are available. However the machine vision industry has only started to approach the color domain. The few color machine vision systems on the market, produced by the largest machine vision manufacturers have very limited capabilities. A lack of understanding that a vision based color measurement system could fail if it ignores the basic principles of colorimetry is the main reason for the slow progress of color vision systems. the purpose of this paper is to clarify how color measurement principles have to be applied to vision systems and how the electro-optical design features of colorimeters have to be modified in order to implement them for vision systems. The subject of this presentation far exceeds the limitations of a journal paper so only the most important aspects will be discussed. An overview of the major areas of applications for colorimetric vision system will be discussed. Finally, the reasons why some customers are happy with their vision systems and some are not will be analyzed.
The Principles for Successful Scientific Data Management Revisited
NASA Astrophysics Data System (ADS)
Walker, R. J.; King, T. A.; Joy, S. P.
2005-12-01
It has been 23 years since the National Research Council's Committee on Data Management and Computation (CODMAC) published its famous list of principles for successful scientific data management that have provided the framework for modern space science data management. CODMAC outlined seven principles: 1. Scientific Involvement in all aspects of space science missions. 2. Scientific Oversight of all scientific data-management activities. 3. Data Availability - Validated data should be made available to the scientific community in a timely manner. They should include appropriate ancillary data, and complete documentation. 4. Facilities - A proper balance between cost and scientific productivity should be maintained. 5. Software - Transportable well documented software should be available to process and analyze the data. 6. Scientific Data Storage - The data should be preserved in retrievable form. 7. Data System Funding - Adequate data funding should be made available at the outset of missions and protected from overruns. In this paper we will review the lessons learned in trying to apply these principles to space derived data. The Planetary Data System created the concept of data curation to carry out the CODMAC principles. Data curators are scientists and technologists who work directly with the mission scientists to create data products. The efficient application of the CODMAC principles requires that data curators and the mission team start early in a mission to plan for data access and archiving. To build the data products the planetary discipline adopted data access and documentation standards and has adhered to them. The data curators and mission team work together to produce data products and make them available. However even with early planning and agreement on standards the needs of the science community frequently far exceed the available resources. This is especially true for smaller principal investigator run missions. We will argue that one way to make data systems for small missions more effective is for the data curators to provide software tools to help develop the mission data system.
Design and applications of a multimodality image data warehouse framework.
Wong, Stephen T C; Hoo, Kent Soo; Knowlton, Robert C; Laxer, Kenneth D; Cao, Xinhau; Hawkins, Randall A; Dillon, William P; Arenson, Ronald L
2002-01-01
A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications--namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains.
Design and Applications of a Multimodality Image Data Warehouse Framework
Wong, Stephen T.C.; Hoo, Kent Soo; Knowlton, Robert C.; Laxer, Kenneth D.; Cao, Xinhau; Hawkins, Randall A.; Dillon, William P.; Arenson, Ronald L.
2002-01-01
A comprehensive data warehouse framework is needed, which encompasses imaging and non-imaging information in supporting disease management and research. The authors propose such a framework, describe general design principles and system architecture, and illustrate a multimodality neuroimaging data warehouse system implemented for clinical epilepsy research. The data warehouse system is built on top of a picture archiving and communication system (PACS) environment and applies an iterative object-oriented analysis and design (OOAD) approach and recognized data interface and design standards. The implementation is based on a Java CORBA (Common Object Request Broker Architecture) and Web-based architecture that separates the graphical user interface presentation, data warehouse business services, data staging area, and backend source systems into distinct software layers. To illustrate the practicality of the data warehouse system, the authors describe two distinct biomedical applications—namely, clinical diagnostic workup of multimodality neuroimaging cases and research data analysis and decision threshold on seizure foci lateralization. The image data warehouse framework can be modified and generalized for new application domains. PMID:11971885
Biomimetics in the design of a robotic exoskeleton for upper limb therapy
NASA Astrophysics Data System (ADS)
Baniqued, Paul Dominick E.; Dungao, Jade R.; Manguerra, Michael V.; Baldovino, Renann G.; Abad, Alexander C.; Bugtai, Nilo T.
2018-02-01
Current methodologies in designing robotic exoskeletons for upper limb therapy simplify the complex requirements of the human anatomy. As a result, such devices tend to compromise safety and biocompatibility with the intended user. However, a new design methodology uses biological analogues as inspiration to address these technical issues. This approach follows that of biomimetics, a design principle that uses the extraction and transfer of useful information from natural morphologies and processes to solve technical design issues. In this study, a biomimetic approach in the design of a 5-degree-of-freedom robotic exoskeleton for upper limb therapy was performed. A review of biomimetics was first discussed along with its current contribution to the design of rehabilitation robots. With a proposed methodological framework, the design for an upper limb robotic exoskeleton was generated using CATIA software. The design was inspired by the morphology of the bones and the muscle force transmission of the upper limbs. Finally, a full design assembly presented had integrated features extracted from the biological analogue. The successful execution of a biomimetic design methodology made a case in providing safer and more biocompatible robots for rehabilitation.
Empirical studies of software design: Implications for SSEs
NASA Technical Reports Server (NTRS)
Krasner, Herb
1988-01-01
Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.
In silico design of ligand triggered RNA switches.
Findeiß, Sven; Hammer, Stefan; Wolfinger, Michael T; Kühnl, Felix; Flamm, Christoph; Hofacker, Ivo L
2018-04-13
This contribution sketches a work flow to design an RNA switch that is able to adapt two structural conformations in a ligand-dependent way. A well characterized RNA aptamer, i.,e., knowing its K d and adaptive structural features, is an essential ingredient of the described design process. We exemplify the principles using the well-known theophylline aptamer throughout this work. The aptamer in its ligand-binding competent structure represents one structural conformation of the switch while an alternative fold that disrupts the binding-competent structure forms the other conformation. To keep it simple we do not incorporate any regulatory mechanism to control transcription or translation. We elucidate a commonly used design process by explicitly dissecting and explaining the necessary steps in detail. We developed a novel objective function which specifies the mechanistics of this simple, ligand-triggered riboswitch and describe an extensive in silico analysis pipeline to evaluate important kinetic properties of the designed sequences. This protocol and the developed software can be easily extended or adapted to fit novel design scenarios and thus can serve as a template for future needs. Copyright © 2018. Published by Elsevier Inc.
Design for Verification: Using Design Patterns to Build Reliable Systems
NASA Technical Reports Server (NTRS)
Mehlitz, Peter C.; Penix, John; Koga, Dennis (Technical Monitor)
2003-01-01
Components so far have been mainly used in commercial software development to reduce time to market. While some effort has been spent on formal aspects of components, most of this was done in the context of programming language or operating system framework integration. As a consequence, increased reliability of composed systems is mainly regarded as a side effect of a more rigid testing of pre-fabricated components. In contrast to this, Design for Verification (D4V) puts the focus on component specific property guarantees, which are used to design systems with high reliability requirements. D4V components are domain specific design pattern instances with well-defined property guarantees and usage rules, which are suitable for automatic verification. The guaranteed properties are explicitly used to select components according to key system requirements. The D4V hypothesis is that the same general architecture and design principles leading to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the limitations of conventional reliability assurance measures, such as too large a state space or too many execution paths.
NASA Astrophysics Data System (ADS)
Saffar, Seha; Azni Jafar, Fairul; Jamaludin, Zamberi
2016-02-01
A case study was selected as a method to collect data in actual industry situation. The study aimed to assess the influences of automated material handling system in automotive industry by proposing a new design of integration system through simulation, and analyze the significant effect and influence of the system. The method approach tool will be CAD Software (Delmia & Quest). The process of preliminary data gathering in phase 1 will collect all data related from actual industry situation. It is expected to produce a guideline and limitation in designing a new integration system later. In phase 2, an idea or concept of design will be done by using 10 principles of design consideration for manufacturing. A full factorial design will be used as design of experiment in order to analyze the performance measured of the integration system with the current system in case study. From the result of the experiment, an ANOVA analysis will be done to study the performance measured. Thus, it is expected that influences can be seen from the improvement made in the system.
Event Display for the Visualization of CMS Events
NASA Astrophysics Data System (ADS)
Bauerdick, L. A. T.; Eulisse, G.; Jones, C. D.; Kovalskyi, D.; McCauley, T.; Mrak Tadel, A.; Muelmenstaedt, J.; Osborne, I.; Tadel, M.; Tu, Y.; Yagil, A.
2011-12-01
During the last year the CMS experiment engaged in consolidation of its existing event display programs. The core of the new system is based on the Fireworks event display program which was by-design directly integrated with the CMS Event Data Model (EDM) and the light version of the software framework (FWLite). The Event Visualization Environment (EVE) of the ROOT framework is used to manage a consistent set of 3D and 2D views, selection, user-feedback and user-interaction with the graphics windows; several EVE components were developed by CMS in collaboration with the ROOT project. In event display operation simple plugins are registered into the system to perform conversion from EDM collections into their visual representations which are then managed by the application. Full event navigation and filtering as well as collection-level filtering is supported. The same data-extraction principle can also be applied when Fireworks will eventually operate as a service within the full software framework.
NASA Technical Reports Server (NTRS)
Vranish, John M.
1991-01-01
A capacitive proximity/tactile sensor with unique performance capabilities ('capaciflector' or capacitive reflector) is being developed by NASA/Goddard Space Flight Center (GSFC) for use on robots and payloads in space in the interests of safety, efficiency, and ease of operation. Specifically, this sensor will permit robots and their attached payloads to avoid collisions in space with humans and other objects and to dock these payloads in a cluttered environment. The sensor is simple, robust, and inexpensive to manufacture with obvious and recognized commercial possibilities. Accordingly, NASA/GSFC, in conjunction with industry, is embarking on an effort to 'spin' this technology off into the private sector. This effort includes prototypes aimed at commercial applications. The principles of operation of these prototypes are described along with hardware, software, modelling, and test results. The hardware description includes both the physical sensor in terms of a flexible printed circuit board and the electronic circuitry. The software description will include filtering and detection techniques. The modelling will involve finite element electric field analysis and will underline techniques used for design optimization.
Computer Assisted REhabilitation (CARE) Lab: A novel approach towards Pediatric Rehabilitation 2.0.
Olivieri, Ivana; Meriggi, Paolo; Fedeli, Cristina; Brazzoli, Elena; Castagna, Anna; Roidi, Marina Luisa Rodocanachi; Angelini, Lucia
2018-01-01
Pediatric Rehabilitation therapists have always worked using a variety of off-the-shelf or custom-made objects and devices, more recently including computer based systems. These Information and Communication Technology (ICT) solutions vary widely in complexity, from easy-to-use interactive videogame consoles originally intended for entertainment purposes to sophisticated systems specifically developed for rehabilitation.This paper describes the principles underlying an innovative "Pediatric Rehabilitation 2.0" approach, based on the combination of suitable ICT solutions and traditional rehabilitation, which has been progressively refined while building up and using a computer-assisted rehabilitation laboratory. These principles are thus summarized in the acronym EPIQ, to account for the terms Ecological, Personalized, Interactive and Quantitative. The paper also presents the laboratory, which has been designed to meet the children's rehabilitation needs and to empower therapists in their work. The laboratory is equipped with commercial hardware and specially developed software called VITAMIN: a virtual reality platform for motor and cognitive rehabilitation.
Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1981-01-01
A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
Yaghoobpour Tari, Shima; Wachowicz, Keith; Gino Fallone, B
2017-04-21
A prototype rotating hybrid magnetic resonance imaging system and linac has been developed to allow for simultaneous imaging and radiation delivery parallel to B 0 . However, the design of a compact magnet capable of rotation in a small vault with sufficient patient access and a typical clinical source-to-axis distance (SAD) is challenging. This work presents a novel superconducting magnet design as a proof of concept that allows for a reduced SAD and ample patient access by moving the superconducting coils to the side of the yoke. The yoke and pole-plate structures are shaped to direct the magnetic flux appropriately. The outer surface of the pole plate is optimized subject to the minimization of a cost function, which evaluates the uniformity of the magnetic field over an ellipsoid. The magnetic field calculations required in this work are performed with the 3D finite element method software package Opera-3D. Each tentative design strategy is virtually modeled in this software package, which is externally controlled by MATLAB, with its key geometries defined as variables. The optimization variables are the thickness of the pole plate at control points distributed over the pole plate surface. A novel design concept as a superconducting non-axial magnet is introduced, which could create a large uniform B 0 magnetic field with fewer geometric restriction. This non-axial 0.5 T superconducting magnet has a moderately reduced SAD of 123 cm and a vertical patient opening of 68 cm. This work is presented as a proof of principle to investigate the feasibility of a non-axial magnet with the coils located around the yoke, and the results encourage future design optimizations to maximize the benefits of this non-axial design.
NASA Astrophysics Data System (ADS)
Yaghoobpour Tari, Shima; Wachowicz, Keith; Fallone, B. Gino
2017-04-01
A prototype rotating hybrid magnetic resonance imaging system and linac has been developed to allow for simultaneous imaging and radiation delivery parallel to B 0. However, the design of a compact magnet capable of rotation in a small vault with sufficient patient access and a typical clinical source-to-axis distance (SAD) is challenging. This work presents a novel superconducting magnet design as a proof of concept that allows for a reduced SAD and ample patient access by moving the superconducting coils to the side of the yoke. The yoke and pole-plate structures are shaped to direct the magnetic flux appropriately. The outer surface of the pole plate is optimized subject to the minimization of a cost function, which evaluates the uniformity of the magnetic field over an ellipsoid. The magnetic field calculations required in this work are performed with the 3D finite element method software package Opera-3D. Each tentative design strategy is virtually modeled in this software package, which is externally controlled by MATLAB, with its key geometries defined as variables. The optimization variables are the thickness of the pole plate at control points distributed over the pole plate surface. A novel design concept as a superconducting non-axial magnet is introduced, which could create a large uniform B 0 magnetic field with fewer geometric restriction. This non-axial 0.5 T superconducting magnet has a moderately reduced SAD of 123 cm and a vertical patient opening of 68 cm. This work is presented as a proof of principle to investigate the feasibility of a non-axial magnet with the coils located around the yoke, and the results encourage future design optimizations to maximize the benefits of this non-axial design.
Design of mini-multi-gas monitoring system based on IR absorption
NASA Astrophysics Data System (ADS)
Tan, Qiu-lin; Zhang, Wen-dong; Xue, Chen-yang; Xiong, Ji-jun; Ma, You-chun; Wen, Fen
2008-07-01
In this paper, a novel non-dispersive infrared ray (IR) gas detection system is described. Conventional devices typically include several primary components: a broadband source (usually an incandescent filament), a rotating chopper shutter, a narrow-band filter, a sample tube and a detector. But we mainly use the mini-multi-channel detector, electrical modulation means and mini-gas-cell structure. To solve the problems of gas accidents in coal mines, and for family safety that results from using gas, this new IR detection system with integration, miniaturization and non-moving parts has been developed. It is based on the principle that certain gases absorb infrared radiation at specific (and often unique) wavelengths. The infrared detection optics principle used in developing this system is mainly analyzed. The idea of multi-gas detection is introduced and guided through the analysis of the single-gas detection. Through researching the design of cell structure, a cell with integration and miniaturization has been devised. By taking a single-chip microcomputer (SCM) as intelligence handling, the functional block diagram of a gas detection system is designed with the analyzing and devising of its hardware and software system. The way of data transmission on a controller area network (CAN) bus and wireless data transmission mode is explained. This system has reached the technology requirement of lower power consumption, mini-volume, wide measure range, and is able to realize multi-gas detection.
Information modeling system for blast furnace control
NASA Astrophysics Data System (ADS)
Spirin, N. A.; Gileva, L. Y.; Lavrov, V. V.
2016-09-01
Modern Iron & Steel Works as a rule are equipped with powerful distributed control systems (DCS) and databases. Implementation of DSC system solves the problem of storage, control, protection, entry, editing and retrieving of information as well as generation of required reporting data. The most advanced and promising approach is to use decision support information technologies based on a complex of mathematical models. The model decision support system for control of blast furnace smelting is designed and operated. The basis of the model system is a complex of mathematical models created using the principle of natural mathematical modeling. This principle provides for construction of mathematical models of two levels. The first level model is a basic state model which makes it possible to assess the vector of system parameters using field data and blast furnace operation results. It is also used to calculate the adjustment (adaptation) coefficients of the predictive block of the system. The second-level model is a predictive model designed to assess the design parameters of the blast furnace process when there are changes in melting conditions relative to its current state. Tasks for which software is developed are described. Characteristics of the main subsystems of the blast furnace process as an object of modeling and control - thermal state of the furnace, blast, gas dynamic and slag conditions of blast furnace smelting - are presented.
Visual Design Principles: An Empirical Study of Design Lore
ERIC Educational Resources Information Center
Kimball, Miles A.
2013-01-01
Many books, designers, and design educators talk about visual design principles such as balance, contrast, and alignment, but with little consistency. This study uses empirical methods to explore the lore surrounding design principles. The study took the form of two stages: a quantitative literature review to determine what design principles are…
Students' Different Understandings of Class Diagrams
ERIC Educational Resources Information Center
Boustedt, Jonas
2012-01-01
The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a…
NASA Astrophysics Data System (ADS)
Gunduz, Mustafa Emre
Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used to probe the design space of several local minima and maxima. After analysis of numerous samples, an optimum configuration of the design that is more stable than that of the initial design is reached. The above process requires several software tools: CATIA as the CAD tool, ANSYS as the FEA tool, VABS for obtaining the cross-sectional structural properties, and DYMORE for the frequency and dynamic analysis of the rotor. MATLAB codes are also employed to generate input files and read output files of DYMORE. All these tools are connected using ModelCenter.
Towards a supported common NEAMS software stack
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cormac Garvey
2012-04-01
The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less
Use of computers in dysmorphology.
Diliberti, J H
1988-01-01
As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092
NASA Astrophysics Data System (ADS)
Amalia, A.; Gunawan, D.; Hardi, S. M.; Rachmawati, D.
2018-02-01
The Internal Quality Assurance System (in Indonesian: SPMI (Sistem Penjaminan Mutu Internal) is a systemic activity of quality assurance of higher education in Indonesia. SPMI should be done by all higher education or universities in Indonesia based on the Regulation of the Minister of Research, Technology and Higher Education of the Republic of Indonesia Number 62 of 2016. Implementation of SPMI must refer to the principle of SPMI that is independent, standardize, accurate, well planned and sustainable, documented and systematic. To assist the SPMI cycle properly, universities need a supporting software to monitor all the activities of SPMI. But in reality, many universities are not optimal in building this SPMI monitoring system. One of the obstacles is the determination of system requirements in support of SPMI principles is difficult to achieve. In this paper, we observe the initial phase of the engineering requirements elicitation. Unlike other methods that collect system requirements from users and stakeholders, we find the system requirements of the SPMI principles from SPMI guideline book. The result of this paper can be used as a choice in determining SPMI software requirements. This paper can also be used by developers and users to understand the scenario of SPMI so that could overcome the problems of understanding between this two parties.
Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom
1997-01-01
The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.
NASA Technical Reports Server (NTRS)
Shaykhian, Gholam Ali; Baggs, Rhoda
2007-01-01
In the early problem-solution era of software programming, functional decompositions were mainly used to design and implement software solutions. In functional decompositions, functions and data are introduced as two separate entities during the design phase, and are followed as such in the implementation phase. Functional decompositions make use of refactoring through optimizing the algorithms, grouping similar functionalities into common reusable functions, and using abstract representations of data where possible; all these are done during the implementation phase. This paper advocates the usage of object-oriented methodologies and design patterns as the centerpieces of refactoring software solutions. Refactoring software is a method of changing software design while explicitly preserving its external functionalities. The combined usage of object-oriented methodologies and design patterns to refactor should also benefit the overall software life cycle cost with improved software.
Rizo, Javier; Zeng, Emily; Kientz, Julie A; Ries, Richard; Otis, Chad; Hernandez, Kayla
2018-01-01
Background Smoking rates in the United States have been reduced in the past decades to 15% of the general population. However, up to 88% of people with psychiatric symptoms still smoke, leading to high rates of disease and mortality. Therefore, there is a great need to develop smoking cessation interventions that have adequate levels of usability and can reach this population. Objective The objective of this study was to report the rationale, ideation, design, user research, and final specifications of a novel smoking cessation app for people with serious mental illness (SMI) that will be tested in a feasibility trial. Methods We used a variety of user-centered design methods and materials to develop the tailored smoking cessation app. This included expert panel guidance, a set of design principles and theory-based smoking cessation content, development of personas and paper prototyping, usability testing of the app prototype, establishment of app’s core vision and design specification, and collaboration with a software development company. Results We developed Learn to Quit, a smoking cessation app designed and tailored to individuals with SMI that incorporates the following: (1) evidence-based smoking cessation content from Acceptance and Commitment Therapy and US Clinical Practice Guidelines for smoking cessation aimed at providing skills for quitting while addressing mental health symptoms, (2) a set of behavioral principles to increase retention and comprehension of smoking cessation content, (3) a gamification component to encourage and sustain app engagement during a 14-day period, (4) an app structure and layout designed to minimize usability errors in people with SMI, and (5) a set of stories and visuals that communicate smoking cessation concepts and skills in simple terms. Conclusions Despite its increasing importance, the design and development of mHealth technology is typically underreported, hampering scientific innovation. This report describes the systematic development of the first smoking cessation app tailored to people with SMI, a population with very high rates of nicotine addiction, and offers new design strategies to engage this population. mHealth developers in smoking cessation and related fields could benefit from a design strategy that capitalizes on the role visual engagement, storytelling, and the systematic application of behavior analytic principles to deliver evidence-based content. PMID:29339346
A Full-Featured User Friendly CO 2-EOR and Sequestration Planning Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savage, Bill
A Full-Featured, User Friendly CO 2-EOR and Sequestration Planning Software This project addressed the development of an integrated software solution that includes a graphical user interface, numerical simulation, visualization tools and optimization processes for reservoir simulation modeling of CO 2-EOR. The objective was to assist the industry in the development of domestic energy resources by expanding the application of CO 2-EOR technologies, and ultimately to maximize the CO 2} sequestration capacity of the U.S. The software resulted in a field-ready application for the industry to address the current CO 2-EOR technologies. The software has been made available to the publicmore » without restrictions and with user friendly operating documentation and tutorials. The software (executable only) can be downloaded from NITEC’s website at www.nitecllc.com. This integrated solution enables the design, optimization and operation of CO 2-EOR processes for small and mid-sized operators, who currently cannot afford the expensive, time intensive solutions that the major oil companies enjoy. Based on one estimate, small oil fields comprise 30% of the of total economic resource potential for the application of CO 2-EOR processes in the U.S. This corresponds to 21.7 billion barrels of incremental, technically recoverable oil using the current “best practices”, and 31.9 billion barrels using “next-generation” CO 2-EOR techniques. The project included a Case Study of a prospective CO 2-EOR candidate field in Wyoming by a small independent, Linc Energy Petroleum Wyoming, Inc. NITEC LLC has an established track record of developing innovative and user friendly software. The Principle Investigator is an experienced manager and engineer with expertise in software development, numerical techniques, and GUI applications. Unique, presently-proprietary NITEC technologies have been integrated into this application to further its ease of use and technical functionality.« less
NASA Astrophysics Data System (ADS)
Salleh, Khalijah Mohd; Abdullah, Abu Bakar Bin
2008-05-01
An explorative study was carried out to confirm Malaysian Physics teachers' perception that Archimedes' principle is a difficult topic for secondary level students. The interview method was used for data collection. The study sample was made of nine national secondary schools teachers from Miri, Sarawak. The data was analysed qualitatively using the Atlas-ti version 5.2 software. The findings of the study showed that i) Archimedes' principle as compared to Bernoulli's and Pascal's is the most difficult principle of hydrodynamics for students, ii) more time was given in the teaching and learning (TL) of Archimedes principle compared to the other two principles, iii) the major TL problems include conceptual understanding, application of physics principles and ideas, and lack of mathematical skills. These findings implicate the need to develop corresponding instructional materials and learning kits that can assist students' understanding of Archimedes' principle.
An Exploratory Review of Design Principles in Constructivist Gaming Learning Environments
ERIC Educational Resources Information Center
Rosario, Roberto A. Munoz; Widmeyer, George R.
2009-01-01
Creating a design theory for Constructivist Gaming Learning Environment necessitates, among other things, the establishment of design principles. These principles have the potential to help designers produce games, where users achieve higher levels of learning. This paper focuses on twelve design principles: Probing, Distributed, Multiple Routes,…
Software-engineering challenges of building and deploying reusable problem solvers.
O'Connor, Martin J; Nyulas, Csongor; Tu, Samson; Buckeridge, David L; Okhmatovskaia, Anna; Musen, Mark A
2009-11-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task-method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach.
Software-engineering challenges of building and deploying reusable problem solvers
O’CONNOR, MARTIN J.; NYULAS, CSONGOR; TU, SAMSON; BUCKERIDGE, DAVID L.; OKHMATOVSKAIA, ANNA; MUSEN, MARK A.
2012-01-01
Problem solving methods (PSMs) are software components that represent and encode reusable algorithms. They can be combined with representations of domain knowledge to produce intelligent application systems. A goal of research on PSMs is to provide principled methods and tools for composing and reusing algorithms in knowledge-based systems. The ultimate objective is to produce libraries of methods that can be easily adapted for use in these systems. Despite the intuitive appeal of PSMs as conceptual building blocks, in practice, these goals are largely unmet. There are no widely available tools for building applications using PSMs and no public libraries of PSMs available for reuse. This paper analyzes some of the reasons for the lack of widespread adoptions of PSM techniques and illustrate our analysis by describing our experiences developing a complex, high-throughput software system based on PSM principles. We conclude that many fundamental principles in PSM research are useful for building knowledge-based systems. In particular, the task–method decomposition process, which provides a means for structuring knowledge-based tasks, is a powerful abstraction for building systems of analytic methods. However, despite the power of PSMs in the conceptual modeling of knowledge-based systems, software engineering challenges have been seriously underestimated. The complexity of integrating control knowledge modeled by developers using PSMs with the domain knowledge that they model using ontologies creates a barrier to widespread use of PSM-based systems. Nevertheless, the surge of recent interest in ontologies has led to the production of comprehensive domain ontologies and of robust ontology-authoring tools. These developments present new opportunities to leverage the PSM approach. PMID:23565031
Rapid Development of Custom Software Architecture Design Environments
1999-08-01
the tools themselves. This dissertation describes a new approach to capturing and using architectural design expertise in software architecture design environments...A language and tools are presented for capturing and encapsulating software architecture design expertise within a conceptual framework...of architectural styles and design rules. The design expertise thus captured is supported with an incrementally configurable software architecture
ERIC Educational Resources Information Center
McCabe, Deborah Brown; Meuter, Matthew L.
2011-01-01
There has been an explosion of classroom technologies, yet there is a lack of research investigating the connection between classroom technology and student learning. This research project explores faculty usage of classroom-based course management software, student usage and opinions of these software tools, and an exploration of whether or not…
Open Data and Open Science for better Research in the Geo and Space Domain
NASA Astrophysics Data System (ADS)
Ritschel, B.; Seelus, C.; Neher, G.; Iyemori, T.; Koyama, Y.; Yatagai, A. I.; Murayama, Y.; King, T. A.; Hughes, S.; Fung, S. F.; Galkin, I. A.; Hapgood, M. A.; Belehaki, A.
2015-12-01
Main open data principles had been worked out in the run-up and finally adopted in the Open Data Charta at the G8 summit in Lough Erne, Northern Ireland in June 2013. Important principles are also valid for science data, such as Open Data by Default, Quality and Quantity, Useable by All, Releasing Data for Improved Governance, Releasing Data for Innovation. There is also an explicit relationship to such areas of high values as earth observation, education and geospatial data. The European union implementation plan of the Open Data Charta identifies among other things objectives such as making data available in an open format, enabling semantic interoperability, ensuring quality, documentation and where appropriate reconciliation across different data sources, implementing software solutionsallowing easy management, publication or visualization of datasets and simplifying clearance of intellectual property rights.Open Science is not just a list of already for a longer time known principles but stands for a lot of initiatives and projects around a better handling of scientific data and openly shared scientific knowledge. It is also about transparency in methodology and collection of data, availability and reuse of scientific data, public accessibility to scientific communication and using of social media to facility scientific collaboration. Some projects are concentrating on open sharing of free and open source software and even further hardware in kind of processing capabilities. In addition question about the mashup of data and publication and an open peer review process are addressed.Following the principles of open data and open science the newest results of the collaboration efforts in mashing up the data servers related to the Japanese IUGONET, the European Union ESPAS and the GFZ ISDC semantic Web projects will be presented here. The semantic Web based approach for the mashup is focusing on the design and implementation of a common but still distributed data catalog based on semantical interoperability including the transparent access to data in relational data bases. References: https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/207772/Open_Data_Charter.pdfhttp://www.openscience.org/blog/wp-content/uploads/2013/06/OpenSciencePoster.pdf
Experimental software engineering: Seventeen years of lessons in the SEL
NASA Technical Reports Server (NTRS)
Mcgarry, Frank E.
1992-01-01
Seven key principles developed by the Software Engineering Laboratory (SEL) at the Goddard Space Flight Center (GSFC) of the National Aeronautics and Space Administration (NASA) are described. For the past 17 years, the SEL has been experimentally analyzing the development of production software as varying techniques and methodologies are applied in this one environment. The SEL has collected, archived, and studied detailed measures from more than 100 flight dynamics projects, thereby gaining significant insight into the effectiveness of numerous software techniques, as well as extensive experience in the overall effectiveness of 'Experimental Software Engineering'. This experience has helped formulate follow-on studies in the SEL, and it has helped other software organizations better understand just what can be accomplished and what cannot be accomplished through experimentation.
Intelligent Agents for Design and Synthesis Environments: My Summary
NASA Technical Reports Server (NTRS)
Norvig, Peter
1999-01-01
This presentation gives a summary of intelligent agents for design synthesis environments. We'll start with the conclusions, and work backwards to justify them. First, an important assumption is that agents (whatever they are) are good for software engineering. This is especially true for software that operates in an uncertain, changing environment. The "real world" of physical artifacts is like that: uncertain in what we can measure, changing in that things are always breaking down, and we must interact with non-software entities. The second point is that software engineering techniques can contribute to good design. There may have been a time when we wanted to build simple artifacts containing little or no software. But modern aircraft and spacecraft are complex, and rely on a great deal of software. So better software engineering leads to better designed artifacts, especially when we are designing a series of related artifacts and can amortize the costs of software development. The third point is that agents are especially useful for design tasks, above and beyond their general usefulness for software engineering, and the usefulness of software engineering to design.
ERIC Educational Resources Information Center
Journal of Chemical Education, 1989
1989-01-01
Reviews three chemistry software programs at the high school and college general chemistry level for the Apple II family. Includes "Chemical Nomenclature and Balancing Equations,""Principles of Stoichiometry," and "Solubility." (MVL)
Why and how Mastering an Incremental and Iterative Software Development Process
NASA Astrophysics Data System (ADS)
Dubuc, François; Guichoux, Bernard; Cormery, Patrick; Mescam, Jean Christophe
2004-06-01
One of the key issues regularly mentioned in the current software crisis of the space domain is related to the software development process that must be performed while the system definition is not yet frozen. This is especially true for complex systems like launchers or space vehicles.Several more or less mature solutions are under study by EADS SPACE Transportation and are going to be presented in this paper. The basic principle is to develop the software through an iterative and incremental process instead of the classical waterfall approach, with the following advantages:- It permits systematic management and incorporation of requirements changes over the development cycle with a minimal cost. As far as possible the most dimensioning requirements are analyzed and developed in priority for validating very early the architecture concept without the details.- A software prototype is very quickly available. It improves the communication between system and software teams, as it enables to check very early and efficiently the common understanding of the system requirements.- It allows the software team to complete a whole development cycle very early, and thus to become quickly familiar with the software development environment (methodology, technology, tools...). This is particularly important when the team is new, or when the environment has changed since the previous development. Anyhow, it improves a lot the learning curve of the software team.These advantages seem very attractive, but mastering efficiently an iterative development process is not so easy and induces a lot of difficulties such as:- How to freeze one configuration of the system definition as a development baseline, while most of thesystem requirements are completely and naturally unstable?- How to distinguish stable/unstable and dimensioning/standard requirements?- How to plan the development of each increment?- How to link classical waterfall development milestones with an iterative approach: when should theclassical reviews be performed: Software Specification Review? Preliminary Design Review? CriticalDesign Review? Code Review? Etc...Several solutions envisaged or already deployed by EADS SPACE Transportation will be presented, both from a methodological and technological point of view:- How the MELANIE EADS ST internal methodology improves the concurrent engineering activitiesbetween GNC, software and simulation teams in a very iterative and reactive way.- How the CMM approach can help by better formalizing Requirements Management and Planningprocesses.- How the Automatic Code Generation with "certified" tools (SCADE) can still dramatically shorten thedevelopment cycle.Then the presentation will conclude by showing an evaluation of the cost and planning reduction based on a pilot application by comparing figures on two similar projects: one with the classical waterfall process, the other one with an iterative and incremental approach.
SEI Software Engineering Education Directory.
1987-02-01
Software Design and Development Gilbert. Philip Systems: CDC Cyber 170/750 CDC Cyber 170760 DEC POP 11/44 PRIME AT&T 3B5 IBM PC IBM XT IBM RT...Macintosh VAx 8300 Software System Development and Laboratory CS 480/480L U P X T Textbooks: Software Design and Development Gilbert, Philip Systems: CDC...Acting Chair (618) 692-2386 Courses: Software Design and Development CS 424 U P E Y Textbooks: Software Design and Development, Gilbert, Philip Topics
PD5: a general purpose library for primer design software.
Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda
2013-01-01
Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.
Application of Design Patterns in Refactoring Software Design
NASA Technical Reports Server (NTRS)
Baggs. Rjpda; Shaykhian, Gholam Ali
2007-01-01
Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.
Apply Design Patterns to Refactor Software Design
NASA Technical Reports Server (NTRS)
Baggs, Rhoda; Shaykhian, Gholam Ali
2007-01-01
Refactoring software design is a method of changing software design while explicitly preserving its unique design functionalities. Presented approach is to utilize design patterns as the basis for refactoring software design. Comparison of a design solution will be made through C++ programming language examples to exploit this approach. Developing reusable component will be discussed, the paper presents that the construction of such components can diminish the added burden of both refactoring and the use of design patterns.
In-Memory Graph Databases for Web-Scale Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castellana, Vito G.; Morari, Alessandro; Weaver, Jesse R.
RDF databases have emerged as one of the most relevant way for organizing, integrating, and managing expo- nentially growing, often heterogeneous, and not rigidly structured data for a variety of scientific and commercial fields. In this paper we discuss the solutions integrated in GEMS (Graph database Engine for Multithreaded Systems), a software framework for implementing RDF databases on commodity, distributed-memory high-performance clusters. Unlike the majority of current RDF databases, GEMS has been designed from the ground up to primarily employ graph-based methods. This is reflected in all the layers of its stack. The GEMS framework is composed of: a SPARQL-to-C++more » compiler, a library of data structures and related methods to access and modify them, and a custom runtime providing lightweight software multithreading, network messages aggregation and a partitioned global address space. We provide an overview of the framework, detailing its component and how they have been closely designed and customized to address issues of graph methods applied to large-scale datasets on clusters. We discuss in details the principles that enable automatic translation of the queries (expressed in SPARQL, the query language of choice for RDF databases) to graph methods, and identify differences with respect to other RDF databases.« less
Development and evaluation of an instrumented linkage system for total knee surgery.
Walker, Peter S; Wei, Chih-Shing; Forman, Rachel E; Balicki, M A
2007-10-01
The principles and application of total knee surgery using optical tracking have been well demonstrated, but electromagnetic tracking may offer further advantages. We asked whether an instrumented linkage that attaches directly to the bone can maintain the accuracy of the optical and electromagnetic systems but be quicker, more convenient, and less expensive to use. Initial testing using a table-mounted digitizer to navigate a drill guide for placing pins to mount a cutting guide demonstrated the feasibility in terms of access and availability. A first version (called the Mark 1) instrumented linkage designed to fix directly to the bone was constructed and software was written to carry out a complete total knee replacement procedure. The results showed the system largely fulfilled these goals, but some surgeons found that using a visual display for pin placement was difficult and time consuming. As a result, a second version of a linkage system (called the K-Link) was designed to further develop the concept. User-friendly flexible software was developed for facilitating each step quickly and accurately while the placement of cutting guides was facilitated. We concluded that an instrumented linkage system could be a useful and potentially lower-cost option to the current systems for total knee replacement and could possibly have application to other surgical procedures.
NASA Astrophysics Data System (ADS)
Arkadov, G. V.; Zhukavin, A. P.; Kroshilin, A. E.; Parshikov, I. A.; Solov'ev, S. L.; Shishov, A. V.
2014-10-01
The article describes the "Virtual Digital VVER-Based Nuclear Power Plant" computerized system comprising a totality of verified initial data (sets of input data for a model intended for describing the behavior of nuclear power plant (NPP) systems in design and emergency modes of their operation) and a unified system of new-generation computation codes intended for carrying out coordinated computation of the variety of physical processes in the reactor core and NPP equipment. Experiments with the demonstration version of the "Virtual Digital VVER-Based NPP" computerized system has shown that it is in principle possible to set up a unified system of computation codes in a common software environment for carrying out interconnected calculations of various physical phenomena at NPPs constructed according to the standard AES-2006 project. With the full-scale version of the "Virtual Digital VVER-Based NPP" computerized system put in operation, the concerned engineering, design, construction, and operating organizations will have access to all necessary information relating to the NPP power unit project throughout its entire lifecycle. The domestically developed commercial-grade software product set to operate as an independently operating application to the project will bring about additional competitive advantages in the modern market of nuclear power technologies.
NASA Technical Reports Server (NTRS)
Wiener, Earl L. (Editor); Nagel, David C. (Editor)
1988-01-01
The fundamental principles of human-factors (HF) analysis for aviation applications are examined in a collection of reviews by leading experts, with an emphasis on recent developments. The aim is to provide information and guidance to the aviation community outside the HF field itself. Topics addressed include the systems approach to HF, system safety considerations, the human senses in flight, information processing, aviation workloads, group interaction and crew performance, flight training and simulation, human error in aviation operations, and aircrew fatigue and circadian rhythms. Also discussed are pilot control; aviation displays; cockpit automation; HF aspects of software interfaces; the design and integration of cockpit-crew systems; and HF issues for airline pilots, general aviation, helicopters, and ATC.
Middle-aged women's preferred theory-based features in mobile physical activity applications.
Ehlers, Diane K; Huberty, Jennifer L
2014-09-01
The purpose of this study was to describe which theory-based behavioral and technological features middle-aged women prefer to be included in a mobile application designed to help them adopt and maintain regular physical activity (PA). Women aged 30 to 64 years (N = 120) completed an online survey measuring their demographics and mobile PA application preferences. The survey was developed upon behavioral principles of Social Cognitive Theory, recent mobile app research, and technology adoption principles of the Unified Theory of Acceptance and Use of Technology. Frequencies were calculated and content analyses conducted to identify which features women most preferred. Behavioral features that help women self-regulate their PA (PA tracking, goal-setting, progress monitoring) were most preferred. Technological features that enhance perceived effort expectancy and playfulness were most preferred. Many women reported the desire to interact and compete with others through the application. Theory-based PA self-regulation features and theory-based design features that improve perceived effort expectancy and playfulness may be most beneficial in a mobile PA application for middle-aged women. Opportunities to interact with other people and the employment of social, game-like activities may also be attractive. Interdisciplinary engagement of experts in PA behavior change, technology adoption, and software development is needed.
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE)
2005-04-01
PA 15213-3890 Methodical Design of Software Architecture Using an Architecture Design Assistant (ArchE) Felix Bachmann and Mark Klein Software...DATES COVERED 00-00-2005 to 00-00-2005 4. TITLE AND SUBTITLE Methodical Design of Software Architecture Using an Architecture Design Assistant...important for architecture design – quality requirements and constraints are most important Here’s some evidence: If the only concern is
Software Design Methods for Real-Time Systems
1989-12-01
This module describes the concepts and methods used in the software design of real time systems . It outlines the characteristics of real time systems , describes...the role of software design in real time system development, surveys and compares some software design methods for real - time systems , and
NASA Astrophysics Data System (ADS)
Bai, Xian-Xu; Zhong, Wei-Min; Zou, Qi; Zhu, An-Ding; Sun, Jun
2018-07-01
Based on the structural design concept of ‘functional integration’, this paper proposes the principle of a power-generated magnetorheological energy absorber with velocity self-sensing capability (PGMREA), which realizes the integration of controllable damping mechanism and mechanical energy-electrical energy conversion mechanism in structure profile and multiple functions in function profile, including controllable damping, power generation and velocity self-sensing. The controllable damping mechanism consists of an annular gap and a ball screw. The annular gap fulfilled with MR fluid that operates in pure shear mode under controllable electromagnetic field. The rotational damping torque generated from the controllable damping mechanism is translated to a linear damping force via the ball screw. The mechanical energy-electrical energy conversion mechanism is realized by the ball screw and a generator composed of a permanent magnet rotor and a generator stator. The ball screw based mechanical energy-electrical energy conversion mechanism converts the mechanical energy of excitations to electrical energy for storage or directly to power the controllable damping mechanism of the PGMREA. The velocity self-sensing capability of the PGMREA is achieved via signal processing using the mechanical energy-electrical energy conversion information. Based on the principle of the proposed PGMREA, the mathematical model of the PGMREA is established, including the damping force, generated power and self-sensing velocity. The electromagnetic circuit of the PGMREA is simulated and verified via a finite element analysis software ANSYS. The developed PGMREA prototype is experimentally tested on a servo-hydraulic testing system. The model-based predicted results and the experimental results are compared and analyzed.
Design and optimization for the occupant restraint system of vehicle based on a single freedom model
NASA Astrophysics Data System (ADS)
Zhang, Junyuan; Ma, Yue; Chen, Chao; Zhang, Yan
2013-05-01
Throughout the vehicle crash event, the interactions between vehicle, occupant, restraint system (VOR) are complicated and highly non-linear. CAE and physical tests are the most widely used in vehicle passive safety development, but they can only be done with the detailed 3D model or physical samples. Often some design errors and imperfections are difficult to correct at that time, and a large amount of time will be needed. A restraint system concept design approach which based on single-degree-of-freedom occupant-vehicle model (SDOF) is proposed in this paper. The interactions between the restraint system parameters and the occupant responses in a crash are studied from the view of mechanics and energy. The discrete input and the iterative algorithm method are applied to the SDOF model to get the occupant responses quickly for arbitrary excitations (impact pulse) by MATLAB. By studying the relationships between the ridedown efficiency, the restraint stiffness, and the occupant response, the design principle of the restraint stiffness aiming to reduce occupant injury level during conceptual design is represented. Higher ridedown efficiency means more occupant energy absorbed by the vehicle, but the research result shows that higher ridedown efficiency does not mean lower occupant injury level. A proper restraint system design principle depends on two aspects. On one hand, the restraint system should lead to as high ridedown efficiency as possible, and at the same time, the restraint system should maximize use of the survival space to reduce the occupant deceleration level. As an example, an optimization of a passenger vehicle restraint system is designed by the concept design method above, and the final results are validated by MADYMO, which is the most widely used software in restraint system design, and the sled test. Consequently, a guideline and method for the occupant restraint system concept design is established in this paper.
Exploring symmetry as an avenue to the computational design of large protein domains.
Fortenberry, Carie; Bowman, Elizabeth Anne; Proffitt, Will; Dorr, Brent; Combs, Steven; Harp, Joel; Mizoue, Laura; Meiler, Jens
2011-11-16
It has been demonstrated previously that symmetric, homodimeric proteins are energetically favored, which explains their abundance in nature. It has been proposed that such symmetric homodimers underwent gene duplication and fusion to evolve into protein topologies that have a symmetric arrangement of secondary structure elements--"symmetric superfolds". Here, the ROSETTA protein design software was used to computationally engineer a perfectly symmetric variant of imidazole glycerol phosphate synthase and its corresponding symmetric homodimer. The new protein, termed FLR, adopts the symmetric (βα)(8) TIM-barrel superfold. The protein is soluble and monomeric and exhibits two-fold symmetry not only in the arrangement of secondary structure elements but also in sequence and at atomic detail, as verified by crystallography. When cut in half, FLR dimerizes readily to form the symmetric homodimer. The successful computational design of FLR demonstrates progress in our understanding of the underlying principles of protein stability and presents an attractive strategy for the in silico construction of larger protein domains from smaller pieces.
A Verification Method for MASOES.
Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H
2013-02-01
MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.
High sensitivity capacitive MEMS microphone with spring supported diaphragm
NASA Astrophysics Data System (ADS)
Mohamad, Norizan; Iovenitti, Pio; Vinay, Thurai
2007-12-01
Capacitive microphones (condenser microphones) work on a principle of variable capacitance and voltage by the movement of its electrically charged diaphragm and back plate in response to sound pressure. There has been considerable research carried out to increase the sensing performance of microphones while reducing their size to cater for various modern applications such as mobile communication and hearing aid devices. This paper reviews the development and current performance of several condenser MEMS microphone designs, and introduces a microphone with spring supported diaphragm to further improve condenser microphone performance. The numerical analysis using Coventor FEM software shows that this new microphone design has a higher mechanical sensitivity compared to the existing edge clamped flat diaphragm condenser MEMS microphone. The spring supported diaphragm is shown to have a flat frequency response up to 7 kHz and more stable under the variations of the diaphragm residual stress. The microphone is designed to be easily fabricated using the existing silicon fabrication technology and the stability against the residual stress increases its reproducibility.
Moving code - Sharing geoprocessing logic on the Web
NASA Astrophysics Data System (ADS)
Müller, Matthias; Bernard, Lars; Kadner, Daniel
2013-09-01
Efficient data processing is a long-standing challenge in remote sensing. Effective and efficient algorithms are required for product generation in ground processing systems, event-based or on-demand analysis, environmental monitoring, and data mining. Furthermore, the increasing number of survey missions and the exponentially growing data volume in recent years have created demand for better software reuse as well as an efficient use of scalable processing infrastructures. Solutions that address both demands simultaneously have begun to slowly appear, but they seldom consider the possibility to coordinate development and maintenance efforts across different institutions, community projects, and software vendors. This paper presents a new approach to share, reuse, and possibly standardise geoprocessing logic in the field of remote sensing. Drawing from the principles of service-oriented design and distributed processing, this paper introduces moving-code packages as self-describing software components that contain algorithmic code and machine-readable descriptions of the provided functionality, platform, and infrastructure, as well as basic information about exploitation rights. Furthermore, the paper presents a lean publishing mechanism by which to distribute these packages on the Web and to integrate them in different processing environments ranging from monolithic workstations to elastic computational environments or "clouds". The paper concludes with an outlook toward community repositories for reusable geoprocessing logic and their possible impact on data-driven science in general.
Thyroid Cancer and Tumor Collaborative Registry (TCCR).
Shats, Oleg; Goldner, Whitney; Feng, Jianmin; Sherman, Alexander; Smith, Russell B; Sherman, Simon
2016-01-01
A multicenter, web-based Thyroid Cancer and Tumor Collaborative Registry (TCCR, http://tccr.unmc.edu) allows for the collection and management of various data on thyroid cancer (TC) and thyroid nodule (TN) patients. The TCCR is coupled with OpenSpecimen, an open-source biobank management system, to annotate biospecimens obtained from the TCCR subjects. The demographic, lifestyle, physical activity, dietary habits, family history, medical history, and quality of life data are provided and may be entered into the registry by subjects. Information on diagnosis, treatment, and outcome is entered by the clinical personnel. The TCCR uses advanced technical and organizational practices, such as (i) metadata-driven software architecture (design); (ii) modern standards and best practices for data sharing and interoperability (standardization); (iii) Agile methodology (project management); (iv) Software as a Service (SaaS) as a software distribution model (operation); and (v) the confederation principle as a business model (governance). This allowed us to create a secure, reliable, user-friendly, and self-sustainable system for TC and TN data collection and management that is compatible with various end-user devices and easily adaptable to a rapidly changing environment. Currently, the TCCR contains data on 2,261 subjects and data on more than 28,000 biospecimens. Data and biological samples collected by the TCCR are used in developing diagnostic, prevention, treatment, and survivorship strategies against TC.
NASA Technical Reports Server (NTRS)
Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.
1987-01-01
The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.
Development and Application of Collaborative Optimization Software for Plate - fin Heat Exchanger
NASA Astrophysics Data System (ADS)
Chunzhen, Qiao; Ze, Zhang; Jiangfeng, Guo; Jian, Zhang
2017-12-01
This paper introduces the design ideas of the calculation software and application examples for plate - fin heat exchangers. Because of the large calculation quantity in the process of designing and optimizing heat exchangers, we used Visual Basic 6.0 as a software development carrier to design a basic calculation software to reduce the calculation quantity. Its design condition is plate - fin heat exchanger which was designed according to the boiler tail flue gas. The basis of the software is the traditional design method of the plate-fin heat exchanger. Using the software for design and calculation of plate-fin heat exchangers, discovery will effectively reduce the amount of computation, and similar to traditional methods, have a high value.
Averting Denver Airports on a Chip
NASA Technical Reports Server (NTRS)
Sullivan, Kevin J.
1995-01-01
As a result of recent advances in software engineering capabilities, we are now in a more stable environment. De-facto hardware and software standards are emerging. Work on software architecture and design patterns signals a consensus on the importance of early system-level design decisions, and agreements on the uses of certain paradigmatic software structures. We now routinely build systems that would have been risky or infeasible a few years ago. Unfortunately, technological developments threaten to destabilize software design again. Systems designed around novel computing and peripheral devices will spark ambitious new projects that will stress current software design and engineering capabilities. Micro-electro-mechanical systems (MEMS) and related technologies provide the physical basis for new systems with the potential to produce this kind of destabilizing effect. One important response to anticipated software engineering and design difficulties is carefully directed engineering-scientific research. Two specific problems meriting substantial research attention are: A lack of sufficient means to build software systems by generating, extending, specializing, and integrating large-scale reusable components; and a lack of adequate computational and analytic tools to extend and aid engineers in maintaining intellectual control over complex software designs.
NASA Technical Reports Server (NTRS)
Yakimovsky, Y.
1974-01-01
An approach to simultaneous interpretation of objects in complex structures so as to maximize a combined utility function is presented. Results of the application of a computer software system to assign meaning to regions in a segmented image based on the principles described in this paper and on a special interactive sequential classification learning system, which is referenced, are demonstrated.
Learning from hackers: open-source clinical trials.
Dunn, Adam G; Day, Richard O; Mandl, Kenneth D; Coiera, Enrico
2012-05-02
Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. A similar gap was addressed in the software industry by their open-source software movement. Here, we examine how the social and technical principles of the movement can guide the growth of an open-source clinical trial community.
Telescope Array Control System Based on Wireless Touch Screen Platform
NASA Astrophysics Data System (ADS)
Fu, X. N.; Huang, L.; Wei, J. Y.
2016-07-01
GWAC (Ground-based Wide Angle Cameras) are the ground-based observational instruments of the Sino-French cooperation SVOM (Space Variable Objects Monitor) astronomical satellite, and Mini-GWAC is a pathfinder and supplement of GWAC. In the context of the Mini-GWAC telescope array, this paper introduces the design and implementation of a kind of telescope array control system, which is based on wireless serial interface module to communicate. We describe the development and implementation of the system in detail in terms of control system principle, system hardware structure, software design, experiment, and test. The system uses the touch-control PC which is based on the Windows CE system as the upper-computer, the wireless transceiver module and PLC (Programmable Logic Controller) as the core. It has the advantages of low cost, reliable data transmission, and simple operation. So far, the control system has been applied to Mini-GWAC successfully.
Telescope Array Control System Based on Wireless Touch Screen Platform
NASA Astrophysics Data System (ADS)
Fu, Xia-nan; Huang, Lei; Wei, Jian-yan
2017-10-01
Ground-based Wide Angle Cameras (GMAC) are the ground-based observational facility for the SVOM (Space Variable Object Monitor) astronomical satellite of Sino-French cooperation, and Mini-GWAC is the pathfinder and supplement of GWAC. In the context of the Mini-GWAC telescope array, this paper introduces the design and implementation of a kind of telescope array control system based on the wireless touch screen platform. We describe the development and implementation of the system in detail in terms of control system principle, system hardware structure, software design, experiment, and test etc. The system uses a touch-control PC which is based on the Windows CE system as the upper computer, while the wireless transceiver module and PLC (Programmable Logic Controller) are taken as the system kernel. It has the advantages of low cost, reliable data transmission, and simple operation. And the control system has been applied to the Mini-GWAC successfully.
Optical aurora detectors: using natural optics to motivate education and outreach
NASA Astrophysics Data System (ADS)
Shaw, Joseph A.; Way, Jesse M.; Pust, Nathan J.; Nugent, Paul W.; Coate, Hans; Balster, Daniel
2009-06-01
Natural optical phenomena enjoy a level of interest sufficiently high among a wide array of people to provide ideal education and outreach opportunities. The aurora promotes particularly high interest, perhaps because of its relative rarity in the areas of the world where most people live. A project is being conducted at Montana State University to use common interest and curiosity about auroras to motivate learning and outreach through the design and deployment of optical sensor systems that detect the presence of an auroral display and send cell phone messages to alert interested people. Project participants learn about the physics and optics of the aurora, basic principles of optical system design, radiometric calculations and calibrations, electro-optical detectors, electronics, embedded computer systems, and computer software. The project is moving into a stage where it will provide greatly expanded outreach and education opportunities as optical aurora detector kits are created and disbursed to colleges around our region.
Inauen, A; Jenny, G J; Bauer, G F
2012-06-01
This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.
IRAF: Lessons for Project Longevity
NASA Astrophysics Data System (ADS)
Fitzpatrick, M.
2012-09-01
Although sometimes derided as a product of the 80's (or more generously, as a legacy system), the fact that IRAF remains a productive work environment for many astronomers today is a testament to one of its core design principles, portability. This idea has meaning beyond a survey of platforms in use at the peak of a project's active development; for true longevity, a project must be able to weather completely unimagined OS, hardware, data, staffing and political environments. A lack of attention to the broader issues of portability, or the true lifespan of a software system (e.g. archival science may extend for years beyond a given mission, upgraded or similar instruments may be developed that require the same reduction/analysis techniques, etc) might require costly new software development instead of simple code re-use. Additionally, one under-appreciated benefit to having a long history in the community is the trust that users have established in the science results produced by a particular system. However a software system evolves architecturally, preserving this trust (and by implication, the applications themselves) is the key to continued success. In this paper, we will discuss how the system architecture has allowed IRAF to navigate the many changes in computing since it was first released. It is hoped that the lessons learned can be adopted by software systems being built today so that they too can survive long enough to one day earn the distinction of being called a legacy system.
NASA Astrophysics Data System (ADS)
Priatna, Nanang
2017-08-01
The use of Information and Communication Technology (ICT) in mathematics instruction will help students in building conceptual understanding. One of the software products used in mathematics instruction is GeoGebra. The program enables simple visualization of complex geometric concepts and helps improve students' understanding of geometric concepts. Instruction applying brain-based learning principles is one oriented at the efforts of naturally empowering the brain potentials which enable students to build their own knowledge. One of the goals of mathematics instruction in school is to develop mathematical communication ability. Mathematical representation is regarded as a part of mathematical communication. It is a description, expression, symbolization, or modeling of mathematical ideas/concepts as an attempt of clarifying meanings or seeking for solutions to the problems encountered by students. The research aims to develop a learning model and teaching materials by applying the principles of brain-based learning aided by GeoGebra to improve junior high school students' mathematical representation ability. It adopted a quasi-experimental method with the non-randomized control group pretest-posttest design and the 2x3 factorial model. Based on analysis of the data, it is found that the increase in the mathematical representation ability of students who were treated with mathematics instruction applying the brain-based learning principles aided by GeoGebra was greater than the increase of the students given conventional instruction, both as a whole and based on the categories of students' initial mathematical ability.
A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components
2005-05-01
interoperability, b) distributed resource discovery, and c) validation of quality requirements. Principles and prototypical systems were created to demonstrate the successful completion of the research.
Shuttle mission simulator software conceptual design
NASA Technical Reports Server (NTRS)
Burke, J. F.
1973-01-01
Software conceptual designs (SCD) are presented for meeting the simulator requirements for the shuttle missions. The major areas of the SCD discussed include: malfunction insertion, flight software, applications software, systems software, and computer complex.
Software Assurance: Five Essential Considerations for Acquisition Officials
2007-05-01
May 2007 www.stsc.hill.af.mil 17 2 • address security concerns in the software development life cycle ( SDLC )? • Are there formal software quality...What threat modeling process, if any, is used when designing the software ? What analysis, design, and construction tools are used by your software design...the-shelf (COTS), government off-the-shelf (GOTS), open- source, embedded, and legacy software . Attackers exploit unintentional vulnerabil- ities or
ERIC Educational Resources Information Center
Cremers, Petra H. M.; Wals, Arjen E. J.; Wesselink, Renate; Mulder, Martin
2017-01-01
Educational design research yields design knowledge, often in the form of design principles or guidelines that provide the rationale or "know-why" for the design of educational interventions. As such, design principles can be utilized by designers in contexts other than the research context in which they were generated. Although research…
High-Fidelity Roadway Modeling and Simulation
NASA Technical Reports Server (NTRS)
Wang, Jie; Papelis, Yiannis; Shen, Yuzhong; Unal, Ozhan; Cetin, Mecit
2010-01-01
Roads are an essential feature in our daily lives. With the advances in computing technologies, 2D and 3D road models are employed in many applications, such as computer games and virtual environments. Traditional road models were generated by professional artists manually using modeling software tools such as Maya and 3ds Max. This approach requires both highly specialized and sophisticated skills and massive manual labor. Automatic road generation based on procedural modeling can create road models using specially designed computer algorithms or procedures, reducing the tedious manual editing needed for road modeling dramatically. But most existing procedural modeling methods for road generation put emphasis on the visual effects of the generated roads, not the geometrical and architectural fidelity. This limitation seriously restricts the applicability of the generated road models. To address this problem, this paper proposes a high-fidelity roadway generation method that takes into account road design principles practiced by civil engineering professionals, and as a result, the generated roads can support not only general applications such as games and simulations in which roads are used as 3D assets, but also demanding civil engineering applications, which requires accurate geometrical models of roads. The inputs to the proposed method include road specifications, civil engineering road design rules, terrain information, and surrounding environment. Then the proposed method generates in real time 3D roads that have both high visual and geometrical fidelities. This paper discusses in details the procedures that convert 2D roads specified in shape files into 3D roads and civil engineering road design principles. The proposed method can be used in many applications that have stringent requirements on high precision 3D models, such as driving simulations and road design prototyping. Preliminary results demonstrate the effectiveness of the proposed method.
Optimizing micromixer design for enhancing dielectrophoretic microconcentrator performance.
Lee, Hsu-Yi; Voldman, Joel
2007-03-01
We present an investigation into optimizing micromixer design for enhancing dielectrophoretic (DEP) microconcentrator performance. DEP-based microconcentrators use the dielectrophoretic force to collect particles on electrodes. Because the DEP force generated by electrodes decays rapidly away from the electrodes, DEP-based microconcentrators are only effective at capturing particles from a limited cross section of the input liquid stream. Adding a mixer can circulate the input liquid, increasing the probability that particles will drift near the electrodes for capture. Because mixers for DEP-based microconcentrators aim to circulate particles, rather than mix two species, design specifications for such mixers may be significantly different from that for conventional mixers. Here we investigated the performance of patterned-groove micromixers on particle trapping efficiency in DEP-based microconcentrators numerically and experimentally. We used modeling software to simulate the particle motion due to various forces on the particle (DEP, hydrodynamic, etc.), allowing us to predict trapping efficiency. We also conducted trapping experiments and measured the capture efficiency of different micromixer configurations, including the slanted groove, staggered herringbone, and herringbone mixers. Finally, we used these analyses to illustrate the design principles of mixers for DEP-based concentrators.
Study on the high-frequency laser measurement of slot surface difference
NASA Astrophysics Data System (ADS)
Bing, Jia; Lv, Qiongying; Cao, Guohua
2017-10-01
In view of the measurement of the slot surface difference in the large-scale mechanical assembly process, Based on high frequency laser scanning technology and laser detection imaging principle, This paragraph designs a double galvanometer pulse laser scanning system. Laser probe scanning system architecture consists of three parts: laser ranging part, mechanical scanning part, data acquisition and processing part. The part of laser range uses high-frequency laser range finder to measure the distance information of the target shape and get a lot of point cloud data. Mechanical scanning part includes high-speed rotary table, high-speed transit and related structure design, in order to realize the whole system should be carried out in accordance with the design of scanning path on the target three-dimensional laser scanning. Data processing part mainly by FPGA hardware with LAbVIEW software to design a core, to process the point cloud data collected by the laser range finder at the high-speed and fitting calculation of point cloud data, to establish a three-dimensional model of the target, so laser scanning imaging is realized.
NASA Technical Reports Server (NTRS)
Valley, Lois
1989-01-01
The SPS product, Classic-Ada, is a software tool that supports object-oriented Ada programming with powerful inheritance and dynamic binding. Object Oriented Design (OOD) is an easy, natural development paradigm, but it is not supported by Ada. Following the DOD Ada mandate, SPS developed Classic-Ada to provide a tool which supports OOD and implements code in Ada. It consists of a design language, a code generator and a toolset. As a design language, Classic-Ada supports the object-oriented principles of information hiding, data abstraction, dynamic binding, and inheritance. It also supports natural reuse and incremental development through inheritance, code factoring, and Ada, Classic-Ada, dynamic binding and static binding in the same program. Only nine new constructs were added to Ada to provide object-oriented design capabilities. The Classic-Ada code generator translates user application code into fully compliant, ready-to-run, standard Ada. The Classic-Ada toolset is fully supported by SPS and consists of an object generator, a builder, a dictionary manager, and a reporter. Demonstrations of Classic-Ada and the Classic-Ada Browser were given at the workshop.
Software Prototyping: Designing Systems for Users.
ERIC Educational Resources Information Center
Spies, Phyllis Bova
1983-01-01
Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…
Toward the First Data Acquisition Standard in Synthetic Biology.
Sainz de Murieta, Iñaki; Bultelle, Matthieu; Kitney, Richard I
2016-08-19
This paper describes the development of a new data acquisition standard for synthetic biology. This comprises the creation of a methodology that is designed to capture all the data, metadata, and protocol information associated with biopart characterization experiments. The new standard, called DICOM-SB, is based on the highly successful Digital Imaging and Communications in Medicine (DICOM) standard in medicine. A data model is described which has been specifically developed for synthetic biology. The model is a modular, extensible data model for the experimental process, which can optimize data storage for large amounts of data. DICOM-SB also includes services orientated toward the automatic exchange of data and information between modalities and repositories. DICOM-SB has been developed in the context of systematic design in synthetic biology, which is based on the engineering principles of modularity, standardization, and characterization. The systematic design approach utilizes the design, build, test, and learn design cycle paradigm. DICOM-SB has been designed to be compatible with and complementary to other standards in synthetic biology, including SBOL. In this regard, the software provides effective interoperability. The new standard has been tested by experiments and data exchange between Nanyang Technological University in Singapore and Imperial College London.
NASA Astrophysics Data System (ADS)
Brezgin, V. I.; Brodov, Yu M.; Kultishev, A. Yu
2017-11-01
The report presents improvement methods review in the fields of the steam turbine units design and operation based on modern information technologies application. In accordance with the life cycle methodology support, a conceptual model of the information support system during life cycle main stages (LC) of steam turbine unit is suggested. A classifying system, which ensures the creation of sustainable information links between the engineer team (manufacture’s plant) and customer organizations (power plants), is proposed. Within report, the principle of parameterization expansion beyond the geometric constructions at the design and improvement process of steam turbine unit equipment is proposed, studied and justified. The report presents the steam turbine unit equipment design methodology based on the brand new oil-cooler design system that have been developed and implemented by authors. This design system combines the construction subsystem, which is characterized by extensive usage of family tables and templates, and computation subsystem, which includes a methodology for the thermal-hydraulic zone-by-zone oil coolers design calculations. The report presents data about the developed software for operational monitoring, assessment of equipment parameters features as well as its implementation on five power plants.
Software archeology: a case study in software quality assurance and design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macdonald, John M; Lloyd, Jane A; Turner, Cameron J
2009-01-01
Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less
Language and Program for Documenting Software Design
NASA Technical Reports Server (NTRS)
Kleine, H.; Zepko, T. M.
1986-01-01
Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.
Automatic extraction and visualization of object-oriented software design metrics
NASA Astrophysics Data System (ADS)
Lakshminarayana, Anuradha; Newman, Timothy S.; Li, Wei; Talburt, John
2000-02-01
Software visualization is a graphical representation of software characteristics and behavior. Certain modes of software visualization can be useful in isolating problems and identifying unanticipated behavior. In this paper we present a new approach to aid understanding of object- oriented software through 3D visualization of software metrics that can be extracted from the design phase of software development. The focus of the paper is a metric extraction method and a new collection of glyphs for multi- dimensional metric visualization. Our approach utilize the extensibility interface of a popular CASE tool to access and automatically extract the metrics from Unified Modeling Language class diagrams. Following the extraction of the design metrics, 3D visualization of these metrics are generated for each class in the design, utilizing intuitively meaningful 3D glyphs that are representative of the ensemble of metrics. Extraction and visualization of design metrics can aid software developers in the early study and understanding of design complexity.
Software cost/resource modeling: Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. J.
1980-01-01
A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
Expert system verification and validation study: ES V/V Workshop
NASA Technical Reports Server (NTRS)
French, Scott; Hamilton, David
1992-01-01
The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.
Research on Visualization Design Method in the Field of New Media Software Engineering
NASA Astrophysics Data System (ADS)
Deqiang, Hu
2018-03-01
In the new period of increasingly developed science and technology, with the increasingly fierce competition in the market and the increasing demand of the masses, new design and application methods have emerged in the field of new media software engineering, that is, the visualization design method. Applying the visualization design method to the field of new media software engineering can not only improve the actual operation efficiency of new media software engineering but more importantly the quality of software development can be enhanced by means of certain media of communication and transformation; on this basis, the progress and development of new media software engineering in China are also continuously promoted. Therefore, the application of visualization design method in the field of new media software engineering is analysed concretely in this article from the perspective of the overview of visualization design methods and on the basis of systematic analysis of the basic technology.
Case Study: Accelerating Process Improvement by Integrating the TSP and CMMI
2007-06-01
Could software development teams and indi- viduals apply similar principles to improve their work? Watts S . Humphrey , a founder of the process...was an authorized PSP instructor. At Schwalb’s urging, Watts Humphrey briefed the SLT on the PSP and TSP, and after the briefing, the team... Humphrey 96] Humphrey , Watts S . Introduction to the Personal Software Process. Boston, MA: Addison- Wesley Publishing Company, Inc., 1996 (ISBN
Li, Z; Fan, Y; Chen, G
1999-07-01
The coronary sinus blood flow can be figured out, which based on the principle of thermodilution, so long as gets the temperature of blood, indicator and mixture of blood and indicator respectively. This system is a smart slave module with single-chip-microcomputer. The structure and principles of hardware and the flow chart of software are described in detail.
Rácz, Norbert; Kormány, Róbert; Fekete, Jenő; Molnár, Imre
2015-04-10
Column technology needs further improvement even today. To get information of batch-to-batch repeatability, intelligent modeling software was applied. Twelve columns from the same production process, but from different batches were compared in this work. In this paper, the retention parameters of these columns with real life sample solutes were studied. The following parameters were selected for measurements: gradient time, temperature and pH. Based on calculated results, batch-to-batch repeatability of BEH columns was evaluated. Two parallel measurements on two columns from the same batch were performed to obtain information about the quality of packing. Calculating the average of individual working points at the highest critical resolution (R(s,crit)) it was found that the robustness, calculated with a newly released robustness module, had a success rate >98% among the predicted 3(6) = 729 experiments for all 12 columns. With the help of retention modeling all substances could be separated independently from the batch and/or packing, using the same conditions, having high robustness of the experiments. Copyright © 2015 Elsevier B.V. All rights reserved.
Information Management for Unmanned Systems: Combining DL-Reasoning with Publish/Subscribe
NASA Astrophysics Data System (ADS)
Moser, Herwig; Reichelt, Toni; Oswald, Norbert; Förster, Stefan
Sharing capabilities and information between collaborating entities by using modem information- and communication-technology is a core principle in complex distributed civil or military mission scenarios. Previous work proved the suitability of Service-oriented Architectures for modelling and sharing the participating entities' capabilities. Albeit providing a satisfactory model for capabilities sharing, pure service-orientation curtails expressiveness for information exchange as opposed to dedicated data-centric communication principles. In this paper we introduce an Information Management System which combines OWL-Ontologies and automated reasoning with Publish/Subscribe-Systems, providing for a shared but decoupled data model. While confirming existing related research results, we emphasise the novel application and lack of practical experience of using Semantic Web technologies in areas other than originally intended. That is, aiding decision support and software design in the context of a mission scenario for an unmanned system. Experiments within a complex simulation environment show the immediate benefits of a semantic information-management and -dissemination platform: Clear separation of concerns in code and data model, increased service re-usability and extensibility as well as regulation of data flow and respective system behaviour through declarative rules.
Design and Effects of Scenario Educational Software.
ERIC Educational Resources Information Center
Keegan, Mark
1993-01-01
Describes the development of educational computer software called scenario software that was designed to incorporate advances in cognitive, affective, and physiological research. Instructional methods are outlined; the need to change from didactic methods to discovery learning is explained; and scenario software design features are discussed. (24…
An empirical study of software design practices
NASA Technical Reports Server (NTRS)
Card, David N.; Church, Victor E.; Agresti, William W.
1986-01-01
Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.
Holt, Jerred; Bennett, Kevin B; Flach, John M
2015-01-01
Two sets of design principles for analogical visual displays, based on the concepts of emergent features and perceptual objects, are described. An interpretation of previous empirical findings for three displays (bar graph, polar graphic, alphanumeric) is provided from both perspectives. A fourth display (configural coordinate) was designed using principles of ecological interface design (i.e. direct perception). An experiment was conducted to evaluate performance (accuracy and latency of state identification) with these four displays. Numerous significant effects were obtained and a clear rank ordering of performance emerged (from best to worst): configural coordinate, bar graph, alphanumeric and polar graphic. These findings are consistent with principles of design based on emergent features; they are inconsistent with principles based on perceptual objects. Some limitations of the configural coordinate display are discussed and a redesign is provided. Practitioner Summary: Principles of ecological interface design, which emphasise the quality of very specific mappings between domain, display and observer constraints, are described; these principles are applicable to the design of all analogical graphical displays.
Software for simulation of a computed tomography imaging spectrometer using optical design software
NASA Astrophysics Data System (ADS)
Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.
2000-11-01
Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert
2005-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less
NASA Technical Reports Server (NTRS)
Tracz, Will
1990-01-01
Viewgraphs are presented on the designing of software for reuse. Topics include terminology, software reuse maxims, the science of programming, an interface design example, a modularization example, and reuse and implementation guidelines.
PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.
Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter
2016-04-01
Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wentzcovitch, R. M.; Da Silveira, P. R.; Wu, Z.; Yu, Y.
2013-12-01
Today first principles calculations in mineral physics play a fundamental role in understanding of the Earth. They complement experiments by expanding the pressure and temperature range for which properties can be obtained and provide access to atomic scale phenomena. Since the wealth of predictive first principles results can hardly be communicated in printed form, we have developed online applications where published results can be reproduced/verified online and extensive unpublished results can be generated in customized form. So far these applications have included thermodynamics properties of end-member phases and thermal elastic properties of end-member phases and few solid solutions. Extension of this software infrastructure to include other properties is in principle straightforward. This contribution will review the nature of results that can be generated (methods, thermodynamics domain, list of minerals, properties, etc) and nature of the software infrastructure. These applications are part of a more extensive cyber-infrastructure operating in the XSEDE - the VLab Science Gateway [1]. [1] https://www.xsede.org/web/guest/gateways-listing Research supported by NSF grants ATM-0428744 and EAR-1047629.
Meslin, Eric M; Schwartz, Peter H
2015-01-01
Ethics should guide the design of electronic health records (EHR), and recognized principles of bioethics can play an important role. This approach was recently adopted by a team of informaticists who are designing and testing a system where patients exert granular control over who views their personal health information. While this method of building ethics in from the start of the design process has significant benefits, questions remain about how useful the application of bioethics principles can be in this process, especially when principles conflict. For instance, while the ethical principle of respect for autonomy supports a robust system of granular control, the principles of beneficence and nonmaleficence counsel restraint due to the danger of patients being harmed by restrictions on provider access to data. Conflict between principles has long been recognized by ethicists and has even motivated attacks on approaches that state and apply principles. In this paper, we show how using ethical principles can help in the design of EHRs by first explaining how ethical principles can and should be used generally, and then by discussing how attention to details in specific cases can show that the tension between principles is not as bad as it initially appeared. We conclude by suggesting ways in which the application of these (and other) principles can add value to the ongoing discussion of patient involvement in their health care. This is a new approach to linking principles to informatics design that we expect will stimulate further interest.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, Huaying, E-mail: zhaoh3@mail.nih.gov; Schuck, Peter, E-mail: zhaoh3@mail.nih.gov
2015-01-01
Global multi-method analysis for protein interactions (GMMA) can increase the precision and complexity of binding studies for the determination of the stoichiometry, affinity and cooperativity of multi-site interactions. The principles and recent developments of biophysical solution methods implemented for GMMA in the software SEDPHAT are reviewed, their complementarity in GMMA is described and a new GMMA simulation tool set in SEDPHAT is presented. Reversible macromolecular interactions are ubiquitous in signal transduction pathways, often forming dynamic multi-protein complexes with three or more components. Multivalent binding and cooperativity in these complexes are often key motifs of their biological mechanisms. Traditional solution biophysicalmore » techniques for characterizing the binding and cooperativity are very limited in the number of states that can be resolved. A global multi-method analysis (GMMA) approach has recently been introduced that can leverage the strengths and the different observables of different techniques to improve the accuracy of the resulting binding parameters and to facilitate the study of multi-component systems and multi-site interactions. Here, GMMA is described in the software SEDPHAT for the analysis of data from isothermal titration calorimetry, surface plasmon resonance or other biosensing, analytical ultracentrifugation, fluorescence anisotropy and various other spectroscopic and thermodynamic techniques. The basic principles of these techniques are reviewed and recent advances in view of their particular strengths in the context of GMMA are described. Furthermore, a new feature in SEDPHAT is introduced for the simulation of multi-method data. In combination with specific statistical tools for GMMA in SEDPHAT, simulations can be a valuable step in the experimental design.« less
NASA Astrophysics Data System (ADS)
Kouznetsov, A.; Cully, C. M.; Knudsen, D. J.
2016-12-01
Changes in D-Region ionization caused by energetic particle precipitation are monitored by the Array for Broadband Observations of VLF/ELF Emissions (ABOVE) - a network of receivers deployed across Western Canada. The observed amplitudes and phases of subionospheric-propagating VLF signals from distant artificial transmitters depend sensitively on the free electron population created by precipitation of energetic charged particles. Those include both primary (electrons, protons and heavier ions) and secondary (cascades of ionized particles and electromagnetic radiation) components. We have designed and implemented a full-scale model to predict the received VLF signals based on first-principle charged particle transport calculations coupled to the Long Wavelength Propagation Capability (LWPC) software. Calculations of ionization rates and free electron densities are based on MCNP-6 (a general-purpose Monte Carlo N- Particle) software taking advantage of its capability of coupled neutron/photon/electron transport and novel library of cross-sections for low-energetic electron and photon interactions with matter. Cosmic ray calculations of background ionization are based on source spectra obtained both from PAMELA direct Cosmic Rays spectra measurements and based on the recently-implemented MCNP 6 galactic cosmic-ray source, scaled using our (Calgary) neutron monitor measurement results. Conversion from calculated fluxes (MCNP F4 tallies) to ionization rates for low-energy electrons are based on the total ionization cross-sections for oxygen and nitrogen molecules from the National Institute of Standard and Technology. We use our model to explore the complexity of the physical processes affecting VLF propagation.
LG based decision aid for naval tactical action officer's (TAO) workstation
NASA Astrophysics Data System (ADS)
Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg; Boyd, Ron
2005-05-01
In the increasingly NetCentric battlespace of the 21st century, Stilman Advanced Strategies Linguistic Geometry software has the potential to revolutionize the way that the Navy fights in two key areas: as a Tactical Decision Aid and for creating a relevant Common Operating Picture. Incorporating STILMAN's software into a prototype Tactical Action Officers (TAO) workstation as a Tactical Decision Aid (TDA) will allow warfighters to manage their assets more intelligently and effectively. This prototype workstation will be developed using human-centered design principles and will be an open, component-based architecture for combat control systems for future small surface combatants. It will integrate both uninhabited vehicles and onboard sensors and weapon systems across a squadron of small surface combatants. In addition, the hypergame representation of complex operations provides a paradigm for the presentation of a common operating picture to operators and personnel throughout the command hierarchy. In the hypergame technology there are game levels that span the range from the tactical to the global strategy level, with each level informing the others. This same principle will be applied to presenting the relevant common operating picture to operators. Each operator will receive a common operating picture that is appropriate for their level in the command hierarchy. The area covered by this operating picture and the level of detail contained within it will be dependent upon the specific tasks the operator is performing (supervisory vice tactical control) and the level of the operator (or command personnel) within the command hierarchy. Each level will inform the others to keep the picture concurrent and up-to-date.
ClassCompass: A Software Design Mentoring System
ERIC Educational Resources Information Center
Coelho, Wesley; Murphy, Gail
2007-01-01
Becoming a quality software developer requires practice under the guidance of an expert mentor. Unfortunately, in most academic environments, there are not enough experts to provide any significant design mentoring for software engineering students. To address this problem, we present a collaborative software design tool intended to maximize an…
78 FR 32988 - Core Principles and Other Requirements for Designated Contract Markets; Correction
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-03
... COMMODITY FUTURES TRADING COMMISSION 17 CFR Part 38 RIN 3038-AD09 Core Principles and Other... regarding Core Principles and Other Requirements for Designated Contract Markets by inserting a missing... regarding Core Principles and Other Requirements for Designated Contract Markets (77 FR 36612, June 19, 2012...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
Stiffness Parameter Design of Suspension Element of Under-Chassis-Equipment for A Rail Vehicle
NASA Astrophysics Data System (ADS)
Ma, Menglin; Wang, Chengqiang; Deng, Hai
2017-06-01
According to the frequency configuration requirements of the vibration of railway under-chassis-equipment, the three- dimension stiffness of the suspension elements of under-chassis-equipment is designed based on the static principle and dynamics principle. The design results of the concrete engineering case show that, compared with the design method based on the static principle, the three- dimension stiffness of the suspension elements designed by the dynamic principle design method is more uniform. The frequency and decoupling degree analysis show that the calculation frequency of under-chassis-equipment under the two design methods is basically the same as the predetermined frequency. Compared with the design method based on the static principle, the design method based on the dynamic principle is adopted. The decoupling degree can be kept high, and the coupling vibration of the corresponding vibration mode can be reduced effectively, which can effectively reduce the fatigue damage of the key parts of the hanging element.
Design Principles for the Information Architecture of a SMET Education Digital Library.
ERIC Educational Resources Information Center
Dong, Andy; Agogino, Alice M.
This implementation paper introduces principles for the information architecture of an educational digital library, principles that address the distinction between designing digital libraries for education and designing digital libraries for information retrieval in general. Design is a key element of any successful product. Good designers and…
A Mechanized Decision Support System for Academic Scheduling.
1986-03-01
an operational system called software. The first step in the development phase is Design . Designers destribute software control by factoring the Data...SUBJECT TERMS (Continue on reverse if necessary and identify by block number) ELD GROUP SUB-GROUP Scheduling, Decision Support System , Software Design ...scheduling system . It will also examine software - design techniques to identify the most appropriate method- ology for this problem. " - Chapter 3 will
2006-12-01
NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1987-01-01
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.
Schüller, Andreas; Suhartono, Marcel; Fechner, Uli; Tanrikulu, Yusuf; Breitung, Sven; Scheffer, Ute; Göbel, Michael W; Schneider, Gisbert
2008-02-01
Principles of fragment-based molecular design are presented and discussed in the context of de novo drug design. The underlying idea is to dissect known drug molecules in fragments by straightforward pseudo-retro-synthesis. The resulting building blocks are then used for automated assembly of new molecules. A particular question has been whether this approach is actually able to perform scaffold-hopping. A prospective case study illustrates the usefulness of fragment-based de novo design for finding new scaffolds. We were able to identify a novel ligand disrupting the interaction between the Tat peptide and TAR RNA, which is part of the human immunodeficiency virus (HIV-1) mRNA. Using a single template structure (acetylpromazine) as reference molecule and a topological pharmacophore descriptor (CATS), new chemotypes were automatically generated by our de novo design software Flux. Flux features an evolutionary algorithm for fragment-based compound assembly and optimization. Pharmacophore superimposition and docking into the target RNA suggest perfect matching between the template molecule and the designed compound. Chemical synthesis was straightforward, and bioactivity of the designed molecule was confirmed in a FRET assay. This study demonstrates the practicability of de novo design to generating RNA ligands containing novel molecular scaffolds.
The Development of a Fiber Optic Raman Temperature Measurement System for Rocket Flows
NASA Technical Reports Server (NTRS)
Degroot, Wim A.
1992-01-01
A fiberoptic Raman diagnostic system for H2/O2 rocket flows is currently under development. This system is designed for measurement of temperature and major species concentration in the combustion chamber and part of the nozzle of a 100 Newton thrust rocket currently undergoing testing. This paper describes a measurement system based on the spontaneous Raman scattering phenomenon. An analysis of the principles behind the technique is given. Software is developed to measure temperature and major species concentrations by comparing theoretical Raman scattering spectra with experimentally obtained spectra. Equipment selection and experimental approach are summarized. This experimental program is part of a program, which is in progress, to evaluate Navier-Stokes based analyses for this class of rocket.
Magnetic field simulation and shimming analysis of 3.0T superconducting MRI system
NASA Astrophysics Data System (ADS)
Yue, Z. K.; Liu, Z. Z.; Tang, G. S.; Zhang, X. C.; Duan, L. J.; Liu, W. C.
2018-04-01
3.0T superconducting magnetic resonance imaging (MRI) system has become the mainstream of modern clinical MRI system because of its high field intensity and high degree of uniformity and stability. It has broad prospects in scientific research and other fields. We analyze the principle of magnet designing in this paper. We also perform the magnetic field simulation and shimming analysis of the first 3.0T/850 superconducting MRI system in the world using the Ansoft Maxwell simulation software. We guide the production and optimization of the prototype based on the results of simulation analysis. Thus the magnetic field strength, magnetic field uniformity and magnetic field stability of the prototype is guided to achieve the expected target.
Improving performance with knowledge management
NASA Astrophysics Data System (ADS)
Kim, Sangchul
2018-06-01
People and organization are unable to easily locate their experience and knowledge, so meaningful data is usually fragmented, unstructured, not up-to-date and largely incomplete. Poor knowledge management (KM) leaves a company weak to their knowledge-base - or intellectual capital - walking out of the door each year, that is minimum estimated at 10%. Knowledge management (KM) can be defined as an emerging set of organizational design and operational principles, processes, organizational structures, applications and technologies that helps knowledge workers dramatically leverage their creativity and ability to deliver business value and to reap finally a competitive advantage. Then, this paper proposed various method and software starting with an understanding of the enterprise aspect, and gave inspiration to those who wanted to use KM.
Energy Systems Integration News | Energy Systems Integration Facility |
DOE-funded research projects that are integrating cybersecurity controls with power systems principles Management, a hardware and software system that mimics the communications, power systems, and cybersecurity
Vilardaga, Roger; Rizo, Javier; Zeng, Emily; Kientz, Julie A; Ries, Richard; Otis, Chad; Hernandez, Kayla
2018-01-16
Smoking rates in the United States have been reduced in the past decades to 15% of the general population. However, up to 88% of people with psychiatric symptoms still smoke, leading to high rates of disease and mortality. Therefore, there is a great need to develop smoking cessation interventions that have adequate levels of usability and can reach this population. The objective of this study was to report the rationale, ideation, design, user research, and final specifications of a novel smoking cessation app for people with serious mental illness (SMI) that will be tested in a feasibility trial. We used a variety of user-centered design methods and materials to develop the tailored smoking cessation app. This included expert panel guidance, a set of design principles and theory-based smoking cessation content, development of personas and paper prototyping, usability testing of the app prototype, establishment of app's core vision and design specification, and collaboration with a software development company. We developed Learn to Quit, a smoking cessation app designed and tailored to individuals with SMI that incorporates the following: (1) evidence-based smoking cessation content from Acceptance and Commitment Therapy and US Clinical Practice Guidelines for smoking cessation aimed at providing skills for quitting while addressing mental health symptoms, (2) a set of behavioral principles to increase retention and comprehension of smoking cessation content, (3) a gamification component to encourage and sustain app engagement during a 14-day period, (4) an app structure and layout designed to minimize usability errors in people with SMI, and (5) a set of stories and visuals that communicate smoking cessation concepts and skills in simple terms. Despite its increasing importance, the design and development of mHealth technology is typically underreported, hampering scientific innovation. This report describes the systematic development of the first smoking cessation app tailored to people with SMI, a population with very high rates of nicotine addiction, and offers new design strategies to engage this population. mHealth developers in smoking cessation and related fields could benefit from a design strategy that capitalizes on the role visual engagement, storytelling, and the systematic application of behavior analytic principles to deliver evidence-based content. ©Roger Vilardaga, Javier Rizo, Emily Zeng, Julie A Kientz, Richard Ries, Chad Otis, Kayla Hernandez. Originally published in JMIR Serious Games (http://games.jmir.org), 16.01.2018.
Contingency theoretic methodology for agent-based web-oriented manufacturing systems
NASA Astrophysics Data System (ADS)
Durrett, John R.; Burnell, Lisa J.; Priest, John W.
2000-12-01
The development of distributed, agent-based, web-oriented, N-tier Information Systems (IS) must be supported by a design methodology capable of responding to the convergence of shifts in business process design, organizational structure, computing, and telecommunications infrastructures. We introduce a contingency theoretic model for the use of open, ubiquitous software infrastructure in the design of flexible organizational IS. Our basic premise is that developers should change in the way they view the software design process from a view toward the solution of a problem to one of the dynamic creation of teams of software components. We postulate that developing effective, efficient, flexible, component-based distributed software requires reconceptualizing the current development model. The basic concepts of distributed software design are merged with the environment-causes-structure relationship from contingency theory; the task-uncertainty of organizational- information-processing relationships from information processing theory; and the concept of inter-process dependencies from coordination theory. Software processes are considered as employees, groups of processes as software teams, and distributed systems as software organizations. Design techniques already used in the design of flexible business processes and well researched in the domain of the organizational sciences are presented. Guidelines that can be utilized in the creation of component-based distributed software will be discussed.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
Multidisciplinary Concurrent Design Optimization via the Internet
NASA Technical Reports Server (NTRS)
Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand
2001-01-01
A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.
AEDT Software Requirements Documents - Draft
DOT National Transportation Integrated Search
2007-01-25
This software requirements document serves as the basis for designing and testing the Aviation Environmental Design Tool (AEDT) software. The intended audience for this document consists of the following groups: the AEDT designers, developers, and te...
Design of ocular for optical sight with long exit pupil distance
NASA Astrophysics Data System (ADS)
Zhu, Zhongyao; Li, Yuyao; Tian, Ailing
2017-02-01
In order to solve the injury of optical sight to shooters, which is produced by recoil for using artillery or firearms, and the usage problems of shooters' eye mask, headband and gas mask, the ocular with long exit pupil distance has been designed based on optical sighting system. The optical properties and aberration characteristics of ocular with long exit pupil distance has been analyzed, the structural style with positive-positive-negative three lens groups has been put forward. According to the aberration theory and the isoplanatic image formation principle, the focal power assignment expression has been deduced by adopting analytical method. By using of optical design software ZEMAX, the ocular with long exit pupil distance has been designed, the focal length of system is 20mm, the exit pupil diameter is 4mm, the field angle is 40°, the distance of exit pupil is 41mm, and the relative eye relief is greater than 2. The design results show if this method has been adopted, the transfer functions of each field are all greater than 0.15 when the ocular with long exit pupil distance locates on 45lp/mm, which can meet the use requirements of visual optical instruments.
Research-oriented teaching in optical design course and its function in education
NASA Astrophysics Data System (ADS)
Cen, Zhaofeng; Li, Xiaotong; Liu, Xiangdong; Deng, Shitao
2008-03-01
The principles and operation plans of research-oriented teaching in the course of computer aided optical design are presented, especially the mode of research in practice course. This program includes contract definition phase, project organization and execution, post project evaluation and discussion. Modes of academic organization are used in the practice course of computer aided optical design. In this course the students complete their design projects in research teams by autonomous group approach and cooperative exploration. In this research process they experience the interpersonal relationship in modern society, the importance of cooperation in team, the functions of each individual, the relationships between team members, the competition and cooperation in one academic group and with other groups, and know themselves objectively. In the design practice the knowledge of many academic fields is applied including applied optics, computer programming, engineering software and etc. The characteristic of interdisciplinary is very useful for academic research and makes the students be ready for innovation by integrating the knowledge of interdisciplinary field. As shown by the practice that this teaching mode has taken very important part in bringing up the abilities of engineering, cooperation, digesting the knowledge at a high level and problem analyzing and solving.
Next Gen One Portal Usability Evaluation
NASA Technical Reports Server (NTRS)
Cross, E. V., III; Perera, J. S.; Hanson, A. M.; English, K.; Vu, L.; Amonette, W.
2018-01-01
Each exercise device on the International Space Station (ISS) has a unique, customized software system interface with unique layouts / hierarchy, and operational principles that require significant crew training. Furthermore, the software programs are not adaptable and provide no real-time feedback or motivation to enhance the exercise experience and/or prevent injuries. Additionally, the graphical user interfaces (GUI) of these systems present information through multiple layers resulting in difficulty navigating to the desired screens and functions. These limitations of current exercise device GUI's lead to increased crew time spent on initiating, loading, performing exercises, logging data and exiting the system. To address these limitations a Next Generation One Portal (NextGen One Portal) Crew Countermeasure System (CMS) was developed, which utilizes the latest industry guidelines in GUI designs to provide an intuitive ease of use approach (i.e., 80% of the functionality gained within 5-10 minutes of initial use without/limited formal training required). This is accomplished by providing a consistent interface using common software to reduce crew training, increase efficiency & user satisfaction while also reducing development & maintenance costs. Results from the usability evaluations showed the NextGen One Portal UI having greater efficiency, learnability, memorability, usability and overall user experience than the current Advanced Resistive Exercise Device (ARED) UI used by astronauts on ISS. Specifically, the design of the One-Portal UI as an app interface similar to those found on the Apple and Google's App Store, assisted many of the participants in grasping the concepts of the interface with minimum training. Although the NextGen One-Portal UI was shown to be an overall better interface, observations by the test facilitators noted specific exercise tasks appeared to have a significant impact on the NextGen One-Portal UI efficiency. Future updates to the NextGen One Portal UI will address these inefficiencies.
Design principles for engaging and retaining virtual citizen scientists.
Wald, Dara M; Longo, Justin; Dobell, A R
2016-06-01
Citizen science initiatives encourage volunteer participants to collect and interpret data and contribute to formal scientific projects. The growth of virtual citizen science (VCS), facilitated through websites and mobile applications since the mid-2000s, has been driven by a combination of software innovations and mobile technologies, growing scientific data flows without commensurate increases in resources to handle them, and the desire of internet-connected participants to contribute to collective outputs. However, the increasing availability of internet-based activities requires individual VCS projects to compete for the attention of volunteers and promote their long-term retention. We examined program and platform design principles that might allow VCS initiatives to compete more effectively for volunteers, increase productivity of project participants, and retain contributors over time. We surveyed key personnel engaged in managing a sample of VCS projects to identify the principles and practices they pursued for these purposes and led a team in a heuristic evaluation of volunteer engagement, website or application usability, and participant retention. We received 40 completed survey responses (33% response rate) and completed a heuristic evaluation of 20 VCS program sites. The majority of the VCS programs focused on scientific outcomes, whereas the educational and social benefits of program participation, variables that are consistently ranked as important for volunteer engagement and retention, were incidental. Evaluators indicated usability, across most of the VCS program sites, was higher and less variable than the ratings for participant engagement and retention. In the context of growing competition for the attention of internet volunteers, increased attention to the motivations of virtual citizen scientists may help VCS programs sustain the necessary engagement and retention of their volunteers. © 2016 Society for Conservation Biology.
Reuseable Objects Software Environment (ROSE): Introduction to Air Force Software Reuse Workshop
NASA Technical Reports Server (NTRS)
Cottrell, William L.
1994-01-01
The Reusable Objects Software Environment (ROSE) is a common, consistent, consolidated implementation of software functionality using modern object oriented software engineering including designed-in reuse and adaptable requirements. ROSE is designed to minimize abstraction and reduce complexity. A planning model for the reverse engineering of selected objects through object oriented analysis is depicted. Dynamic and functional modeling are used to develop a system design, the object design, the language, and a database management system. The return on investment for a ROSE pilot program and timelines are charted.
Interface Specifications for the A-7E Shared Services Module.
1982-09-08
To illustrate the principles, the onboard software for the Navy’s A-7E aircraft will be redesigned and rewritten. The Shared Services module provides...purpose of the Shared Services module is to allow the remainder of the software to remain unchanged when the requirements-based rules for these values and...services change. This report describes the modular structure of the Shared Services module, and contains the abstract interface specifications for all
Information Interface Related Standards, Guidelines, and Recommended Practices.
1985-07-01
Application Workshop, IEEE, October 1984 (11) Software Portability and Standards, by Ingemar Dahlstrand, Ellis Horwood Ltd., 1984 (12) World of EDP...March 29, 1985, p. 42-48 * 9. "IBM’s Topview Plays to Poor Reviews of Early Users," Computer World , March 4, 1985, p. 5 10. "Lack of Software Standards...Information Symbols ISOiTR 7239-1984 - Development and Principles for Application of Public Information Symbols ISO/TR 8545 -1984 - Technical Drawings
A research review of quality assessment for software
NASA Technical Reports Server (NTRS)
1991-01-01
Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness.
Zhang, Fan; Briones, Andrea; Soloviev, Mikhail
2016-01-01
This chapter describes the principles of selection of antigenic peptides for the development of anti-peptide antibodies for use in microarray-based multiplex affinity assays and also with mass-spectrometry detection. The methods described here are mostly applicable to small to medium scale arrays. Although the same principles of peptide selection would be suitable for larger scale arrays (with 100+ features) the actual informatics software and printing methods may well be different. Because of the sheer number of proteins/peptides to be processed and analyzed dedicated software capable of processing all the proteins and an enterprise level array robotics may be necessary for larger scale efforts. This report aims to provide practical advice to those who develop or use arrays with up to ~100 different peptide or protein features.
Construction of the Dependence Matrix Based on the TRIZ Contradiction Matrix in OOD
NASA Astrophysics Data System (ADS)
Ma, Jianhong; Zhang, Quan; Wang, Yanling; Luo, Tao
In the Object-Oriented software design (OOD), design of the class and object, definition of the classes’ interface and inheritance levels and determination of dependent relations have a serious impact on the reusability and flexibility of the system. According to the concrete problems of design, how to select the right solution from the hundreds of the design schemas which has become the focus of attention of designers. After analyzing lots of software design schemas in practice and Object-Oriented design patterns, this paper constructs the dependence matrix of Object-Oriented software design filed, referring to contradiction matrix of TRIZ (Theory of Inventive Problem Solving) proposed by the former Soviet Union innovation master Altshuller. As the practice indicates, it provides a intuitive, common and standardized method for designers to choose the right design schema. Make research and communication more effectively, and also improve the software development efficiency and software quality.
Toward a Formal Model of the Design and Evolution of Software
1988-12-20
should have the flezibiity to support a variety of design methodologies, be compinhenaive enough to encompass the gamut of software lifecycle...the future. It should have the flezibility to support a variety of design methodologies, be comprehensive enough to encompass the gamut of software...variety of design methodologies, be comprehensive enough to encompass the gamut of software lifecycle activities, and be precise enough to provide the
SynGenics Optimization System (SynOptSys)
NASA Technical Reports Server (NTRS)
Ventresca, Carol; McMilan, Michelle L.; Globus, Stephanie
2013-01-01
The SynGenics Optimization System (SynOptSys) software application optimizes a product with respect to multiple, competing criteria using statistical Design of Experiments, Response-Surface Methodology, and the Desirability Optimization Methodology. The user is not required to be skilled in the underlying math; thus, SynOptSys can help designers and product developers overcome the barriers that prevent them from using powerful techniques to develop better pro ducts in a less costly manner. SynOpt-Sys is applicable to the design of any product or process with multiple criteria to meet, and at least two factors that influence achievement of those criteria. The user begins with a selected solution principle or system concept and a set of criteria that needs to be satisfied. The criteria may be expressed in terms of documented desirements or defined responses that the future system needs to achieve. Documented desirements can be imported into SynOptSys or created and documented directly within SynOptSys. Subsequent steps include identifying factors, specifying model order for each response, designing the experiment, running the experiment and gathering the data, analyzing the results, and determining the specifications for the optimized system. The user may also enter textual information as the project progresses. Data is easily edited within SynOptSys, and the software design enables full traceability within any step in the process, and facilitates reporting as needed. SynOptSys is unique in the way responses are defined and the nuances of the goodness associated with changes in response values for each of the responses of interest. The Desirability Optimization Methodology provides the basis of this novel feature. Moreover, this is a complete, guided design and optimization process tool with embedded math that can remain invisible to the user. It is not a standalone statistical program; it is a design and optimization system.
[Research progress of probe design software of oligonucleotide microarrays].
Chen, Xi; Wu, Zaoquan; Liu, Zhengchun
2014-02-01
DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.
SEPAC flight software detailed design specifications, volume 1
NASA Technical Reports Server (NTRS)
1982-01-01
The detailed design specifications (as built) for the SEPAC Flight Software are defined. The design includes a description of the total software system and of each individual module within the system. The design specifications describe the decomposition of the software system into its major components. The system structure is expressed in the following forms: the control-flow hierarchy of the system, the data-flow structure of the system, the task hierarchy, the memory structure, and the software to hardware configuration mapping. The component design description includes details on the following elements: register conventions, module (subroutines) invocaton, module functions, interrupt servicing, data definitions, and database structure.
Image enhancement software for underwater recovery operations: User's manual
NASA Astrophysics Data System (ADS)
Partridge, William J.; Therrien, Charles W.
1989-06-01
This report describes software for performing image enhancement on live or recorded video images. The software was developed for operational use during underwater recovery operations at the Naval Undersea Warfare Engineering Station. The image processing is performed on an IBM-PC/AT compatible computer equipped with hardware to digitize and display video images. The software provides the capability to provide contrast enhancement and other similar functions in real time through hardware lookup tables, to automatically perform histogram equalization, to capture one or more frames and average them or apply one of several different processing algorithms to a captured frame. The report is in the form of a user manual for the software and includes guided tutorial and reference sections. A Digital Image Processing Primer in the appendix serves to explain the principle concepts that are used in the image processing.
Capi text V.1--data analysis software for nailfold skin capillaroscopy.
Dobrev, Hristo P
2007-01-01
Nailfold skin capillaroscopy is a simple non-invasive method used to assess conditions of disturbed microcirculation such as Raynaud's phenomenon, acrocyanosis, perniones, connective tissue diseases, psoriasis, diabetes mellitus, neuropathy and vibration disease. To develop data analysis software aimed at assisting the documentation and analysis of a capillaroscopic investigation. SOFTWARE DESCRIPTION: The programme is based on a modular principle. The module "Nomenclatures" includes menus for the patients' data. The module "Examinations" includes menus for all general and specific aspects of the medical examination and capillaroscopic investigations. The modules "Settings" and "Information" include customization menus for the programme. The results of nailfold capillaroscopy can be printed in a short or expanded form. This software allows physicians to perform quick search by using various specified criteria and prepare analyses and reports. This software programme will facilitate any practitioner who performs nailfold skin capillaroscopy.
Architecture of a framework for providing information services for public transport.
García, Carmelo R; Pérez, Ricardo; Lorenzo, Alvaro; Quesada-Arencibia, Alexis; Alayón, Francisco; Padrón, Gabino
2012-01-01
This paper presents OnRoute, a framework for developing and running ubiquitous software that provides information services to passengers of public transportation, including payment systems and on-route guidance services. To achieve a high level of interoperability, accessibility and context awareness, OnRoute uses the ubiquitous computing paradigm. To guarantee the quality of the software produced, the reliable software principles used in critical contexts, such as automotive systems, are also considered by the framework. The main components of its architecture (run-time, system services, software components and development discipline) and how they are deployed in the transportation network (stations and vehicles) are described in this paper. Finally, to illustrate the use of OnRoute, the development of a guidance service for travellers is explained.
LUMA: A many-core, Fluid-Structure Interaction solver based on the Lattice-Boltzmann Method
NASA Astrophysics Data System (ADS)
Harwood, Adrian R. G.; O'Connor, Joseph; Sanchez Muñoz, Jonathan; Camps Santasmasas, Marta; Revell, Alistair J.
2018-01-01
The Lattice-Boltzmann Method at the University of Manchester (LUMA) project was commissioned to build a collaborative research environment in which researchers of all abilities can study fluid-structure interaction (FSI) problems in engineering applications from aerodynamics to medicine. It is built on the principles of accessibility, simplicity and flexibility. The LUMA software at the core of the project is a capable FSI solver with turbulence modelling and many-core scalability as well as a wealth of input/output and pre- and post-processing facilities. The software has been validated and several major releases benchmarked on supercomputing facilities internationally. The software architecture is modular and arranged logically using a minimal amount of object-orientation to maintain a simple and accessible software.
Research on infrared small-target tracking technology under complex background
NASA Astrophysics Data System (ADS)
Liu, Lei; Wang, Xin; Chen, Jilu; Pan, Tao
2012-10-01
In this paper, some basic principles and the implementing flow charts of a series of algorithms for target tracking are described. On the foundation of above works, a moving target tracking software base on the OpenCV is developed by the software developing platform MFC. Three kinds of tracking algorithms are integrated in this software. These two tracking algorithms are Kalman Filter tracking method and Camshift tracking method. In order to explain the software clearly, the framework and the function are described in this paper. At last, the implementing processes and results are analyzed, and those algorithms for tracking targets are evaluated from the two aspects of subjective and objective. This paper is very significant in the application of the infrared target tracking technology.
Khammarnia, Mohammad; Sharifian, Roxana; Zand, Farid; Keshtkaran, Ali; Barati, Omid
2016-09-01
This study aimed to identify the functional requirements of computerized provider order entry software and design this software in Iran. This study was conducted using review documentation, interview, and focus group discussions in Shiraz University of Medical Sciences, as the medical pole in Iran, in 2013-2015. The study sample consisted of physicians (n = 12) and nurses (n = 2) in the largest hospital in the southern part of Iran and information technology experts (n = 5) in Shiraz University of Medical Sciences. Functional requirements of the computerized provider order entry system were examined in three phases. Finally, the functional requirements were distributed in four levels, and accordingly, the computerized provider order entry software was designed. The software had seven main dimensions: (1) data entry, (2) drug interaction management system, (3) warning system, (4) treatment services, (5) ability to write in software, (6) reporting from all sections of the software, and (7) technical capabilities of the software. The nurses and physicians emphasized quick access to the computerized provider order entry software, order prescription section, and applicability of the software. The software had some items that had not been mentioned in other studies. Ultimately, the software was designed by a company specializing in hospital information systems in Iran. This study was the first specific investigation of computerized provider order entry software design in Iran. Based on the results, it is suggested that this software be implemented in hospitals.
Styranivska, Oksana; Kliuchkovska, Nataliia; Mykyyevych, Nataliya
2017-01-01
To analyze the stress-strain states of bone and abutment teeth during the use of different prosthetic designs of fixed partial dentures with the use of relevant mathematical modeling principles. The use of Comsol Multiphysics 3.5 (Comsol AB, Sweden) software during the mathematical modeling of stress-strain states provided numerical data for analytical interpretation in three different clinical scenarios with fixed dentures and different abutment teeth and demountable prosthetic denture with the saddle-shaped intermediate part. Microsoft Excel Software (Microsoft Office 2017) helped to evaluate absolute mistakes of stress and strain parameters of each abutment tooth during three modeled scenarios and normal condition and to summarize data into the forms of tables. In comparison with the fixed prosthetic denture supported by the canine, first premolar, and third molar, stresses at the same abutment teeth with the use of demountable denture with the saddle-shaped intermediate part decreased: at the mesial abutment tooth by 2.8 times, at distal crown by 6.1 times, and at the intermediate part by 11.1 times, respectively, the deformation level decreased by 3.1, 1.9, and 1.4 times at each area. The methods of mathematical modeling proved that complications during the use of fixed partial dentures based on the overload effect of the abutment teeth and caused by the deformation process inside the intermediate section of prosthetic construction.
Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Bonvini, Marco; Piette, Mary Ann
We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and buildingmore » behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.« less
Laser pulse coded signal frequency measuring device based on DSP and CPLD
NASA Astrophysics Data System (ADS)
Zhang, Hai-bo; Cao, Li-hua; Geng, Ai-hui; Li, Yan; Guo, Ru-hai; Wang, Ting-feng
2011-06-01
Laser pulse code is an anti-jamming measures used in semi-active laser guided weapons. On account of the laser-guided signals adopting pulse coding mode and the weak signal processing, it need complex calculations in the frequency measurement process according to the laser pulse code signal time correlation to meet the request in optoelectronic countermeasures in semi-active laser guided weapons. To ensure accurately completing frequency measurement in a short time, it needed to carry out self-related process with the pulse arrival time series composed of pulse arrival time, calculate the signal repetition period, and then identify the letter type to achieve signal decoding from determining the time value, number and rank number in a signal cycle by Using CPLD and DSP for signal processing chip, designing a laser-guided signal frequency measurement in the pulse frequency measurement device, improving the signal processing capability through the appropriate software algorithms. In this article, we introduced the principle of frequency measurement of the device, described the hardware components of the device, the system works and software, analyzed the impact of some system factors on the accuracy of the measurement. The experimental results indicated that this system improve the accuracy of the measurement under the premise of volume, real-time, anti-interference, low power of the laser pulse frequency measuring device. The practicality of the design, reliability has been demonstrated from the experimental point of view.
Evaluation of a computer-based approach to teaching acid/base physiology.
Rawson, Richard E; Quinlan, Kathleen M
2002-12-01
Because acid/base physiology is a difficult subject for most medical and veterinary students, the first author designed a software program, Acid/Base Primer, that would help students with this topic. The Acid/Base Primer was designed and evaluated within a conceptual framework of basic educational principles. Seventy-five first-year veterinary students (of 81; 93% response rate) participated in this study. Students took both a pre- and posttest of content understanding. After completing the Acid/Base Primer in pairs, each student filled out a survey evaluating the features of the program and describing his/her use and experience of it. Four pairs of students participated in interviews that elaborated on the surveys. Scores improved from 53 +/- 2% on the pretest to 74 +/- 1% on an immediate posttest. On surveys and in interviews, students reported that the program helped them construct their own understanding of acid/base physiology and prompted discussions in pairs of students when individual understandings differed. The case-based format provided anchors and a high degree of relevance. Repetition of concepts helped students develop a more complex network of understanding. Questions in the program served to scaffold the learning process by providing direction, accentuating the relevant features of the cases, and provoking discussion. Guidelines for software development were generated on the basis of the findings and relevant educational literature.
Enhancing the Therapy Experience Using Principles of Video Game Design.
Folkins, John Wm; Brackenbury, Tim; Krause, Miriam; Haviland, Allison
2016-02-01
This article considers the potential benefits that applying design principles from contemporary video games may have on enhancing therapy experiences. Six principles of video game design are presented, and their relevance for enriching clinical experiences is discussed. The motivational and learning benefits of each design principle have been discussed in the education literature as having positive impacts on student motivation and learning and are related here to aspects of clinical practice. The essential experience principle suggests connecting all aspects of the experience around a central emotion or cognitive connection. The discovery principle promotes indirect learning in focused environments. The risk-taking principle addresses the uncertainties clients face when attempting newly learned skills in novel situations. The generalization principle encourages multiple opportunities for skill transfer. The reward system principle directly relates to the scaffolding of frequent and varied feedback in treatment. Last, the identity principle can assist clients in using their newly learned communication skills to redefine self-perceptions. These principles highlight areas for research and interventions that may be used to reinforce or advance current practice.
Design study of Software-Implemented Fault-Tolerance (SIFT) computer
NASA Technical Reports Server (NTRS)
Wensley, J. H.; Goldberg, J.; Green, M. W.; Kutz, W. H.; Levitt, K. N.; Mills, M. E.; Shostak, R. E.; Whiting-Okeefe, P. M.; Zeidler, H. M.
1982-01-01
Software-implemented fault tolerant (SIFT) computer design for commercial aviation is reported. A SIFT design concept is addressed. Alternate strategies for physical implementation are considered. Hardware and software design correctness is addressed. System modeling and effectiveness evaluation are considered from a fault-tolerant point of view.
Using Software Design Methods in CALL
ERIC Educational Resources Information Center
Ward, Monica
2006-01-01
The phrase "software design" is not one that arouses the interest of many CALL practitioners, particularly those from a humanities background. However, software design essentials are simply logical ways of going about designing a system. The fundamentals include modularity, anticipation of change, generality and an incremental approach. While CALL…
75 FR 80571 - Core Principles and Other Requirements for Designated Contract Markets
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-22
... Part II Commodity Futures Trading Commission 17 CFR Parts 1, 16, and 38 Core Principles and Other... CFR Parts 1, 16, and 38 RIN 3038-AD09 Core Principles and Other Requirements for Designated Contract... Principles 1. Subpart B--Designation as Contract Market 2. Subpart C--Compliance With Rules i. Proposed Sec...
Human factors certification in the development of future air traffic control systems
NASA Technical Reports Server (NTRS)
Evans, Alyson E.
1994-01-01
If human factors certification of aviation technologies aims to encompass the wide range of issues which need to be addressed for any new system, then human factors involvement must be present throughout the whole design process in a manner which relates to final certification. A certification process cannot simply be applied to the final product of design. Standards and guidelines will be required by designers at the outset of design for reference in preparing for certification. The most effective use of human factors principles, methods, and measures is made as part of an iterative design process, leading to a system which reflects these as far as possible. This particularly applies where the technology is complex and may be represented by a number of components or sub-systems. Some aspects of the system are best certified during early prototyping, when there is still scope to make changes to software or hardware. At this stage in design, financial and/or time pressures will not rule out the possibility of necessary changes, as may be the case later. Other aspects of the system will be best certified during the final phases of design when the system is in a more complete form and in a realistic environment.
Autonomous robot software development using simple software components
NASA Astrophysics Data System (ADS)
Burke, Thomas M.; Chung, Chan-Jin
2004-10-01
Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.
Designing Control System Application Software for Change
NASA Technical Reports Server (NTRS)
Boulanger, Richard
2001-01-01
The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.
Thyroid Cancer and Tumor Collaborative Registry (TCCR)
Shats, Oleg; Goldner, Whitney; Feng, Jianmin; Sherman, Alexander; Smith, Russell B.; Sherman, Simon
2016-01-01
A multicenter, web-based Thyroid Cancer and Tumor Collaborative Registry (TCCR, http://tccr.unmc.edu) allows for the collection and management of various data on thyroid cancer (TC) and thyroid nodule (TN) patients. The TCCR is coupled with OpenSpecimen, an open-source biobank management system, to annotate biospecimens obtained from the TCCR subjects. The demographic, lifestyle, physical activity, dietary habits, family history, medical history, and quality of life data are provided and may be entered into the registry by subjects. Information on diagnosis, treatment, and outcome is entered by the clinical personnel. The TCCR uses advanced technical and organizational practices, such as (i) metadata-driven software architecture (design); (ii) modern standards and best practices for data sharing and interoperability (standardization); (iii) Agile methodology (project management); (iv) Software as a Service (SaaS) as a software distribution model (operation); and (v) the confederation principle as a business model (governance). This allowed us to create a secure, reliable, user-friendly, and self-sustainable system for TC and TN data collection and management that is compatible with various end-user devices and easily adaptable to a rapidly changing environment. Currently, the TCCR contains data on 2,261 subjects and data on more than 28,000 biospecimens. Data and biological samples collected by the TCCR are used in developing diagnostic, prevention, treatment, and survivorship strategies against TC. PMID:27168721
The role of 3D printing in treating craniomaxillofacial congenital anomalies.
Lopez, Christopher D; Witek, Lukasz; Torroni, Andrea; Flores, Roberto L; Demissie, David B; Young, Simon; Cronstein, Bruce N; Coelho, Paulo G
2018-05-20
Craniomaxillofacial congenital anomalies comprise approximately one third of all congenital birth defects and include deformities such as alveolar clefts, craniosynostosis, and microtia. Current surgical treatments commonly require the use of autogenous graft material which are difficult to shape, limited in supply, associated with donor site morbidity and cannot grow with a maturing skeleton. Our group has demonstrated that 3D printed bio-ceramic scaffolds can generate vascularized bone within large, critical-sized defects (defects too large to heal spontaneously) of the craniomaxillofacial skeleton. Furthermore, these scaffolds are also able to function as a delivery vehicle for a new osteogenic agent with a well-established safety profile. The same 3D printers and imaging software platforms have been leveraged by our team to create sterilizable patient-specific intraoperative models for craniofacial reconstruction. For microtia repair, the current standard of care surgical guide is a two-dimensional drawing taken from the contralateral ear. Our laboratory has used 3D printers and open source software platforms to design personalized microtia surgical models. In this review, we report on the advancements in tissue engineering principles, digital imaging software platforms and 3D printing that have culminated in the application of this technology to repair large bone defects in skeletally immature transitional models and provide in-house manufactured, sterilizable patient-specific models for craniofacial reconstruction. © 2018 Wiley Periodicals, Inc.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
Next Generation Nuclear Plant Methods Research and Development Technical Program Plan -- PLN-2498
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg
2008-09-01
One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less
Next Generation Nuclear Plant Methods Technical Program Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg
2010-12-01
One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less
Next Generation Nuclear Plant Methods Technical Program Plan -- PLN-2498
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Abderrafi M. Ougouag; David W. Nigg
2010-09-01
One of the great challenges of designing and licensing the Very High Temperature Reactor (VHTR) is to confirm that the intended VHTR analysis tools can be used confidently to make decisions and to assure all that the reactor systems are safe and meet the performance objectives of the Generation IV Program. The research and development (R&D) projects defined in the Next Generation Nuclear Plant (NGNP) Design Methods Development and Validation Program will ensure that the tools used to perform the required calculations and analyses can be trusted. The Methods R&D tasks are designed to ensure that the calculational envelope ofmore » the tools used to analyze the VHTR reactor systems encompasses, or is larger than, the operational and transient envelope of the VHTR itself. The Methods R&D focuses on the development of tools to assess the neutronic and thermal fluid behavior of the plant. The fuel behavior and fission product transport models are discussed in the Advanced Gas Reactor (AGR) program plan. Various stress analysis and mechanical design tools will also need to be developed and validated and will ultimately also be included in the Methods R&D Program Plan. The calculational envelope of the neutronics and thermal-fluids software tools intended to be used on the NGNP is defined by the scenarios and phenomena that these tools can calculate with confidence. The software tools can only be used confidently when the results they produce have been shown to be in reasonable agreement with first-principle results, thought-problems, and data that describe the “highly ranked” phenomena inherent in all operational conditions and important accident scenarios for the VHTR.« less
The capture and recreation of 3D auditory scenes
NASA Astrophysics Data System (ADS)
Li, Zhiyun
The main goal of this research is to develop the theory and implement practical tools (in both software and hardware) for the capture and recreation of 3D auditory scenes. Our research is expected to have applications in virtual reality, telepresence, film, music, video games, auditory user interfaces, and sound-based surveillance. The first part of our research is concerned with sound capture via a spherical microphone array. The advantage of this array is that it can be steered into any 3D directions digitally with the same beampattern. We develop design methodologies to achieve flexible microphone layouts, optimal beampattern approximation and robustness constraint. We also design novel hemispherical and circular microphone array layouts for more spatially constrained auditory scenes. Using the captured audio, we then propose a unified and simple approach for recreating them by exploring the reciprocity principle that is satisfied between the two processes. Our approach makes the system easy to build, and practical. Using this approach, we can capture the 3D sound field by a spherical microphone array and recreate it using a spherical loudspeaker array, and ensure that the recreated sound field matches the recorded field up to a high order of spherical harmonics. For some regular or semi-regular microphone layouts, we design an efficient parallel implementation of the multi-directional spherical beamformer by using the rotational symmetries of the beampattern and of the spherical microphone array. This can be implemented in either software or hardware and easily adapted for other regular or semi-regular layouts of microphones. In addition, we extend this approach for headphone-based system. Design examples and simulation results are presented to verify our algorithms. Prototypes are built and tested in real-world auditory scenes.
FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA
NASA Technical Reports Server (NTRS)
Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.
1997-01-01
The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.
The Software Design Document: More than a User's Manual.
ERIC Educational Resources Information Center
Bowers, Dennis
1989-01-01
Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…
School Building Designs: Principles and Challenges of the 21st Century.
ERIC Educational Resources Information Center
Chan, T. C.
2002-01-01
Reviews school-facility challenges and design principles described in 2000 U.S. Department of Education report on school planning and design. Describes additional school-facility design challenges and planning principles. Describes five critical facility-planning issues for the 21st Century. (Contains 14 references.) (PKP)
Current And Future Directions Of Lens Design Software
NASA Astrophysics Data System (ADS)
Gustafson, Darryl E.
1983-10-01
The most effective environment for doing lens design continues to evolve as new computer hardware and software tools become available. Important recent hardware developments include: Low-cost but powerful interactive multi-user 32 bit computers with virtual memory that are totally software-compatible with prior larger and more expensive members of the family. A rapidly growing variety of graphics devices for both hard-copy and screen graphics, including many with color capability. In addition, with optical design software readily accessible in many forms, optical design has become a part-time activity for a large number of engineers instead of being restricted to a small number of full-time specialists. A designer interface that is friendly for the part-time user while remaining efficient for the full-time designer is thus becoming more important as well as more practical. Along with these developments, software tools in other scientific and engineering disciplines are proliferating. Thus, the optical designer is less and less unique in his use of computer-aided techniques and faces the challenge and opportunity of efficiently communicating his designs to other computer-aided-design (CAD), computer-aided-manufacturing (CAM), structural, thermal, and mechanical software tools. This paper will address the impact of these developments on the current and future directions of the CODE VTM optical design software package, its implementation, and the resulting lens design environment.
[Real-time detection and processing of medical signals under windows using Lcard analog interfaces].
Kuz'min, A A; Belozerov, A E; Pronin, T V
2008-01-01
Multipurpose modular software for an analog interface based on Lcard 761 is considered. Algorithms for pipeline processing of medical signals under Windows with dynamic control of computational resources are suggested. The software consists of user-friendly completable modifiable modules. The module hierarchy is based on object-oriented heritage principles, which make it possible to construct various real-time systems for long-term detection, processing, and imaging of multichannel medical signals.
NASA Astrophysics Data System (ADS)
Vaňko, M.; Komžík, R.; Kollár, V.; Sekeráš, M.
2014-10-01
We present a continuation of Paper9 I describing the photoelectric photometry at the Astronomical Institute of the Slovak Academy of Sciences at Tatranská Lomnica. In this article we show the observation principles and the basic ideas and philosophy of the photometer control software — the code UNIV, written by R. Komžík and V. Kollár, and used for the data resulting from observations.
What an open source clinical trial community can learn from hackers
Dunn, Adam G.; Day, Richard O.; Mandl, Kenneth D.; Coiera, Enrico
2014-01-01
Summary Open sharing of clinical trial data has been proposed as a way to address the gap between the production of clinical evidence and the decision-making of physicians. Since a similar gap has already been addressed in the software industry by the open source software movement, we examine how the social and technical principles of the movement can be used to guide the growth of an open source clinical trial community. PMID:22553248
New asphalt mix design system for Oklahoma department of transportation : final report.
DOT National Transportation Integrated Search
2013-03-01
Oklahoma Department of Transportation (ODOT) has been using the Superpave mix design software for several years. The original Superpave mix design software was built around Fox Database and did not meet ODOT requirements. The software currently being...
Software Requirements Engineering Methodology (Development)
1979-06-01
Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway
ERIC Educational Resources Information Center
Carter, Curtis W.
2012-01-01
This article contends that instructional designers and developers should attend to four particular design principles when creating instructional audio. Support for this view is presented by referencing the limited research that has been done in this area, and by indicating how and why each of the four principles is important to the design process.…
Complementarity and Compensation: Bridging the Gap between Writing and Design.
ERIC Educational Resources Information Center
Killingsworth, M. Jimmie; Sanders, Scott P.
1990-01-01
Outlines two rhetorical principles for producing iconic-mosaic texts--the principle of complementarity and the principle of compensation. Shows how these principles can be applied to practical problems in coordinating the writing and design processes in student projects. (RS)
Jing, Lei; Wang, Yao; Zhao, Huifu; Ke, Hongliang; Wang, Xiaoxun; Gao, Qun
2017-06-10
In order to meet the requirements of uniform illumination for optical palm/fingerprint instruments and overcome the shortcomings of the poor uniform illumination on the working plane of the optical palm/fingerprint prism, a novel secondary optical lens with a free-form surface, compact structure, and high uniformity is presented in this paper. The design of the secondary optical lens is based on emission properties of the near-infrared light-emitting diode (LED) and basic principles of non-imaging optics, especially considering the impact of the thickness of the prism in the design. Through the numerical solution of Snell's law in geometric optics, we obtain the profile of the free-form surface of the lens. Using the optical software TracePro, we trace and simulate the illumination system. The results show that the uniformity is 89.8% on the working plane of the prism, and the test results show that the actual uniformity reaches 85.7% in the experiment, which provides an effective way for realizing a highly uniform illumination system with high-power near-infrared LED.