Software Prototyping: Designing Systems for Users.
ERIC Educational Resources Information Center
Spies, Phyllis Bova
1983-01-01
Reports on major change in computer software development process--the prototype model, i.e., implementation of skeletal system that is enhanced during interaction with users. Expensive and unreliable software, software design errors, traditional development approach, resources required for prototyping, success stories, and systems designer's role…
Application of Real Options Theory to DoD Software Acquisitions
2009-08-01
words.) The traditional real options valuation methodology, when enhanced and properly formulated around a proposed or existing software investment...Std 239-18 - ii - THIS PAGE INTENTIONALLY LEFT BLANK - iii - Abstract The traditional real options valuation ...founder and CEO of Real Options Valuation , Inc., a consulting, training, and software development firm specializing in strategic real options
Development and Application of Collaborative Optimization Software for Plate - fin Heat Exchanger
NASA Astrophysics Data System (ADS)
Chunzhen, Qiao; Ze, Zhang; Jiangfeng, Guo; Jian, Zhang
2017-12-01
This paper introduces the design ideas of the calculation software and application examples for plate - fin heat exchangers. Because of the large calculation quantity in the process of designing and optimizing heat exchangers, we used Visual Basic 6.0 as a software development carrier to design a basic calculation software to reduce the calculation quantity. Its design condition is plate - fin heat exchanger which was designed according to the boiler tail flue gas. The basis of the software is the traditional design method of the plate-fin heat exchanger. Using the software for design and calculation of plate-fin heat exchangers, discovery will effectively reduce the amount of computation, and similar to traditional methods, have a high value.
The need for a comprehensive expert system development methodology
NASA Technical Reports Server (NTRS)
Baumert, John; Critchfield, Anna; Leavitt, Karen
1988-01-01
In a traditional software development environment, the introduction of standardized approaches has led to higher quality, maintainable products on the technical side and greater visibility into the status of the effort on the management side. This study examined expert system development to determine whether it differed enough from traditional systems to warrant a reevaluation of current software development methodologies. Its purpose was to identify areas of similarity with traditional software development and areas requiring tailoring to the unique needs of expert systems. A second purpose was to determine whether existing expert system development methodologies meet the needs of expert system development, management, and maintenance personnel. The study consisted of a literature search and personal interviews. It was determined that existing methodologies and approaches to developing expert systems are not comprehensive nor are they easily applied, especially to cradle to grave system development. As a result, requirements were derived for an expert system development methodology and an initial annotated outline derived for such a methodology.
Software Assurance Curriculum Project Volume 2: Undergraduate Course Outlines
2010-08-01
Contents Acknowledgments iii Abstract v 1 An Undergraduate Curriculum Focus on Software Assurance 1 2 Computer Science I 7 3 Computer Science II...confidence that can be integrated into traditional software development and acquisition process models . Thus, in addition to a technology focus...testing throughout the software development life cycle ( SDLC ) AP Security and complexity—system development challenges: security failures
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases, they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At the Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and processes for developing software. This paper will discuss some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies and processes.
A Roadmap for Using Agile Development in a Traditional Environment
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Starbird, Thomas; Grenander, Sven
2006-01-01
One of the newer classes of software engineering techniques is called 'Agile Development'. In Agile Development software engineers take small implementation steps and, in some cases they program in pairs. In addition, they develop automatic tests prior to implementing their small functional piece. Agile Development focuses on rapid turnaround, incremental planning, customer involvement and continuous integration. Agile Development is not the traditional waterfall method or even a rapid prototyping method (although this methodology is closer to Agile Development). At Jet Propulsion Laboratory (JPL) a few groups have begun Agile Development software implementations. The difficulty with this approach becomes apparent when Agile Development is used in an organization that has specific criteria and requirements handed down for how software development is to be performed. The work at the JPL is performed for the National Aeronautics and Space Agency (NASA). Both organizations have specific requirements, rules and procedure for developing software. This paper will discuss the some of the initial uses of the Agile Development methodology, the spread of this method and the current status of the successful incorporation into the current JPL development policies.
Lindoerfer, Doris; Mansmann, Ulrich
2017-07-01
Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Williams, Lawrence H., Jr.
2013-01-01
This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…
Agile Development Methods for Space Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay; Webster, Chris
2012-01-01
Main stream industry software development practice has gone from a traditional waterfall process to agile iterative development that allows for fast response to customer inputs and produces higher quality software at lower cost. How can we, the space ops community, adopt state of the art software development practice, achieve greater productivity at lower cost, and maintain safe and effective space flight operations? At NASA Ames, we are developing Mission Control Technologies Software, in collaboration with Johnson Space Center (JSC) and, more recently, the Jet Propulsion Laboratory (JPL).
Experiences in integrating auto-translated state-chart designs for model checking
NASA Technical Reports Server (NTRS)
Pingree, P. J.; Benowitz, E. G.
2003-01-01
In the complex environment of JPL's flight missions with increasing dependency on advanced software designs, traditional software validation methods of simulation and testing are being stretched to adequately cover the needs of software development.
An approach to developing user interfaces for space systems
NASA Astrophysics Data System (ADS)
Shackelford, Keith; McKinney, Karen
1993-08-01
Inherent weakness in the traditional waterfall model of software development has led to the definition of the spiral model. The spiral model software development lifecycle model, however, has not been applied to NASA projects. This paper describes its use in developing real time user interface software for an Environmental Control and Life Support System (ECLSS) Process Control Prototype at NASA's Marshall Space Flight Center.
Incorporating a Human-Computer Interaction Course into Software Development Curriculums
ERIC Educational Resources Information Center
Janicki, Thomas N.; Cummings, Jeffrey; Healy, R. Joseph
2015-01-01
Individuals have increasing options on retrieving information related to hardware and software. Specific hardware devices include desktops, tablets and smart devices. Also, the number of software applications has significantly increased the user's capability to access data. Software applications include the traditional web site, smart device…
Object oriented development of engineering software using CLIPS
NASA Technical Reports Server (NTRS)
Yoon, C. John
1991-01-01
Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.
A research on the application of software defined networking in satellite network architecture
NASA Astrophysics Data System (ADS)
Song, Huan; Chen, Jinqiang; Cao, Suzhi; Cui, Dandan; Li, Tong; Su, Yuxing
2017-10-01
Software defined network is a new type of network architecture, which decouples control plane and data plane of traditional network, has the feature of flexible configurations and is a direction of the next generation terrestrial Internet development. Satellite network is an important part of the space-ground integrated information network, while the traditional satellite network has the disadvantages of difficult network topology maintenance and slow configuration. The application of SDN technology in satellite network can solve these problems that traditional satellite network faces. At present, the research on the application of SDN technology in satellite network is still in the stage of preliminary study. In this paper, we start with introducing the SDN technology and satellite network architecture. Then we mainly introduce software defined satellite network architecture, as well as the comparison of different software defined satellite network architecture and satellite network virtualization. Finally, the present research status and development trend of SDN technology in satellite network are analyzed.
Starlink Software Developments
NASA Astrophysics Data System (ADS)
Bly, M. J.; Giaretta, D.; Currie, M. J.; Taylor, M.
Some current and upcoming software developments from Starlink were demonstrated. These included invoking traditional Starlink applications via web services, the current version of the ORAC-DR reduction pipeline, and some new Java-based tools including Treeview, an interactive explorer of hierarchical data structures.
Open Source software and social networks: disruptive alternatives for medical imaging.
Ratib, Osman; Rosset, Antoine; Heuberger, Joris
2011-05-01
In recent decades several major changes in computer and communication technology have pushed the limits of imaging informatics and PACS beyond the traditional system architecture providing new perspectives and innovative approach to a traditionally conservative medical community. Disruptive technologies such as the world-wide-web, wireless networking, Open Source software and recent emergence of cyber communities and social networks have imposed an accelerated pace and major quantum leaps in the progress of computer and technology infrastructure applicable to medical imaging applications. This paper reviews the impact and potential benefits of two major trends in consumer market software development and how they will influence the future of medical imaging informatics. Open Source software is emerging as an attractive and cost effective alternative to traditional commercial software developments and collaborative social networks provide a new model of communication that is better suited to the needs of the medical community. Evidence shows that successful Open Source software tools have penetrated the medical market and have proven to be more robust and cost effective than their commercial counterparts. Developed by developers that are themselves part of the user community, these tools are usually better adapted to the user's need and are more robust than traditional software programs being developed and tested by a large number of contributing users. This context allows a much faster and more appropriate development and evolution of the software platforms. Similarly, communication technology has opened up to the general public in a way that has changed the social behavior and habits adding a new dimension to the way people communicate and interact with each other. The new paradigms have also slowly penetrated the professional market and ultimately the medical community. Secure social networks allowing groups of people to easily communicate and exchange information is a new model that is particularly suitable for some specific groups of healthcare professional and for physicians. It has also changed the expectations of how patients wish to communicate with their physicians. Emerging disruptive technologies and innovative paradigm such as Open Source software are leading the way to a new generation of information systems that slowly will change the way physicians and healthcare providers as well as patients will interact and communicate in the future. The impact of these new technologies is particularly effective in image communication, PACS and teleradiology. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Software for improved field surveys of nesting marine turtles.
Anastácio, R; Gonzalez, J M; Slater, K; Pereira, M J
2017-09-07
Field data are still recorded on paper in many worldwide beach surveys of nesting marine turtles. The data must be subsequently transferred into an electronic database, and this can introduce errors in the dataset. To minimize such errors, the "Turtles" software was developed and piloted to record field data by one software user accompanying one Tortuguero in Akumal beaches, Quintana Roo, Mexico, from June 1 st to July 31 st during the night patrols. Comparisons were made between exported data from the software with the paper forms entered into a database (henceforth traditional). Preliminary assessment indicated that the software user tended to record a greater amount of metrics (i.e., an average of 18.3 fields ± 5.4 sd vs. 8.6 fields ± 2.1 sd recorded by the traditional method). The traditional method introduce three types of "errors" into a dataset: missing values in relevant fields (40.1%), different answers for the same value (9.8%), and inconsistent data (0.9%). Only 5.8% of these (missing values) were found with the software methodology. Although only tested by a single user, the software may suggest increased efficacy and warrants further examination to accurately assess the merit of replacing traditional methods of data recording for beach monitoring programmes.
Wang, Wen-ming; Yan, Zhen; Liu, Han-jiang; Zhang, Lei-hong; Xia, Li; Zhao, Zhen-dong
2013-12-01
To provide reference for the government to formulate policies and medical institution to formulate development plan, the present situation of medical institutions of traditional Chinese preparation and some countermeasures for developing the research of traditional Chinese preparation in Guangdong province were studied. Combined with development situation of traditional Chinese preparation of medical institutions in Guangdong province in recent years, the development countermeasure of traditional Chinese preparation was discussed. In order to promote the development of traditional Chinese preparation of medical institutions, suggestions and countermeasures for the government and related departments were proposed. Government departments should formulate policies to support the development of traditional Chinese preparation, the medical institutions should make scientific development planning, strengthen the construction of hardware and software and develop special traditional Chinese preparation to promote the healthy development of traditional Chinese preparation.
Veal marketing could return more than traditional weaning
USDA-ARS?s Scientific Manuscript database
How profitable is a system of marketing early-weaned calves for veal production versus a traditional system based on more traditional weaning and marketing feeder calves? In an attempt to answer this question, decision support software (Decision Evaluator for the Cattle Industry, DECI) developed at...
Product-oriented Software Certification Process for Software Synthesis
NASA Technical Reports Server (NTRS)
Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil
2004-01-01
The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.
Development of a Traditional/Computer-aided Graphics Course for Engineering Technology.
ERIC Educational Resources Information Center
Anand, Vera B.
1985-01-01
Describes a two-semester-hour freshman course in engineering graphics which uses both traditional and computerized instruction. Includes course description, computer graphics topics, and recommendations. Indicates that combining interactive graphics software with development of simple programs gave students a better foundation for upper-division…
Bringing Legacy Visualization Software to Modern Computing Devices via Application Streaming
NASA Astrophysics Data System (ADS)
Fisher, Ward
2014-05-01
Planning software compatibility across forthcoming generations of computing platforms is a problem commonly encountered in software engineering and development. While this problem can affect any class of software, data analysis and visualization programs are particularly vulnerable. This is due in part to their inherent dependency on specialized hardware and computing environments. A number of strategies and tools have been designed to aid software engineers with this task. While generally embraced by developers at 'traditional' software companies, these methodologies are often dismissed by the scientific software community as unwieldy, inefficient and unnecessary. As a result, many important and storied scientific software packages can struggle to adapt to a new computing environment; for example, one in which much work is carried out on sub-laptop devices (such as tablets and smartphones). Rewriting these packages for a new platform often requires significant investment in terms of development time and developer expertise. In many cases, porting older software to modern devices is neither practical nor possible. As a result, replacement software must be developed from scratch, wasting resources better spent on other projects. Enabled largely by the rapid rise and adoption of cloud computing platforms, 'Application Streaming' technologies allow legacy visualization and analysis software to be operated wholly from a client device (be it laptop, tablet or smartphone) while retaining full functionality and interactivity. It mitigates much of the developer effort required by other more traditional methods while simultaneously reducing the time it takes to bring the software to a new platform. This work will provide an overview of Application Streaming and how it compares against other technologies which allow scientific visualization software to be executed from a remote computer. We will discuss the functionality and limitations of existing application streaming frameworks and how a developer might prepare their software for application streaming. We will also examine the secondary benefits realized by moving legacy software to the cloud. Finally, we will examine the process by which a legacy Java application, the Integrated Data Viewer (IDV), is to be adapted for tablet computing via Application Streaming.
Software Design Methodology Migration for a Distributed Ground System
NASA Technical Reports Server (NTRS)
Ritter, George; McNair, Ann R. (Technical Monitor)
2002-01-01
The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .
Relational Data Bases--Are You Ready?
ERIC Educational Resources Information Center
Marshall, Dorothy M.
1989-01-01
Migrating from a traditional to a relational database technology requires more than traditional project management techniques. An overview of what to consider before migrating to relational database technology is presented. Leadership, staffing, vendor support, hardware, software, and application development are discussed. (MLW)
Building quality into medical product software design.
Mallory, S R
1993-01-01
The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.
Software Development and Test Methodology for a Distributed Ground System
NASA Technical Reports Server (NTRS)
Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)
2002-01-01
The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.
ERIC Educational Resources Information Center
Wang, Ling
2008-01-01
This study developed an interactive multimedia-based software program for Optics instruction, which was expected to overcome the imperfection of traditional optical labs. The researcher evaluated the effectiveness of the program through an experimental study that compared the learning outcomes of the students who used and did not use the software.…
Parallel Worlds: Agile and Waterfall Differences and Similarities
2013-10-01
development model , and it is deliberately shorter than the Agile Overview as most readers are assumed to be from the Traditional World. For a more in...process of DODI 5000 does not forbid the iterative incremental software development model with frequent end-user interaction, it requires heroics on...added). Today, many of the DOD’s large IT programs therefore continue to adopt program structures and software development models closely
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
NASA Technical Reports Server (NTRS)
Broderick, Ron
1997-01-01
The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.
Software Accelerates Computing Time for Complex Math
NASA Technical Reports Server (NTRS)
2014-01-01
Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.
Standardized development of computer software. Part 1: Methods
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1976-01-01
This work is a two-volume set on standards for modern software engineering methodology. This volume presents a tutorial and practical guide to the efficient development of reliable computer software, a unified and coordinated discipline for design, coding, testing, documentation, and project organization and management. The aim of the monograph is to provide formal disciplines for increasing the probability of securing software that is characterized by high degrees of initial correctness, readability, and maintainability, and to promote practices which aid in the consistent and orderly development of a total software system within schedule and budgetary constraints. These disciplines are set forth as a set of rules to be applied during software development to drastically reduce the time traditionally spent in debugging, to increase documentation quality, to foster understandability among those who must come in contact with it, and to facilitate operations and alterations of the program as requirements on the program environment change.
Precise Documentation: The Key to Better Software
NASA Astrophysics Data System (ADS)
Parnas, David Lorge
The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.
Path generation algorithm for UML graphic modeling of aerospace test software
NASA Astrophysics Data System (ADS)
Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao
2018-03-01
Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.
Large-scale visualization projects for teaching software engineering.
Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel
2012-01-01
The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Astaf'ev, S. B., E-mail: bard@ns.crys.ras.ru; Shchedrin, B. M.; Yanusova, L. G.
2012-01-15
The main principles of developing the Basic Analysis of Reflectometry Data (BARD) software package, which is aimed at obtaining a unified (standardized) tool for analyzing the structure of thin multilayer films and nanostructures of different nature based on reflectometry data, are considered. This software package contains both traditionally used procedures for processing reflectometry data and the authors' original developments on the basis of new methods for carrying out and analyzing reflectometry experiments. The structure of the package, its functional possibilities, examples of application, and prospects of development are reviewed.
Addressing software security and mitigations in the life cycle
NASA Technical Reports Server (NTRS)
Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt
2003-01-01
Traditionally, security is viewed as an organizational and Information Technology (IIJ systems function comprising of Firewalls, intrusion detection systems (IDS), system security settings and patches to the operating system (OS) and applications running on it. Until recently, little thought has been given to the importance of security as a formal approach in the software life cycle. The Jet Propulsion Laboratory has approached the problem through the development of an integrated formal Software Security Assessment Instrument (SSAI) with six foci for the software life cycle.
Addressing software security and mitigations in the life cycle
NASA Technical Reports Server (NTRS)
Gilliam, David; Powell, John; Haugh, Eric; Bishop, Matt
2004-01-01
Traditionally, security is viewed as an organizational and Information Technology (IT) systems function comprising of firewalls, intrusion detection systems (IDS), system security settings and patches to the operating system (OS) and applications running on it. Until recently, little thought has been given to the importance of security as a formal approach in the software life cycle. The Jet Propulsion Laboratory has approached the problem through the development of an integrated formal Software Security Assessment Instrument (SSAI) with six foci for the software life cycle.
Reconfigurable Software for Mission Operations
NASA Technical Reports Server (NTRS)
Trimble, Jay
2014-01-01
We developed software that provides flexibility to mission organizations through modularity and composability. Modularity enables removal and addition of functionality through the installation of plug-ins. Composability enables users to assemble software from pre-built reusable objects, thus reducing or eliminating the walls associated with traditional application architectures and enabling unique combinations of functionality. We have used composable objects to reduce display build time, create workflows, and build scenarios to test concepts for lunar roving operations. The software is open source, and may be downloaded from https:github.comnasamct.
Lessons learned in transitioning to an open systems environment
NASA Technical Reports Server (NTRS)
Boland, Dillard E.; Green, David S.; Steger, Warren L.
1994-01-01
Software development organizations, both commercial and governmental, are undergoing rapid change spurred by developments in the computing industry. To stay competitive, these organizations must adopt new technologies, skills, and practices quickly. Yet even for an organization with a well-developed set of software engineering models and processes, transitioning to a new technology can be expensive and risky. Current industry trends are leading away from traditional mainframe environments and toward the workstation-based, open systems world. This paper presents the experiences of software engineers on three recent projects that pioneered open systems development for NASA's Flight Dynamics Division of the Goddard Space Flight Center (GSFC).
Information systems analysis approach in hospitals: a national survey.
Wong, B K; Sellaro, C L; Monaco, J A
1995-03-01
A survey of 216 hospitals reveals that some hospitals do not conduct cost-benefit analyses or analyze possible adverse effects in feasibility studies. In determining and analyzing system requirements, external factors that initiate the transaction are not examined, and computer-aided software engineering (CASE) tools are seldom used. Some hospitals do not investigate the advantages and disadvantages of using in-house-developed software versus purchased software packages in the evaluation of alternatives. The survey finds that, overall, most hospitals follow the traditional systems development life cycle (SDLC) approach in analyzing information systems.
Requirements for a multifunctional code architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiihonen, O.; Juslin, K.
1997-07-01
The present paper studies a set of requirements for a multifunctional simulation software architecture in the light of experiences gained in developing and using the APROS simulation environment. The huge steps taken in the development of computer hardware and software during the last ten years are changing the status of the traditional nuclear safety analysis software. The affordable computing power on the safety analysts table by far exceeds the possibilities offered to him/her ten years ago. At the same time the features of everyday office software tend to set standards to the way the input data and calculational results aremore » managed.« less
Combining Agile and Traditional: Customer Communication in Distributed Environment
NASA Astrophysics Data System (ADS)
Korkala, Mikko; Pikkarainen, Minna; Conboy, Kieran
Distributed development is a radically increasing phenomenon in modern software development environments. At the same time, traditional and agile methodologies and combinations of those are being used in the industry. Agile approaches place a large emphasis on customer communication. However, existing knowledge on customer communication in distributed agile development seems to be lacking. In order to shed light on this topic and provide practical guidelines for companies in distributed agile environments, a qualitative case study was conducted in a large globally distributed software company. The key finding was that it might be difficult for an agile organization to get relevant information from a traditional type of customer organization, even though the customer communication was indicated to be active and utilized via multiple different communication media. Several challenges discussed in this paper referred to "information blackout" indicating the importance of an environment fostering meaningful communication. In order to evaluate if this environment can be created a set of guidelines is proposed.
The Effect of Interactive CD-ROM/Digitized Audio Courseware on Reading among Low-Literate Adults.
ERIC Educational Resources Information Center
Gretes, John A.; Green, Michael
1994-01-01
Compares a multimedia adult literacy instructional course, Reading to Educate and Develop Yourself (READY), to traditional classroom instruction by studying effects of replacing conventional learning tools with computer-assisted instruction (CD-ROMs and audio software). Results reveal that READY surpassed traditional instruction for virtually…
Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.
Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron
2018-02-01
New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.
Zhou, Yu; Ren, Jie
2011-04-01
We put forward a new concept of software oversampling mapping system for electrocardiogram (ECG) to assist the research of the ECG inverse problem to improve the generality of mapping system and the quality of mapping signals. We then developed a conceptual system based on the traditional ECG detecting circuit, Labview and DAQ card produced by National Instruments, and at the same time combined the newly-developed oversampling method into the system. The results indicated that the system could map ECG signals accurately and the quality of the signals was good. The improvement of hardware and enhancement of software made the system suitable for mapping in different situations. So the primary development of the software for oversampling mapping system was successful and further research and development can make the system a powerful tool for researching ECG inverse problem.
The use of emulator-based simulators for on-board software maintenance
NASA Astrophysics Data System (ADS)
Irvine, M. M.; Dartnell, A.
2002-07-01
Traditionally, onboard software maintenance activities within the space sector are performed using hardware-based facilities. These facilities are developed around the use of hardware emulation or breadboards containing target processors. Some sort of environment is provided around the hardware to support the maintenance actives. However, these environments are not easy to use to set-up the required test scenarios, particularly when the onboard software executes in a dynamic I/O environment, e.g. attitude control software, or data handling software. In addition, the hardware and/or environment may not support the test set-up required during investigations into software anomalies, e.g. raise spurious interrupt, fail memory, etc, and the overall "visibility" of the software executing may be limited. The Software Maintenance Simulator (SOMSIM) is a tool that can support the traditional maintenance facilities. The following list contains some of the main benefits that SOMSIM can provide: Low cost flexible extension to existing product - operational simulator containing software processor emulator; System-level high-fidelity test-bed in which software "executes"; Provides a high degree of control/configuration over the entire "system", including contingency conditions perhaps not possible with real hardware; High visibility and control over execution of emulated software. This paper describes the SOMSIM concept in more detail, and also describes the SOMSIM study being carried out for ESA/ESOC by VEGA IT GmbH.
Impact of office productivity cloud computing on energy consumption and greenhouse gas emissions.
Williams, Daniel R; Tang, Yinshan
2013-05-07
Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft's cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.
Application of Real Options Theory to DoD Software Acquisitions
2009-02-20
Future Combat Systems Program. Washington, DC. U.S. Government Printing Office. Damodaran , A. (2007). Investment Valuation : The Options To Expand... valuation methodology, when enhanced and properly formulated around a proposed or existing software investment employing the spiral development approach...THIS PAGE INTENTIONALLY LEFT BLANK iii ABSTRACT The traditional real options valuation methodology, when enhanced and properly formulated
Natural language processing-based COTS software and related technologies survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stickland, Michael G.; Conrad, Gregory N.; Eaton, Shelley M.
Natural language processing-based knowledge management software, traditionally developed for security organizations, is now becoming commercially available. An informal survey was conducted to discover and examine current NLP and related technologies and potential applications for information retrieval, information extraction, summarization, categorization, terminology management, link analysis, and visualization for possible implementation at Sandia National Laboratories. This report documents our current understanding of the technologies, lists software vendors and their products, and identifies potential applications of these technologies.
[Social network analysis of traditional Chinese medicine on treatment of constipation].
Du, Li-Dong; Tian, Jin-Hui; Wu, Guo-Tai; Niu, Ting-Hui; Chen, Zhen-He; Ren, Yuan
2017-01-01
The methods of literature metrology and data mining were used to study the research topics and social network analysis of traditional Chinese medicine for constipation. The major Chinese databases were searched to include the research studies of traditional Chinese medicine for constipation. BICOMS analysis software was used to extract and collect the main information and produce co-occurrence Matrix; gCLUTO software was used for cluster analysis. Data analysis was conducted by using SPSS 19.0 software. The results showed that the number of studies on traditional Chinese medicine for constipation was constantly increased, with two literature volume peaks respectively in 2003 and 2006. Related studies have been published in 31 provinces, autonomous regions and municipalities have published, but the studies in developed areas were more than those in developing areas. There was little cooperation between research institutions and the authors, especially the cooperation between different areas. At present, the research field of Chinese medicine for constipation is divided into five research topics. In terms of specific traditional Chinese medicine, angelica sinensis is in the core position. The results showed regional imbalance in the number of studies on Chinese medicine treatment for constipation, as well as little cooperation between researchers and research institutions. The research topics mainly focused on the evaluation of clinical efficacy, but the research on optimizing the prescriptions was still not enough, so the future researchers shall pay more attention to the studies of constipation prescriptions with Angelica sinensis as the core herb. Copyright© by the Chinese Pharmaceutical Association.
Hoseinzadeh, Hamidreza; Taghipour, Ali; Yousefi, Mahdi
2018-01-01
Background Development of a questionnaire based on the resources of Persian traditional medicine seems necessary. One of the problems faced by practitioners of traditional medicine is the different opinions regarding the diagnosis of general temperament or temperament of member. One of the reasons is the lack of validity tools, and it has led to difficulties in training the student of traditional medicine and the treatment of patients. The differences in the detection methods, have given rise to several treatment methods. Objective The present study aimed to develop a questionnaire and standard software for diagnosis of gastrointestinal dystemperaments. Methods The present research is a tool developing study which included 8 stages of developing the items, determining the statements based on items, assessing the face validity, assessing the content validity, assessing the reliability, rating the items, developing a software for calculation of the total score of the questionnaire named GDS v.1.1, and evaluating the concurrent validity using statistical tests including Cronbach’s alpha coefficient, Cohen’s kappa coefficient. Results Based on the results, 112 notes including 62 symptoms were extracted from resources, and 58 items were obtained from in-person interview sessions with a panel of experts. A statement was selected for each item and, after merging a number of statements, a total of 49 statements were finally obtained. By calculating the score of statement impact and determining the content validity, respectively, 6 and 10 other items were removed from the list of statements. Standardized Cronbach’s alpha for this questionnaire was obtained 0.795 and its concurrent validity was equal to 0.8. Conclusion A quantitative tool was developed for diagnosis and examination of gastrointestinal dystemperaments. The developed questionnaire is adequately reliable and valid for this purpose. In addition, the software can be used for clinical diagnosis. PMID:29629060
Software support environment design knowledge capture
NASA Technical Reports Server (NTRS)
Dollman, Tom
1990-01-01
The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.
Software-Reconfigurable Processors for Spacecraft
NASA Technical Reports Server (NTRS)
Farrington, Allen; Gray, Andrew; Bell, Bryan; Stanton, Valerie; Chong, Yong; Peters, Kenneth; Lee, Clement; Srinivasan, Jeffrey
2005-01-01
A report presents an overview of an architecture for a software-reconfigurable network data processor for a spacecraft engaged in scientific exploration. When executed on suitable electronic hardware, the software performs the functions of a physical layer (in effect, acts as a software radio in that it performs modulation, demodulation, pulse-shaping, error correction, coding, and decoding), a data-link layer, a network layer, a transport layer, and application-layer processing of scientific data. The software-reconfigurable network processor is undergoing development to enable rapid prototyping and rapid implementation of communication, navigation, and scientific signal-processing functions; to provide a long-lived communication infrastructure; and to provide greatly improved scientific-instrumentation and scientific-data-processing functions by enabling science-driven in-flight reconfiguration of computing resources devoted to these functions. This development is an extension of terrestrial radio and network developments (e.g., in the cellular-telephone industry) implemented in software running on such hardware as field-programmable gate arrays, digital signal processors, traditional digital circuits, and mixed-signal application-specific integrated circuits (ASICs).
Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.
The Software Engineering Prototype.
1983-06-01
34. sThis cnly means that the ’claim’, i.e., "accepted wisdcu" in systems design, was set up as the aiternative to the hypcthesis, in accord with tra dit ion...conflict and its resolution are m~~lyto occur when users can exercise their influence 4n the levelc2- inert prcezss. Ccnflict 4itsslY os snotr lead...the traditional method of software de- velopment often has poor results. Recently, a new approach to software development, the prototype approach
2003-03-01
private sector . Researchers have also identified software acquisitions as one of the major differences between the private sector and public sector MIS. This indicates that the elements for a successful software project in the public sector may be different from the private sector . Private sector project success depends on many elements. Three of them are user interaction with the project’s development, critical success factors, and how the project manager prioritizes the traditional success criteria.
Space station advanced automation
NASA Technical Reports Server (NTRS)
Woods, Donald
1990-01-01
In the development of a safe, productive and maintainable space station, Automation and Robotics (A and R) has been identified as an enabling technology which will allow efficient operation at a reasonable cost. The Space Station Freedom's (SSF) systems are very complex, and interdependent. The usage of Advanced Automation (AA) will help restructure, and integrate system status so that station and ground personnel can operate more efficiently. To use AA technology for the augmentation of system management functions requires a development model which consists of well defined phases of: evaluation, development, integration, and maintenance. The evaluation phase will consider system management functions against traditional solutions, implementation techniques and requirements; the end result of this phase should be a well developed concept along with a feasibility analysis. In the development phase the AA system will be developed in accordance with a traditional Life Cycle Model (LCM) modified for Knowledge Based System (KBS) applications. A way by which both knowledge bases and reasoning techniques can be reused to control costs is explained. During the integration phase the KBS software must be integrated with conventional software, and verified and validated. The Verification and Validation (V and V) techniques applicable to these KBS are based on the ideas of consistency, minimal competency, and graph theory. The maintenance phase will be aided by having well designed and documented KBS software.
Professional Ethics of Software Engineers: An Ethical Framework.
Lurie, Yotam; Mark, Shlomo
2016-04-01
The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process.
NASA Technical Reports Server (NTRS)
Allen, B. Danette
1998-01-01
In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.
Systems Prototyping with Fourth Generation Tools.
ERIC Educational Resources Information Center
Sholtys, Phyllis
1983-01-01
The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bridge, Pete, E-mail: pete.bridge@qut.edu.au; Gunn, Therese; Kastanis, Lazaros
A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice.more » Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment.« less
Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866
Whole earth modeling: developing and disseminating scientific software for computational geophysics.
NASA Astrophysics Data System (ADS)
Kellogg, L. H.
2016-12-01
Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.
Validation and Verification of LADEE Models and Software
NASA Technical Reports Server (NTRS)
Gundy-Burlet, Karen
2013-01-01
The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.
The development and evaluation of a medical imaging training immersive environment
Bridge, Pete; Gunn, Therese; Kastanis, Lazaros; Pack, Darren; Rowntree, Pamela; Starkey, Debbie; Mahoney, Gaynor; Berry, Clare; Braithwaite, Vicki; Wilson-Stewart, Kelly
2014-01-01
Introduction A novel realistic 3D virtual reality (VR) application has been developed to allow medical imaging students at Queensland University of Technology to practice radiographic techniques independently outside the usual radiography laboratory. Methods A flexible agile development methodology was used to create the software rapidly and effectively. A 3D gaming environment and realistic models were used to engender presence in the software while tutor-determined gold standards enabled students to compare their performance and learn in a problem-based learning pedagogy. Results Students reported high levels of satisfaction and perceived value and the software enabled up to 40 concurrent users to prepare for clinical practice. Student feedback also indicated that they found 3D to be of limited value in the desktop version compared to the usual 2D approach. A randomised comparison between groups receiving software-based and traditional practice measured performance in a formative role play with real equipment. The results of this work indicated superior performance with the equipment for the VR trained students (P = 0.0366) and confirmed the value of VR for enhancing 3D equipment-based problem-solving skills. Conclusions Students practising projection techniques virtually performed better at role play assessments than students practising in a traditional radiography laboratory only. The application particularly helped with 3D equipment configuration, suggesting that teaching 3D problem solving is an ideal use of such medical equipment simulators. Ongoing development work aims to establish the role of VR software in preparing students for clinical practice with a range of medical imaging equipment. PMID:26229652
NASA Astrophysics Data System (ADS)
Qi, Yong; Lei, Kai; Zhang, Lizeqing; Xing, Ximing; Gou, Wenyue
2018-06-01
This paper introduced the development of a self-serving medical data assisted diagnosis software of cervical cancer on the basis of artificial neural network (SVN, FNN, KNN). The system is developed based on the idea of self-service platform, supported by the application and innovation of neural network algorithm in medical data identification. Furthermore, it combined the advanced methods in various fields to effectively solve the complicated and inaccurate problem of cervical canceration data in the traditional manual treatment.
Protocol independent transmission method in software defined optical network
NASA Astrophysics Data System (ADS)
Liu, Yuze; Li, Hui; Hou, Yanfang; Qiu, Yajun; Ji, Yuefeng
2016-10-01
With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.i., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). Using a proprietary protocol or encoding format is a way to improve information security. However, the flow, which carried by proprietary protocol or code, cannot go through the traditional IP network. In addition, ultra- high-definition video transmission service once again become a hot spot. Traditionally, in the IP network, the Serial Digital Interface (SDI) signal must be compressed. This approach offers additional advantages but also bring some disadvantages such as signal degradation and high latency. To some extent, HD-SDI can also be regard as a proprietary protocol, which need transparent transmission such as optical channel. However, traditional optical networks cannot support flexible traffics . In response to aforementioned challenges for future network, one immediate solution would be to use NFV technology to abstract the network infrastructure and provide an all-optical switching topology graph for the SDN control plane. This paper proposes a new service-based software defined optical network architecture, including an infrastructure layer, a virtualization layer, a service abstract layer and an application layer. We then dwell on the corresponding service providing method in order to implement the protocol-independent transport. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit the HD-SDI signal in the software-defined optical network.
AKM in Open Source Communities
NASA Astrophysics Data System (ADS)
Stamelos, Ioannis; Kakarontzas, George
Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.
Romanian traditional motif - element of modernity in clothing
NASA Astrophysics Data System (ADS)
Doble, L.; Stan, O.; Suteu, M. D.; Albu, A.; Bohm, G.; Tsatsarou-Michalaki, A.; Gialinou, E.
2017-10-01
In this paper are presented the phases for improving from an aesthetic point of view a clothing item, the jacket respectively, with a straight cut for women using software design patterns, computerised graphics and textile different modern technologies including: industrial embroidery, digital printing, sublimation. In the first phase a documentation was prepared in the Ethnographic Museum of Transylvania from Cluj Napoca where more traditional motifs were selected specific to Transylvania etnographic region and were reintepreted and stylized whilst preserving the symbolism and color range specified to the area. For the styling phase was used CorelDraw vector graphics program that allows changing the shape, size and color of the drawings without affecting the identity of the pattern. In the patterns design phase Gemini CAD software was used and for the modeling and model development Optitex software was used. The part for garnishing the model was performed using Embrodery machine software reproducing the stylized motif identically. In order to obtain a significantly improved aesthetic look and an added artistic value the pattern chosen for the jacket was done using a combination of modern textile technologies. This has allowed the realization of a particular texture on the surface of the designed product, demonstrating that traditional patterns can be reintepreted in modern clothing
From Bridges and Rockets, Lessons for Software Systems
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
2004-01-01
Although differences exist between building software systems and building physical structures such as bridges and rockets, enough similarities exist that software engineers can learn lessons from failures in traditional engineering disciplines. This paper draws lessons from two well-known failures the collapse of the Tacoma Narrows Bridge in 1940 and the destruction of the space shuttle Challenger in 1986 and applies these lessons to software system development. The following specific applications are made: (1) the verification and validation of a software system should not be based on a single method, or a single style of methods; (2) the tendency to embrace the latest fad should be overcome; and (3) the introduction of software control into safety-critical systems should be done cautiously.
ERIC Educational Resources Information Center
Karemaker, Arjette; Pitchford, Nicola J.; O'Malley, Claire
2010-01-01
The effectiveness of a reading intervention using the whole-word multimedia software "Oxford Reading Tree (ORT) for Clicker" was compared to a reading intervention using traditional ORT Big Books. Developing literacy skills and attitudes towards learning to read were assessed in a group of 17 struggling beginner readers aged 5-6 years. Each child…
A Roadmap for using Agile Development in a Traditional System
NASA Technical Reports Server (NTRS)
Streiffert, Barbara; Starbird, Thomas
2006-01-01
I. Ensemble Development Group: a) Produces activity planning software for in spacecraft; b) Built on Eclipse Rich Client Platform (open source development and runtime software); c) Funded by multiple sources including the Mars Technology Program; d) Incorporated the use of Agile Development. II. Next Generation Uplink Planning System: a) Researches the Activity Planning and Sequencing Subsystem for Mars Science Laboratory (APSS); b) APSS includes Ensemble, Activity Modeling, Constraint Checking, Command Editing and Sequencing tools plus other uplink generation utilities; c) Funded by the Mars Technology Program; d) Integrates all of the tools for APSS.
An Assessment of Educational Benefits from the OpenOrbiter Space Program
ERIC Educational Resources Information Center
Straub, Jeremy; Whalen, David
2013-01-01
This paper analyzes the educational impact of the OpenOrbiter Small Spacecraft Development Initiative, a CubeSat development program underway at the University of North Dakota. OpenOrbiter includes traditional STEM activities (e.g., spacecraft engineering, software development); it also incorporates students from non-STEM disciplines not generally…
Improving communication among nurses and patients.
Unluturk, Mehmet S; Ozcanhan, Mehmet H; Dalkilic, Gokhan
2015-07-01
Patients use nurse call systems to signal nurses for medical help. Traditional push button-flashing lamp call systems are not integrated with other hospital automation systems. Therefore, nurse response time becomes a matter of personal discretion. The improvement obtained by integrating a pager system into the nurse call systems does not increase care efficiency, because unnecessary visits are still not eliminated. To obtain an immediate response and a purposeful visit by a nurse; regardless of the location of nurse in hospital, traditional systems have to be improved by intelligent telephone system integration. The results of the developed Nurse Call System Software (NCSS), the Wireless Phone System Software (WPSS), the Location System Software (LSS) and the communication protocol are provided, together with detailed XML message structures. The benefits of the proposed system are also discussed and the direction of future work is presented. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Computer-assisted concept mapping: Visual aids for knowledge construction
Mammen, Jennifer R.
2016-01-01
Background Concept mapping is a visual representation of ideas that facilitates critical thinking and is applicable to many areas of nursing education. Computer-Assisted Concept Maps are more flexible and less constrained than traditional paper methods, allowing for analysis and synthesis of complex topics and larger amounts of data. Ability to iteratively revise and collaboratively create computerized maps can contribute to enhanced interpersonal learning. However, there is limited awareness of free software that can support these types of applications. Discussion This educational brief examines affordances and limitations of Computer-Assisted Concept Maps and reviews free software for development of complex, collaborative malleable maps. Free software such as VUE, Xmind, MindMaple, and others can substantially contribute to utility of concept-mapping for nursing education. Conclusions Computerized concept-mapping is an important tool for nursing and is likely to hold greater benefit for students and faculty than traditional pen and paper methods alone. PMID:27351610
Telescience Resource Kit (TReK)
NASA Technical Reports Server (NTRS)
Lippincott, Jeff
2015-01-01
Telescience Resource Kit (TReK) is one of the Huntsville Operations Support Center (HOSC) remote operations solutions. It can be used to monitor and control International Space Station (ISS) payloads from anywhere in the world. It is comprised of a suite of software applications and libraries that provide generic data system capabilities and access to HOSC services. The TReK Software has been operational since 2000. A new cross-platform version of TReK is under development. The new software is being released in phases during the 2014-2016 timeframe. The TReK Release 3.x series of software is the original TReK software that has been operational since 2000. This software runs on Windows. It contains capabilities to support traditional telemetry and commanding using CCSDS (Consultative Committee for Space Data Systems) packets. The TReK Release 4.x series of software is the new cross platform software. It runs on Windows and Linux. The new TReK software will support communication using standard IP protocols and traditional telemetry and commanding. All the software listed above is compatible and can be installed and run together on Windows. The new TReK software contains a suite of software that can be used by payload developers on the ground and onboard (TReK Toolkit). TReK Toolkit is a suite of lightweight libraries and utility applications for use onboard and on the ground. TReK Desktop is the full suite of TReK software -most useful on the ground. When TReK Desktop is released, the TReK installation program will provide the option to choose just the TReK Toolkit portion of the software or the full TReK Desktop suite. The ISS program is providing the TReK Toolkit software as a generic flight software capability offered as a standard service to payloads. TReK Software Verification was conducted during the April/May 2015 timeframe. Payload teams using the TReK software onboard can reference the TReK software verification. TReK will be demonstrated on-orbit running on an ISS provided T61p laptop. Target Timeframe: September 2015 -2016. The on-orbit demonstration will collect benchmark metrics, and will be used in the future to provide live demonstrations during ISS Payload Conferences. Benchmark metrics and demonstrations will address the protocols described in SSP 52050-0047 Ku Forward section 3.3.7. (Associated term: CCSDS File Delivery Protocol (CFDP)).
Service-oriented Software Defined Optical Networks for Cloud Computing
NASA Astrophysics Data System (ADS)
Liu, Yuze; Li, Hui; Ji, Yuefeng
2017-10-01
With the development of big data and cloud computing technology, the traditional software-defined network is facing new challenges (e.g., ubiquitous accessibility, higher bandwidth, more flexible management and greater security). This paper proposes a new service-oriented software defined optical network architecture, including a resource layer, a service abstract layer, a control layer and an application layer. We then dwell on the corresponding service providing method. Different service ID is used to identify the service a device can offer. Finally, we experimentally evaluate that proposed service providing method can be applied to transmit different services based on the service ID in the service-oriented software defined optical network.
NASA Technical Reports Server (NTRS)
Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim
1993-01-01
The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.
NASA Astrophysics Data System (ADS)
Xie, Songhua; Li, Dehua; Nie, Hui
2009-10-01
There are a large number of fuzzy concepts and fuzzy phenomena in traditional Chinese medicine, which have led to great difficulties for study of traditional Chinese medicine. In this paper, the mathematical methods are used to quantify fuzzy concepts of drugs and prescription. We put forward the process of innovation formulations and selection method in Chinese medicine based on the Possibility Construction Space Theory (PCST) and fuzzy pattern recognition. Experimental results show that the method of selecting medicines from a number of characteristics of traditional Chinese medicine is consistent with the basic theory of traditional Chinese medicine. The results also reflect the integrated effects of the innovation compound. Through the use of the innovation formulations system, we expect to provide software tools for developing new traditional Chinese medicine and to inspire traditional Chinese medicine researchers to develop novel drugs.
Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models
NASA Astrophysics Data System (ADS)
Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto
In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
NASA Astrophysics Data System (ADS)
Dulo, D. A.
Safety critical software systems permeate spacecraft, and in a long term venture like a starship would be pervasive in every system of the spacecraft. Yet software failure today continues to plague both the systems and the organizations that develop them resulting in the loss of life, time, money, and valuable system platforms. A starship cannot afford this type of software failure in long journeys away from home. A single software failure could have catastrophic results for the spaceship and the crew onboard. This paper will offer a new approach to developing safe reliable software systems through focusing not on the traditional safety/reliability engineering paradigms but rather by focusing on a new paradigm: Resilience and Failure Obviation Engineering. The foremost objective of this approach is the obviation of failure, coupled with the ability of a software system to prevent or adapt to complex changing conditions in real time as a safety valve should failure occur to ensure safe system continuity. Through this approach, safety is ensured through foresight to anticipate failure and to adapt to risk in real time before failure occurs. In a starship, this type of software engineering is vital. Through software developed in a resilient manner, a starship would have reduced or eliminated software failure, and would have the ability to rapidly adapt should a software system become unstable or unsafe. As a result, long term software safety, reliability, and resilience would be present for a successful long term starship mission.
The application of domain-driven design in NMS
NASA Astrophysics Data System (ADS)
Zhang, Jinsong; Chen, Yan; Qin, Shengjun
2011-12-01
In the traditional design approach of data-model-driven, system analysis and design phases are often separated which makes the demand information can not be expressed explicitly. The method is also easy to lead developer to the process-oriented programming, making codes between the modules or between hierarchies disordered. So it is hard to meet requirement of system scalability. The paper proposes a software hiberarchy based on rich domain model according to domain-driven design named FHRDM, then the Webwork + Spring + Hibernate (WSH) framework is determined. Domain-driven design aims to construct a domain model which not only meets the demand of the field where the software exists but also meets the need of software development. In this way, problems in Navigational Maritime System (NMS) development like big system business volumes, difficulty of requirement elicitation, high development costs and long development cycle can be resolved successfully.
3D virtual environment of Taman Mini Indonesia Indah in a web
NASA Astrophysics Data System (ADS)
Wardijono, B. A.; Wardhani, I. P.; Chandra, Y. I.; Pamungkas, B. U. G.
2018-05-01
Taman Mini Indonesia Indah known as TMII is a largest recreational park based on culture in Indonesia. This park has 250 acres that consist of houses from provinces in Indonesia. In TMII, there are traditional houses of the various provinces in Indonesia. The official website of TMII has informed the traditional houses, but the information was limited to public. To provide information more detail about TMII to the public, this research aims to create and develop virtual traditional houses as 3d graphics models and show it via website. The Virtual Reality (VR) technology was used to display the visualization of the TMII and the surrounding environment. This research used Blender software to create the 3D models and Unity3D software to make virtual reality models that can be showed on a web. This research has successfully created 33 virtual traditional houses of province in Indonesia. The texture of traditional house was taken from original to make the culture house realistic. The result of this research was the website of TMII including virtual culture houses that can be displayed through the web browser. The website consists of virtual environment scenes and internet user can walkthrough and navigates inside the scenes.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Computer-Assisted Concept Mapping: Visual Aids for Knowledge Construction.
Mammen, Jennifer R
2016-07-01
Concept mapping is a visual representation of ideas that facilitates critical thinking and is applicable to many areas of nursing education. Computer-assisted concept maps are more flexible and less constrained than traditional paper methods, allowing for analysis and synthesis of complex topics and larger amounts of data. Ability to iteratively revise and collaboratively create computerized maps can contribute to enhanced interpersonal learning. However, there is limited awareness of free software that can support these types of applications. This educational brief examines affordances and limitations of computer-assisted concept maps and reviews free software for development of complex, collaborative malleable maps. Free software, such as VUE, XMind, MindMaple, and others, can substantially contribute to the utility of concept mapping for nursing education. Computerized concept-mapping is an important tool for nursing and is likely to hold greater benefit for students and faculty than traditional pen-and-paper methods alone. [J Nurs Educ. 2016;55(7):403-406.]. Copyright 2016, SLACK Incorporated.
Image Processing Occupancy Sensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Image Processing Occupancy Sensor, or IPOS, is a novel sensor technology developed at the National Renewable Energy Laboratory (NREL). The sensor is based on low-cost embedded microprocessors widely used by the smartphone industry and leverages mature open-source computer vision software libraries. Compared to traditional passive infrared and ultrasonic-based motion sensors currently used for occupancy detection, IPOS has shown the potential for improved accuracy and a richer set of feedback signals for occupant-optimized lighting, daylighting, temperature setback, ventilation control, and other occupancy and location-based uses. Unlike traditional passive infrared (PIR) or ultrasonic occupancy sensors, which infer occupancy based only onmore » motion, IPOS uses digital image-based analysis to detect and classify various aspects of occupancy, including the presence of occupants regardless of motion, their number, location, and activity levels of occupants, as well as the illuminance properties of the monitored space. The IPOS software leverages the recent availability of low-cost embedded computing platforms, computer vision software libraries, and camera elements.« less
A Teamwork-Oriented Air Traffic Control Simulator
2006-06-01
the software development methodology of this work , this chapter is viewed as the acquisition phase of this model. The end of the ...Maintenance phase Changed Verification Retirement Development Maintenance 37 because the different controllers working in these phases usually...traditional operation such as scaling the airport and personalizing the working environment. 4. Pilot Specification The
ERIC Educational Resources Information Center
Deek, Fadi; Espinosa, Idania
2005-01-01
Traditionally, novice programmers have had difficulties in three distinct areas: breaking down a given problem, designing a workable solution, and debugging the resulting program. Many programming environments, software applications, and teaching tools have been developed to address the difficulties faced by these novices. Along with advancements…
Agile Methods and Request for Change (RFC): Observations from DoD Acquisition Programs
2014-01-01
at the Software Development Plan, then it’s worth having a conversation with the contractor that includes answering the above questions. MSA TD EMD...Lap- ham 2010] CMU/SEI-2013-TN-031 | 18 those undertaken in more traditional waterfall-based developments. Some of the government PMO enabling
Future Field Programmable Gate Array (FPGA) Design Methodologies and Tool Flows
2008-07-01
a) that the results are accepted by users, vendors, … and (b) that they can quantitatively explain HPC rules of thumb such as: “OpenMP is easier...in productivity that were demonstrated by traditional software systems. Using advances in software productivity as a guide , we have identified three...of this study we developed a productivity model to guide our investigation (14). Models have limitations and the model we propose is no exception
Software-Based Safety Systems in Space - Learning from other Domains
NASA Astrophysics Data System (ADS)
Klicker, M.; Putzer, H.
2012-01-01
Increasing complexity and new emerging capabilities for manned and unmanned missions have been the hallmark of the past decades of space exploration. One of the drivers in this process was the ever increasing use of software and software-intensive systems to implement system functions necessary to the capabilities needed. The course of technological evolution suggests that this development will continue well into the future with a number of challenges for the safety community some of which shall be discussed in this paper. The current state of the art reveals a number of problems with developing and assessing safety critical software which explains the reluctance of the space community to rely on software-based safety measures to mitigate hazards. Among others, usually lack of trustworthy evidence of software integrity in all foreseeable situations and the difficulties to integrate software in the traditional safety analysis framework are cited. Experience from other domains and recent developments in modern software development methodologies and verification techniques are analysed for the suitability for space systems and an avionics architectural framework (see STANAG 4626) for the implementation of safety critical software is proposed. This is shown to create among other features the possibility of numerous degradation modes enhancing overall system safety and interoperability of computerized space systems. It also potentially simplifies international cooperation on a technical level by introducing a higher degree of compatibility. As software safety cannot be tested or argued into a system in hindsight, the development process and especially the architecture chosen are essential to establish safety properties for the software used to implement safety functions. The core of the safety argument revolves around the separation of different functions and software modules from each other by minimal coupling of functions and credible separation mechanisms in the architecture combined with rigorous development methodologies for the software itself.
NASA Astrophysics Data System (ADS)
Laracuente, Nicholas; Grossman, Carl
2013-03-01
We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
The Need for V&V in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.
Pre-Mastering and CD-WO Evaluations
NASA Technical Reports Server (NTRS)
Hecox, D.; Hyon, J.; Martin, M.; Marski, K.; Shields, E.; Sorensen, S.; Teramae, S.
1993-01-01
This article reviews the features and functionality of five desktop pre-mastering software packages for the PC. Desktop pre-mastering packages are aimed primarily at end-users interested in bringing CD-ROM publishing tasks in-house, rather than traditional CD-ROM developers.
Experiences with Extreme Programming
ERIC Educational Resources Information Center
Sherrell, Linda; Krishna, Bhagavathy; Velaga, Natasha; Vejandla, Pavan; Satharla, Mahesh
2010-01-01
Agile methodologies have become increasingly popular among software developers as evidenced by industrial participation at related conferences. The popularity of agile practices over traditional techniques partly stems from the fact that these practices provide for more customer involvement and better accommodate rapidly changing requirements,…
A Comparison of Two Approaches to Safety Analysis Based on Use Cases
NASA Astrophysics Data System (ADS)
Stålhane, Tor; Sindre, Guttorm
Engineering has a long tradition in analyzing the safety of mechanical, electrical and electronic systems. Important methods like HazOp and FMEA have also been adopted by the software engineering community. The misuse case method, on the other hand, has been developed by the software community as an alternative to FMEA and preliminary HazOp for software development. To compare the two methods misuse case and FMEA we have run a small experiment involving 42 third year software engineering students. In the experiment, the students should identify and analyze failure modes from one of the use cases for a commercial electronic patient journals system. The results of the experiment show that on the average, the group that used misuse cases identified and analyzed more user related failure modes than the persons using FMEA. In addition, the persons who used the misuse cases scored better on perceived ease of use and intention to use.
State of the art metrics for aspect oriented programming
NASA Astrophysics Data System (ADS)
Ghareb, Mazen Ismaeel; Allen, Gary
2018-04-01
The quality evaluation of software, e.g., defect measurement, gains significance with higher use of software applications. Metric measurements are considered as the primary indicator of imperfection prediction and software maintenance in various empirical studies of software products. However, there is no agreement on which metrics are compelling quality indicators for novel development approaches such as Aspect Oriented Programming (AOP). AOP intends to enhance programming quality, by providing new and novel constructs for the development of systems, for example, point cuts, advice and inter-type relationships. Hence, it is not evident if quality pointers for AOP can be derived from direct expansions of traditional OO measurements. Then again, investigations of AOP do regularly depend on established coupling measurements. Notwithstanding the late reception of AOP in empirical studies, coupling measurements have been adopted as useful markers of flaw inclination in this context. In this paper we will investigate the state of the art metrics for measurement of Aspect Oriented systems development.
Use of Docker for deployment and testing of astronomy software
NASA Astrophysics Data System (ADS)
Morris, D.; Voutsinas, S.; Hambly, N. C.; Mann, R. G.
2017-07-01
We describe preliminary investigations of using Docker for the deployment and testing of astronomy software. Docker is a relatively new containerization technology that is developing rapidly and being adopted across a range of domains. It is based upon virtualization at operating system level, which presents many advantages in comparison to the more traditional hardware virtualization that underpins most cloud computing infrastructure today. A particular strength of Docker is its simple format for describing and managing software containers, which has benefits for software developers, system administrators and end users. We report on our experiences from two projects - a simple activity to demonstrate how Docker works, and a more elaborate set of services that demonstrates more of its capabilities and what they can achieve within an astronomical context - and include an account of how we solved problems through interaction with Docker's very active open source development community, which is currently the key to the most effective use of this rapidly-changing technology.
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
1997-01-01
The NASA Lewis Research Center is developing analytical methods and software tools to create a bridge between the controls and computational fluid dynamics (CFD) disciplines. Traditionally, control design engineers have used coarse nonlinear simulations to generate information for the design of new propulsion system controls. However, such traditional methods are not adequate for modeling the propulsion systems of complex, high-speed vehicles like the High Speed Civil Transport. To properly model the relevant flow physics of high-speed propulsion systems, one must use simulations based on CFD methods. Such CFD simulations have become useful tools for engineers that are designing propulsion system components. The analysis techniques and software being developed as part of this effort are an attempt to evolve CFD into a useful tool for control design as well. One major aspect of this research is the generation of linear models from steady-state CFD results. CFD simulations, often used during the design of high-speed inlets, yield high resolution operating point data. Under a NASA grant, the University of Akron has developed analytical techniques and software tools that use these data to generate linear models for control design. The resulting linear models have the same number of states as the original CFD simulation, so they are still very large and computationally cumbersome. Model reduction techniques have been successfully applied to reduce these large linear models by several orders of magnitude without significantly changing the dynamic response. The result is an accurate, easy to use, low-order linear model that takes less time to generate than those generated by traditional means. The development of methods for generating low-order linear models from steady-state CFD is most complete at the one-dimensional level, where software is available to generate models with different kinds of input and output variables. One-dimensional methods have been extended somewhat so that linear models can also be generated from two- and three-dimensional steady-state results. Standard techniques are adequate for reducing the order of one-dimensional CFD-based linear models. However, reduction of linear models based on two- and three-dimensional CFD results is complicated by very sparse, ill-conditioned matrices. Some novel approaches are being investigated to solve this problem.
Engineering and Software Engineering
NASA Astrophysics Data System (ADS)
Jackson, Michael
The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.
AI tools in computer based problem solving
NASA Technical Reports Server (NTRS)
Beane, Arthur J.
1988-01-01
The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
NASA Astrophysics Data System (ADS)
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
SU-E-P-43: A Knowledge Based Approach to Guidelines for Software Safety
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salomons, G; Kelly, D
Purpose: In the fall of 2012, a survey was distributed to medical physicists across Canada. The survey asked the respondents to comment on various aspects of software development and use in their clinic. The survey revealed that most centers employ locally produced (in-house) software of some kind. The respondents also indicated an interest in having software guidelines, but cautioned that the realities of cancer clinics include variations, that preclude a simple solution. Traditional guidelines typically involve periodically repeating a set of prescribed tests with defined tolerance limits. However, applying a similar formula to software is problematic since it assumes thatmore » the users have a perfect knowledge of how and when to apply the software and that if the software operates correctly under one set of conditions it will operate correctly under all conditions Methods: In the approach presented here the personnel involved with the software are included as an integral part of the system. Activities performed to improve the safety of the software are done with both software and people in mind. A learning oriented approach is taken, following the premise that the best approach to safety is increasing the understanding of those associated with the use or development of the software. Results: The software guidance document is organized by areas of knowledge related to use and development of software. The categories include: knowledge of the underlying algorithm and its limitations; knowledge of the operation of the software, such as input values, parameters, error messages, and interpretation of output; and knowledge of the environment for the software including both data and users. Conclusion: We propose a new approach to developing guidelines which is based on acquiring knowledge-rather than performing tests. The ultimate goal is to provide robust software guidelines which will be practical and effective.« less
A research on the security of wisdom campus based on geospatial big data
NASA Astrophysics Data System (ADS)
Wang, Haiying
2018-05-01
There are some difficulties in wisdom campus, such as geospatial big data sharing, function expansion, data management, analysis and mining geospatial big data for a characteristic, especially the problem of data security can't guarantee cause prominent attention increasingly. In this article we put forward a data-oriented software architecture which is designed by the ideology of orienting data and data as kernel, solve the problem of traditional software architecture broaden the campus space data research, develop the application of wisdom campus.
Software engineering capability for Ada (GRASP/Ada Tool)
NASA Technical Reports Server (NTRS)
Cross, James H., II
1995-01-01
The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.
NASA Astrophysics Data System (ADS)
Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.
Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.
NASA Technical Reports Server (NTRS)
Slafer, Loren I.
1989-01-01
Realtime simulation and hardware-in-the-loop testing is being used extensively in all phases of the design, development, and testing of the attitude control system (ACS) for the new Hughes HS601 satellite bus. Realtime, hardware-in-the-loop simulation, integrated with traditional analysis and pure simulation activities is shown to provide a highly efficient and productive overall development program. Implementation of high fidelity simulations of the satellite dynamics and control system algorithms, capable of real-time execution (using applied Dynamics International's System 100), provides a tool which is capable of being integrated with the critical flight microprocessor to create a mixed simulation test (MST). The MST creates a highly accurate, detailed simulated on-orbit test environment, capable of open and closed loop ACS testing, in which the ACS design can be validated. The MST is shown to provide a valuable extension of traditional test methods. A description of the MST configuration is presented, including the spacecraft dynamics simulation model, sensor and actuator emulators, and the test support system. Overall system performance parameters are presented. MST applications are discussed; supporting ACS design, developing on-orbit system performance predictions, flight software development and qualification testing (augmenting the traditional software-based testing), mission planning, and a cost-effective subsystem-level acceptance test. The MST is shown to provide an ideal tool in which the ACS designer can fly the spacecraft on the ground.
A Framework for Performing Verification and Validation in Reuse Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1997-01-01
Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
NASA Astrophysics Data System (ADS)
Yousif, Dilon
The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).
QTL mapping of potato chip color and tuber traits within an autotetraploid family
USDA-ARS?s Scientific Manuscript database
Cultivated potato (Solanum tuberosum L.) is a highly heterozygous autotetraploid crop species, and this presents challenges for traditional line development and molecular breeding. Recent availability of a single nucleotide polymorphism (SNP) array with 8303 features and software packages for linkag...
GiNA, an efficient and high-throughput software for horticultural phenotyping
USDA-ARS?s Scientific Manuscript database
Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed,...
Adopting best practices: "Agility" moves from software development to healthcare project management.
Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge
2006-01-01
It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.
Expert system for skin problem consultation in Thai traditional medicine.
Nopparatkiat, Pornchai; na Nagara, Byaporn; Chansa-ngavej, Chuvej
2014-01-01
This paper aimed to demonstrate the research and development of a rule-based expert system for skin problem consulting in the areas of acne, melasma, freckle, wrinkle, and uneven skin tone, with recommended treatments from Thai traditional medicine knowledge. The tool selected for developing the expert system is a software program written in the PHP language. MySQL database is used to work together with PHP for building database of the expert system. The system is web-based and can be reached from anywhere with Internet access. The developed expert system gave recommendations on the skin problem treatment with Thai herbal recipes and Thai herbal cosmetics based on 416 rules derived from primary and secondary sources. The system had been tested by 50 users consisting of dermatologists, Thai traditional medicine doctors, and general users. The developed system was considered good for learning and consultation. The present work showed how such a scattered body of traditional knowledge as Thai traditional medicine and herbal recipes could be collected, organised and made accessible to users and interested parties. The expert system developed herein should contribute in a meaningful way towards preserving the knowledge and helping promote the use of Thai traditional medicine as a practical alternative medicine for the treatment of illnesses.
Autonomy Software: V&V Challenges and Characteristics
NASA Technical Reports Server (NTRS)
Schumann, Johann; Visser, Willem
2006-01-01
The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kidd, M.E.C.
1997-02-01
The goal of our work is to provide a high level of confidence that critical software driven event sequences are maintained in the face of hardware failures, malevolent attacks and harsh or unstable operating environments. This will be accomplished by providing dynamic fault management measures directly to the software developer and to their varied development environments. The methodology employed here is inspired by previous work in path expressions. This paper discusses the perceived problems, a brief overview of path expressions, the proposed methods, and a discussion of the differences between the proposed methods and traditional path expression usage and implementation.
Pedagogical Issues in Object Orientation.
ERIC Educational Resources Information Center
Nerur, Sridhar; Ramanujan, Sam; Kesh, Someswar
2002-01-01
Discusses the need for people with object-oriented (OO) skills, explains benefits of OO in software development, and addresses some of the difficulties in teaching OO. Topics include the evolution of programming languages; differences between OO and traditional approaches; differences from data modeling; and Unified Modeling Language (UML) and…
Test Driven Development: Lessons from a Simple Scientific Model
NASA Astrophysics Data System (ADS)
Clune, T. L.; Kuo, K.
2010-12-01
In the commercial software industry, unit testing frameworks have emerged as a disruptive technology that has permanently altered the process by which software is developed. Unit testing frameworks significantly reduce traditional barriers, both practical and psychological, to creating and executing tests that verify software implementations. A new development paradigm, known as test driven development (TDD), has emerged from unit testing practices, in which low-level tests (i.e. unit tests) are created by developers prior to implementing new pieces of code. Although somewhat counter-intuitive, this approach actually improves developer productivity. In addition to reducing the average time for detecting software defects (bugs), the requirement to provide procedure interfaces that enable testing frequently leads to superior design decisions. Although TDD is widely accepted in many software domains, its applicability to scientific modeling still warrants reasonable skepticism. While the technique is clearly relevant for infrastructure layers of scientific models such as the Earth System Modeling Framework (ESMF), numerical and scientific components pose a number of challenges to TDD that are not often encountered in commercial software. Nonetheless, our experience leads us to believe that the technique has great potential not only for developer productivity, but also as a tool for understanding and documenting the basic scientific assumptions upon which our models are implemented. We will provide a brief introduction to test driven development and then discuss our experience in using TDD to implement a relatively simple numerical model that simulates the growth of snowflakes. Many of the lessons learned are directly applicable to larger scientific models.
A Data-Driven Solution for Performance Improvement
NASA Technical Reports Server (NTRS)
2002-01-01
Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.
Using Modern Methodologies with Maintenance Software
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.
2014-01-01
Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.
Develop Direct Geo-referencing System Based on Open Source Software and Hardware Platform
NASA Astrophysics Data System (ADS)
Liu, H. S.; Liao, H. M.
2015-08-01
Direct geo-referencing system uses the technology of remote sensing to quickly grasp images, GPS tracks, and camera position. These data allows the construction of large volumes of images with geographic coordinates. So that users can be measured directly on the images. In order to properly calculate positioning, all the sensor signals must be synchronized. Traditional aerial photography use Position and Orientation System (POS) to integrate image, coordinates and camera position. However, it is very expensive. And users could not use the result immediately because the position information does not embed into image. To considerations of economy and efficiency, this study aims to develop a direct geo-referencing system based on open source software and hardware platform. After using Arduino microcontroller board to integrate the signals, we then can calculate positioning with open source software OpenCV. In the end, we use open source panorama browser, panini, and integrate all these to open source GIS software, Quantum GIS. A wholesome collection of data - a data processing system could be constructed.
Long-Term Retention after Self-Instructional Methods.
ERIC Educational Resources Information Center
Puskas, Jane C.; And Others
1992-01-01
A study of the effectiveness of self-instructional booklets and computer software for teaching dental students endodontic diagnosis found that the self-teaching method may be as effective as traditional lectures in teaching concepts central to development of clinical decision-making skills. Sampling difficulties created problems in assessment of…
USDA-ARS?s Scientific Manuscript database
Recently, a variant of stochastic dominance called stochastic efficiency with respect to a function (SERF) has been developed and applied. Unlike traditional stochastic dominance approaches, SERF uses the concept of certainty equivalents (CEs) to rank a set of risk-efficient alternatives instead of...
AI-Based Chatterbots and Spoken English Teaching: A Critical Analysis
ERIC Educational Resources Information Center
Sha, Guoquan
2009-01-01
The aim of various approaches implemented, whether the classical "three Ps" (presentation, practice, and production) or communicative language teaching (CLT), is to achieve communicative competence. Although a lot of software developed for teaching spoken English is dressed up to raise interaction, its methodology is largely rooted in tradition.…
CONTENTdm Digital Collection Management Software and End-User Efficacy
ERIC Educational Resources Information Center
Dickson, Maggie
2008-01-01
Digital libraries and collections are a growing facet of today's traditional library. Digital library technologies have become increasingly more sophisticated in the effort to provide more and better access to the collections they contain. The evaluation of the usability of these technologies has not kept pace with technological developments,…
Modeling and Hazard Analysis Using STPA
NASA Astrophysics Data System (ADS)
Ishimatsu, Takuto; Leveson, Nancy; Thomas, John; Katahira, Masa; Miyamoto, Yuko; Nakao, Haruka
2010-09-01
A joint research project between MIT and JAXA/JAMSS is investigating the application of a new hazard analysis to the system and software in the HTV. Traditional hazard analysis focuses on component failures but software does not fail in this way. Software most often contributes to accidents by commanding the spacecraft into an unsafe state(e.g., turning off the descent engines prematurely) or by not issuing required commands. That makes the standard hazard analysis techniques of limited usefulness on software-intensive systems, which describes most spacecraft built today. STPA is a new hazard analysis technique based on systems theory rather than reliability theory. It treats safety as a control problem rather than a failure problem. The goal of STPA, which is to create a set of scenarios that can lead to a hazard, is the same as FTA but STPA includes a broader set of potential scenarios including those in which no failures occur but the problems arise due to unsafe and unintended interactions among the system components. STPA also provides more guidance to the analysts that traditional fault tree analysis. Functional control diagrams are used to guide the analysis. In addition, JAXA uses a model-based system engineering development environment(created originally by Leveson and called SpecTRM) which also assists in the hazard analysis. One of the advantages of STPA is that it can be applied early in the system engineering and development process in a safety-driven design process where hazard analysis drives the design decisions rather than waiting until reviews identify problems that are then costly or difficult to fix. It can also be applied in an after-the-fact analysis and hazard assessment, which is what we did in this case study. This paper describes the experimental application of STPA to the JAXA HTV in order to determine the feasibility and usefulness of the new hazard analysis technique. Because the HTV was originally developed using fault tree analysis and following the NASA standards for safety-critical systems, the results of our experimental application of STPA can be compared with these more traditional safety engineering approaches in terms of the problems identified and the resources required to use it.
The discounting model selector: Statistical software for delay discounting applications.
Gilroy, Shawn P; Franck, Christopher T; Hantula, Donald A
2017-05-01
Original, open-source computer software was developed and validated against established delay discounting methods in the literature. The software executed approximate Bayesian model selection methods from user-supplied temporal discounting data and computed the effective delay 50 (ED50) from the best performing model. Software was custom-designed to enable behavior analysts to conveniently apply recent statistical methods to temporal discounting data with the aid of a graphical user interface (GUI). The results of independent validation of the approximate Bayesian model selection methods indicated that the program provided results identical to that of the original source paper and its methods. Monte Carlo simulation (n = 50,000) confirmed that true model was selected most often in each setting. Simulation code and data for this study were posted to an online repository for use by other researchers. The model selection approach was applied to three existing delay discounting data sets from the literature in addition to the data from the source paper. Comparisons of model selected ED50 were consistent with traditional indices of discounting. Conceptual issues related to the development and use of computer software by behavior analysts and the opportunities afforded by free and open-sourced software are discussed and a review of possible expansions of this software are provided. © 2017 Society for the Experimental Analysis of Behavior.
Modern Computational Techniques for the HMMER Sequence Analysis
2013-01-01
This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944
Digital Geological Mapping for Earth Science Students
NASA Astrophysics Data System (ADS)
England, Richard; Smith, Sally; Tate, Nick; Jordan, Colm
2010-05-01
This SPLINT (SPatial Literacy IN Teaching) supported project is developing pedagogies for the introduction of teaching of digital geological mapping to Earth Science students. Traditionally students are taught to make geological maps on a paper basemap with a notebook to record their observations. Learning to use a tablet pc with GIS based software for mapping and data recording requires emphasis on training staff and students in specific GIS and IT skills and beneficial adjustments to the way in which geological data is recorded in the field. A set of learning and teaching materials are under development to support this learning process. Following the release of the British Geological Survey's Sigma software we have been developing generic methodologies for the introduction of digital geological mapping to students that already have experience of mapping by traditional means. The teaching materials introduce the software to the students through a series of structured exercises. The students learn the operation of the software in the laboratory by entering existing observations, preferably data that they have collected. Through this the students benefit from being able to reflect on their previous work, consider how it might be improved and plan new work. Following this they begin fieldwork in small groups using both methods simultaneously. They are able to practise what they have learnt in the classroom and review the differences, advantages and disadvantages of the two methods, while adding to the work that has already been completed. Once the field exercises are completed students use the data that they have collected in the production of high quality map products and are introduced to the use of integrated digital databases which they learn to search and extract information from. The relatively recent development of the technologies which underpin digital mapping also means that many academic staff also require training before they are able to deliver the course materials. Consequently, a set of staff training materials are being developed in parallel to those for the students. These focus on the operation of the software and an introduction to the structure of the exercises. The presentation will review the teaching exercises and student and staff responses to their introduction.
NASA Astrophysics Data System (ADS)
Filippov, A. V.; Tarasov, S. Yu; Podgornyh, O. A.; Shamarin, N. N.; Filippova, E. O.
2017-01-01
Automatization of engineering processes requires developing relevant mathematical support and a computer software. Analysis of metal cutting kinematics and tool geometry is a necessary key task at the preproduction stage. This paper is focused on developing a procedure for determining the geometry of oblique peakless round-nose tool lathe machining with the use of vector/matrix transformations. Such an approach allows integration into modern mathematical software packages in distinction to the traditional analytic description. Such an advantage is very promising for developing automated control of the preproduction process. A kinematic criterion for the applicable tool geometry has been developed from the results of this study. The effect of tool blade inclination and curvature on the geometry-dependent process parameters was evaluated.
Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study
NASA Astrophysics Data System (ADS)
Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.
2017-12-01
Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.
ERIC Educational Resources Information Center
Santana-Paixao, Raquel C.
2017-01-01
Oral testing administration plays a significant role in foreign language programs aiming to foster the development of students' speaking abilities. With the development of language teaching software, the use of computer based recording tools are becoming increasingly used in language courses as an alternative to traditional face-to-face oral…
ERIC Educational Resources Information Center
Eftekhari, Maryam; Sotoudehnama, Elaheh; Marandi, S. Susan
2016-01-01
Developing higher-order critical thinking skills as one of the central objectives of education has been recently facilitated via software packages. Whereas one such technology as computer-aided argument mapping is reported to enhance levels of critical thinking (van Gelder 2001), its application as a pedagogical tool in English as a Foreign…
Nurturing reliable and robust open-source scientific software
NASA Astrophysics Data System (ADS)
Uieda, L.; Wessel, P.
2017-12-01
Scientific results are increasingly the product of software. The reproducibility and validity of published results cannot be ensured without access to the source code of the software used to produce them. Therefore, the code itself is a fundamental part of the methodology and must be published along with the results. With such a reliance on software, it is troubling that most scientists do not receive formal training in software development. Tools such as version control, continuous integration, and automated testing are routinely used in industry to ensure the correctness and robustness of software. However, many scientist do not even know of their existence (although efforts like Software Carpentry are having an impact on this issue; software-carpentry.org). Publishing the source code is only the first step in creating an open-source project. For a project to grow it must provide documentation, participation guidelines, and a welcoming environment for new contributors. Expanding the project community is often more challenging than the technical aspects of software development. Maintainers must invest time to enforce the rules of the project and to onboard new members, which can be difficult to justify in the context of the "publish or perish" mentality. This problem will continue as long as software contributions are not recognized as valid scholarship by hiring and tenure committees. Furthermore, there are still unsolved problems in providing attribution for software contributions. Many journals and metrics of academic productivity do not recognize citations to sources other than traditional publications. Thus, some authors choose to publish an article about the software and use it as a citation marker. One issue with this approach is that updating the reference to include new contributors involves writing and publishing a new article. A better approach would be to cite a permanent archive of individual versions of the source code in services such as Zenodo (zenodo.org). However, citations to these sources are not always recognized when computing citation metrics. In summary, the widespread development of reliable and robust open-source software relies on the creation of formal training programs in software development best practices and the recognition of software as a valid form of scholarship.
Mining dynamic noteworthy functions in software execution sequences.
Zhang, Bing; Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong
2017-01-01
As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely.
Toward a user-driven approach to radiology software solutions: putting the wag back in the dog.
Morgan, Matthew; Mates, Jonathan; Chang, Paul
2006-09-01
The relationship between healthcare providers and the software industry is evolving. In many cases, industry's traditional, market-driven model is failing to meet the increasingly sophisticated and appropriately individualized needs of providers. Advances in both technology infrastructure and development methodologies have set the stage for the transition from a vendor-driven to a more user-driven process of solution engineering. To make this transition, providers must take an active role in the development process and vendors must provide flexible frameworks on which to build. Only then can the provider/vendor relationship mature from a purchaser/supplier to a codesigner/partner model, where true insight and innovation can occur.
Zhao, Yan-qing; Teng, Jing
2015-03-01
To analyze the composition and medication regularities of prescriptions treating hypochondriac pain in Chinese journal full-text database (CNKI) based on the traditional Chinese medicine inheritance support system, in order to provide a reference for further research and development for new traditional Chinese medicines treating hypochondriac pain. The traditional Chinese medicine inheritance support platform software V2. 0 was used to build a prescription database of Chinese medicines treating hypochondriac pain. The software integration data mining method was used to distribute prescriptions according to "four odors", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis were made for 192 prescriptions treating hypochondriac pain to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations and summarize 15 new prescriptions. This study indicated that the prescriptions treating hypochondriac pain in Chinese journal full-text database are mostly those for soothing liver-qi stagnation, promoting qi and activating blood, clearing heat and promoting dampness, and invigorating spleen and removing phlem, with a cold property and bitter taste, and reflect the principles of "distinguish deficiency and excess and relieving pain by smoothening meridians" in treating hypochondriac pain.
Dorofeeva, A A; Khrustalev, A V; Krylov, Iu V; Bocharov, D A; Negasheva, M A
2010-01-01
Digital images of the iris were received for study peculiarities of the iris color during the anthropological examination of 578 students aged 16-24 years. Simultaneously with the registration of the digital images, the visual assessment of the eye color was carried out using the traditional scale of Bunak, based on 12 ocular prostheses. Original software for automatic determination of the iris color based on 12 classes scale of Bunak was designed, and computer version of that scale was developed. The software proposed allows to conduct the determination of the iris color with high validity based on numerical evaluation; its application may reduce the bias due to subjective assessment and methodological divergences of the different researchers. The software designed for automatic determination of the iris color may help develop both theoretical and applied anthropology, it may be used in forensic and emergency medicine, sports medicine, medico-genetic counseling and professional selection.
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.
2011-01-01
As a result of recommendation from the Augustine Panel, the direction for Human Space Flight has been altered from the original plan referred to as Constellation. NASA s Human Exploration Framework Team (HEFT) proposes the use of a Shuttle Derived Heavy Lift Launch Vehicle (SDLV) and an Orion derived spacecraft (salvaged from Constellation) to support a new flexible direction for space exploration. The SDLV must be developed within an environment of a constrained budget and a preferred fast development schedule. Thus, it has been proposed to utilize existing assets from the Shuttle Program to speed development at a lower cost. These existing assets should not only include structures such as external tanks or solid rockets, but also the Flight Software which has traditionally been a "long pole" in new development efforts. The avionics and software for the Space Shuttle was primarily developed in the 70 s and considered state of the art for that time. As one may argue that the existing avionics and flight software may be too outdated to support the new SDLV effort, this is a fallacy if they can be evolved over time into a "modern avionics" platform. The technology may be outdated, but the avionics concepts and flight software algorithms are not. The reuse of existing avionics and software also allows for the reuse of development, verification, and operations facilities. The keyword is evolve in that these assets can support the fast development of such a vehicle, but then be gradually evolved over time towards more modern platforms as budget and schedule permits. The "gold" of the flight software is the "control loop" algorithms of the vehicle. This is the Guidance, Navigation, and Control (GNC) software algorithms. This software is typically the most expensive to develop, test, and verify. Thus, the approach is to preserve the GNC flight software, while first evolving the supporting software (such as Command and Data Handling, Caution and Warning, Telemetry, etc.). This can be accomplished by gradually removing the "support software" from the legacy flight software leaving only the GNC algorithms. The "support software" could be re-developed for modern platforms, while leaving the GNC algorithms to execute on technology compatible with the legacy system. It is also possible to package the GNC algorithms into an emulated version of the original computer (via Field Programmable Gate Arrays or FPGAs), thus becoming a "GNC on a Chip" solution where it could live forever to be embedded in modern avionics platforms.
Comparison of BrainTool to other UML modeling and model transformation tools
NASA Astrophysics Data System (ADS)
Nikiforova, Oksana; Gusarovs, Konstantins
2017-07-01
In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.
XSEOS: An Open Software for Chemical Engineering Thermodynamics
ERIC Educational Resources Information Center
Castier, Marcelo
2008-01-01
An Excel add-in--XSEOS--that implements several excess Gibbs free energy models and equations of state has been developed for educational use. Several traditional and modern thermodynamic models are available in the package with a user-friendly interface. XSEOS has open code, is freely available, and should be useful for instructors and students…
ERIC Educational Resources Information Center
Dyer, Mark; Grey, Thomas; Kinnane, Oliver
2017-01-01
It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about…
Text, Graphics, and Multimedia Materials Employed in Learning a Computer-Based Procedural Task
ERIC Educational Resources Information Center
Coffindaffer, Kari Christine Carlson
2010-01-01
The present research study investigated the interaction of graphic design students with different forms of software training materials. Four versions of the procedural task instructions were developed (A) Traditional Textbook with Still Images, (B) Modified Text with Integrated Still Images, (C) Onscreen Modified Text with Silent Onscreen Video…
Stereo Orthogonal Axonometric Perspective for the Teaching of Descriptive Geometry
ERIC Educational Resources Information Center
Méxas, José Geraldo Franco; Guedes, Karla Bastos; Tavares, Ronaldo da Silva
2015-01-01
Purpose: The purpose of this paper is to present the development of a software for stereo visualization of geometric solids, applied to the teaching/learning of Descriptive Geometry. Design/methodology/approach: The paper presents the traditional method commonly used in computer graphic stereoscopic vision (implemented in C language) and the…
ERIC Educational Resources Information Center
Górski, Filip; Bun, Pawel; Wichniarek, Radoslaw; Zawadzki, Przemyslaw; Hamrol, Adam
2017-01-01
Effective medical and biomedical engineering education is an important problem. Traditional methods are difficult and costly. That is why Virtual Reality is often used for that purpose. Educational medical VR is a well-developed IT field, with many available hardware and software solutions. Current solutions are prepared without methodological…
Optimizing spacecraft design - optimization engine development : progress and plans
NASA Technical Reports Server (NTRS)
Cornford, Steven L.; Feather, Martin S.; Dunphy, Julia R; Salcedo, Jose; Menzies, Tim
2003-01-01
At JPL and NASA, a process has been developed to perform life cycle risk management. This process requires users to identify: goals and objectives to be achieved (and their relative priorities), the various risks to achieving those goals and objectives, and options for risk mitigation (prevention, detection ahead of time, and alleviation). Risks are broadly defined to include the risk of failing to design a system with adequate performance, compatibility and robustness in addition to more traditional implementation and operational risks. The options for mitigating these different kinds of risks can include architectural and design choices, technology plans and technology back-up options, test-bed and simulation options, engineering models and hardware/software development techniques and other more traditional risk reduction techniques.
Ross, David E; Ochs, Alfred L; Seabaugh, Jan M; Shrader, Carole R
2013-01-01
NeuroQuant® is a recently developed, FDA-approved software program for measuring brain MRI volume in clinical settings. The purpose of this study was to compare NeuroQuant with the radiologist's traditional approach, based on visual inspection, in 20 outpatients with mild or moderate traumatic brain injury (TBI). Each MRI was analyzed with NeuroQuant, and the resulting volumetric analyses were compared with the attending radiologist's interpretation. The radiologist's traditional approach found atrophy in 10.0% of patients; NeuroQuant found atrophy in 50.0% of patients. NeuroQuant was more sensitive for detecting brain atrophy than the traditional radiologist's approach.
Metadata-driven Delphi rating on the Internet.
Deshpande, Aniruddha M; Shiffman, Richard N; Nadkarni, Prakash M
2005-01-01
Paper-based data collection and analysis for consensus development is inefficient and error-prone. Computerized techniques that could improve efficiency, however, have been criticized as costly, inconvenient and difficult to use. We designed and implemented a metadata-driven Web-based Delphi rating and analysis tool, employing the flexible entity-attribute-value schema to create generic, reusable software. The software can be applied to various domains by altering the metadata; the programming code remains intact. This approach greatly reduces the marginal cost of re-using the software. We implemented our software to prepare for the Conference on Guidelines Standardization. Twenty-three invited experts completed the first round of the Delphi rating on the Web. For each participant, the software generated individualized reports that described the median rating and the disagreement index (calculated from the Interpercentile Range Adjusted for Symmetry) as defined by the RAND/UCLA Appropriateness Method. We evaluated the software with a satisfaction survey using a five-level Likert scale. The panelists felt that Web data entry was convenient (median 4, interquartile range [IQR] 4.0-5.0), acceptable (median 4.5, IQR 4.0-5.0) and easily accessible (median 5, IQR 4.0-5.0). We conclude that Web-based Delphi rating for consensus development is a convenient and acceptable alternative to the traditional paper-based method.
Scrum and Global Delivery: Pitfalls and Lessons Learned
NASA Astrophysics Data System (ADS)
Sadun, Cristiano
Two trends are becoming widespread in software development work—agile development processes and global delivery, both promising sizable benefits in productivity, capacity and so on. Combining the two is a highly attractive possibility, even more so in fast-paced and constrained commercial software engineering projects. However, a degree of conflict exists between the assumptions underlying the two ideas, leading to pitfalls and challenges in agile/distributed projects which are new, both with respect to traditional development and agile or distributed efforts adopted separately. Succeeding in commercial agile/distributed projects implies recognizing these new challenges, proactively planning for them, and actively put in place solutions and methods to overcome them. This chapter illustrates some of the typical challenges that were met during real-world commercial projects, and how they were solved.
Enabling Agile Testing through Continuous Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stolberg, Sean E.
2009-08-24
A Continuous Integration system is often considered one of the key elements involved in supporting an agile software development and testing environment. As a traditional software tester transitioning to an agile development environment it became clear to me that I would need to put this essential infrastructure in place and promote improved development practices in order to make the transition to agile testing possible. This experience report discusses a continuous integration implementation I lead last year. The initial motivations for implementing continuous integration are discussed and a pre and post-assessment using Martin Fowler's "Practices of Continuous Integration" is provided alongmore » with the technical specifics of the implementation. Finally, I’ll wrap up with a retrospective of my experiences implementing and promoting continuous integration within the context of agile testing.« less
Browsing software of the Visible Korean data used for teaching sectional anatomy.
Shin, Dong Sun; Chung, Min Suk; Park, Hyo Seok; Park, Jin Seo; Hwang, Sung Bae
2011-01-01
The interpretation of computed tomographs (CTs) and magnetic resonance images (MRIs) to diagnose clinical conditions requires basic knowledge of sectional anatomy. Sectional anatomy has traditionally been taught using sectioned cadavers, atlases, and/or computer software. The computer software commonly used for this subject is practical and efficient for students but could be more advanced. The objective of this research was to present browsing software developed from the Visible Korean images that can be used for teaching sectional anatomy. One thousand seven hundred and two sets of MRIs, CTs, and sectioned images (intervals, one millimeter) of a whole male cadaver were prepared. Over 900 structures in the sectioned images were outlined and then filled with different colors to elaborate each structure. Software was developed where four corresponding images could be displayed simultaneously; in addition, the structures in the image data could be readily recognized with the aid of the color-filled outlines. The software, distributed free of charge, could be a valuable tool to teach medical students. For example, sectional anatomy could be taught by showing the sectioned images with real color and high resolution. Students could then review the lecture by using the sectioned and color-filled images on their own computers. Students could also be evaluated using the same software. Furthermore, other investigators would be able to replace the images for more comprehensive sectional anatomy. Copyright © 2011 Wiley-Liss, Inc.
A Knowledge-Based Representation Scheme for Environmental Science Models
NASA Technical Reports Server (NTRS)
Keller, Richard M.; Dungan, Jennifer L.; Lum, Henry, Jr. (Technical Monitor)
1994-01-01
One of the primary methods available for studying environmental phenomena is the construction and analysis of computational models. We have been studying how artificial intelligence techniques can be applied to assist in the development and use of environmental science models within the context of NASA-sponsored activities. We have identified several high-utility areas as potential targets for research and development: model development; data visualization, analysis, and interpretation; model publishing and reuse, training and education; and framing, posing, and answering questions. Central to progress on any of the above areas is a representation for environmental models that contains a great deal more information than is present in a traditional software implementation. In particular, a traditional software implementation is devoid of any semantic information that connects the code with the environmental context that forms the background for the modeling activity. Before we can build AI systems to assist in model development and usage, we must develop a representation for environmental models that adequately describes a model's semantics and explicitly represents the relationship between the code and the modeling task at hand. We have developed one such representation in conjunction with our work on the SIGMA (Scientists' Intelligent Graphical Modeling Assistant) environment. The key feature of the representation is that it provides a semantic grounding for the symbols in a set of modeling equations by linking those symbols to an explicit representation of the underlying environmental scenario.
Table-driven configuration and formatting of telemetry data in the Deep Space Network
NASA Technical Reports Server (NTRS)
Manning, Evan
1994-01-01
With a restructured software architecture for telemetry system control and data processing, the NASA/Deep Space Network (DSN) has substantially improved its ability to accommodate a wide variety of spacecraft in an era of 'better, faster, cheaper'. In the new architecture, the permanent software implements all capabilities needed by any system user, and text tables specify how these capabilities are to be used for each spacecraft. Most changes can now be made rapidly, outside of the traditional software development cycle. The system can be updated to support a new spacecraft through table changes rather than software changes, reducing the implementation, test, and delivery cycle for such a change from three months to three weeks. The mechanical separation of the text table files from the program software, with tables only loaded into memory when that mission is being supported, dramatically reduces the level of regression testing required. The format of each table is a different compromise between ease of human interpretation, efficiency of computer interpretation, and flexibility.
ATLAS event display: Virtual Point-1 visualization software
NASA Astrophysics Data System (ADS)
Seeley, Kaelyn; Dimond, David; Bianchi, R. M.; Boudreau, Joseph; Hong, Tae Min; Atlas Collaboration
2017-01-01
Virtual Point-1 (VP1) is an event display visualization software for the ATLAS Experiment. VP1 is a software framework that makes use of ATHENA, the ATLAS software infrastructure, to access the complete detector geometry. This information is used to draw graphics representing the components of the detector at any scale. Two new features are added to VP1. The first is a traditional ``lego'' plot, displaying the calorimeter energy deposits in eta-phi space. The second is another lego plot focusing on the forward endcap region, displaying the energy deposits in r-phi space. Currently, these new additions display the energy deposits based on the granularity of the middle layer of the liquid-Argon electromagnetic calorimeter. Since VP1 accesses the complete detector geometry and all experimental data, future developments are outlined for a more detailed display involving multiple layers of the calorimeter along with their distinct granularities.
Precise and Scalable Static Program Analysis of NASA Flight Software
NASA Technical Reports Server (NTRS)
Brat, G.; Venet, A.
2005-01-01
Recent NASA mission failures (e.g., Mars Polar Lander and Mars Orbiter) illustrate the importance of having an efficient verification and validation process for such systems. One software error, as simple as it may be, can cause the loss of an expensive mission, or lead to budget overruns and crunched schedules. Unfortunately, traditional verification methods cannot guarantee the absence of errors in software systems. Therefore, we have developed the CGS static program analysis tool, which can exhaustively analyze large C programs. CGS analyzes the source code and identifies statements in which arrays are accessed out of bounds, or, pointers are used outside the memory region they should address. This paper gives a high-level description of CGS and its theoretical foundations. It also reports on the use of CGS on real NASA software systems used in Mars missions (from Mars PathFinder to Mars Exploration Rover) and on the International Space Station.
Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering
NASA Astrophysics Data System (ADS)
Atkinson, Colin
The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.
Semantic Metrics for Analysis of Software
NASA Technical Reports Server (NTRS)
Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara
2005-01-01
A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.
Automatic Generation of Just-in-Time Online Assessments from Software Design Models
ERIC Educational Resources Information Center
Zualkernan, Imran A.; El-Naaj, Salim Abou; Papadopoulos, Maria; Al-Amoudi, Budoor K.; Matthews, Charles E.
2009-01-01
Computer software is pervasive in today's society. The rate at which new versions of computer software products are released is phenomenal when compared to the release rate of new products in traditional industries such as aircraft building. This rapid rate of change can partially explain why most certifications in the software industry are…
Selection and Integration of a Computer Simulation for Public Budgeting and Finance (PBS 116).
ERIC Educational Resources Information Center
Banas, Ed Jr.
1998-01-01
Describes the development of a course on public budgeting and finance, which integrated the use of SimCity Classic, a computer-simulation software, with traditional lecture, guest speakers, and collaborative-learning activities. Explains the rationale for the course design and discusses the results from the first semester of teaching the course.…
ERIC Educational Resources Information Center
Zanin, Mary K. B.
2015-01-01
Over the years, many of my students have reported that they enjoy lectures that include short, simple animations. To keep students engaged, I have developed a small set of teaching animations using PowerPoint and Camtasia Studio software packages. A survey of students who learned four difficult topics with traditional written lessons and with…
Planning and Scheduling of Software Manufacturing Projects
1991-03-01
based on the previous results in social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing...planning and scheduling, and the traditional approaches to planning in artificial intelligence, and extends the techniques that have been developed by them...social analysis of computing, operations research in manufacturing, artificial intelligence in manufacturing planning and scheduling, and the
ERIC Educational Resources Information Center
Srinivasan, Deepa
2013-01-01
Recent rapid malware growth has exposed the limitations of traditional in-host malware-defense systems and motivated the development of secure virtualization-based solutions. By running vulnerable systems as virtual machines (VMs) and moving security software from inside VMs to the outside, the out-of-VM solutions securely isolate the anti-malware…
Using an architectural approach to integrate heterogeneous, distributed software components
NASA Technical Reports Server (NTRS)
Callahan, John R.; Purtilo, James M.
1995-01-01
Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.
Mining dynamic noteworthy functions in software execution sequences
Huang, Guoyan; Wang, Yuqian; He, Haitao; Ren, Jiadong
2017-01-01
As the quality of crucial entities can directly affect that of software, their identification and protection become an important premise for effective software development, management, maintenance and testing, which thus contribute to improving the software quality and its attack-defending ability. Most analysis and evaluation on important entities like codes-based static structure analysis are on the destruction of the actual software running. In this paper, from the perspective of software execution process, we proposed an approach to mine dynamic noteworthy functions (DNFM)in software execution sequences. First, according to software decompiling and tracking stack changes, the execution traces composed of a series of function addresses were acquired. Then these traces were modeled as execution sequences and then simplified so as to get simplified sequences (SFS), followed by the extraction of patterns through pattern extraction (PE) algorithm from SFS. After that, evaluating indicators inner-importance and inter-importance were designed to measure the noteworthiness of functions in DNFM algorithm. Finally, these functions were sorted by their noteworthiness. Comparison and contrast were conducted on the experiment results from two traditional complex network-based node mining methods, namely PageRank and DegreeRank. The results show that the DNFM method can mine noteworthy functions in software effectively and precisely. PMID:28278276
Computational Infrastructure for Geodynamics (CIG)
NASA Astrophysics Data System (ADS)
Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.
2004-12-01
Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.
Flight dynamics software in a distributed network environment
NASA Technical Reports Server (NTRS)
Jeletic, J.; Weidow, D.; Boland, D.
1995-01-01
As with all NASA facilities, the announcement of reduced budgets, reduced staffing, and the desire to implement smaller/quicker/cheaper missions has required the Agency's organizations to become more efficient in what they do. To accomplish these objectives, the FDD has initiated the development of the Flight Dynamics Distributed System (FDDS). The underlying philosophy of FDDS is to build an integrated system that breaks down the traditional barriers of attitude, mission planning, and navigation support software to provide a uniform approach to flight dynamics applications. Through the application of open systems concepts and state-of-the-art technologies, including object-oriented specification concepts, object-oriented software, and common user interface, communications, data management, and executive services, the FDD will reengineer most of its six million lines of code.
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.
Open high-level data formats and software for gamma-ray astronomy
NASA Astrophysics Data System (ADS)
Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio
2017-01-01
In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.
Yu, Xuefei; Lin, Liangzhuo; Shen, Jie; Chen, Zhi; Jian, Jun; Li, Bin; Xin, Sherman Xuegang
2018-01-01
The mean amplitude of glycemic excursions (MAGE) is an essential index for glycemic variability assessment, which is treated as a key reference for blood glucose controlling at clinic. However, the traditional "ruler and pencil" manual method for the calculation of MAGE is time-consuming and prone to error due to the huge data size, making the development of robust computer-aided program an urgent requirement. Although several software products are available instead of manual calculation, poor agreement among them is reported. Therefore, more studies are required in this field. In this paper, we developed a mathematical algorithm based on integer nonlinear programming. Following the proposed mathematical method, an open-code computer program named MAGECAA v1.0 was developed and validated. The results of the statistical analysis indicated that the developed program was robust compared to the manual method. The agreement among the developed program and currently available popular software is satisfied, indicating that the worry about the disagreement among different software products is not necessary. The open-code programmable algorithm is an extra resource for those peers who are interested in the related study on methodology in the future.
A Framework for Performing V&V within Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1996-01-01
Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.
Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study
ERIC Educational Resources Information Center
Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.
2009-01-01
Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…
Browsing Software of the Visible Korean Data Used for Teaching Sectional Anatomy
ERIC Educational Resources Information Center
Shin, Dong Sun; Chung, Min Suk; Park, Hyo Seok; Park, Jin Seo; Hwang, Sung Bae
2011-01-01
The interpretation of computed tomographs (CTs) and magnetic resonance images (MRIs) to diagnose clinical conditions requires basic knowledge of sectional anatomy. Sectional anatomy has traditionally been taught using sectioned cadavers, atlases, and/or computer software. The computer software commonly used for this subject is practical and…
Dave, Vivek S; Shahin, Hend I; Youngren-Ortiz, Susanne R; Chougule, Mahavir B; Haware, Rahul V
2017-10-30
The density, porosity, breaking force, viscoelastic properties, and the presence or absence of any structural defects or irregularities are important physical-mechanical quality attributes of popular solid dosage forms like tablets. The irregularities associated with these attributes may influence the drug product functionality. Thus, an accurate and efficient characterization of these properties is critical for successful development and manufacturing of a robust tablets. These properties are mainly analyzed and monitored with traditional pharmacopeial and non-pharmacopeial methods. Such methods are associated with several challenges such as lack of spatial resolution, efficiency, or sample-sparing attributes. Recent advances in technology, design, instrumentation, and software have led to the emergence of newer techniques for non-invasive characterization of physical-mechanical properties of tablets. These techniques include near infrared spectroscopy, Raman spectroscopy, X-ray microtomography, nuclear magnetic resonance (NMR) imaging, terahertz pulsed imaging, laser-induced breakdown spectroscopy, and various acoustic- and thermal-based techniques. Such state-of-the-art techniques are currently applied at various stages of development and manufacturing of tablets at industrial scale. Each technique has specific advantages or challenges with respect to operational efficiency and cost, compared to traditional analytical methods. Currently, most of these techniques are used as secondary analytical tools to support the traditional methods in characterizing or monitoring tablet quality attributes. Therefore, further development in the instrumentation and software, and studies on the applications are necessary for their adoption in routine analysis and monitoring of tablet physical-mechanical properties. Copyright © 2017 Elsevier B.V. All rights reserved.
Yu, Ming; Cao, Qi-chen; Su, Yu-xi; Sui, Xin; Yang, Hong-jun; Huang, Lu-qi; Wang, Wen-ping
2015-08-01
Malignant tumor is one of the main causes for death in the world at present as well as a major disease seriously harming human health and life and restricting the social and economic development. There are many kinds of reports about traditional Chinese medicine patent prescriptions, empirical prescriptions and self-made prescriptions treating cancer, and prescription rules were often analyzed based on medication frequency. Such methods were applicable for discovering dominant experience but hard to have an innovative discovery and knowledge. In this paper, based on the traditional Chinese medicine inheritance assistance system, the software integration of mutual information improvement method, complex system entropy clustering and unsupervised entropy-level clustering data mining methods was adopted to analyze the rules of traditional Chinese medicine prescriptions for cancer. Totally 114 prescriptions were selected, the frequency of herbs in prescription was determined, and 85 core combinations and 13 new prescriptions were indentified. The traditional Chinese medicine inheritance assistance system, as a valuable traditional Chinese medicine research-supporting tool, can be used to record, manage, inquire and analyze prescription data.
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
The Cloud-Based Integrated Data Viewer (IDV)
NASA Astrophysics Data System (ADS)
Fisher, Ward
2015-04-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.
NASA Astrophysics Data System (ADS)
Semushin, I. V.; Tsyganova, J. V.; Ugarov, V. V.; Afanasova, A. I.
2018-05-01
Russian higher education institutions' tradition of teaching large-enrolled classes is impairing student striving for individual prominence, one-upmanship, and hopes for originality. Intending to converting these drawbacks into benefits, a Project-Centred Education Model (PCEM) has been introduced to deliver Computational Mathematics and Information Science courses. The model combines a Frontal Competitive Approach and a Project-Driven Learning (PDL) framework. The PDL framework has been developed by stating and solving three design problems: (i) enhance the diversity of project assignments on specific computation methods algorithmic approaches, (ii) balance similarity and dissimilarity of the project assignments, and (iii) develop a software assessment tool suitable for evaluating the technological maturity of students' project deliverables and thus reducing instructor's workload and possible overlook. The positive experience accumulated over 15 years shows that implementing the PCEM keeps students motivated to strive for success in rising to higher levels of their computational and software engineering skills.
ISAC's Gating-ML 2.0 data exchange standard for gating description.
Spidlen, Josef; Moore, Wayne; Brinkman, Ryan R
2015-07-01
The lack of software interoperability with respect to gating has traditionally been a bottleneck preventing the use of multiple analytical tools and reproducibility of flow cytometry data analysis by independent parties. To address this issue, ISAC developed Gating-ML, a computer file format to encode and interchange gates. Gating-ML 1.5 was adopted and published as an ISAC Candidate Recommendation in 2008. Feedback during the probationary period from implementors, including major commercial software companies, instrument vendors, and the wider community, has led to a streamlined Gating-ML 2.0. Gating-ML has been significantly simplified and therefore easier to support by software tools. To aid developers, free, open source reference implementations, compliance tests, and detailed examples are provided to stimulate further commercial adoption. ISAC has approved Gating-ML as a standard ready for deployment in the public domain and encourages its support within the community as it is at a mature stage of development having undergone extensive review and testing, under both theoretical and practical conditions. © 2015 International Society for Advancement of Cytometry.
ERIC Educational Resources Information Center
Iaeger, Paula Irene
2012-01-01
This research synthesizes data and presents it using mapping software to help to identify potential site locations for community-centered higher education alternatives and more traditional junior-level colleges in Uganda. What factors can be used to quantify one site over another for the location of such an institution and if these factors can be…
ERIC Educational Resources Information Center
Bazo, Plácido; Rodríguez, Romén; Fumero, Dácil
2016-01-01
In this paper, we will introduce an innovative software platform that can be especially useful in a Content and Language Integrated Learning (CLIL) context. This tool is called Vocabulary Notebook, and has been developed to solve all the problems that traditional (paper) vocabulary notebooks have. This tool keeps focus on the personalisation of…
Crowd-driven Ecosystem for Evolutionary Design
2012-07-28
also embeds social media connections to maximize crowd engagement. Within such an environment, experts and non- traditional contributors (crowd) can...process.” The CEED platform also embeds social media connections to maximize crowd engagement. When completed, the software developed under the...track a project of interest online through other social media (namely RSS, Facebook, and Twitter) as well as on the vehicleforge website itself
Software system design for the non-null digital Moiré interferometer
NASA Astrophysics Data System (ADS)
Chen, Meng; Hao, Qun; Hu, Yao; Wang, Shaopu; Li, Tengfei; Li, Lin
2016-11-01
Aspheric optical components are an indispensable part of modern optics systems. With the development of aspheric optical elements fabrication technique, high-precision figure error test method of aspheric surfaces is a quite urgent issue now. We proposed a digital Moiré interferometer technique (DMIT) based on partial compensation principle for aspheric and freeform surface measurement. Different from traditional interferometer, DMIT consists of a real and a virtual interferometer. The virtual interferometer is simulated with Zemax software to perform phase-shifting and alignment. We can get the results by a series of calculation with the real interferogram and virtual interferograms generated by computer. DMIT requires a specific, reliable software system to ensure its normal work. Image acquisition and data processing are two important parts in this system. And it is also a challenge to realize the connection between the real and virtual interferometer. In this paper, we present a software system design for DMIT with friendly user interface and robust data processing features, enabling us to acquire the figure error of the measured asphere. We choose Visual C++ as the software development platform and control the ideal interferometer by using hybrid programming with Zemax. After image acquisition and data transmission, the system calls image processing algorithms written with Matlab to calculate the figure error of the measured asphere. We test the software system experimentally. In the experiment, we realize the measurement of an aspheric surface and prove the feasibility of the software system.
Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U
2001-12-01
Clinicians' acceptance of clinical decision support depends on its workflow-oriented, context-sensitive accessibility and availability at the point of care, integrated into the Electronic Patient Record (EPR). Commercially available Hospital Information Systems (HIS) often focus on administrative tasks and mostly do not provide additional knowledge based functionality. Their traditionally monolithic and closed software architecture encumbers integration of and interaction with external software modules. Our aim was to develop methods and interfaces to integrate knowledge sources into two different commercial hospital information systems to provide the best decision support possible within the context of available patient data. An existing, proven standalone scoring system for acute abdominal pain was supplemented by a communication interface. In both HIS we defined data entry forms and developed individual and reusable mechanisms for data exchange with external software modules. We designed an additional knowledge support frontend which controls data exchange between HIS and the knowledge modules. Finally, we added guidelines and algorithms to the knowledge library. Despite some major drawbacks which resulted mainly from the HIS' closed software architectures we showed exemplary, how external knowledge support can be integrated almost seamlessly into different commercial HIS. This paper describes the prototypical design and current implementation and discusses our experiences.
Research on axisymmetric aspheric surface numerical design and manufacturing technology
NASA Astrophysics Data System (ADS)
Wang, Zhen-zhong; Guo, Yin-biao; Lin, Zheng
2006-02-01
The key technology for aspheric machining offers exact machining path and machining aspheric lens with high accuracy and efficiency, in spite of the development of traditional manual manufacturing into nowadays numerical control (NC) machining. This paper presents a mathematical model between virtual cone and aspheric surface equations, and discusses the technology of uniform wear of grinding wheel and error compensation in aspheric machining. Finally, a software system for high precision aspheric surface manufacturing is designed and realized, based on the mentioned above. This software system can work out grinding wheel path according to input parameters and generate machining NC programs of aspheric surfaces.
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
Reconstruction of Cyber and Physical Software Using Novel Spread Method
NASA Astrophysics Data System (ADS)
Ma, Wubin; Deng, Su; Huang, Hongbin
2018-03-01
Cyber and Physical software has been concerned for many years since 2010. Actually, many researchers would disagree with the deployment of traditional Spread Method for reconstruction of Cyber and physical software, which embodies the key principles reconstruction of cyber physical system. NSM(novel spread method), our new methodology for reconstruction of cyber and physical software, is the solution to all of these challenges.
Rapid Prototyping: A Survey and Evaluation of Methodologies and Models
1990-03-01
possibility of program coding errors or design differences from the actual prototype the user validated. The method - ology should result in a production...behavior within the problem domain to be defned. "Each method has a different approach towards developing the set of symbols with which to define the...investigate prototyping as a viable alternative to the conventional method of software development. By the mid 1980’s, it was evi- dent that the traditional
2013-06-01
Communication Applet) UNIGE – D.I.M.E. Using a free application as “MIT APP Inventor” Android Software Development Kit DEGRADED C2 ICCRTS 2013...operate on an Android operating system up-gradable on which will be developed a simplified ACA ( Android Communication Applet) that will call C24U...Server) IP number . . . Portable COTS Devices ACA - C24U ( Android Communication Applet) Sending/receiving SEFL (Simple Exchange
Leveraging Modeling Approaches: Reaction Networks and Rules
Blinov, Michael L.; Moraru, Ion I.
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349
Leveraging modeling approaches: reaction networks and rules.
Blinov, Michael L; Moraru, Ion I
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.
Characteristics study of the gears by the CAD/CAE
NASA Astrophysics Data System (ADS)
Wang, P. Y.; Chang, S. L.; Lee, B. Y.; Nguyen, D. H.; Cao, C. W.
2017-09-01
Gears are the most important transmission component in machines. The rapid development of the machines in industry requires a shorter time of the analysis process. In traditional, the gears are analyzed by setting up the complete mathematical model firstly, considering the profile of cutter and coordinate systems relationship between the machine and the cutter. It is a really complex and time-consuming process. Recently, the CAD/CAE software is well developed and useful in the mechanical design. In this paper, the Autodesk Inventor® software is introduced to model the spherical gears firstly, and then the models can also be transferred into ANSYS Workbench for the finite element analysis. The proposed process in this paper is helpful to the engineers to speed up the analyzing process of gears in the design stage.
Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst
NASA Astrophysics Data System (ADS)
Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina
2015-03-01
In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.
ERIC Educational Resources Information Center
Incikabi, Lutfi; Sancar Tokmak, Hatice
2012-01-01
This case study examined the educational software evaluation processes of pre-service teachers who attended either expertise-based training (XBT) or traditional training in conjunction with a Software-Evaluation checklist. Forty-three mathematics teacher candidates and three experts participated in the study. All participants evaluated educational…
Software Application for Computer Aided Vocabulary Learning in a Blended Learning Environment
ERIC Educational Resources Information Center
Essam, Rasha
2010-01-01
This study focuses on the effect of computer-aided vocabulary learning software called "ArabCAVL" on students' vocabulary acquisition. It was hypothesized that students who use the ArabCAVL software in blended learning environment will surpass students who use traditional vocabulary learning strategies in face-to-face learning…
An Object-Oriented Network-Centric Software Architecture for Physical Computing
NASA Astrophysics Data System (ADS)
Palmer, Richard
1997-08-01
Recent developments in object-oriented computer languages and infrastructure such as the Internet, Web browsers, and the like provide an opportunity to define a more productive computational environment for scientific programming that is based more closely on the underlying mathematics describing physics than traditional programming languages such as FORTRAN or C++. In this talk I describe an object-oriented software architecture for representing physical problems that includes classes for such common mathematical objects as geometry, boundary conditions, partial differential and integral equations, discretization and numerical solution methods, etc. In practice, a scientific program written using this architecture looks remarkably like the mathematics used to understand the problem, is typically an order of magnitude smaller than traditional FORTRAN or C++ codes, and hence easier to understand, debug, describe, etc. All objects in this architecture are ``network-enabled,'' which means that components of a software solution to a physical problem can be transparently loaded from anywhere on the Internet or other global network. The architecture is expressed as an ``API,'' or application programmers interface specification, with reference embeddings in Java, Python, and C++. A C++ class library for an early version of this API has been implemented for machines ranging from PC's to the IBM SP2, meaning that phidentical codes run on all architectures.
The Impact of Software Culture on the Management of Community Data
NASA Astrophysics Data System (ADS)
Collins, J. A.; Pulsifer, P. L.; Sheffield, E.; Lewis, S.; Oldenburg, J.
2013-12-01
The Exchange for Local Observations and Knowledge of the Arctic (ELOKA), a program hosted at the National Snow and Ice Data Center (NSIDC), supports the collection, curation, and distribution of Local and Traditional Knowledge (LTK) data, as well as some quantitative data products. Investigations involving LTK data often involve community participation, and therefore require flexible and robust user interfaces to support a reliable process of data collection and management. Often, investigators focused on LTK and community-based monitoring choose to use ELOKA's data services based on our ability to provide rapid proof-of-concepts and economical delivery of a usable product. To satisfy these two overarching criteria, ELOKA is experimenting with modifications to its software development culture both in terms of how the software applications are developed as well as the kind of software applications (or components) being developed. Over the past several years, NSIDC has shifted its software development culture from one of assigning individual scientific programmers to support particular principal investigators or projects, to an Agile Software Methodology implementation using Scrum practices. ELOKA has participated in this process by working with other product owners to schedule and prioritize development work which is then implemented by a team of application developers. Scrum, along with practices such as Test Driven Development (TDD) and paired programming, improves the quality of the software product delivered to the user community. To meet the need for rapid prototyping and to maximize product development and support with limited developer input, our software development efforts are now focused on creating a platform of application modules that can be quickly customized to suit the needs of a variety of LTK projects. This approach is in contrast to the strategy of delivering custom applications for individual projects. To date, we have integrated components of the Nunaliit Atlas framework (a Java/JavaScript client-server web-based application) with an existing Ruby on Rails application. This approach requires transitioning individual applications to expose a service layer, thus allowing interapplication communication via RESTful services. In this presentation we will report on our experiences using Agile Scrum practices, our efforts to move from custom solutions to a platform of customizable modules, and the impact of each on our ability to support researchers and Arctic residents in the domain of community-based observations and knowledge.
Ada and the rapid development lifecycle
NASA Technical Reports Server (NTRS)
Deforrest, Lloyd; Gref, Lynn
1991-01-01
JPL is under contract, through NASA, with the US Army to develop a state-of-the-art Command Center System for the US European Command (USEUCOM). The Command Center System will receive, process, and integrate force status information from various sources and provide this integrated information to staff officers and decision makers in a format designed to enhance user comprehension and utility. The system is based on distributed workstation class microcomputers, VAX- and SUN-based data servers, and interfaces to existing military mainframe systems and communication networks. JPL is developing the Command Center System utilizing an incremental delivery methodology called the Rapid Development Methodology with adherence to government and industry standards including the UNIX operating system, X Windows, OSF/Motif, and the Ada programming language. Through a combination of software engineering techniques specific to the Ada programming language and the Rapid Development Approach, JPL was able to deliver capability to the military user incrementally, with comparable quality and improved economies of projects developed under more traditional software intensive system implementation methodologies.
NEXUS - Resilient Intelligent Middleware
NASA Astrophysics Data System (ADS)
Kaveh, N.; Hercock, R. Ghanea
Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.
NASA Technical Reports Server (NTRS)
Hamel, Gary P.; Wijesinghe, R.
1996-01-01
Groupware is a term describing an emerging computer software technology enhancing the ability of people to work together as a group, (a software driven 'group support system'). This project originated at the beginning of 1992 and reports were issued describing the activity through May 1995. These reports stressed the need for process as well as technology. That is, while the technology represented a computer assisted method for groups to work together, the Group Support System (GSS) technology als required an understanding of the facilitation process electronic meetings demand. Even people trained in traditional facilitation techniques did not necessarily aimlessly adopt groupware techniques. The latest phase of this activity attempted to (1) improve the facilitation process by developing training support for a portable groupware computer system, and (2) to explore settings and uses for the portable groupware system using different software, such as Lotus Notes.
C-C1-04: Building a Health Services Information Technology Research Environment
Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F
2010-01-01
Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.
Reinventing User Applications for Mission Control
NASA Technical Reports Server (NTRS)
Trimble, Jay Phillip; Crocker, Alan R.
2010-01-01
In 2006, NASA Ames Research Center's (ARC) Intelligent Systems Division, and NASA Johnson Space Centers (JSC) Mission Operations Directorate (MOD) began a collaboration to move user applications for JSC's mission control center to a new software architecture, intended to replace the existing user applications being used for the Space Shuttle and the International Space Station. It must also carry NASA/JSC mission operations forward to the future, meeting the needs for NASA's exploration programs beyond low Earth orbit. Key requirements for the new architecture, called Mission Control Technologies (MCT) are that end users must be able to compose and build their own software displays without the need for programming, or direct support and approval from a platform services organization. Developers must be able to build MCT components using industry standard languages and tools. Each component of MCT must be interoperable with other components, regardless of what organization develops them. For platform service providers and MOD management, MCT must be cost effective, maintainable and evolvable. MCT software is built from components that are presented to users as composable user objects. A user object is an entity that represents a domain object such as a telemetry point, a command, a timeline, an activity, or a step in a procedure. User objects may be composed and reused, for example a telemetry point may be used in a traditional monitoring display, and that same telemetry user object may be composed into a procedure step. In either display, that same telemetry point may be shown in different views, such as a plot, an alpha numeric, or a meta-data view and those views may be changed live and in place. MCT presents users with a single unified user environment that contains all the objects required to perform applicable flight controller tasks, thus users do not have to use multiple applications, the traditional boundaries that exist between multiple heterogeneous applications disappear, leaving open the possibility of new operations concepts that are not constrained by the traditional applications paradigm.
SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, C; Wessels, B; Hamilton, H
2014-06-01
Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of amore » number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT.« less
Next Generation Models for Storage and Representation of Microbial Biological Annotation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quest, Daniel J; Land, Miriam L; Brettin, Thomas S
2010-01-01
Background Traditional genome annotation systems were developed in a very different computing era, one where the World Wide Web was just emerging. Consequently, these systems are built as centralized black boxes focused on generating high quality annotation submissions to GenBank/EMBL supported by expert manual curation. The exponential growth of sequence data drives a growing need for increasingly higher quality and automatically generated annotation. Typical annotation pipelines utilize traditional database technologies, clustered computing resources, Perl, C, and UNIX file systems to process raw sequence data, identify genes, and predict and categorize gene function. These technologies tightly couple the annotation software systemmore » to hardware and third party software (e.g. relational database systems and schemas). This makes annotation systems hard to reproduce, inflexible to modification over time, difficult to assess, difficult to partition across multiple geographic sites, and difficult to understand for those who are not domain experts. These systems are not readily open to scrutiny and therefore not scientifically tractable. The advent of Semantic Web standards such as Resource Description Framework (RDF) and OWL Web Ontology Language (OWL) enables us to construct systems that address these challenges in a new comprehensive way. Results Here, we develop a framework for linking traditional data to OWL-based ontologies in genome annotation. We show how data standards can decouple hardware and third party software tools from annotation pipelines, thereby making annotation pipelines easier to reproduce and assess. An illustrative example shows how TURTLE (Terse RDF Triple Language) can be used as a human readable, but also semantically-aware, equivalent to GenBank/EMBL files. Conclusions The power of this approach lies in its ability to assemble annotation data from multiple databases across multiple locations into a representation that is understandable to researchers. In this way, all researchers, experimental and computational, will more easily understand the informatics processes constructing genome annotation and ultimately be able to help improve the systems that produce them.« less
Computer vs. Typewriter: Changes in Teaching Methods.
ERIC Educational Resources Information Center
Frankeberger, Lynda
1990-01-01
Factors to consider in making a decision whether to convert traditional typewriting classrooms to microcomputer classrooms include effects on oral instruction, ethical issues in file transfer, and use of keyboarding software and timed writing software. (JOW)
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
iSDS: a self-configurable software-defined storage system for enterprise
NASA Astrophysics Data System (ADS)
Chen, Wen-Shyen Eric; Huang, Chun-Fang; Huang, Ming-Jen
2018-01-01
Storage is one of the most important aspects of IT infrastructure for various enterprises. But, enterprises are interested in more than just data storage; they are interested in such things as more reliable data protection, higher performance and reduced resource consumption. Traditional enterprise-grade storage satisfies these requirements at high cost. It is because traditional enterprise-grade storage is usually designed and constructed by customised field-programmable gate array to achieve high-end functionality. However, in this ever-changing environment, enterprises request storage with more flexible deployment and at lower cost. Moreover, the rise of new application fields, such as social media, big data, video streaming service etc., makes operational tasks for administrators more complex. In this article, a new storage system called intelligent software-defined storage (iSDS), based on software-defined storage, is described. More specifically, this approach advocates using software to replace features provided by traditional customised chips. To alleviate the management burden, it also advocates applying machine learning to automatically configure storage to meet dynamic requirements of workloads running on storage. This article focuses on the analysis feature of iSDS cluster by detailing its architecture and design.
Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks
Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin
2017-01-01
Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller’s direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20–40% while ensuring feasible data delay. PMID:28914816
Energy-Saving Traffic Scheduling in Hybrid Software Defined Wireless Rechargeable Sensor Networks.
Wei, Yunkai; Ma, Xiaohui; Yang, Ning; Chen, Yijin
2017-09-15
Software Defined Wireless Rechargeable Sensor Networks (SDWRSNs) are an inexorable trend for Wireless Sensor Networks (WSNs), including Wireless Rechargeable Sensor Network (WRSNs). However, the traditional network devices cannot be completely substituted in the short term. Hybrid SDWRSNs, where software defined devices and traditional devices coexist, will last for a long time. Hybrid SDWRSNs bring new challenges as well as opportunities for energy saving issues, which is still a key problem considering that the wireless chargers are also exhaustible, especially in some rigid environment out of the main supply. Numerous energy saving schemes for WSNs, or even some works for WRSNs, are no longer suitable for the new features of hybrid SDWRSNs. To solve this problem, this paper puts forward an Energy-saving Traffic Scheduling (ETS) algorithm. The ETS algorithm adequately considers the new characters in hybrid SDWRSNs, and takes advantage of the Software Defined Networking (SDN) controller's direct control ability on SDN nodes and indirect control ability on normal nodes. The simulation results show that, comparing with traditional Minimum Transmission Energy (MTE) protocol, ETS can substantially improve the energy efficiency in hybrid SDWRSNs for up to 20-40% while ensuring feasible data delay.
Weaving a Formal Methods Education with Problem-Based Learning
NASA Astrophysics Data System (ADS)
Gibson, J. Paul
The idea of weaving formal methods through computing (or software engineering) degrees is not a new one. However, there has been little success in developing and implementing such a curriculum. Formal methods continue to be taught as stand-alone modules and students, in general, fail to see how fundamental these methods are to the engineering of software. A major problem is one of motivation — how can the students be expected to enthusiastically embrace a challenging subject when the learning benefits, beyond passing an exam and achieving curriculum credits, are not clear? Problem-based learning has gradually moved from being an innovative pedagogique technique, commonly used to better-motivate students, to being widely adopted in the teaching of many different disciplines, including computer science and software engineering. Our experience shows that a good problem can be re-used throughout a student's academic life. In fact, the best computing problems can be used with children (young and old), undergraduates and postgraduates. In this paper we present a process for weaving formal methods through a University curriculum that is founded on the application of problem-based learning and a library of good software engineering problems, where students learn about formal methods without sitting a traditional formal methods module. The process of constructing good problems and integrating them into the curriculum is shown to be analagous to the process of engineering software. This approach is not intended to replace more traditional formal methods modules: it will better prepare students for such specialised modules and ensure that all students have an understanding and appreciation for formal methods even if they do not go on to specialise in them.
NASA Technical Reports Server (NTRS)
Yau, M.; Guarro, S.; Apostolakis, G.
1993-01-01
Dynamic Flowgraph Methodology (DFM) is a new approach developed to integrate the modeling and analysis of the hardware and software components of an embedded system. The objective is to complement the traditional approaches which generally follow the philosophy of separating out the hardware and software portions of the assurance analysis. In this paper, the DFM approach is demonstrated using the Titan 2 Space Launch Vehicle Digital Flight Control System. The hardware and software portions of this embedded system are modeled in an integrated framework. In addition, the time dependent behavior and the switching logic can be captured by this DFM model. In the modeling process, it is found that constructing decision tables for software subroutines is very time consuming. A possible solution is suggested. This approach makes use of a well-known numerical method, the Newton-Raphson method, to solve the equations implemented in the subroutines in reverse. Convergence can be achieved in a few steps.
Optimizing the Use of Storage Systems Provided by Cloud Computing Environments
NASA Astrophysics Data System (ADS)
Gallagher, J. H.; Potter, N.; Byrne, D. A.; Ogata, J.; Relph, J.
2013-12-01
Cloud computing systems present a set of features that include familiar computing resources (albeit augmented to support dynamic scaling of processing power) bundled with a mix of conventional and unconventional storage systems. The linux base on which many Cloud environments (e.g., Amazon) are based make it tempting to assume that any Unix software will run efficiently in this environment efficiently without change. OPeNDAP and NODC collaborated on a short project to explore how the S3 and Glacier storage systems provided by the Amazon Cloud Computing infrastructure could be used with a data server developed primarily to access data stored in a traditional Unix file system. Our work used the Amazon cloud system, but we strived for designs that could be adapted easily to other systems like OpenStack. Lastly, we evaluated different architectures from a computer security perspective. We found that there are considerable issues associated with treating S3 as if it is a traditional file system, even though doing so is conceptually simple. These issues include performance penalties because using a software tool that emulates a traditional file system to store data in S3 performs poorly when compared to a storing data directly in S3. We also found there are important benefits beyond performance to ensuring that data written to S3 can directly accessed without relying on a specific software tool. To provide a hierarchical organization to the data stored in S3, we wrote 'catalog' files, using XML. These catalog files map discrete files to S3 access keys. Like a traditional file system's directories, the catalogs can also contain references to other catalogs, providing a simple but effective hierarchy overlaid on top of S3's flat storage space. An added benefit to these catalogs is that they can be viewed in a web browser; our storage scheme provides both efficient access for the data server and access via a web browser. We also looked at the Glacier storage system and found that the system's response characteristics are very different from a traditional file system or database; it behaves like a near-line storage system. To be used by a traditional data server, the underlying access protocol must support asynchronous accesses. This is because the Glacier system takes a minimum of four hours to deliver any data object, so systems built with the expectation of instant access (i.e., most web systems) must be fundamentally changed to use Glacier. Part of a related project has been to develop an asynchronous access mode for OPeNDAP, and we have developed a design using that new addition to the DAP protocol with Glacier as a near-line mass store. In summary, we found that both S3 and Glacier require special treatment to be effectively used by a data server. It is important to add (new) interfaces to data servers that enable them to use these storage devices through their native interfaces. We also found that our designs could easily map to a cloud environment based on OpenStack. Lastly, we noted that while these designs invited more liberal use of remote references for data objects, that can expose software to new security risks.
National policies for technical change: Where are the increasing returns to economic research?
Pavitt, Keith
1996-01-01
Improvements over the past 30 years in statistical data, analysis, and related theory have strengthened the basis for science and technology policy by confirming the importance of technical change in national economic performance. But two important features of scientific and technological activities in the Organization for Economic Cooperation and Development countries are still not addressed adequately in mainstream economics: (i) the justification of public funding for basic research and (ii) persistent international differences in investment in research and development and related activities. In addition, one major gap is now emerging in our systems of empirical measurement—the development of software technology, especially in the service sector. There are therefore dangers of diminishing returns to the usefulness of economic research, which continues to rely completely on established theory and established statistical sources. Alternative propositions that deserve serious consideration are: (i) the economic usefulness of basic research is in the provision of (mainly tacit) skills rather than codified and applicable information; (ii) in developing and exploiting technological opportunities, institutional competencies are just as important as the incentive structures that they face; and (iii) software technology developed in traditional service sectors may now be a more important locus of technical change than software technology developed in “high-tech” manufacturing. PMID:8917481
Commercialization of NESSUS: Status
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Millwater, Harry R.
1991-01-01
A plan was initiated in 1988 to commercialize the Numerical Evaluation of Stochastic Structures Under Stress (NESSUS) probabilistic structural analysis software. The goal of the on-going commercialization effort is to begin the transfer of Probabilistic Structural Analysis Method (PSAM) developed technology into industry and to develop additional funding resources in the general area of structural reliability. The commercialization effort is summarized. The SwRI NESSUS Software System is a general purpose probabilistic finite element computer program using state of the art methods for predicting stochastic structural response due to random loads, material properties, part geometry, and boundary conditions. NESSUS can be used to assess structural reliability, to compute probability of failure, to rank the input random variables by importance, and to provide a more cost effective design than traditional methods. The goal is to develop a general probabilistic structural analysis methodology to assist in the certification of critical components in the next generation Space Shuttle Main Engine.
ERIC Educational Resources Information Center
Greenfield, Rich
The author argues that traditional library cataloging (MARC) and the online public access catalog (OPAC) are in collision with the world of the Internet because items in electronic formats undergo MARC cataloging only on a very selective basis. Also the library profession initially isolated itself from World Wide Web development by predicting no…
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
ERIC Educational Resources Information Center
Kuyatt, Brian L.; Baker, Jason D.
2014-01-01
This study evaluates the effectiveness of human anatomy software in face-to-face and online anatomy laboratory classes. Cognitive, affective, and psychomotor perceived learning was measured for students using Pearson Education's Practice Anatomy Laboratory 2.0 software. This study determined that student-perceived learning was significantly…
Is There Such a Thing as Free Software? The Pros and Cons of Open-Source Software
ERIC Educational Resources Information Center
Trappler, Thomas J.
2009-01-01
Today's higher education environment is marked by heightened accountability and decreased budgets. In such an environment, no higher education institution can afford to ignore alternative approaches that could result in more effective and less costly solutions. Open-source software (OSS) can serve as a viable alternative to traditional proprietary…
DOE Office of Scientific and Technical Information (OSTI.GOV)
McIntyre, Dustin L.; Russo, Richard
Applied Spectra, as our industrial collaborator, is helping us develop our downhole LIBS sensor. Our part of the collaboration is the design, construction, and validation of the miniaturized fiber coupled laser whereas Applied Spectra will be providing technical guidance and control/analysis software. This will allow our system which is traditionally operated by a person to be automated in both data collection and analysis. This will allow our system to significantly increase its TRL.
QuantWorm: a comprehensive software package for Caenorhabditis elegans phenotypic assays.
Jung, Sang-Kyu; Aleman-Meza, Boanerges; Riepe, Celeste; Zhong, Weiwei
2014-01-01
Phenotypic assays are crucial in genetics; however, traditional methods that rely on human observation are unsuitable for quantitative, large-scale experiments. Furthermore, there is an increasing need for comprehensive analyses of multiple phenotypes to provide multidimensional information. Here we developed an automated, high-throughput computer imaging system for quantifying multiple Caenorhabditis elegans phenotypes. Our imaging system is composed of a microscope equipped with a digital camera and a motorized stage connected to a computer running the QuantWorm software package. Currently, the software package contains one data acquisition module and four image analysis programs: WormLifespan, WormLocomotion, WormLength, and WormEgg. The data acquisition module collects images and videos. The WormLifespan software counts the number of moving worms by using two time-lapse images; the WormLocomotion software computes the velocity of moving worms; the WormLength software measures worm body size; and the WormEgg software counts the number of eggs. To evaluate the performance of our software, we compared the results of our software with manual measurements. We then demonstrated the application of the QuantWorm software in a drug assay and a genetic assay. Overall, the QuantWorm software provided accurate measurements at a high speed. Software source code, executable programs, and sample images are available at www.quantworm.org. Our software package has several advantages over current imaging systems for C. elegans. It is an all-in-one package for quantifying multiple phenotypes. The QuantWorm software is written in Java and its source code is freely available, so it does not require use of commercial software or libraries. It can be run on multiple platforms and easily customized to cope with new methods and requirements.
Rapid Processing of Radio Interferometer Data for Transient Surveys
NASA Astrophysics Data System (ADS)
Bourke, S.; Mooley, K.; Hallinan, G.
2014-05-01
We report on a software infrastructure and pipeline developed to process large radio interferometer datasets. The pipeline is implemented using a radical redesign of the AIPS processing model. An infrastructure we have named AIPSlite is used to spawn, at runtime, minimal AIPS environments across a cluster. The pipeline then distributes and processes its data in parallel. The system is entirely free of the traditional AIPS distribution and is self configuring at runtime. This software has so far been used to process a EVLA Stripe 82 transient survey, the data for the JVLA-COSMOS project, and has been used to process most of the EVLA L-Band data archive imaging each integration to search for short duration transients.
FPGA Based Reconfigurable ATM Switch Test Bed
NASA Technical Reports Server (NTRS)
Chu, Pong P.; Jones, Robert E.
1998-01-01
Various issues associated with "FPGA Based Reconfigurable ATM Switch Test Bed" are presented in viewgraph form. Specific topics include: 1) Network performance evaluation; 2) traditional approaches; 3) software simulation; 4) hardware emulation; 5) test bed highlights; 6) design environment; 7) test bed architecture; 8) abstract sheared-memory switch; 9) detailed switch diagram; 10) traffic generator; 11) data collection circuit and user interface; 12) initial results; and 13) the following conclusions: Advances in FPGA make hardware emulation feasible for performance evaluation, hardware emulation can provide several orders of magnitude speed-up over software simulation; due to the complexity of hardware synthesis process, development in emulation is much more difficult than simulation and requires knowledge in both networks and digital design.
NASA Technical Reports Server (NTRS)
Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.
NASA Technical Reports Server (NTRS)
Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)
2001-01-01
In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.
ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra
NASA Astrophysics Data System (ADS)
Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.
A Generic Software Safety Document Generator
NASA Technical Reports Server (NTRS)
Denney, Ewen; Venkatesan, Ram Prasad
2004-01-01
Formal certification is based on the idea that a mathematical proof of some property of a piece of software can be regarded as a certificate of correctness which, in principle, can be subjected to external scrutiny. In practice, however, proofs themselves are unlikely to be of much interest to engineers. Nevertheless, it is possible to use the information obtained from a mathematical analysis of software to produce a detailed textual justification of correctness. In this paper, we describe an approach to generating textual explanations from automatically generated proofs of program safety, where the proofs are of compliance with an explicit safety policy that can be varied. Key to this is tracing proof obligations back to the program, and we describe a tool which implements this to certify code auto-generated by AutoBayes and AutoFilter, program synthesis systems under development at the NASA Ames Research Center. Our approach is a step towards combining formal certification with traditional certification methods.
a Cognitive Approach to Teaching a Graduate-Level Geobia Course
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel A.
2016-06-01
Remote sensing image analysis training occurs both in the classroom and the research lab. Education in the classroom for traditional pixel-based image analysis has been standardized across college curriculums. However, with the increasing interest in Geographic Object-Based Image Analysis (GEOBIA), there is a need to develop classroom instruction for this method of image analysis. While traditional remote sensing courses emphasize the expansion of skills and knowledge related to the use of computer-based analysis, GEOBIA courses should examine the cognitive factors underlying visual interpretation. This current paper provides an initial analysis of the development, implementation, and outcomes of a GEOBIA course that considers not only the computational methods of GEOBIA, but also the cognitive factors of expertise, that such software attempts to replicate. Finally, a reflection on the first instantiation of this course is presented, in addition to plans for development of an open-source repository for course materials.
An Incremental Life-cycle Assurance Strategy for Critical System Certification
2014-11-04
for Safe Aircraft Operation Embedded software systems introduce a new class of problems not addressed by traditional system modeling & analysis...Platform Runtime Architecture Application Software Embedded SW System Engineer Data Stream Characteristics Latency jitter affects control behavior...do system level failures still occur despite fault tolerance techniques being deployed in systems ? Embedded software system as major source of
Testing in Service-Oriented Environments
2010-03-01
software releases (versions, service packs, vulnerability patches) for one com- mon ESB during the 13-month period from January 1, 2008 through...impact on quality of service : Unlike traditional software compo- nents, a single instance of a web service can be used by multiple consumers. Since the...distributed, with heterogeneous hardware and software (SOA infrastructure, services , operating systems, and databases). Because of cost and security, it
Improving the strength of additively manufactured objects via modified interior structure
NASA Astrophysics Data System (ADS)
Al, Can Mert; Yaman, Ulas
2017-10-01
Additive manufacturing (AM), in other words 3D printing, is becoming more common because of its crucial advantages such as geometric complexity, functional interior structures, etc. over traditional manufacturing methods. Especially, Fused Filament Fabrication (FFF) 3D printing technology is frequently used because of the fact that desktop variants of these types of printers are highly appropriate for different fields and are improving rapidly. In spite of the fact that there are significant advantages of AM, the strength of the parts fabricated with AM is still a major problem especially when plastic materials, such as Acrylonitrile butadiene styrene (ABS), Polylactic acid (PLA), Nylon, etc., are utilized. In this study, an alternative method is proposed in which the strength of AM fabricated parts is improved employing direct slicing approach. Traditional Computer Aided Manufacturing (CAM) software of 3D printers takes only the geometry as an input in triangular mesh form (stereolithography, STL file) generated by Computer Aided Design software. This file format includes data only about the outer boundaries of the geometry. Interior of the artifacts are manufactured with homogeneous infill patterns, such as diagonal, honeycomb, linear, etc. according to the paths generated in CAM software. The developed method within this study provides a way to fabricate parts with heterogeneous infill patterns by utilizing the stress field data obtained from a Finite Element Analysis software, such as ABAQUS. According to the performed tensile tests, the strength of the test specimen is improved by about 45% compared to the conventional way of 3D printing.
Embedded Web Technology: Internet Technology Applied to Real-Time System Control
NASA Technical Reports Server (NTRS)
Daniele, Carl J.
1998-01-01
The NASA Lewis Research Center is developing software tools to bridge the gap between the traditionally non-real-time Internet technology and the real-time, embedded-controls environment for space applications. Internet technology has been expanding at a phenomenal rate. The simple World Wide Web browsers (such as earlier versions of Netscape, Mosaic, and Internet Explorer) that resided on personal computers just a few years ago only enabled users to log into and view a remote computer site. With current browsers, users not only view but also interact with remote sites. In addition, the technology now supports numerous computer platforms (PC's, MAC's, and Unix platforms), thereby providing platform independence.In contrast, the development of software to interact with a microprocessor (embedded controller) that is used to monitor and control a space experiment has generally been a unique development effort. For each experiment, a specific graphical user interface (GUI) has been developed. This procedure works well for a single-user environment. However, the interface for the International Space Station (ISS) Fluids and Combustion Facility will have to enable scientists throughout the world and astronauts onboard the ISS, using different computer platforms, to interact with their experiments in the Fluids and Combustion Facility. Developing a specific GUI for all these users would be cost prohibitive. An innovative solution to this requirement, developed at Lewis, is to use Internet technology, where the general problem of platform independence has already been partially solved, and to leverage this expanding technology as new products are developed. This approach led to the development of the Embedded Web Technology (EWT) program at Lewis, which has the potential to significantly reduce software development costs for both flight and ground software.
Embedded Web Technology: Applying World Wide Web Standards to Embedded Systems
NASA Technical Reports Server (NTRS)
Ponyik, Joseph G.; York, David W.
2002-01-01
Embedded Systems have traditionally been developed in a highly customized manner. The user interface hardware and software along with the interface to the embedded system are typically unique to the system for which they are built, resulting in extra cost to the system in terms of development time and maintenance effort. World Wide Web standards have been developed in the passed ten years with the goal of allowing servers and clients to intemperate seamlessly. The client and server systems can consist of differing hardware and software platforms but the World Wide Web standards allow them to interface without knowing about the details of system at the other end of the interface. Embedded Web Technology is the merging of Embedded Systems with the World Wide Web. Embedded Web Technology decreases the cost of developing and maintaining the user interface by allowing the user to interface to the embedded system through a web browser running on a standard personal computer. Embedded Web Technology can also be used to simplify an Embedded System's internal network.
Historical review of computer-assisted cognitive retraining.
Lynch, Bill
2002-10-01
This article details the introduction and development of the use of microcomputers as adjuncts to traditional cognitive rehabilitation of persons with acquired brain injury. The initial application of video games as therapeutic recreation in the late 1970s was soon followed in the early 1980s by the use of the first personal computers and available educational software. By the mid-1980s, both the IBM PC and Macintosh platforms were established, along with simplified programming languages that allowed individuals without extensive technical expertise to develop their own software. Several rehabilitation clinicians began to produce and market specially written cognitive retraining software for one or the other platform. Their work was detailed and reviewed, as was recently released software from commercial sources. The latter discussion included the latest developments in the rehabilitation applications of personal digital assistants and related organizing, reminding, and dictation devices. A summary of research on the general and specific efficacy of computer-assisted cognitive retraining illustrated the lingering controversy and skepticism that have been associated with this field since its inception. Computer-assisted cognitive retraining (CACR) can be an effective adjunct to a comprehensive program of cognitive rehabilitation. Training needs to be focused, structured, monitored, and as ecologically relevant as possible for optimum effect. Transfer or training or generalizability of skills remains a key issue in the field and should be considered the key criterion in evaluating whether to initiate or continue CACR.
NSLS-II HIGH LEVEL APPLICATION INFRASTRUCTURE AND CLIENT API DESIGN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, G.; Yang; L.
2011-03-28
The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. It is an open structure platform, and we try to provide a narrow API set for client application. With this narrow API, existing applications developed in different language under different architecture could be ported to our platform with small modification. This paper describes system infrastructure design, client API and system integration, and latest progress. As a new 3rd generation synchrotron light source with ultra low emittance, there are new requirements and challenges to control and manipulate themore » beam. A use case study and a theoretical analysis have been performed to clarify requirements and challenges to the high level applications (HLA) software environment. To satisfy those requirements and challenges, adequate system architecture of the software framework is critical for beam commissioning, study and operation. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating, plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology. The HLA is combination of tools for accelerator physicists and operators, which is same as traditional approach. In NSLS-II, they include monitoring applications and control routines. Scripting environment is very important for the later part of HLA and both parts are designed based on a common set of APIs. Physicists and operators are users of these APIs, while control system engineers and a few accelerator physicists are the developers of these APIs. With our Client/Server mode based approach, we leave how to retrieve information to the developers of APIs and how to use them to form a physics application to the users. For example, how the channels are related to magnet and what the current real-time setting of a magnet is in physics unit are the internals of APIs. Measuring chromaticities are the users of APIs. All the users of APIs are working with magnet and instrument names in a physics unit. The low level communications in current or voltage unit are minimized. In this paper, we discussed our recent progress of our infrastructure development, and client API.« less
ERIC Educational Resources Information Center
Hussein, Nadhim Obaid; Elttayef, Ahmed Ibrahim
2016-01-01
The importance of this study comes from the fact that foreign language learners suffer from traditional ways and methods of teaching and learning. They are looking for new ways of teaching and learning specially methods which integrated with technology. What makes this study important is that using one of the most familiar software for learners…
Systems Architecture for Fully Autonomous Space Missions
NASA Technical Reports Server (NTRS)
Esper, Jamie; Schnurr, R.; VanSteenberg, M.; Brumfield, Mark (Technical Monitor)
2002-01-01
The NASA Goddard Space Flight Center is working to develop a revolutionary new system architecture concept in support of fully autonomous missions. As part of GSFC's contribution to the New Millenium Program (NMP) Space Technology 7 Autonomy and on-Board Processing (ST7-A) Concept Definition Study, the system incorporates the latest commercial Internet and software development ideas and extends them into NASA ground and space segment architectures. The unique challenges facing the exploration of remote and inaccessible locales and the need to incorporate corresponding autonomy technologies within reasonable cost necessitate the re-thinking of traditional mission architectures. A measure of the resiliency of this architecture in its application to a broad range of future autonomy missions will depend on its effectiveness in leveraging from commercial tools developed for the personal computer and Internet markets. Specialized test stations and supporting software come to past as spacecraft take advantage of the extensive tools and research investments of billion-dollar commercial ventures. The projected improvements of the Internet and supporting infrastructure go hand-in-hand with market pressures that provide continuity in research. By taking advantage of consumer-oriented methods and processes, space-flight missions will continue to leverage on investments tailored to provide better services at reduced cost. The application of ground and space segment architectures each based on Local Area Networks (LAN), the use of personal computer-based operating systems, and the execution of activities and operations through a Wide Area Network (Internet) enable a revolution in spacecraft mission formulation, implementation, and flight operations. Hardware and software design, development, integration, test, and flight operations are all tied-in closely to a common thread that enables the smooth transitioning between program phases. The application of commercial software development techniques lays the foundation for delivery of product-oriented flight software modules and models. Software can then be readily applied to support the on-board autonomy required for mission self-management. An on-board intelligent system, based on advanced scripting languages, facilitates the mission autonomy required to offload ground system resources, and enables the spacecraft to manage itself safely through an efficient and effective process of reactive planning, science data acquisition, synthesis, and transmission to the ground. Autonomous ground systems in turn coordinate and support schedule contact times with the spacecraft. Specific autonomy software modules on-board include mission and science planners, instrument and subsystem control, and fault tolerance response software, all residing within a distributed computing environment supported through the flight LAN. Autonomy also requires the minimization of human intervention between users on the ground and the spacecraft, and hence calls for the elimination of the traditional operations control center as a funnel for data manipulation. Basic goal-oriented commands are sent directly from the user to the spacecraft through a distributed internet-based payload operations "center". The ensuing architecture calls for the use of spacecraft as point extensions on the Internet. This paper will detail the system architecture implementation chosen to enable cost-effective autonomous missions with applicability to a broad range of conditions. It will define the structure needed for implementation of such missions, including software and hardware infrastructures. The overall architecture is then laid out as a common thread in the mission life cycle from formulation through implementation and flight operations.
Rosenfeld, Alan L; Mandelaris, George A; Tardieu, Philippe B
2006-08-01
The purpose of this paper is to expand on part 1 of this series (published in the previous issue) regarding the emerging future of computer-guided implant dentistry. This article will introduce the concept of rapid-prototype medical modeling as well as describe the utilization and fabrication of computer-generated surgical drilling guides used during implant surgery. The placement of dental implants has traditionally been an intuitive process, whereby the surgeon relies on mental navigation to achieve optimal implant positioning. Through rapid-prototype medical modeling and the ste-reolithographic process, surgical drilling guides (eg, SurgiGuide) can be created. These guides are generated from a surgical implant plan created with a computer software system that incorporates all relevant prosthetic information from which the surgical plan is developed. The utilization of computer-generated planning and stereolithographically generated surgical drilling guides embraces the concept of collaborative accountability and supersedes traditional mental navigation on all levels of implant therapy.
A new computerized moving stage for optical microscopes
NASA Astrophysics Data System (ADS)
Hatiboglu, Can Ulas; Akin, Serhat
2004-06-01
Measurements of microscope stage movements in the x and y directions are of importance for some stereological methods. Traditionally, the length of stage movements is measured with differing precision and accuracy using a suitable motorized stage, a microscope and software. Such equipment is generally expensive and not readily available in many laboratories. One other challenging problem is the adaptability to available microscope systems which weakens the possibility of the equipment to be used with any kind of light microscope. This paper describes a simple and cheap programmable moving stage that can be used with the available microscopes in the market. The movements of the stage are controlled by two servo-motors and a controller chip via a Java-based image processing software. With the developed motorized stage and a microscope equipped with a CCD camera, the software allows complete coverage of the specimens with minimum overlap, eliminating the optical strain associated with counting hundreds of images through an eyepiece, in a quick and precise fashion. The uses and the accuracy of the developed stage are demonstrated using thin sections obtained from a limestone core plug.
Exploring User Acceptance of FOSS: The Role of the Age of the Users
NASA Astrophysics Data System (ADS)
Gallego, M. Dolores; Bueno, Salvador
Free and open source software (FOSS) movement essentially arises like answer to the evolution occurred in the market from the software, characterized by the closing of the source code. Furthermore, some FOSS characteristics, such as (1) the advance of this movement and (2) the attractiveness that contributes the voluntary and cooperative work, have increased the interest of the users towards free software. Traditionally, research in FOSS has focused on identifying individual personal motives for participating in the development of a FOSS project, analyzing specific FOSS solutions, or the FOSS movement itself. Nevertheless, the advantages of the FOSS for users and the effect of the demographic dimensions on user acceptance for FOSS have been two research topics with little attention. Specifically, this paper's aim is to focus on the influence of the userś age with FOSS the FOSS acceptance. Based on the literature, userś age is an essential demographic dimension for explaining the Information Systems acceptance. With this purpose, the authors have developed a research model based on the Technological Acceptance Model (TAM).
Bringing the Unidata IDV to the Cloud
NASA Astrophysics Data System (ADS)
Fisher, W. I.; Oxelson Ganter, J.
2015-12-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.
Computerized assessment of placental calcification post-ultrasound: a novel software tool.
Moran, M; Higgins, M; Zombori, G; Ryan, J; McAuliffe, F M
2013-05-01
Placental calcification is associated with an increased risk of perinatal morbidity and mortality. The subjectivity of current ultrasound methods of assessment of placental calcification indicates that a more objective method is required. The aim of this study was to correlate the percentage of calcification defined by the clinician using a new software tool for calculating the extent of placental calcification with traditional ultrasound methods and with pregnancy outcome. Ninety placental images were individually assessed. An upper threshold was defined, based on high intensity, to quantify calcification within the placenta. Output metrics were then produced including the overall percentage of calcification with respect to the total number of pixels within the region of interest. The results were correlated with traditional ultrasound methods of assessment of placental calcification and with pregnancy outcome. The results demonstrate a significant correlation between placental calcification, as defined using the software, and traditional methods of Grannum grading of placental calcification. Whilst correlation with perinatal outcome and cord pH was not significant as a result of small numbers, patients with placental calcification assessed using the computerized software at the upper quartile had higher rates of poor perinatal outcome when compared with those at the lower quartile (8/22 (36%) vs 3/23 (13%); P = 0.069). These results suggest that this computerized software tool has the potential to become an alternative method of assessing placental calcification. Copyright © 2012 ISUOG. Published by John Wiley & Sons Ltd.
Visible in camouflage of military engineering application
NASA Astrophysics Data System (ADS)
Pu, Huan; Kang, Qing; Chen, Shanjing; Wang, Zhenggang
2016-03-01
Our traditional methods of disguise shortcomings, using optical material combined with traditional methods to improve the efficiency of camouflage in disguise. Present lack of effective camouflage effect evaluation system, it refers to Matlab software for optical phase camouflage effect evaluation.
Keyboarding, Language Arts, and the Elementary School Child.
ERIC Educational Resources Information Center
Balajthy, Ernest
1988-01-01
Discusses benefits of keyboarding instruction for elementary school students, emphasizing the integration of keyboarding with language arts instruction. Traditional typing and computer-assisted instruction are discussed, six software packages for adapting keyboarding instruction to the classroom are reviewed, and suggestions for software selection…
Nuclear forensics of a non-traditional sample: Neptunium
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, Jamie L.; Schwartz, Daniel; Tandon, Lav
Recent nuclear forensics cases have focused primarily on plutonium (Pu) and uranium (U) materials. By definition however, nuclear forensics can apply to any diverted nuclear material. This includes neptunium (Np), an internationally safeguarded material like Pu and U, that could offer a nuclear security concern if significant quantities were found outside of regulatory control. This case study couples scanning electron microscopy (SEM) with quantitative analysis using newly developed specialized software, to evaluate a non-traditional nuclear forensic sample of Np. Here, the results of the morphological analyses were compared with another Np sample of known pedigree, as well as other traditionalmore » actinide materials in order to determine potential processing and point-of-origin.« less
Nuclear forensics of a non-traditional sample: Neptunium
Doyle, Jamie L.; Schwartz, Daniel; Tandon, Lav
2016-05-16
Recent nuclear forensics cases have focused primarily on plutonium (Pu) and uranium (U) materials. By definition however, nuclear forensics can apply to any diverted nuclear material. This includes neptunium (Np), an internationally safeguarded material like Pu and U, that could offer a nuclear security concern if significant quantities were found outside of regulatory control. This case study couples scanning electron microscopy (SEM) with quantitative analysis using newly developed specialized software, to evaluate a non-traditional nuclear forensic sample of Np. Here, the results of the morphological analyses were compared with another Np sample of known pedigree, as well as other traditionalmore » actinide materials in order to determine potential processing and point-of-origin.« less
NASA Technical Reports Server (NTRS)
Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey
1993-01-01
Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.
The Phenix Software for Automated Determination of Macromolecular Structures
Adams, Paul D.; Afonine, Pavel V.; Bunkóczi, Gábor; Chen, Vincent B.; Echols, Nathaniel; Headd, Jeffrey J.; Hung, Li-Wei; Jain, Swati; Kapral, Gary J.; Grosse Kunstleve, Ralf W.; McCoy, Airlie J.; Moriarty, Nigel W.; Oeffner, Robert D.; Read, Randy J.; Richardson, David C.; Richardson, Jane S.; Terwilliger, Thomas C.; Zwart, Peter H.
2011-01-01
X-ray crystallography is a critical tool in the study of biological systems. It is able to provide information that has been a prerequisite to understanding the fundamentals of life. It is also a method that is central to the development of new therapeutics for human disease. Significant time and effort are required to determine and optimize many macromolecular structures because of the need for manual interpretation of complex numerical data, often using many different software packages, and the repeated use of interactive three-dimensional graphics. The Phenix software package has been developed to provide a comprehensive system for macromolecular crystallographic structure solution with an emphasis on automation. This has required the development of new algorithms that minimize or eliminate subjective input in favour of built-in expert-systems knowledge, the automation of procedures that are traditionally performed by hand, and the development of a computational framework that allows a tight integration between the algorithms. The application of automated methods is particularly appropriate in the field of structural proteomics, where high throughput is desired. Features in Phenix for the automation of experimental phasing with subsequent model building, molecular replacement, structure refinement and validation are described and examples given of running Phenix from both the command line and graphical user interface. PMID:21821126
Zhou, Qin; Wang, Zhenzhen; Chen, Jun; Song, Jun; Chen, Lu; Lu, Yi
2016-01-01
For reasons of convenience and economy, attempts have been made to transform traditional dental gypsum casts into 3-dimensional (3D) digital casts. Different scanning devices have been developed to generate digital casts; however, each has its own limitations and disadvantages. The purpose of this study was to develop an advanced method for the 3D reproduction of dental casts by using a high-speed grating projection system and noncontact reverse engineering (RE) software and to evaluate the accuracy of the method. The methods consisted of 3 main steps: the scanning and acquisition of 3D dental cast data with a high-resolution grating projection system, the reconstruction and measurement of digital casts with RE software, and the evaluation of the accuracy of this method using 20 dental gypsum casts. The common anatomic landmarks were measured directly on the gypsum casts with a Vernier caliper and on the 3D digital casts with the Geomagic software measurement tool. Data were statistically assessed with the t test. The grating projection system had a rapid scanning speed, and smooth 3D dental casts were obtained. The mean differences between the gypsum and 3D measurements were approximately 0.05 mm, and no statistically significant differences were found between the 2 methods (P>.05), except for the measurements of the incisor tooth width and maxillary arch length. A method for the 3D reconstruction of dental casts was developed by using a grating projection system and RE software. The accuracy of the casts generated using the grating projection system was comparable with that of the gypsum casts. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
A neural net-based approach to software metrics
NASA Technical Reports Server (NTRS)
Boetticher, G.; Srinivas, Kankanahalli; Eichmann, David A.
1992-01-01
Software metrics provide an effective method for characterizing software. Metrics have traditionally been composed through the definition of an equation. This approach is limited by the fact that all the interrelationships among all the parameters be fully understood. This paper explores an alternative, neural network approach to modeling metrics. Experiments performed on two widely accepted metrics, McCabe and Halstead, indicate that the approach is sound, thus serving as the groundwork for further exploration into the analysis and design of software metrics.
OsiriX: an open-source software for navigating in multidimensional DICOM images.
Rosset, Antoine; Spadola, Luca; Ratib, Osman
2004-09-01
A multidimensional image navigation and display software was designed for display and interpretation of large sets of multidimensional and multimodality images such as combined PET-CT studies. The software is developed in Objective-C on a Macintosh platform under the MacOS X operating system using the GNUstep development environment. It also benefits from the extremely fast and optimized 3D graphic capabilities of the OpenGL graphic standard widely used for computer games optimized for taking advantage of any hardware graphic accelerator boards available. In the design of the software special attention was given to adapt the user interface to the specific and complex tasks of navigating through large sets of image data. An interactive jog-wheel device widely used in the video and movie industry was implemented to allow users to navigate in the different dimensions of an image set much faster than with a traditional mouse or on-screen cursors and sliders. The program can easily be adapted for very specific tasks that require a limited number of functions, by adding and removing tools from the program's toolbar and avoiding an overwhelming number of unnecessary tools and functions. The processing and image rendering tools of the software are based on the open-source libraries ITK and VTK. This ensures that all new developments in image processing that could emerge from other academic institutions using these libraries can be directly ported to the OsiriX program. OsiriX is provided free of charge under the GNU open-source licensing agreement at http://homepage.mac.com/rossetantoine/osirix.
NASA Technical Reports Server (NTRS)
Guarro, Sergio B.
2010-01-01
This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.
The new meaning of quality in the information age.
Prahalad, C K; Krishnan, M S
1999-01-01
Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.
Use of Field Programmable Gate Array Technology in Future Space Avionics
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.; Tate, Robert
2005-01-01
Fulfilling NASA's new vision for space exploration requires the development of sustainable, flexible and fault tolerant spacecraft control systems. The traditional development paradigm consists of the purchase or fabrication of hardware boards with fixed processor and/or Digital Signal Processing (DSP) components interconnected via a standardized bus system. This is followed by the purchase and/or development of software. This paradigm has several disadvantages for the development of systems to support NASA's new vision. Building a system to be fault tolerant increases the complexity and decreases the performance of included software. Standard bus design and conventional implementation produces natural bottlenecks. Configuring hardware components in systems containing common processors and DSPs is difficult initially and expensive or impossible to change later. The existence of Hardware Description Languages (HDLs), the recent increase in performance, density and radiation tolerance of Field Programmable Gate Arrays (FPGAs), and Intellectual Property (IP) Cores provides the technology for reprogrammable Systems on a Chip (SOC). This technology supports a paradigm better suited for NASA's vision. Hardware and software production are melded for more effective development; they can both evolve together over time. Designers incorporating this technology into future avionics can benefit from its flexibility. Systems can be designed with improved fault isolation and tolerance using hardware instead of software. Also, these designs can be protected from obsolescence problems where maintenance is compromised via component and vendor availability.To investigate the flexibility of this technology, the core of the Central Processing Unit and Input/Output Processor of the Space Shuttle AP101S Computer were prototyped in Verilog HDL and synthesized into an Altera Stratix FPGA.
NASA Astrophysics Data System (ADS)
Segret, Boris; Semery, Alain; Vannitsen, Jordan; Mosser, Benoît.; Miau, Jiun-Jih; Juang, Jyh-Ching; Deleflie, Florent
2014-08-01
The AGILE principles in the software industry seems well adapted to the paradigm of CubeSat missions that involve students for the development of space missions. Some of well-known engineering and program processes are revisited on the example of an interplanetary CubeSat mission profile that has been developed by several teams of students in various countries and at various educational levels since 02/2013. The lessons learned at adapting traditional space mission methods are emphasized and they produce a metaphoric image of paving stones.
Xiao, Fengjun; Li, Chengzhi; Sun, Jiangman; Zhang, Lianjie
2017-01-01
To study the rapid growth of research on organic photovoltaic (OPV) technology, development trends in the relevant research are analyzed based on CiteSpace software of text mining and visualization in scientific literature. By this analytical method, the outputs and cooperation of authors, the hot research topics, the vital references and the development trend of OPV are identified and visualized. Different from the traditional review articles by the experts on OPV, this work provides a new method of visualizing information about the development of the OPV technology research over the past decade quantitatively.
Introduction to Message-Bus Architectures for Space Systems
NASA Technical Reports Server (NTRS)
Smith, Dan; Gregory, Brian
2005-01-01
This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system development process are presented. Benefits and lessons learned will be discussed and time for questions and answers will be provided.
NASA Astrophysics Data System (ADS)
Xiao, Fengjun; Li, Chengzhi; Sun, Jiangman; Zhang, Lianjie
2017-09-01
To study the rapid growth of research on organic photovoltaic (OPV) technology, development trends in the relevant research are analyzed based on CiteSpace software of text mining and visualization in scientific literature. By this analytical method, the outputs and cooperation of authors, the hot research topics, the vital references and the development trend of OPV are identified and visualized. Different from the traditional review articles by the experts on OPV, this work provides a new method of visualizing information about the development of the OPV technology research over the past decade quantitatively.
[Application of animal models in gingival retraction experimental curriculum].
Cai, He; Yang, Shu-ying; Zeng, Yong-xiang; Qin, Han; Hu, Shan-shan; Wang, Jian
2016-02-01
To introduce a teaching method for gingival retraction, and evaluate its efficacy for implementation into experimental curricula. First, two kinds of animal models using pigs and cows (below 6 months of age) were established. Twenty-two experienced prosthodontists were then asked to apply gingival retraction on each animal model and evaluate the biofidelity of the 2 models' dento-gingival environment. The data was analyzed with SPSS19.0 software package for paired t test.Then, eighty pre-internship students were randomly divided into 2 groups. Besides the traditional teaching (lecture-based teaching), the experimental group (group A) also had access to skill training (using animal models to practice gingival retraction), while the control group (group B) only used the traditional teaching modality. All students' performance in gingival retraction and impression taking were evaluated in their internship. The data was analyzed with SPSS19.0 software package for Chi-square test. Both pig and cow's dento-gingival environment were similar to that of human being, and there was no significant difference between the two models'biofidelities (P>0.05). In addition, both the effect of gingival retraction and the quality of impression in group A were significantly better than those in group B (P<0.05). Compared with the traditional strategy,practising gingival retraction on animal models can offer greater opportunities for skill development,and be implemented for a wider range of applications.
Using SFOC to fly the Magellan Venus mapping mission
NASA Technical Reports Server (NTRS)
Bucher, Allen W.; Leonard, Robert E., Jr.; Short, Owen G.
1993-01-01
Traditionally, spacecraft flight operations at the Jet Propulsion Laboratory (JPL) have been performed by teams of spacecraft experts utilizing ground software designed specifically for the current mission. The Jet Propulsion Laboratory set out to reduce the cost of spacecraft mission operations by designing ground data processing software that could be used by multiple spacecraft missions, either sequentially or concurrently. The Space Flight Operations Center (SFOC) System was developed to provide the ground data system capabilities needed to monitor several spacecraft simultaneously and provide enough flexibility to meet the specific needs of individual projects. The Magellan Spacecraft Team utilizes the SFOC hardware and software designed for engineering telemetry analysis, both real-time and non-real-time. The flexibility of the SFOC System has allowed the spacecraft team to integrate their own tools with SFOC tools to perform the tasks required to operate a spacecraft mission. This paper describes how the Magellan Spacecraft Team is utilizing the SFOC System in conjunction with their own software tools to perform the required tasks of spacecraft event monitoring as well as engineering data analysis and trending.
An Introduction to Message-Bus Architectures for Space Systems
NASA Technical Reports Server (NTRS)
Smith, Danford; Gregory, Brian
2005-01-01
This course presents technical and programmatic information on the development of message-based architectures for space mission ground and flight software systems. Message-based architecture approaches provide many significant advantages over the more traditional socket-based one-of-a-kind integrated system development approaches. The course provides an overview of publish/subscribe concepts, the use of common isolation layer API's, approaches to message standardization, and other technical topics. Several examples of currently operational systems are discussed and possible changes to the system discussed and time for questions and answers will be provided.
Development of a Space Station Operations Management System
NASA Technical Reports Server (NTRS)
Brandli, A. E.; Mccandless, W. T.
1988-01-01
To enhance the productivity of operations aboard the Space Station, a means must be provided to augment, and frequently to supplant, human effort in support of mission operations and management, both on the ground and onboard. The Operations Management System (OMS), under development at the Johnson Space Center, is one such means. OMS comprises the tools and procedures to facilitate automation of station monitoring, control, and mission planning tasks. OMS mechanizes, and hence rationalizes, execution of tasks traditionally performed by mission planners, the mission control center team, onboard System Management software, and the flight crew.
Development of a Space Station Operations Management System
NASA Astrophysics Data System (ADS)
Brandli, A. E.; McCandless, W. T.
To enhance the productivity of operations aboard the Space Station, a means must be provided to augment, and frequently to supplant, human effort in support of mission operations and management, both on the ground and onboard. The Operations Management System (OMS), under development at the Johnson Space Center, is one such means. OMS comprises the tools and procedures to facilitate automation of station monitoring, control, and mission planning tasks. OMS mechanizes, and hence rationalizes, execution of tasks traditionally performed by mission planners, the mission control center team, onboard System Management software, and the flight crew.
Development of a Two-Wheel Contingency Mode for the MAP Spacecraft
NASA Technical Reports Server (NTRS)
Starin, Scott R.; ODonnell, James R., Jr.; Bauer, Frank (Technical Monitor)
2002-01-01
The Microwave Anisotropy Probe (MAP) is a follow-on mission to the Cosmic Background Explorer (COBE), and is currently collecting data from its orbit near the second Sun-Earth libration point. Due to limited mass, power, and financial resources, a traditional reliability concept including fully redundant components was not feasible for MAP. Instead, the MAP design employs selective hardware redundancy in tandem with contingency software modes and algorithms to improve the odds of mission success. One direction for such improvement has been the development of a two-wheel backup control strategy. This strategy would allow MAP to position itself for maneuvers and collect science data should one of its three reaction wheels fail. Along with operational considerations, the strategy includes three new control algorithms. These algorithms would use the remaining attitude control actuators-thrusters and two reaction wheels-in ways that achieve control goals while minimizing adverse impacts on the functionality of other subsystems and software.
A Language Translator for a Computer Aided Rapid Prototyping System.
1988-03-01
PROBLEM ................... S B. THE TRADITIONAL "WATERFALL LIFE CYCLE" .. ............... 14 C. RAPID PROTOTYPING...feature of everyday life for almost the entire industrialized world. Few governments or businesses function without the aid of computer systems. Com...engineering. B. TIE TRADITIONAL "WATERFALL LIFE CYCLE" I. Characteristics The traditional method of software engineering is the "waterfall life cycle
Just Do It Yourself: Implementing 3D Printing in a Deployed Environment
2017-04-01
This 3D model data can be stored for future manufacturing or manipulated, using software, to improve the parts’ design .8 3D manufactured parts can be...be developed and tested in a virtual environment, very quickly, and before manufacturing has commenced. Additionally, these 3D designs can be...capitalize on this innovative technology. Consequently, AM may offer the best hope for designing a reusable hypersonic weapon. Traditional manufacturing
Legal ramifications of intellectual property
NASA Technical Reports Server (NTRS)
Kempf, Robert F.
1990-01-01
Recent government policy changes that have resulted in encouraging or requiring increased intellectual property rights of federally funded research and development activities are examined. The reasons for these changes are discussed, including considerations related to technology transfer, patent rights, copyrights, trade secrets, and computer software issues. The effect of these changes on traditional approaches to the dissemination of federally funded scientific and technical information is considered and predictions concerning future trends in intellectual property rights are given.
Legal ramifications of intellectual property
NASA Technical Reports Server (NTRS)
Kempf, Robert F.
1990-01-01
Recent government policy changes that have resulted in encouraging or requiring increased intellectual property rights of Federally funded research and development activities are examined. The reasons for these changes are discussed, including considerations related to technology transfer, patent rights, copyrights, trade secrets, and computer software issues. The effect of these changes on traditional approaches to the dissemination of Federally funded scientific and technical information is considered and predictions concerning future trends in intellectual property rights are given.
Hinnen, Deborah A.; Buskirk, Ann; Lyden, Maureen; Amstutz, Linda; Hunter, Tracy; Parkin, Christopher G.; Wagner, Robin
2014-01-01
Background: We assessed users’ proficiency and efficiency in identifying and interpreting self-monitored blood glucose (SMBG), insulin, and carbohydrate intake data using data management software reports compared with standard logbooks. Method: This prospective, self-controlled, randomized study enrolled insulin-treated patients with diabetes (PWDs) (continuous subcutaneous insulin infusion [CSII] and multiple daily insulin injection [MDI] therapy), patient caregivers [CGVs]) and health care providers (HCPs) who were naïve to diabetes data management computer software. Six paired clinical cases (3 CSII, 3 MDI) and associated multiple-choice questions/answers were reviewed by diabetes specialists and presented to participants via a web portal in both software report (SR) and traditional logbook (TL) formats. Participant response time and accuracy were documented and assessed. Participants completed a preference questionnaire at study completion. Results: All participants (54 PWDs, 24 CGVs, 33 HCPs) completed the cases. Participants achieved greater accuracy (assessed by percentage of accurate answers) using the SR versus TL formats: PWDs, 80.3 (13.2)% versus 63.7 (15.0)%, P < .0001; CGVs, 84.6 (8.9)% versus 63.6 (14.4)%, P < .0001; HCPs, 89.5 (8.0)% versus 66.4 (12.3)%, P < .0001. Participants spent less time (minutes) with each case using the SR versus TL formats: PWDs, 8.6 (4.3) versus 19.9 (12.2), P < .0001; CGVs, 7.0 (3.5) versus 15.5 (11.8), P = .0005; HCPs, 6.7 (2.9) versus 16.0 (12.0), P < .0001. The majority of participants preferred using the software reports versus logbook data. Conclusions: Use of the Accu-Chek Connect Online software reports enabled PWDs, CGVs, and HCPs, naïve to diabetes data management software, to identify and utilize key diabetes information with significantly greater accuracy and efficiency compared with traditional logbook information. Use of SRs was preferred over logbooks. PMID:25367012
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Prototyping a Sensor Enabled 3d Citymodel on Geospatial Managed Objects
NASA Astrophysics Data System (ADS)
Kjems, E.; Kolář, J.
2013-09-01
One of the major development efforts within the GI Science domain are pointing at sensor based information and the usage of real time information coming from geographic referenced features in general. At the same time 3D City models are mostly justified as being objects for visualization purposes rather than constituting the foundation of a geographic data representation of the world. The combination of 3D city models and real time information based systems though can provide a whole new setup for data fusion within an urban environment and provide time critical information preserving our limited resources in the most sustainable way. Using 3D models with consistent object definitions give us the possibility to avoid troublesome abstractions of reality, and design even complex urban systems fusing information from various sources of data. These systems are difficult to design with the traditional software development approach based on major software packages and traditional data exchange. The data stream is varying from urban domain to urban domain and from system to system why it is almost impossible to design a complete system taking care of all thinkable instances now and in the future within one constraint software design complex. On several occasions we have been advocating for a new end advanced formulation of real world features using the concept of Geospatial Managed Objects (GMO). This paper presents the outcome of the InfraWorld project, a 4 million Euro project financed primarily by the Norwegian Research Council where the concept of GMO's have been applied in various situations on various running platforms of an urban system. The paper will be focusing on user experiences and interfaces rather then core technical and developmental issues. The project was primarily focusing on prototyping rather than realistic implementations although the results concerning applicability are quite clear.
A Stigmergy Collaboration Approach in the Open Source Software Developer Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Xiaohui; Pullum, Laura L; Treadwell, Jim N
2009-01-01
The communication model of some self-organized online communities is significantly different from the traditional social network based community. It is problematic to use social network analysis to analyze the collaboration structure and emergent behaviors in these communities because these communities lack peer-to-peer connections. Stigmergy theory provides an explanation of the collaboration model of these communities. In this research, we present a stigmergy approach for building an agent-based simulation to simulate the collaboration model in the open source software (OSS) developer community. We used a group of actors who collaborate on OSS projects through forums as our frame of reference andmore » investigated how the choices actors make in contributing their work on the projects determines the global status of the whole OSS project. In our simulation, the forum posts serve as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing the developer agents behavior selection probability.« less
Smartphones in ecology and evolution: a guide for the app-rehensive.
Teacher, Amber G F; Griffiths, David J; Hodgson, David J; Inger, Richard
2013-12-01
Smartphones and their apps (application software) are now used by millions of people worldwide and represent a powerful combination of sensors, information transfer, and computing power that deserves better exploitation by ecological and evolutionary researchers. We outline the development process for research apps, provide contrasting case studies for two new research apps, and scan the research horizon to suggest how apps can contribute to the rapid collection, interpretation, and dissemination of data in ecology and evolutionary biology. We emphasize that the usefulness of an app relies heavily on the development process, recommend that app developers are engaged with the process at the earliest possible stage, and commend efforts to create open-source software scaffolds on which customized apps can be built by nonexperts. We conclude that smartphones and their apps could replace many traditional handheld sensors, calculators, and data storage devices in ecological and evolutionary research. We identify their potential use in the high-throughput collection, analysis, and storage of complex ecological information.
Lin, Steve; Turgulov, Anuar; Taher, Ahmed; Buick, Jason E; Byers, Adam; Drennan, Ian R; Hu, Samantha; J Morrison, Laurie
2016-10-01
Cardiopulmonary resuscitation (CPR) process measures research and quality assurance has traditionally been limited to the first 5 minutes of resuscitation due to significant costs in time, resources, and personnel from manual data abstraction. CPR performance may change over time during prolonged resuscitations, which represents a significant knowledge gap. Moreover, currently available commercial software output of CPR process measures are difficult to analyze. The objective was to develop and validate a software program to help automate the abstraction and transfer of CPR process measures data from electronic defibrillators for complete episodes of cardiac arrest resuscitation. We developed a software program to facilitate and help automate CPR data abstraction and transfer from electronic defibrillators for entire resuscitation episodes. Using an intermediary Extensible Markup Language export file, the automated software transfers CPR process measures data (electrocardiogram [ECG] number, CPR start time, number of ventilations, number of chest compressions, compression rate per minute, compression depth per minute, compression fraction, and end-tidal CO 2 per minute). We performed an internal validation of the software program on 50 randomly selected cardiac arrest cases with resuscitation durations between 15 and 60 minutes. CPR process measures were manually abstracted and transferred independently by two trained data abstractors and by the automated software program, followed by manual interpretation of raw ECG tracings, treatment interventions, and patient events. Error rates and the time needed for data abstraction, transfer, and interpretation were measured for both manual and automated methods, compared to an additional independent reviewer. A total of 9,826 data points were each abstracted by the two abstractors and by the software program. Manual data abstraction resulted in a total of six errors (0.06%) compared to zero errors by the software program. The mean ± SD time measured per case for manual data abstraction was 20.3 ± 2.7 minutes compared to 5.3 ± 1.4 minutes using the software program (p = 0.003). We developed and validated an automated software program that efficiently abstracts and transfers CPR process measures data from electronic defibrillators for complete cardiac arrest episodes. This software will enable future cardiac arrest studies and quality assurance programs to evaluate the impact of CPR process measures during prolonged resuscitations. © 2016 by the Society for Academic Emergency Medicine.
Parametric Modelling of As-Built Beam Framed Structure in Bim Environment
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2017-02-01
A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.
Design and Implementation of Hybrid CORDIC Algorithm Based on Phase Rotation Estimation for NCO
Zhang, Chaozhu; Han, Jinan; Li, Ke
2014-01-01
The numerical controlled oscillator has wide application in radar, digital receiver, and software radio system. Firstly, this paper introduces the traditional CORDIC algorithm. Then in order to improve computing speed and save resources, this paper proposes a kind of hybrid CORDIC algorithm based on phase rotation estimation applied in numerical controlled oscillator (NCO). Through estimating the direction of part phase rotation, the algorithm reduces part phase rotation and add-subtract unit, so that it decreases delay. Furthermore, the paper simulates and implements the numerical controlled oscillator by Quartus II software and Modelsim software. Finally, simulation results indicate that the improvement over traditional CORDIC algorithm is achieved in terms of ease of computation, resource utilization, and computing speed/delay while maintaining the precision. It is suitable for high speed and precision digital modulation and demodulation. PMID:25110750
ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.
Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.
Impact of agile methodologies on team capacity in automotive radio-navigation projects
NASA Astrophysics Data System (ADS)
Prostean, G.; Hutanu, A.; Volker, S.
2017-01-01
The development processes used in automotive radio-navigation projects are constantly under adaption pressure. While the software development models are based on automotive production processes, the integration of peripheral components into an automotive system will trigger a high number of requirement modifications. The use of traditional development models in automotive industry will bring team’s development capacity to its boundaries. The root cause lays in the inflexibility of actual processes and their adaption limits. This paper addresses a new project management approach for the development of radio-navigation projects. The understanding of weaknesses of current used models helped us in development and integration of agile methodologies in traditional development model structure. In the first part we focus on the change management methods to reduce request for change inflow. Established change management risk analysis processes enables the project management to judge the impact of a requirement change and also gives time to the project to implement some changes. However, in big automotive radio-navigation projects the saved time is not enough to implement the large amount of changes, which are submitted to the project. In the second phase of this paper we focus on increasing team capacity by integrating at critical project phases agile methodologies into the used traditional model. The overall objective of this paper is to prove the need of process adaption in order to solve project team capacity bottlenecks.
A reprogrammable receiver architecture for wireless signal interception
NASA Astrophysics Data System (ADS)
Yao, Timothy S.
2003-09-01
In this paper, a re-programmable receiver architecture, based on software-defined-radio concept, for wireless signal interception is presented. The radio-frequency (RF) signal that the receiver would like to intercept may come from a terrestrial cellular network or communication satellites, which their carrier frequency are in the range from 800 MHz (civilian mobile) to 15 GHz (Ku band). To intercept signals from such a wide range of frequency in these variant communication systems, the traditional way is to deploy multiple receivers to scan and detect the desired signal. This traditional approach is obviously unattractive due to the cost, efficiency, and accuracy. Instead, we propose a universal receiver, which is software-driven and re-configurable, to intercept signals of interest. The software-defined-radio based receiver first intercepts RF energy of wide spectrum (25MHz) through antenna, performs zero-IF down conversion (homodyne architecture) to baseband, and digital channelizes the baseband signal. The channelization module is a bank of high performance digital filters. The bandwidth of the filter bank is programmable according to the wireless communication protocol under watch. In the baseband processing, high-performance digital signal processors carry out the detection process and microprocessors handle the communication protocols. The baseband processing is also re-configurable for different wireless standards and protocol. The advantages of the software-defined-radio architecture over traditional RF receiver make it a favorable technology for the communication signal interception and surveillance.
Rapid Prototyping Integrated With Nondestructive Evaluation and Finite Element Analysis
NASA Technical Reports Server (NTRS)
Abdul-Aziz, Ali; Baaklini, George Y.
2001-01-01
Most reverse engineering approaches involve imaging or digitizing an object then creating a computerized reconstruction that can be integrated, in three dimensions, into a particular design environment. Rapid prototyping (RP) refers to the practical ability to build high-quality physical prototypes directly from computer aided design (CAD) files. Using rapid prototyping, full-scale models or patterns can be built using a variety of materials in a fraction of the time required by more traditional prototyping techniques (refs. 1 and 2). Many software packages have been developed and are being designed to tackle the reverse engineering and rapid prototyping issues just mentioned. For example, image processing and three-dimensional reconstruction visualization software such as Velocity2 (ref. 3) are being used to carry out the construction process of three-dimensional volume models and the subsequent generation of a stereolithography file that is suitable for CAD applications. Producing three-dimensional models of objects from computed tomography (CT) scans is becoming a valuable nondestructive evaluation methodology (ref. 4). Real components can be rendered and subjected to temperature and stress tests using structural engineering software codes. For this to be achieved, accurate high-resolution images have to be obtained via CT scans and then processed, converted into a traditional file format, and translated into finite element models. Prototyping a three-dimensional volume of a composite structure by reading in a series of two-dimensional images generated via CT and by using and integrating commercial software (e.g. Velocity2, MSC/PATRAN (ref. 5), and Hypermesh (ref. 6)) is being applied successfully at the NASA Glenn Research Center. The building process from structural modeling to the analysis level is outlined in reference 7. Subsequently, a stress analysis of a composite cooling panel under combined thermomechanical loading conditions was performed to validate this process.
Designing the Undesignable: Social Software and Control
ERIC Educational Resources Information Center
Dron, Jon
2007-01-01
Social software, such as blogs, wikis, tagging systems and collaborative filters, treats the group as a first-class object within the system. Drawing from theories of transactional distance and control, this paper proposes a model of e-learning that extends traditional concepts of learner-teacher-content interactions to include these emergent…
Bootstrapping Methods Applied for Simulating Laboratory Works
ERIC Educational Resources Information Center
Prodan, Augustin; Campean, Remus
2005-01-01
Purpose: The aim of this work is to implement bootstrapping methods into software tools, based on Java. Design/methodology/approach: This paper presents a category of software e-tools aimed at simulating laboratory works and experiments. Findings: Both students and teaching staff use traditional statistical methods to infer the truth from sample…
VLTI auxiliary telescopes: a full object-oriented approach
NASA Astrophysics Data System (ADS)
Chiozzi, Gianluca; Duhoux, Philippe; Karban, Robert
2000-06-01
The Very Large Telescope (VLT) Telescope Control Software (TCS) is a portable system. It is now in use or will be used in a whole family of ESO telescopes VLT Unit Telescopes, VLTI Auxiliary Telescopes, NTT, La Silla 3.6, VLT Survey Telescope and Astronomical Site Monitors in Paranal and La Silla). Although it has been developed making extensive usage of Object Oriented (OO) methodologies, the overall development process chosen at the beginning of the project used traditional methods. In order to warranty a longer lifetime to the system (improving documentation and maintainability) and to prepare for future projects, we have introduced a full OO process. We have taken as a basis the United Software Development Process with the Unified Modeling Language (UML) and we have adapted the process to our specific needs. This paper describes how the process has been applied to the VLTI Auxiliary Telescopes Control Software (ATCS). The ATCS is based on the portable VLT TCS, but some subsystems are new or have specific characteristics. The complete process has been applied to the new subsystems, while reused code has been integrated in the UML models. We have used the ATCS on one side to tune the process and train the team members and on the other side to provide a UML and WWW based documentation for the portable VLT TCS.
Atlas : A library for numerical weather prediction and climate modelling
NASA Astrophysics Data System (ADS)
Deconinck, Willem; Bauer, Peter; Diamantakis, Michail; Hamrud, Mats; Kühnlein, Christian; Maciel, Pedro; Mengaldo, Gianmarco; Quintino, Tiago; Raoult, Baudouin; Smolarkiewicz, Piotr K.; Wedi, Nils P.
2017-11-01
The algorithms underlying numerical weather prediction (NWP) and climate models that have been developed in the past few decades face an increasing challenge caused by the paradigm shift imposed by hardware vendors towards more energy-efficient devices. In order to provide a sustainable path to exascale High Performance Computing (HPC), applications become increasingly restricted by energy consumption. As a result, the emerging diverse and complex hardware solutions have a large impact on the programming models traditionally used in NWP software, triggering a rethink of design choices for future massively parallel software frameworks. In this paper, we present Atlas, a new software library that is currently being developed at the European Centre for Medium-Range Weather Forecasts (ECMWF), with the scope of handling data structures required for NWP applications in a flexible and massively parallel way. Atlas provides a versatile framework for the future development of efficient NWP and climate applications on emerging HPC architectures. The applications range from full Earth system models, to specific tools required for post-processing weather forecast products. The Atlas library thus constitutes a step towards affordable exascale high-performance simulations by providing the necessary abstractions that facilitate the application in heterogeneous HPC environments by promoting the co-design of NWP algorithms with the underlying hardware.
Positron lifetime setup based on DRS4 evaluation board
NASA Astrophysics Data System (ADS)
Petriska, M.; Sojak, S.; Slugeň, V.
2014-04-01
A digital positron lifetime setup based on DRS4 evaluation board designed at the Paul Scherrer Institute has been constructed and tested in the Positron annihilation laboratory Slovak University of Technology Bratislava. The high bandwidth, low power consumption and short readout time make DRS4 chip attractive for positron annihilation lifetime (PALS) setup, replacing traditional ADCs and TDCs. A software for PALS setup online and offline pulse analysis was developed with Qt,Qwt and ALGLIB libraries.
Park, Sophie Elizabeth; Thomas, James
2018-06-07
It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Arbitrary Shape Deformation in CFD Design
NASA Technical Reports Server (NTRS)
Landon, Mark; Perry, Ernest
2014-01-01
Sculptor(R) is a commercially available software tool, based on an Arbitrary Shape Design (ASD), which allows the user to perform shape optimization for computational fluid dynamics (CFD) design. The developed software tool provides important advances in the state-of-the-art of automatic CFD shape deformations and optimization software. CFD is an analysis tool that is used by engineering designers to help gain a greater understanding of the fluid flow phenomena involved in the components being designed. The next step in the engineering design process is to then modify, the design to improve the components' performance. This step has traditionally been performed manually via trial and error. Two major problems that have, in the past, hindered the development of an automated CFD shape optimization are (1) inadequate shape parameterization algorithms, and (2) inadequate algorithms for CFD grid modification. The ASD that has been developed as part of the Sculptor(R) software tool is a major advancement in solving these two issues. First, the ASD allows the CFD designer to freely create his own shape parameters, thereby eliminating the restriction of only being able to use the CAD model parameters. Then, the software performs a smooth volumetric deformation, which eliminates the extremely costly process of having to remesh the grid for every shape change (which is how this process had previously been achieved). Sculptor(R) can be used to optimize shapes for aerodynamic and structural design of spacecraft, aircraft, watercraft, ducts, and other objects that affect and are affected by flows of fluids and heat. Sculptor(R) makes it possible to perform, in real time, a design change that would manually take hours or days if remeshing were needed.
Crawling The Web for Libre: Selecting, Integrating, Extending and Releasing Open Source Software
NASA Astrophysics Data System (ADS)
Truslove, I.; Duerr, R. E.; Wilcox, H.; Savoie, M.; Lopez, L.; Brandt, M.
2012-12-01
Libre is a project developed by the National Snow and Ice Data Center (NSIDC). Libre is devoted to liberating science data from its traditional constraints of publication, location, and findability. Libre embraces and builds on the notion of making knowledge freely available, and both Creative Commons licensed content and Open Source Software are crucial building blocks for, as well as required deliverable outcomes of the project. One important aspect of the Libre project is to discover cryospheric data published on the internet without prior knowledge of the location or even existence of that data. Inspired by well-known search engines and their underlying web crawling technologies, Libre has explored tools and technologies required to build a search engine tailored to allow users to easily discover geospatial data related to the polar regions. After careful consideration, the Libre team decided to base its web crawling work on the Apache Nutch project (http://nutch.apache.org). Nutch is "an open source web-search software project" written in Java, with good documentation, a significant user base, and an active development community. Nutch was installed and configured to search for the types of data of interest, and the team created plugins to customize the default Nutch behavior to better find and categorize these data feeds. This presentation recounts the Libre team's experiences selecting, using, and extending Nutch, and working with the Nutch user and developer community. We will outline the technical and organizational challenges faced in order to release the project's software as Open Source, and detail the steps actually taken. We distill these experiences into a set of heuristics and recommendations for using, contributing to, and releasing Open Source Software.
Dental students' evaluations of an interactive histology software.
Rosas, Cristian; Rubí, Rafael; Donoso, Manuel; Uribe, Sergio
2012-11-01
This study assessed dental students' evaluations of a new Interactive Histology Software (IHS) developed by the authors and compared students' assessment of the extent to which this new software, as well as other histology teaching methods, supported their learning. The IHS is a computer-based tool for histology learning that presents high-resolution images of histology basics as well as specific oral histologies at different magnifications and with text labels. Survey data were collected from 204 first-year dental students at the Universidad Austral de Chile. The survey consisted of questions for the respondents to evaluate the characteristics of the IHS and the contribution of various teaching methods to their histology learning. The response rate was 85 percent. Student evaluations were positive for the design, usability, and theoretical-practical integration of the IHS, and the students reported they would recommend the method to future students. The students continued to value traditional teaching methods for histological lab work and did not think this new technology would replace traditional methods. With respect to the contribution of each teaching method to students' learning, no statistically significant differences (p>0.05) were found for an evaluation of IHS, light microscopy, and slide presentations. However, these student assessments were significantly more positive than the evaluations of other digital or printed materials. Overall, the students evaluated the IHS very positively in terms of method quality and contribution to their learning; they also evaluated use of light microscopy and teacher slide presentations positively.
Salomon-Ferrer, Romelia; Götz, Andreas W; Poole, Duncan; Le Grand, Scott; Walker, Ross C
2013-09-10
We present an implementation of explicit solvent all atom classical molecular dynamics (MD) within the AMBER program package that runs entirely on CUDA-enabled GPUs. First released publicly in April 2010 as part of version 11 of the AMBER MD package and further improved and optimized over the last two years, this implementation supports the three most widely used statistical mechanical ensembles (NVE, NVT, and NPT), uses particle mesh Ewald (PME) for the long-range electrostatics, and runs entirely on CUDA-enabled NVIDIA graphics processing units (GPUs), providing results that are statistically indistinguishable from the traditional CPU version of the software and with performance that exceeds that achievable by the CPU version of AMBER software running on all conventional CPU-based clusters and supercomputers. We briefly discuss three different precision models developed specifically for this work (SPDP, SPFP, and DPDP) and highlight the technical details of the approach as it extends beyond previously reported work [Götz et al., J. Chem. Theory Comput. 2012, DOI: 10.1021/ct200909j; Le Grand et al., Comp. Phys. Comm. 2013, DOI: 10.1016/j.cpc.2012.09.022].We highlight the substantial improvements in performance that are seen over traditional CPU-only machines and provide validation of our implementation and precision models. We also provide evidence supporting our decision to deprecate the previously described fully single precision (SPSP) model from the latest release of the AMBER software package.
The Tetracorder user guide: version 4.4
Livo, Keith Eric; Clark, Roger N.
2014-01-01
Imaging spectroscopy mapping software assists in the identification and mapping of materials based on their chemical properties as expressed in spectral measurements of a planet including the solid or liquid surface or atmosphere. Such software can be used to analyze field, aircraft, or spacecraft data; remote sensing datasets; or laboratory spectra. Tetracorder is a set of software algorithms commanded through an expert system to identify materials based on their spectra (Clark and others, 2003). Tetracorder also can be used in traditional remote sensing analyses, because some of the algorithms are a version of a matched filter. Thus, depending on the instructions fed to the Tetracorder system, results can range from simple matched filter output, to spectral feature fitting, to full identification of surface materials (within the limits of the spectral signatures of materials over the spectral range and resolution of the imaging spectroscopy data). A basic understanding of spectroscopy by the user is required for developing an optimum mapping strategy and assessing the results.
Design and implementation of a cloud based lithography illumination pupil processing application
NASA Astrophysics Data System (ADS)
Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie
2017-02-01
Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.
Model-Based Verification and Validation of Spacecraft Avionics
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Sievers, Michael; Standley, Shaun
2012-01-01
Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.
Using XML and Java for Astronomical Instrumentation Control
NASA Technical Reports Server (NTRS)
Ames, Troy; Koons, Lisa; Sall, Ken; Warsaw, Craig
2000-01-01
Traditionally, instrument command and control systems have been highly specialized, consisting mostly of custom code that is difficult to develop, maintain, and extend. Such solutions are initially very costly and are inflexible to subsequent engineering change requests, increasing software maintenance costs. Instrument description is too tightly coupled with details of implementation. NASA Goddard Space Flight Center is developing a general and highly extensible framework that applies to any kind of instrument that can be controlled by a computer. The software architecture combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML), a human readable and machine understandable way to describe structured data. A key aspect of the object-oriented architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). ]ML is used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, and communication mechanisms. Although the current effort is targeted for the High-resolution Airborne Wideband Camera, a first-light instrument of the Stratospheric Observatory for Infrared Astronomy, the framework is designed to be generic and extensible so that it can be applied to any instrument.
NADIR: A Flexible Archiving System Current Development
NASA Astrophysics Data System (ADS)
Knapic, C.; De Marco, M.; Smareglia, R.; Molinaro, M.
2014-05-01
The New Archiving Distributed InfrastructuRe (NADIR) is under development at the Italian center for Astronomical Archives (IA2) to increase the performances of the current archival software tools at the data center. Traditional softwares usually offer simple and robust solutions to perform data archive and distribution but are awkward to adapt and reuse in projects that have different purposes. Data evolution in terms of data model, format, publication policy, version, and meta-data content are the main threats to re-usage. NADIR, using stable and mature framework features, answers those very challenging issues. Its main characteristics are a configuration database, a multi threading and multi language environment (C++, Java, Python), special features to guarantee high scalability, modularity, robustness, error tracking, and tools to monitor with confidence the status of each project at each archiving site. In this contribution, the development of the core components is presented, commenting also on some performance and innovative features (multi-cast and publisher-subscriber paradigms). NADIR is planned to be developed as simply as possible with default configurations for every project, first of all for LBT and other IA2 projects.
The Design and Implementation of Virtual Roaming in Yunnan Diqing Tibetan traditional Villages
NASA Astrophysics Data System (ADS)
Cao, Lucheng; Xu, Wu; Li, Ke; Jin, Chunjie; Su, Ying; He, Jin
2018-06-01
Traditional residence is the continuation of intangible cultural heritage and the primitive soil for development. At present, the protection and inheritance of traditional villages have been impacted by the process of modernization, and the phenomenon of assimilation is very serious. This article takes the above questions as the breakthrough point, and then analyzes why and how to use virtual reality technology to better solve the above problems, and take the Yunnan Diqing Tibetan traditional dwellings as the specific example to explore. First, using VR technology, with real images and sound, the paper simulate a near real virtual world. Secondly, we collect a large amount of real image information, and make the visualization model of building by using 3DMAX software platform, UV Mapping and Rendering optimization. Finally, the Vizard virtual reality development platform was used to establish the roaming system and realize the virtual interaction. The roaming system was posted online so that overcome the disadvantages of not intuitive and low capability of interaction, and these new ideas can give a whole new meaning in the protection projects of the cultural relic buildings. At the same time, visitors could enjoy the "Dian-style" architectural style and cultural connotation of dwelling house in Diqing Yunnan.
From Excavations to Web: a GIS for Archaeology
NASA Astrophysics Data System (ADS)
D'Urso, M. G.; Corsi, E.; Nemeti, S.; Germani, M.
2017-05-01
The study and the protection of Cultural Heritage in recent years have undergone a revolution about the search tools and the reference disciplines. The technological approach to the problem of the collection, organization and publication of archaeological data using GIS software has completely changed the essence of the traditional methods of investigation, paving the way to the development of several application areas, up to the Cultural Resource Management. A relatively recent specific sector of development for archaeological GIS development sector is dedicated to the intra - site analyses aimed to recording, processing and display information obtained during the excavations. The case - study of the archaeological site located in the south - east of San Pietro Vetere plateau in Aquino, in the Southern Lazio, is concerned with the illustration of a procedure describing the complete digital workflow relative to an intra-site analysis of an archaeological dig. The GIS project implementation and its publication on the web, thanks to several softwares, particularly the FOSS (Free Open Source Software) Quantum - GIS, are an opportunity to reflect on the strengths and the critical nature of this particular application of the GIS technology. For future developments in research it is of fundamental importance the identification of a digital protocol for processing of excavations (from the acquisition, cataloguing, up data insertion), also on account of a possible future Open Project on medieval Aquino.
Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang
2014-01-01
Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment. PMID:24686728
Chen, Shang-Liang; Chen, Yun-Yao; Hsu, Chiang
2014-03-28
Cloud computing is changing the ways software is developed and managed in enterprises, which is changing the way of doing business in that dynamically scalable and virtualized resources are regarded as services over the Internet. Traditional manufacturing systems such as supply chain management (SCM), customer relationship management (CRM), and enterprise resource planning (ERP) are often developed case by case. However, effective collaboration between different systems, platforms, programming languages, and interfaces has been suggested by researchers. In cloud-computing-based systems, distributed resources are encapsulated into cloud services and centrally managed, which allows high automation, flexibility, fast provision, and ease of integration at low cost. The integration between physical resources and cloud services can be improved by combining Internet of things (IoT) technology and Software-as-a-Service (SaaS) technology. This study proposes a new approach for developing cloud-based manufacturing systems based on a four-layer SaaS model. There are three main contributions of this paper: (1) enterprises can develop their own cloud-based logistic management information systems based on the approach proposed in this paper; (2) a case study based on literature reviews with experimental results is proposed to verify that the system performance is remarkable; (3) challenges encountered and feedback collected from T Company in the case study are discussed in this paper for the purpose of enterprise deployment.
Software for integrated manufacturing systems, part 2
NASA Technical Reports Server (NTRS)
Volz, R. A.; Naylor, A. W.
1987-01-01
Part 1 presented an overview of the unified approach to manufacturing software. The specific characteristics of the approach that allow it to realize the goals of reduced cost, increased reliability and increased flexibility are considered. Why the blending of a components view, distributed languages, generics and formal models is important, why each individual part of this approach is essential, and why each component will typically have each of these parts are examined. An example of a specification for a real material handling system is presented using the approach and compared with the standard interface specification given by the manufacturer. Use of the component in a distributed manufacturing system is then compared with use of the traditional specification with a more traditional approach to designing the system. An overview is also provided of the underlying mechanisms used for implementing distributed manufacturing systems using the unified software/hardware component approach.
Combining Traditional and New Literacies in a 21st-Century Writing Workshop
ERIC Educational Resources Information Center
Bogard, Jennifer M.; McMackin, Mary C.
2012-01-01
This article describes how third graders combine traditional literacy practices, including writer's notebooks and graphic organizers, with new literacies, such as video editing software, to create digital personal narratives. The authors emphasize the role of planning in the recursive writing process and describe how technology-based audio…
Generic trending and analysis system
NASA Technical Reports Server (NTRS)
Keehan, Lori; Reese, Jay
1994-01-01
The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.
Antiplagiarism Software Takes on the Honor Code
ERIC Educational Resources Information Center
Wasley, Paula
2008-01-01
Among the 100-odd colleges with academic honor codes, plagiarism-detection services raise a knotty problem: Is software compatible with a system based on trust? The answer frequently devolves to the size and culture of the university. Colleges with traditional student-run honor codes tend to "forefront" trust, emphasizing it above all else. This…
Software risk estimation and management techniques at JPL
NASA Technical Reports Server (NTRS)
Hihn, J.; Lum, K.
2002-01-01
In this talk we will discuss how uncertainty has been incorporated into the JPL software model, probabilistic-based estimates, and how risk is addressed, how cost risk is currently being explored via a variety of approaches, from traditional risk lists, to detailed WBS-based risk estimates to the Defect Detection and Prevention (DDP) tool.
After Losing Users in Catalogs, Libraries Find Better Search Software
ERIC Educational Resources Information Center
Parry, Marc
2010-01-01
Traditional online library catalogs do not tend to order search results by ranked relevance, and they can befuddle users with clunky interfaces. However, that's changing because of two technology trends. First, a growing number of universities are shelling out serious money for sophisticated software that makes exploring their collections more…
Using Information Technology in Teaching of Business Statistics in Nigeria Business School
ERIC Educational Resources Information Center
Hamadu, Dallah; Adeleke, Ismaila; Ehie, Ike
2011-01-01
This paper discusses the use of Microsoft Excel software in the teaching of statistics in the Faculty of Business Administration at the University of Lagos, Nigeria. Problems associated with existing traditional methods are identified and a novel pedagogy using Excel is proposed. The advantages of using this software over other specialized…
Learning Embedded Software Design in an Open 3A Multiuser Laboratory
ERIC Educational Resources Information Center
Shih, Chien-Chou; Hwang, Lain-Jinn
2011-01-01
The need for professional programmers in embedded applications has become critical for industry growth. This need has increased the popularity of embedded software design courses, which are resource-intensive and space-limited in traditional real lab-based instruction. To overcome geographic and time barriers in enhancing practical skills that…
2004-09-01
protection. Firewalls, Intrusion Detection Systems (IDS’s), Anti-Virus (AV) software , and routers are such tools used. In recent years, computer security...associated with operating systems, application software , and computing hardware. When IDS’s are utilized on a host computer or network, there are two...primary approaches to detecting and / or preventing attacks. Traditional IDS’s, like most AV software , rely on known “signatures” to detect attacks
NASA Astrophysics Data System (ADS)
Handayani, Langlang; Prasetya Aji, Mahardika; Susilo; Marwoto, Putut
2016-08-01
An alternative approach of an arts-based instruction for Basic Physics class has been developed through the implementation of video analysis of a Javanesse traditional dance: Bambangan Cakil. A particular movement of the dance -weapon throwing- was analyzed by employing the LoggerPro software package to exemplify projectile motion. The results of analysis indicated that the movement of the thrown weapon in Bambangan Cakil dance provides some helping explanations of several physics concepts of projectile motion: object's path, velocity, and acceleration, in a form of picture, graph and also table. Such kind of weapon path and velocity can be shown via a picture or graph, while such concepts of decreasing velocity in y direction (weapon moving downward and upward) due to acceleration g can be represented through the use of a table. It was concluded that in a Javanesse traditional dance there are many physics concepts which can be explored. The study recommends to bring the traditional dance into a science class which will enable students to get more understanding of both physics concepts and Indonesia cultural heritage.
The relationships between software publications and software systems
NASA Astrophysics Data System (ADS)
Hogg, David W.
2017-01-01
When we build software systems or software tools for astronomy, we sometimes do and sometimes don't also write and publish standard scientific papers about those software systems. I will discuss the pros and cons of writing such publications. There are impacts of writing such papers immediately (they can affect the design and structure of the software project itself), in the short term (they can promote adoption and legitimize the software), in the medium term (they can provide a platform for all the literature's mechanisms for citation, criticism, and reuse), and in the long term (they can preserve ideas that are embodied in the software, possibly on timescales much longer than the lifetime of any software context). I will argue that as important as pure software contributions are to astronomy—and I am both a preacher and a practitioner—software contributions are even more valuable when they are associated with traditional scientific publications. There are exceptions and complexities of course, which I will discuss.
Reducing software mass through behavior control. [of planetary roving robots
NASA Technical Reports Server (NTRS)
Miller, David P.
1992-01-01
Attention is given to the tradeoff between communication and computation as regards a planetary rover (both these subsystems are very power-intensive, and both can be the major driver of the rover's power subsystem, and therefore the minimum mass and size of the rover). Software techniques that can be used to reduce the requirements on both communciation and computation, allowing the overall robot mass to be greatly reduced, are discussed. Novel approaches to autonomous control, called behavior control, employ an entirely different approach, and for many tasks will yield a similar or superior level of autonomy to traditional control techniques, while greatly reducing the computational demand. Traditional systems have several expensive processes that operate serially, while behavior techniques employ robot capabilities that run in parallel. Traditional systems make extensive world models, while behavior control systems use minimal world models or none at all.
An Exploration of Software-Based GNSS Signal Processing at Multiple Frequencies
NASA Astrophysics Data System (ADS)
Pasqual Paul, Manuel; Elosegui, Pedro; Lind, Frank; Vazquez, Antonio; Pankratius, Victor
2017-01-01
The Global Navigation Satellite System (GNSS; i.e., GPS, GLONASS, Galileo, and other constellations) has recently grown into numerous areas that go far beyond the traditional scope in navigation. In the geosciences, for example, high-precision GPS has become a powerful tool for a myriad of geophysical applications such as in geodynamics, seismology, paleoclimate, cryosphere, and remote sensing of the atmosphere. Positioning with millimeter-level accuracy can be achieved through carrier-phase-based, multi-frequency signal processing, which mitigates various biases and error sources such as those arising from ionospheric effects. Today, however, most receivers with multi-frequency capabilities are highly specialized hardware receiving systems with proprietary and closed designs, limited interfaces, and significant acquisition costs. This work explores alternatives that are entirely software-based, using Software-Defined Radio (SDR) receivers as a way to digitize the entire spectrum of interest. It presents an overview of existing open-source frameworks and outlines the next steps towards converting GPS software receivers from single-frequency to dual-frequency, geodetic-quality systems. In the future, this development will lead to a more flexible multi-constellation GNSS processing architecture that can be easily reused in different contexts, as well as to further miniaturization of receivers.
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Lanza, Vincenzo
2002-12-01
The first part of this paper discussed the advantages and communication tools needed to create a Distance Learning Center for continuing medical education by using an Intranet or the Internet. This part continues with an explanation of the hardware, software (largely free) and human resources needed for videoconferencing as well as the costs. Suitable even for small hospitals Distance Learning Centers can be of higher quality than traditional methods of continuing medical education.
NASA Technical Reports Server (NTRS)
Britcher, Colin P.
1987-01-01
The technical background to the development of the digital control system of the NASA/Langley Research Center's 13 inch Magnetic Supension and Balance Systen (MSBS) is reviewed. The implementation of traditional MSBS control algorithms in digital form is examined. Extensive details of the 13-inch MSBS digital controller and related hardware are given, together with the introductory instructions for systems operators. Full listings of software are included in the Appendices.
Titan Science Return Quantification
NASA Technical Reports Server (NTRS)
Weisbin, Charles R.; Lincoln, William
2014-01-01
Each proposal for a NASA mission concept includes a Science Traceability Matrix (STM), intended to show that what is being proposed would contribute to satisfying one or more of the agency's top-level science goals. But the information traditionally provided cannot be used directly to quantitatively compare anticipated science return. We added numerical elements to NASA's STM and developed a software tool to process the data. We then applied this methodology to evaluate a group of competing concepts for a proposed mission to Saturn's moon, Titan.
Hinnen, Deborah A; Buskirk, Ann; Lyden, Maureen; Amstutz, Linda; Hunter, Tracy; Parkin, Christopher G; Wagner, Robin
2015-03-01
We assessed users' proficiency and efficiency in identifying and interpreting self-monitored blood glucose (SMBG), insulin, and carbohydrate intake data using data management software reports compared with standard logbooks. This prospective, self-controlled, randomized study enrolled insulin-treated patients with diabetes (PWDs) (continuous subcutaneous insulin infusion [CSII] and multiple daily insulin injection [MDI] therapy), patient caregivers [CGVs]) and health care providers (HCPs) who were naïve to diabetes data management computer software. Six paired clinical cases (3 CSII, 3 MDI) and associated multiple-choice questions/answers were reviewed by diabetes specialists and presented to participants via a web portal in both software report (SR) and traditional logbook (TL) formats. Participant response time and accuracy were documented and assessed. Participants completed a preference questionnaire at study completion. All participants (54 PWDs, 24 CGVs, 33 HCPs) completed the cases. Participants achieved greater accuracy (assessed by percentage of accurate answers) using the SR versus TL formats: PWDs, 80.3 (13.2)% versus 63.7 (15.0)%, P < .0001; CGVs, 84.6 (8.9)% versus 63.6 (14.4)%, P < .0001; HCPs, 89.5 (8.0)% versus 66.4 (12.3)%, P < .0001. Participants spent less time (minutes) with each case using the SR versus TL formats: PWDs, 8.6 (4.3) versus 19.9 (12.2), P < .0001; CGVs, 7.0 (3.5) versus 15.5 (11.8), P = .0005; HCPs, 6.7 (2.9) versus 16.0 (12.0), P < .0001. The majority of participants preferred using the software reports versus logbook data. Use of the Accu-Chek Connect Online software reports enabled PWDs, CGVs, and HCPs, naïve to diabetes data management software, to identify and utilize key diabetes information with significantly greater accuracy and efficiency compared with traditional logbook information. Use of SRs was preferred over logbooks. © 2014 Diabetes Technology Society.
SCOS 2: An object oriented software development approach
NASA Technical Reports Server (NTRS)
Symonds, Martin; Lynenskjold, Steen; Mueller, Christian
1994-01-01
The Spacecraft Control and Operations System 2 (SCOS 2), is intended to provide the generic mission control system infrastructure for future ESA missions. It represents a bold step forward in order to take advantage of state-of-the-art technology and current practices in the area of software engineering. Key features include: (1) use of object oriented analysis and design techniques; (2) use of UNIX, C++ and a distributed architecture as the enabling implementation technology; (3) goal of re-use for development, maintenance and mission specific software implementation; and (4) introduction of the concept of a spacecraft control model. This paper touches upon some of the traditional beliefs surrounding Object Oriented development and describes their relevance to SCOS 2. It gives rationale for why particular approaches were adopted and others not, and describes the impact of these decisions. The development approach followed is discussed, highlighting the evolutionary nature of the overall process and the iterative nature of the various tasks carried out. The emphasis of this paper is on the process of the development with the following being covered: (1) the three phases of the SCOS 2 project - prototyping & analysis, design & implementation and configuration / delivery of mission specific systems; (2) the close cooperation and continual interaction with the users during the development; (3) the management approach - the split between client staff, industry and some of the required project management activities; (4) the lifecycle adopted being an enhancement of the ESA PSS-05 standard with SCOS 2 specific activities and approaches defined; and (5) an examination of some of the difficulties encountered and the solutions adopted. Finally, the lessons learned from the SCOS 2 experience are highlighted, identifying those issues to be used as feedback into future developments of this nature. This paper does not intend to describe the finished product and its operation, but focusing on the journey to arrive there, concentrating therefore on the process and not the products of the SCOS 2 software development.
Enhanced chemical weapon warning via sensor fusion
NASA Astrophysics Data System (ADS)
Flaherty, Michael; Pritchett, Daniel; Cothren, Brian; Schwaiger, James
2011-05-01
Torch Technologies Inc., is actively involved in chemical sensor networking and data fusion via multi-year efforts with Dugway Proving Ground (DPG) and the Defense Threat Reduction Agency (DTRA). The objective of these efforts is to develop innovative concepts and advanced algorithms that enhance our national Chemical Warfare (CW) test and warning capabilities via the fusion of traditional and non-traditional CW sensor data. Under Phase I, II, and III Small Business Innovative Research (SBIR) contracts with DPG, Torch developed the Advanced Chemical Release Evaluation System (ACRES) software to support non real-time CW sensor data fusion. Under Phase I and II SBIRs with DTRA in conjunction with the Edgewood Chemical Biological Center (ECBC), Torch is using the DPG ACRES CW sensor data fuser as a framework from which to develop the Cloud state Estimation in a Networked Sensor Environment (CENSE) data fusion system. Torch is currently developing CENSE to implement and test innovative real-time sensor network based data fusion concepts using CW and non-CW ancillary sensor data to improve CW warning and detection in tactical scenarios.
Teaching Programming via the Web: A Time-Tested Methodology
ERIC Educational Resources Information Center
Karsten, Rex; Kaparthi, Shashidhar; Roth, Roberta M.
2005-01-01
Advances in information and communication technologies give us the ability to reach out beyond the time and place limitations of the traditional classroom. However, effective online teaching is more than just transferring traditional courses to the World Wide Web (WWW). We describe how we have used "off the shelf" software and the infrastructure…
ERIC Educational Resources Information Center
Fulkerth, Robert
This paper discusses the processes and outcomes of translating a traditionally-taught business writing course into the online format, using bulletin board software. The paper covers creating, teaching, and managing the online business writing course at Golden Gate University (San Francisco, California). Pedagogical objectives are to emulate group…
A Comparison of Simplified-Visually Rich and Traditional Presentation Styles
ERIC Educational Resources Information Center
Johnson, Douglas A.; Christensen, Jack
2011-01-01
Microsoft PowerPoint and similar presentation tools have become commonplace in higher education, yet there is very little research on the effectiveness of different PowerPoint formats for implementing this software. This study compared two PowerPoint presentation techniques: a more traditional format employing heavy use of bullet points with text…
Discovering Theorems in Abstract Algebra Using the Software "GAP"
ERIC Educational Resources Information Center
Blyth, Russell D.; Rainbolt, Julianne G.
2010-01-01
A traditional abstract algebra course typically consists of the professor stating and then proving a sequence of theorems. As an alternative to this classical structure, the students could be expected to discover some of the theorems even before they are motivated by classroom examples. This can be done by using a software system to explore a…
Cabri-Geometre: Does Dynamic Geometry Software (DGS) Change Geometry and Its Teaching and Learning?
ERIC Educational Resources Information Center
Straesser, Rudolf
2001-01-01
Discusses geometry and Dynamical Geometry Software (DGS). Analyses the way DGS-use influences traditional geometry. Highlights changes in the interactions between geometry, computers, and DGS and human users, focusing on changes in the teaching and learning of geometry. Concludes that DGS deeply changes geometry if it is taken as a human activity…
Technology Evaluation Tools and Teacher Performance in Public Schools
ERIC Educational Resources Information Center
Stonehouse, Pauline; Keengwe, Jared
2013-01-01
The purpose of this study was, (a) to describe the introduction of mVAL software and Charlotte Danielson Rubrics (CDR) as teacher evaluation tools; (b) to compare the process and outcomes of the new initiative with traditional systems; and (c) to evaluate the software from the perspective of participants in the system. This study highlights the…
Soft Where? Licensing Struggles in a Virtual World
ERIC Educational Resources Information Center
Ramaswami, Rama
2011-01-01
As virtualization becomes commonplace in higher education, it is clear that the traditional licensing options for software are woefully inadequate. The definitions of who is licensed to use what--and where--are blurring, as users move from physical to virtual spaces and can access software from a variety of devices. In discussing the need for new…
Design and implementation of a general main axis controller for the ESO telescopes
NASA Astrophysics Data System (ADS)
Sandrock, Stefan; Di Lieto, Nicola; Pettazzi, Lorenzo; Erm, Toomas
2012-09-01
Most of the real-time control systems at the existing ESO telescopes were developed with "traditional" methods, using general purpose VMEbus electronics, and running applications that were coded by hand, mostly using the C programming language under VxWorks. As we are moving towards more modern design methods, we have explored a model-based design approach for real-time applications in the telescope area, and used the control algorithm of a standard telescope main axis as a first example. We wanted to have a clear work-flow that follows the "correct-by-construction" paradigm, where the implementation is testable in simulation on the development host, and where the testing time spent by debugging on target is minimized. It should respect the domains of control, electronics, and software engineers in the choice of tools. It should be a targetindependent approach so that the result could be deployed on various platforms. We have selected the Mathworks tools Simulink, Stateflow, and Embedded Coder for design and implementation, and LabVIEW with NI hardware for hardware-in-the-loop testing, all of which are widely used in industry. We describe how these tools have been used in order to model, simulate, and test the application. We also evaluate the benefits of this approach compared to the traditional method with respect to testing effort and maintainability. For a specific axis controller application we have successfully integrated the result into the legacy platform of the existing VLT software, as well as demonstrated how to use the same design for a new development with a completely different environment.
NASA Technical Reports Server (NTRS)
Choudhary, Abdur Rahim
1994-01-01
The Science Operations Center (SOC) for the X-ray Timing Explorer (XTE) mission is an important component of the XTE ground system. Its mandate includes: (1) command and telemetry for the three XTE instruments, using CCSDS standards; (2) monitoring of the real-time science operations, reconfiguration of the experiment and the instruments, and real-time commanding to address the targets of opportunity (TOO) and alternate observations; and (3) analysis, processing, and archival of the XTE telemetry, and the timely delivery of the data products to the principal investigator (PI) teams and the guest observers (GO). The SOC has two major components: the science operations facility (SOF) that addresses the first two objectives stated above and the guest observer facility (GOF) that addresses the third. The SOF has subscribed to the object oriented design and implementation; while the GOF uses the traditional approach in order to take advantage of the existing software developed in support of previous missions. This paper details the SOF development using the object oriented design (OOD), and its implementation using the object oriented programming (OOP) in C++ under Unix environment on client-server architecture using Sun workstations. It also illustrates how the object oriented (OO) and the traditional approaches coexist in SOF and GOF, the lessons learned, and how the OOD facilitated the distributed software development collaboratively by four different teams. Details are presented for the SOF system, its major subsystems, its interfaces with the rest of the XTE ground data system, and its design and implementation approaches.
Neuhauser, Linda; Kreps, Gary L; Morrison, Kathleen; Athanasoulis, Marcos; Kirienko, Nikolai; Van Brunt, Deryk
2013-08-01
This paper describes how design science theory and methods and use of artificial intelligence (AI) components can improve the effectiveness of health communication. We identified key weaknesses of traditional health communication and features of more successful eHealth/AI communication. We examined characteristics of the design science paradigm and the value of its user-centered methods to develop eHealth/AI communication. We analyzed a case example of the participatory design of AI components in the ChronologyMD project intended to improve management of Crohn's disease. eHealth/AI communication created with user-centered design shows improved relevance to users' needs for personalized, timely and interactive communication and is associated with better health outcomes than traditional approaches. Participatory design was essential to develop ChronologyMD system architecture and software applications that benefitted patients. AI components can greatly improve eHealth/AI communication, if designed with the intended audiences. Design science theory and its iterative, participatory methods linked with traditional health communication theory and methods can create effective AI health communication. eHealth/AI communication researchers, developers and practitioners can benefit from a holistic approach that draws from theory and methods in both design sciences and also human and social sciences to create successful AI health communication. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Doughty, Teresa Taber; Bouck, Emily C; Bassette, Laura; Szwed, Kathryn; Flanagan, Sara
2013-01-01
The purpose of this study was to examine the effects of a pentop computer and accompanying spelling software on the spelling accuracy and academic engagement behavior in three elementary students with disabilities who were served in a resource room setting. Using a multiple baseline across students single subject research design, researchers determined student use of the pentop computer--the FLYPen--and its spelling software may serve as an equivalent intervention to traditional spelling instruction. While academic engagement performance increased considerably for students when using the FLYPen, results indicated little to no improvement over traditional instruction in spelling accuracy. Implications and suggestions for future research are presented.
Leveraging e-Science infrastructure for electrochemical research.
Peachey, Tom; Mashkina, Elena; Lee, Chong-Yong; Enticott, Colin; Abramson, David; Bond, Alan M; Elton, Darrell; Gavaghan, David J; Stevenson, Gareth P; Kennedy, Gareth F
2011-08-28
As in many scientific disciplines, modern chemistry involves a mix of experimentation and computer-supported theory. Historically, these skills have been provided by different groups, and range from traditional 'wet' laboratory science to advanced numerical simulation. Increasingly, progress is made by global collaborations, in which new theory may be developed in one part of the world and applied and tested in the laboratory elsewhere. e-Science, or cyber-infrastructure, underpins such collaborations by providing a unified platform for accessing scientific instruments, computers and data archives, and collaboration tools. In this paper we discuss the application of advanced e-Science software tools to electrochemistry research performed in three different laboratories--two at Monash University in Australia and one at the University of Oxford in the UK. We show that software tools that were originally developed for a range of application domains can be applied to electrochemical problems, in particular Fourier voltammetry. Moreover, we show that, by replacing ad-hoc manual processes with e-Science tools, we obtain more accurate solutions automatically.
NASA Astrophysics Data System (ADS)
Zhang, Xinyue; Zhang, Qisheng; Wang, Meng; Kong, Qiang; Zhang, Shengquan; He, Ruihao; Liu, Shenghui; Li, Shuhan; Yuan, Zhenzhong
2017-11-01
Due to the pressing demand for metallic ore exploration technology in China, several new technologies are being employed in the relevant exploration instruments. In addition to possessing the high resolution of the traditional transient electromagnetic method, high-efficiency measurements, and a short measurement time, the multichannel transient electromagnetic method (MTEM) technology can also sensitively determine the characteristics of a low-resistivity geologic body, without being affected by the terrain. Besides, the MTEM technology also solves the critical, existing interference problem in electrical exploration technology. This study develops a full-waveform voltage and current recording device for MTEM transmitters. After continuous acquisition and storage of the large, pseudo-random current signals emitted by the MTEM transmitter, these signals are then convoluted with the signals collected by the receiver to obtain the earth's impulse response. In this paper, the overall design of the full-waveform recording apparatus, including the hardware and upper-computer software designs, the software interface display, and the results of field test, is discussed in detail.
Research on AutoCAD secondary development and function expansion based on VBA technology
NASA Astrophysics Data System (ADS)
Zhang, Runmei; Gu, Yehuan
2017-06-01
AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.
NASA Astrophysics Data System (ADS)
Roccatello, E.; Nozzi, A.; Rumor, M.
2013-05-01
This paper illustrates the key concepts behind the design and the development of a framework, based on OGC services, capable to visualize 3D large scale geospatial data streamed over the web. WebGISes are traditionally bounded to a bi-dimensional simplified representation of the reality and though they are successfully addressing the lack of flexibility and simplicity of traditional desktop clients, a lot of effort is still needed to reach desktop GIS features, like 3D visualization. The motivations behind this work lay in the widespread availability of OGC Web Services inside government organizations and in the technology support to HTML 5 and WebGL standard of the web browsers. This delivers an improved user experience, similar to desktop applications, therefore allowing to augment traditional WebGIS features with a 3D visualization framework. This work could be seen as an extension of the Cityvu project, started in 2008 with the aim of a plug-in free OGC CityGML viewer. The resulting framework has also been integrated in existing 3DGIS software products and will be made available in the next months.
Model for Simulating a Spiral Software-Development Process
NASA Technical Reports Server (NTRS)
Mizell, Carolyn; Curley, Charles; Nayak, Umanath
2010-01-01
A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code), productivity (number of lines of code per hour), and number of defects per source line of code. The user provides the number of resources, the overall percent of effort that should be allocated to each process step, and the number of desired staff members for each step. The output of PATT includes the size of the product, a measure of effort, a measure of rework effort, the duration of the entire process, and the numbers of injected, detected, and corrected defects as well as a number of other interesting features. In the development of the present model, steps were added to the IEEE 12207 waterfall process, and this model and its implementing software were made to run repeatedly through the sequence of steps, each repetition representing an iteration in a spiral process. Because the IEEE 12207 model is founded on a waterfall paradigm, it enables direct comparison of spiral and waterfall processes. The model can be used throughout a software-development project to analyze the project as more information becomes available. For instance, data from early iterations can be used as inputs to the model, and the model can be used to estimate the time and cost of carrying the project to completion.
NASA Astrophysics Data System (ADS)
Lelièvre, Peter G.; Grey, Melissa
2017-08-01
Quantitative morphometric analyses of form are widely used in palaeontology, especially for taxonomic and evolutionary research. These analyses can involve several measurements performed on hundreds or even thousands of samples. Performing measurements of size and shape on large assemblages of macro- or microfossil samples is generally infeasible or impossible with traditional instruments such as vernier calipers. Instead, digital image processing software is required to perform measurements via suitable digital images of samples. Many software packages exist for morphometric analyses but there is not much available for the integral stage of data collection, particularly for the measurement of the outlines of samples. Some software exists to automatically detect the outline of a fossil sample from a digital image. However, automatic outline detection methods may perform inadequately when samples have incomplete outlines or images contain poor contrast between the sample and staging background. Hence, a manual digitization approach may be the only option. We are not aware of any software packages that are designed specifically for efficient digital measurement of fossil assemblages with numerous samples, especially for the purposes of manual outline analysis. Throughout several previous studies, we have developed a new software tool, JMorph, that is custom-built for that task. JMorph provides the means to perform many different types of measurements, which we describe in this manuscript. We focus on JMorph's ability to rapidly and accurately digitize the outlines of fossils. JMorph is freely available from the authors.
NASA Astrophysics Data System (ADS)
Moritzer, Elmar; Müller, Ellen; Martin, Yannick; Kleeschulte, Rainer
2015-05-01
Today the global market poses great challenges for industrial product development. Complexity, diversity of variants, flexibility and individuality are just some of the features that products have to offer today. In addition, the product series have shorter lifetimes. Because of their high capacity for adaption, polymers are increasingly able to displace traditional materials such as wood, glass and metals from various fields of application. Polymers can only be used to substitute other materials, however, if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important. Integrating the compounding step in the injection moulding process permits a more efficient and faster development process for a new polymer formulation, making it possible to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. The entire process sequence is supported by software from Bayer Technology called Product Design Workbench (PDWB), which provides assistance in all the individual steps from data management, via analysis and model compilation, right through to the optimization of the formulation and the design of experiments. The software is based on artificial neural networks and can model the formulation-property correlations and thus enable different formulations to be optimized. In the study presented, the workflow and the modelling with the software are presented.
Closed-Loop Analysis of Soft Decisions for Serial Links
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Steele, Glen F.; Zucha, Joan P.; Schlesinger, Adam M.
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
Classroom Live: a software-assisted gamification tool
NASA Astrophysics Data System (ADS)
de Freitas, Adrian A.; de Freitas, Michelle M.
2013-06-01
Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin; Steele, Glen; Zucha, Joan; Schlesinger, Adam
2013-01-01
We describe the benefit of using closed-loop measurements for a radio receiver paired with a counterpart transmitter. We show that real-time analysis of the soft decision output of a receiver can provide rich and relevant insight far beyond the traditional hard-decision bit error rate (BER) test statistic. We describe a Soft Decision Analyzer (SDA) implementation for closed-loop measurements on single- or dual- (orthogonal) channel serial data communication links. The analyzer has been used to identify, quantify, and prioritize contributors to implementation loss in live-time during the development of software defined radios. This test technique gains importance as modern receivers are providing soft decision symbol synchronization as radio links are challenged to push more data and more protocol overhead through noisier channels, and software-defined radios (SDRs) use error-correction codes that approach Shannon's theoretical limit of performance.
Field-based Information Technology in Geology Education: GeoPads
NASA Astrophysics Data System (ADS)
Knoop, P. A.; van der Pluijm, B.
2004-12-01
During the past two summers, we have successfully incorporated a field-based information technology component into our senior-level, field geology course (GS-440) at the University of Michigan's Camp Davis Geology Field Station, near Jackson, WY. Using GeoPads -- rugged TabletPCs equipped with electronic notebook software, GIS, GPS, and wireless networking -- we have significantly enhanced our field mapping exercises and field trips. While fully retaining the traditional approaches and advantages of field instruction, GeoPads offer important benefits in the development of students' spatial reasoning skills. GeoPads enable students to record observations and directly create geologic maps in the field, using a combination of an electronic field notebook (Microsoft OneNote) tightly integrated with pen-enabled GIS software (ArcGIS-ArcMap). Specifically, this arrangement permits students to analyze and manipulate their data in multiple contexts and representations -- while still in the field -- using both traditional 2-D map views, as well as richer 3-D contexts. Such enhancements provide students with powerful exploratory tools that aid the development of spatial reasoning skills, allowing more intuitive interactions with 2-D representations of our 3-D world. Additionally, field-based GIS mapping enables better error-detection, through immediate interaction with current observations in the context of both supporting data (e.g., topographic maps, aerial photos, magnetic surveys) and students' ongoing observations. The overall field-based IT approach also provides students with experience using tools that are increasingly relevant to their future academic or professional careers.
Welcome to the techno highway: development of a health assessment CD-ROM and website.
Bosco, Anna Maria; Ward, Catherine
2005-09-01
Traditionally teaching nursing students psychomotor skills took place in a laboratory setting; however, recent developments in computer technology have revolutionised how educators can transfer knowledge. To meet the need for an efficient and interactive learning experience a software product was required to educate nursing students about health assessment techniques. This paper presents how existing 'old technology' of a video was given new life by embracing new technology, resulting in development of an interactive CD-ROM with supporting WebCT. This innovation reflects a more flexible approach to learning as it is dynamic, portable, self-paced and more convenient for adult learners especially those in remote areas.
The Role of Dynamic Geometry Software in High School Geometry Curricula: An Analysis of Proof Tasks
ERIC Educational Resources Information Center
Oner, Diler
2009-01-01
In this study, I examine the role of dynamic geometry software (DGS) in curricular proof tasks. I investigated seven US high school geometry textbooks that were categorised into three groups: technology-intensive, standards-based, and traditional curricula. I looked at the frequency and purpose of DGS use in these textbooks. In addition, I…
CrossTalk: The Journal of Defense Software Engineering. Volume 18, Number 10
2005-10-01
1: Disciplines Contributing to Software Assurance When a natural disaster strikes, a cor-poration normally places a disaster recovery plan into effect...survivability, and contrasts survivability with the traditional disaster recovery and business continuity disciplines . A system survivability design...specialty engi- neering disciplines and their requirements. These disciplines include availability, reliabil- ity, maintainability, and accountability as
A Framework for Adopting LMS to Introduce e-Learning in a Traditional Course
ERIC Educational Resources Information Center
Georgouli, Katerina; Skalkidis, Ilias; Guerreiro, Pedro
2008-01-01
As more and more teachers in tertiary education experiment with technology, looking for new ways of enhancing their traditional ways of teaching, the need of flexible tools able to support well planned blended learning scenarios emerges. Learning Management Systems, especially those which are based on open source software, have shown to be very…
Utilizing Software Application Tools to Enhance Online Student Engagement and Achievement
ERIC Educational Resources Information Center
Andersson, David; Reimers, Karl
2010-01-01
The field of education is experiencing a rapid shift as internet-enabled distance learning becomes more widespread. Often, traditional classroom teaching pedagogical techniques can be ill-suited to the online environment. While a traditional entry-level class might see a student attrition rate of 5-10%, the same teaching pedagogy in an online…
Gaburro, Julie; Duchemin, Jean-Bernard; Paradkar, Prasad N; Nahavandi, Saeid; Bhatti, Asim
2016-11-18
Widespread in the tropics, the mosquito Aedes aegypti is an important vector of many viruses, posing a significant threat to human health. Vector monitoring often requires fecundity estimation by counting eggs laid by female mosquitoes. Traditionally, manual data analyses have been used but this requires a lot of effort and is the methods are prone to errors. An easy tool to assess the number of eggs laid would facilitate experimentation and vector control operations. This study introduces a built-in software called ICount allowing automatic egg counting of the mosquito vector, Aedes aegypti. ICount egg estimation compared to manual counting is statistically equivalent, making the software effective for automatic and semi-automatic data analysis. This technique also allows rapid analysis compared to manual methods. Finally, the software has been used to assess p-cresol oviposition choices under laboratory conditions in order to test the system with different egg densities. ICount is a powerful tool for fast and precise egg count analysis, freeing experimenters from manual data processing. Software access is free and its user-friendly interface allows easy use by non-experts. Its efficiency has been tested in our laboratory with oviposition dual choices of Aedes aegypti females. The next step will be the development of a mobile application, based on the ICount platform, for vector monitoring surveys in the field.
Adding intelligent services to an object oriented system
NASA Technical Reports Server (NTRS)
Robideaux, Bret R.; Metzler, Theodore A.
1994-01-01
As today's software becomes increasingly complex, the need grows for intelligence of one sort or another to becomes part of the application, often an intelligence that does not readily fit the paradigm of one's software development. There are many methods of developing software, but at this time, the most promising is the object oriented (OO) method. This method involves an analysis to abstract the problem into separate 'objects' that are unique in the data that describe them and the behavior that they exhibit, and eventually to convert this analysis into computer code using a programming language that was designed (or retrofitted) for OO implementation. This paper discusses the creation of three different applications that are analyzed, designed, and programmed using the Shlaer/Mellor method of OO development and C++ as the programming language. All three, however, require the use of an expert system to provide an intelligence that C++ (or any other 'traditional' language) is not directly suited to supply. The flexibility of CLIPS permitted us to make modifications to it that allow seamless integration with any of our applications that require an expert system. We illustrate this integration with the following applications: (1) an after action review (AAR) station that assists a reviewer in watching a simulated tank battle and developing an AAR to critique the performance of the participants in the battle; (2) an embedded training system and over-the-shoulder coach for howitzer crewmen; and (3) a system to identify various chemical compounds from their infrared absorption spectra.
NASA Astrophysics Data System (ADS)
Grandi, C.; Italiano, A.; Salomoni, D.; Calabrese Melcarne, A. K.
2011-12-01
WNoDeS, an acronym for Worker Nodes on Demand Service, is software developed at CNAF-Tier1, the National Computing Centre of the Italian Institute for Nuclear Physics (INFN) located in Bologna. WNoDeS provides on demand, integrated access to both Grid and Cloud resources through virtualization technologies. Besides the traditional use of computing resources in batch mode, users need to have interactive and local access to a number of systems. WNoDeS can dynamically select these computers instantiating Virtual Machines, according to the requirements (computing, storage and network resources) of users through either the Open Cloud Computing Interface API, or through a web console. An interactive use is usually limited to activities in user space, i.e. where the machine configuration is not modified. In some other instances the activity concerns development and testing of services and thus implies the modification of the system configuration (and, therefore, root-access to the resource). The former use case is a simple extension of the WNoDeS approach, where the resource is provided in interactive mode. The latter implies saving the virtual image at the end of each user session so that it can be presented to the user at subsequent requests. This work describes how the LHC experiments at INFN-Bologna are testing and making use of these dynamically created ad-hoc machines via WNoDeS to support flexible, interactive analysis and software development at the INFN Tier-1 Computing Centre.
Advanced Numerical Methods and Software Approaches for Semiconductor Device Simulation
Carey, Graham F.; Pardhanani, A. L.; Bova, S. W.
2000-01-01
In this article we concisely present several modern strategies that are applicable to driftdominated carrier transport in higher-order deterministic models such as the driftdiffusion, hydrodynamic, and quantum hydrodynamic systems. The approaches include extensions of “upwind” and artificial dissipation schemes, generalization of the traditional Scharfetter – Gummel approach, Petrov – Galerkin and streamline-upwind Petrov Galerkin (SUPG), “entropy” variables, transformations, least-squares mixed methods and other stabilized Galerkin schemes such as Galerkin least squares and discontinuous Galerkin schemes. The treatment is representative rather than an exhaustive review and several schemes are mentioned only briefly with appropriate reference to the literature. Some of themore » methods have been applied to the semiconductor device problem while others are still in the early stages of development for this class of applications. We have included numerical examples from our recent research tests with some of the methods. A second aspect of the work deals with algorithms that employ unstructured grids in conjunction with adaptive refinement strategies. The full benefits of such approaches have not yet been developed in this application area and we emphasize the need for further work on analysis, data structures and software to support adaptivity. Finally, we briefly consider some aspects of software frameworks. These include dial-an-operator approaches such as that used in the industrial simulator PROPHET, and object-oriented software support such as those in the SANDIA National Laboratory framework SIERRA.« less
Remote access laboratories in Australia and Europe
NASA Astrophysics Data System (ADS)
Ku, H.; Ahfock, T.; Yusaf, T.
2011-06-01
Remote access laboratories (RALs) were first developed in 1994 in Australia and Switzerland. The main purposes of developing them are to enable students to do their experiments at their own pace, time and locations and to enable students and teaching staff to get access to facilities beyond their institutions. Currently, most of the experiments carried out through RALs in Australia are heavily biased towards electrical, electronic and computer engineering disciplines. However, the experiments carried out through RALs in Europe had more variety, in addition to the traditional electrical, electronic and computer engineering disciplines, there were experiments in mechanical and mechatronic disciplines. It was found that RALs are now being developed aggressively in Australia and Europe and it can be argued that RALs will develop further and faster in the future with improving Internet technology. The rising costs of real experimental equipment will also speed up their development because by making the equipment remotely accessible, the cost can be shared by more universities or institutions and this will improve their cost-effectiveness. Their development would be particularly rapid in large countries with small populations such as Australia, Canada and Russia, because of the scale of economy. Reusability of software, interoperability in software implementation, computer supported collaborative learning and convergence with learning management systems are the required development of future RALs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tal, J.; Lopez, A.; Edwards, J.M.
1995-04-01
In this paper, an alternative solution to the traditional CNC machine tool controller has been introduced. Software and hardware modules have been described and their incorporation in a CNC control system has been outlined. This type of CNC machine tool controller demonstrates that technology is accessible and can be readily implemented into an open architecture machine tool controller. Benefit to the user is greater controller flexibility, while being economically achievable. PC based, motion as well as non-motion features will provide flexibility through a Windows environment. Up-grading this type of controller system through software revisions will keep the machine tool inmore » a competitive state with minimal effort. Software and hardware modules are mass produced permitting competitive procurement and incorporation. Open architecture CNC systems provide diagnostics thus enhancing maintainability, and machine tool up-time. A major concern of traditional CNC systems has been operator training time. Training time can be greatly minimized by making use of Windows environment features.« less
Design and implementation of an online systemic human anatomy course with laboratory.
Attardi, Stefanie M; Rogers, Kem A
2015-01-01
Systemic Human Anatomy is a full credit, upper year undergraduate course with a (prosection) laboratory component at Western University Canada. To meet enrollment demands beyond the physical space of the laboratory facility, a fully online section was developed to run concurrently with the traditional face to face (F2F) course. Lectures given to F2F students are simultaneously broadcasted to online students using collaborative software (Blackboard Collaborate). The same collaborative software is used by a teaching assistant to deliver laboratory demonstrations in which three-dimensional (3D) virtual anatomical models are manipulated. Ten commercial software programs were reviewed to determine their suitability for demonstrating the virtual models, resulting in the selection of Netter's 3D Interactive Anatomy. Supplementary online materials for the central nervous system were developed by creating 360° images of plastinated prosected brain specimens and a website through which they could be accessed. This is the first description of a fully online undergraduate anatomy course with a live, interactive laboratory component. Preliminary data comparing the online and F2F student grades suggest that previous student academic performance, and not course delivery format, predicts performance in anatomy. Future qualitative studies will reveal student perceptions about their learning experiences in both of the course delivery formats. © 2014 American Association of Anatomists.
Development of a computer-aided design software for dental splint in orthognathic surgery
NASA Astrophysics Data System (ADS)
Chen, Xiaojun; Li, Xing; Xu, Lu; Sun, Yi; Politis, Constantinus; Egger, Jan
2016-12-01
In the orthognathic surgery, dental splints are important and necessary to help the surgeon reposition the maxilla or mandible. However, the traditional methods of manual design of dental splints are difficult and time-consuming. The research on computer-aided design software for dental splints is rarely reported. Our purpose is to develop a novel special software named EasySplint to design the dental splints conveniently and efficiently. The design can be divided into two steps, which are the generation of initial splint base and the Boolean operation between it and the maxilla-mandibular model. The initial splint base is formed by ruled surfaces reconstructed using the manually picked points. Then, a method to accomplish Boolean operation based on the distance filed of two meshes is proposed. The interference elimination can be conducted on the basis of marching cubes algorithm and Boolean operation. The accuracy of the dental splint can be guaranteed since the original mesh is utilized to form the result surface. Using EasySplint, the dental splints can be designed in about 10 minutes and saved as a stereo lithography (STL) file for 3D printing in clinical applications. Three phantom experiments were conducted and the efficiency of our method was demonstrated.
Development of a computer-aided design software for dental splint in orthognathic surgery
Chen, Xiaojun; Li, Xing; Xu, Lu; Sun, Yi; Politis, Constantinus; Egger, Jan
2016-01-01
In the orthognathic surgery, dental splints are important and necessary to help the surgeon reposition the maxilla or mandible. However, the traditional methods of manual design of dental splints are difficult and time-consuming. The research on computer-aided design software for dental splints is rarely reported. Our purpose is to develop a novel special software named EasySplint to design the dental splints conveniently and efficiently. The design can be divided into two steps, which are the generation of initial splint base and the Boolean operation between it and the maxilla-mandibular model. The initial splint base is formed by ruled surfaces reconstructed using the manually picked points. Then, a method to accomplish Boolean operation based on the distance filed of two meshes is proposed. The interference elimination can be conducted on the basis of marching cubes algorithm and Boolean operation. The accuracy of the dental splint can be guaranteed since the original mesh is utilized to form the result surface. Using EasySplint, the dental splints can be designed in about 10 minutes and saved as a stereo lithography (STL) file for 3D printing in clinical applications. Three phantom experiments were conducted and the efficiency of our method was demonstrated. PMID:27966601
Development of a computer-aided design software for dental splint in orthognathic surgery.
Chen, Xiaojun; Li, Xing; Xu, Lu; Sun, Yi; Politis, Constantinus; Egger, Jan
2016-12-14
In the orthognathic surgery, dental splints are important and necessary to help the surgeon reposition the maxilla or mandible. However, the traditional methods of manual design of dental splints are difficult and time-consuming. The research on computer-aided design software for dental splints is rarely reported. Our purpose is to develop a novel special software named EasySplint to design the dental splints conveniently and efficiently. The design can be divided into two steps, which are the generation of initial splint base and the Boolean operation between it and the maxilla-mandibular model. The initial splint base is formed by ruled surfaces reconstructed using the manually picked points. Then, a method to accomplish Boolean operation based on the distance filed of two meshes is proposed. The interference elimination can be conducted on the basis of marching cubes algorithm and Boolean operation. The accuracy of the dental splint can be guaranteed since the original mesh is utilized to form the result surface. Using EasySplint, the dental splints can be designed in about 10 minutes and saved as a stereo lithography (STL) file for 3D printing in clinical applications. Three phantom experiments were conducted and the efficiency of our method was demonstrated.
Not virtual, but a real, live, online, interactive reference service.
Jerant, Lisa Lott; Firestein, Kenneth
2003-01-01
In today's fast-paced environment, traditional medical reference services alone are not adequate to meet users' information needs. Efforts to find new ways to provide comprehensive service to users, where and when needed, have often included the use of new and developing technologies. This paper describes the experience of an academic health science library in developing and providing an online, real-time reference service. Issues discussed include selecting software, training librarians, staffing the service, and considering the future of the service. Use statistics, question type analysis, and feedback from users of the service and librarians who staff the service, are also presented.
NASA Astrophysics Data System (ADS)
Cheng, Po-Hsun; Chen, Sao-Jie; Lai, Jin-Shin; Lai, Feipei
This paper illustrates a feasible health informatics domain knowledge management process which helps gather useful technology information and reduce many knowledge misunderstandings among engineers who have participated in the IBM mainframe rightsizing project at National Taiwan University (NTU) Hospital. We design an asynchronously sharing mechanism to facilitate the knowledge transfer and our health informatics domain knowledge management process can be used to publish and retrieve documents dynamically. It effectively creates an acceptable discussion environment and even lessens the traditional meeting burden among development engineers. An overall description on the current software development status is presented. Then, the knowledge management implementation of health information systems is proposed.
Aircraft integrated design and analysis: A classroom experience
NASA Technical Reports Server (NTRS)
Weisshaar, Terrence A.
1989-01-01
AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport design, the AIAA Long Duration Aircraft design and RPV design proposal as project objectives. The central goal of these efforts is to provide a user-friendly, computer-software-based environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN) and stand-alone PC's are being used for this development. This year's accomplishments center primarily on aerodynamics software obtained from NASA/Langley and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of ten HSCT designs were generated, ranging from twin-fuselage aircraft, forward swept wing aircraft to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance.
Performance Characteristic Mems-Based IMUs for UAVs Navigation
NASA Astrophysics Data System (ADS)
Mohamed, H. A.; Hansen, J. M.; Elhabiby, M. M.; El-Sheimy, N.; Sesay, A. B.
2015-08-01
Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs) are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS) or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK), and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS) signal outage.
NASA Astrophysics Data System (ADS)
Al, Can Mert; Yaman, Ulas
2018-05-01
In the scope of this study, an alternative automated method to the conventional design and fabrication pipeline of 3D printers is developed by using an integrated CAD/CAE/CAM approach. It increases the load carrying capacity of the parts by constructing heterogeneous infill structures. Traditional CAM software of Additive Manufacturing machinery starts with a design model in STL file format which only includes data about the outer boundary in the triangular mesh form. Depending on the given infill percentage, the algorithm running behind constructs the interior of the artifact by using homogeneous infill structures. As opposed to the current CAM software, the proposed method provides a way to construct heterogeneous infill structures with respect to the Von Misses stress field results obtained from a finite element analysis. Throughout the work, Rhinoceros3D is used for the design of the parts along with Grasshopper3D, an algorithmic design tool for Rhinoceros3D. In addition, finite element analyses are performed using Karamba3D, a plug-in for Grasshopper3D. According to the results of the tensile tests, the method offers an improvement of load carrying capacity about 50% compared to traditional slicing algorithms of 3D printing.
NASA Astrophysics Data System (ADS)
Ma, Xiaoli; Guo, Xiaoyu; Song, Yuelin; Qiao, Lirui; Wang, Wenguang; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong
2016-12-01
Clarification of the chemical composition of traditional Chinese medicine formulas (TCMFs) is a challenge due to the variety of structures and the complexity of plant matrices. Herein, an integrated strategy was developed by hyphenating ultra-performance liquid chromatography (UPLC), quadrupole time-of-flight (Q-TOF), hybrid triple quadrupole-linear ion trap mass spectrometry (Qtrap-MS), and the novel post-acquisition data processing software UNIFI to achieve automatic, rapid, accurate, and comprehensive qualitative and quantitative analysis of the chemical components in TCMFs. As a proof-of-concept, the chemical profiling of Baoyuan decoction (BYD), which is an ancient TCMF that is clinically used for the treatment of coronary heart disease that consists of Ginseng Radix et Rhizoma, Astragali Radix, Glycyrrhizae Radix et Rhizoma Praeparata Cum Melle, and Cinnamomi Cortex, was performed. As many as 236 compounds were plausibly or unambiguously identified, and 175 compounds were quantified or relatively quantified by the scheduled multiple reaction monitoring (sMRM) method. The findings demonstrate that the strategy integrating the rapidity of UNIFI software, the efficiency of UPLC, the accuracy of Q-TOF-MS, and the sensitivity and quantitation ability of Qtrap-MS provides a method for the efficient and comprehensive chemome characterization and quality control of complex TCMFs.
NASA Astrophysics Data System (ADS)
Frey, Jesse
In recent years there has been a growing interest in smaller satellites. Smaller satellites are cheaper to build and launch than larger satellites. One form factor, the CubeSat, is especially popular with universities and is a 10~cm cube. Being smaller means that the mass and power budgets are tighter and as such new ways must be developed to cope with these constraints. Traditional attitude control systems often use reaction wheels with gas thrusters which present challenges on a CubeSat. Many CubeSats use magnetic attitude control which uses the Earth's magnetic field to torque the satellite into the proper orientation. Magnetic attitude control systems fall into two main categories: active and passive. Active control is often achieved by running current through a coil to produce a dipole moment, while passive control uses the dipole moment from permanent magnets that consume no power. This thesis describes a system that uses twelve hard magnetic torquers along with a magnetometer. The torquers only consume current when their dipole moment is flipped, thereby significantly reducing power requirements compared with traditional active control. The main focus of this thesis is on the design, testing and fabrication of CubeSat hardware and software in preparation for launch.
Phaser crystallographic software.
McCoy, Airlie J; Grosse-Kunstleve, Ralf W; Adams, Paul D; Winn, Martyn D; Storoni, Laurent C; Read, Randy J
2007-08-01
Phaser is a program for phasing macromolecular crystal structures by both molecular replacement and experimental phasing methods. The novel phasing algorithms implemented in Phaser have been developed using maximum likelihood and multivariate statistics. For molecular replacement, the new algorithms have proved to be significantly better than traditional methods in discriminating correct solutions from noise, and for single-wavelength anomalous dispersion experimental phasing, the new algorithms, which account for correlations between F(+) and F(-), give better phases (lower mean phase error with respect to the phases given by the refined structure) than those that use mean F and anomalous differences DeltaF. One of the design concepts of Phaser was that it be capable of a high degree of automation. To this end, Phaser (written in C++) can be called directly from Python, although it can also be called using traditional CCP4 keyword-style input. Phaser is a platform for future development of improved phasing methods and their release, including source code, to the crystallographic community.
An automatic and effective parameter optimization method for model tuning
NASA Astrophysics Data System (ADS)
Zhang, T.; Li, L.; Lin, Y.; Xue, W.; Xie, F.; Xu, H.; Huang, X.
2015-05-01
Physical parameterizations in General Circulation Models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determines parameter sensitivity and the other chooses the optimum initial value of sensitive parameters, are introduced before the downhill simplex method to reduce the computational cost and improve the tuning performance. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9%. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameters tuning during the model development stage.
3D virtual character reconstruction from projections: a NURBS-based approach
NASA Astrophysics Data System (ADS)
Triki, Olfa; Zaharia, Titus B.; Preteux, Francoise J.
2004-05-01
This work has been carried out within the framework of the industrial project, so-called TOON, supported by the French government. TOON aims at developing tools for automating the traditional 2D cartoon content production. This paper presents preliminary results of the TOON platform. The proposed methodology concerns the issues of 2D/3D reconstruction from a limited number of drawn projections, and 2D/3D manipulation/deformation/refinement of virtual characters. Specifically, we show that the NURBS-based modeling approach developed here offers a well-suited framework for generating deformable 3D virtual characters from incomplete 2D information. Furthermore, crucial functionalities such as animation and non-rigid deformation can be also efficiently handled and solved. Note that user interaction is enabled exclusively in 2D by achieving a multiview constraint specification method. This is fully consistent and compliant with the cartoon creator traditional practice and makes it possible to avoid the use of 3D modeling software packages which are generally complex to manipulate.
A Software Engineering Paradigm for Quick-turnaround Earth Science Data Projects
NASA Astrophysics Data System (ADS)
Moore, K.
2016-12-01
As is generally the case with applied sciences professional and educational programs, the participants of such programs can come from a variety of technical backgrounds. In the NASA DEVELOP National Program, the participants constitute an interdisciplinary set of backgrounds, with varying levels of experience with computer programming. DEVELOP makes use of geographically explicit data sets, and it is necessary to use geographic information systems and geospatial image processing environments. As data sets cover longer time spans and include more complex sets of parameters, automation is becoming an increasingly prevalent feature. Though platforms such as ArcGIS, ERDAS Imagine, and ENVI facilitate the batch-processing of geospatial imagery, these environments are naturally constricting to the user in that they limit him or her to the tools that are available. Users must then turn to "homemade" scripting in more traditional programming languages such as Python, JavaScript, or R, to automate workflows. However, in the context of quick-turnaround projects like those in DEVELOP, the programming learning curve may be prohibitively steep. In this work, we consider how to best design a software development paradigm that addresses two major constants: an arbitrarily experienced programmer and quick-turnaround project timelines.
Infrared small target detection technology based on OpenCV
NASA Astrophysics Data System (ADS)
Liu, Lei; Huang, Zhijian
2013-05-01
Accurate and fast detection of infrared (IR) dim target has very important meaning for infrared precise guidance, early warning, video surveillance, etc. In this paper, some basic principles and the implementing flow charts of a series of algorithms for target detection are described. These algorithms are traditional two-frame difference method, improved three-frame difference method, background estimate and frame difference fusion method, and building background with neighborhood mean method. On the foundation of above works, an infrared target detection software platform which is developed by OpenCV and MFC is introduced. Three kinds of tracking algorithms are integrated in this software. In order to explain the software clearly, the framework and the function are described in this paper. At last, the experiments are performed for some real-life IR images. The whole algorithm implementing processes and results are analyzed, and those algorithms for detection targets are evaluated from the two aspects of subjective and objective. The results prove that the proposed method has satisfying detection effectiveness and robustness. Meanwhile, it has high detection efficiency and can be used for real-time detection.
Infrared small target detection technology based on OpenCV
NASA Astrophysics Data System (ADS)
Liu, Lei; Huang, Zhijian
2013-09-01
Accurate and fast detection of infrared (IR) dim target has very important meaning for infrared precise guidance, early warning, video surveillance, etc. In this paper, some basic principles and the implementing flow charts of a series of algorithms for target detection are described. These algorithms are traditional two-frame difference method, improved three-frame difference method, background estimate and frame difference fusion method, and building background with neighborhood mean method. On the foundation of above works, an infrared target detection software platform which is developed by OpenCV and MFC is introduced. Three kinds of tracking algorithms are integrated in this software. In order to explain the software clearly, the framework and the function are described in this paper. At last, the experiments are performed for some real-life IR images. The whole algorithm implementing processes and results are analyzed, and those algorithms for detection targets are evaluated from the two aspects of subjective and objective. The results prove that the proposed method has satisfying detection effectiveness and robustness. Meanwhile, it has high detection efficiency and can be used for real-time detection.
Rule groupings: An approach towards verification of expert systems
NASA Technical Reports Server (NTRS)
Mehrotra, Mala
1991-01-01
Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.
Results of a Formal Methods Demonstration Project
NASA Technical Reports Server (NTRS)
Kelly, J.; Covington, R.; Hamilton, D.
1994-01-01
This paper describes the results of a cooperative study conducted by a team of researchers in formal methods at three NASA Centers to demonstrate FM techniques and to tailor them to critical NASA software systems. This pilot project applied FM to an existing critical software subsystem, the Shuttle's Jet Select subsystem (Phase I of an ongoing study). The present study shows that FM can be used successfully to uncover hidden issues in a highly critical and mature Functional Subsystem Software Requirements (FSSR) specification which are very difficult to discover by traditional means.
Enhancing GIS Capabilities for High Resolution Earth Science Grids
NASA Astrophysics Data System (ADS)
Koziol, B. W.; Oehmke, R.; Li, P.; O'Kuinghttons, R.; Theurich, G.; DeLuca, C.
2017-12-01
Applications for high performance GIS will continue to increase as Earth system models pursue more realistic representations of Earth system processes. Finer spatial resolution model input and output, unstructured or irregular modeling grids, data assimilation, and regional coordinate systems present novel challenges for GIS frameworks operating in the Earth system modeling domain. This presentation provides an overview of two GIS-driven applications that combine high performance software with big geospatial datasets to produce value-added tools for the modeling and geoscientific community. First, a large-scale interpolation experiment using National Hydrography Dataset (NHD) catchments, a high resolution rectilinear CONUS grid, and the Earth System Modeling Framework's (ESMF) conservative interpolation capability will be described. ESMF is a parallel, high-performance software toolkit that provides capabilities (e.g. interpolation) for building and coupling Earth science applications. ESMF is developed primarily by the NOAA Environmental Software Infrastructure and Interoperability (NESII) group. The purpose of this experiment was to test and demonstrate the utility of high performance scientific software in traditional GIS domains. Special attention will be paid to the nuanced requirements for dealing with high resolution, unstructured grids in scientific data formats. Second, a chunked interpolation application using ESMF and OpenClimateGIS (OCGIS) will demonstrate how spatial subsetting can virtually remove computing resource ceilings for very high spatial resolution interpolation operations. OCGIS is a NESII-developed Python software package designed for the geospatial manipulation of high-dimensional scientific datasets. An overview of the data processing workflow, why a chunked approach is required, and how the application could be adapted to meet operational requirements will be discussed here. In addition, we'll provide a general overview of OCGIS's parallel subsetting capabilities including challenges in the design and implementation of a scientific data subsetter.
Developing a protocol for creating microfluidic devices with a 3D printer, PDMS, and glass
NASA Astrophysics Data System (ADS)
Collette, Robyn; Novak, Eric; Shirk, Kathryn
2015-03-01
Microfluidics research requires the design and fabrication of devices that have the ability to manipulate small volumes of fluid, typically ranging from microliters to picoliters. These devices are used for a wide range of applications including the assembly of materials and testing of biological samples. Many methods have been previously developed to create microfluidic devices, including traditional nanolithography techniques. However, these traditional techniques are cost-prohibitive for many small-scale laboratories. This research explores a relatively low-cost technique using a 3D printed master, which is used as a template for the fabrication of polydimethylsiloxane (PDMS) microfluidic devices. The masters are designed using computer aided design (CAD) software and can be printed and modified relatively quickly. We have developed a protocol for creating simple microfluidic devices using a 3D printer and PDMS adhered to glass. This relatively simple and lower-cost technique can now be scaled to more complicated device designs and applications. Funding provided by the Undergraduate Research Grant Program at Shippensburg University and the Student/Faculty Research Engagement Grants from the College of Arts and Sciences at Shippensburg University.
NASA Astrophysics Data System (ADS)
Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael
2013-05-01
This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.
He, Lan-Juan; Zhu, Xiang-Dong
2016-06-01
To analyze the regularities of prescriptions in "a guide to clinical practice with medical record" (Ye Tianshi) for diarrhoea based on traditional Chinese medicine inheritance support system(V2.5), and provide a reference for further research and development of new traditional Chinese medicines in treating diarrhoea. Traditional Chinese medicine inheritance support system was used to build a prescription database of Chinese medicines for diarrhoea. The software integration data mining method was used to analyze the prescriptions according to "four natures", "five flavors" and "meridians" in the database and achieve frequency statistics, syndrome distribution, prescription regularity and new prescription analysis. An analysis on 94 prescriptions for diarrhoea was used to determine the frequencies of medicines in prescriptions, commonly used medicine pairs and combinations, and achieve 13 new prescriptions. This study indicated that the prescriptions for diarrhoea in "a guide to clinical practice with medical record" are mostly of eliminating dampness and tonifying deficienccy, with neutral drug property, sweet, bitter or hot in flavor, and reflecting the treatment principle of "activating spleen-energy and resolving dampness". Copyright© by the Chinese Pharmaceutical Association.
Virtual experiments in electronics: beyond logistics, budgets, and the art of the possible
NASA Astrophysics Data System (ADS)
Chapman, Brian
1999-09-01
It is common and correct to suppose that computers support flexible delivery of educational resources by offering virtual experiments that replicate and substitute for experiments traditionally offered in conventional teaching laboratories. However, traditional methods are limited by logistics, costs, and what is physically possible to accomplish on a laboratory bench. Virtual experiments allow experimental approaches to teaching and learning to transcend these limits. This paper analyses recent and current developments in educational software for 1st- year physics, 2nd-year electronics engineering and 3rd-year communication engineering, based on three criteria: (1)Is the virtual experiment possible in a real laboratory? (2)How direct is the link between the experimental manipulation and the reinforcement of theoretical learning? (3) What impact might the virtual experiment have on the learner's acquisition of practical measurement skills? Virtual experiments allow more flexibility in the directness of the link between experimental manipulation and the theoretical message. However, increasing the directness of this link may reduce or even abolish the measurement processes associated with traditional experiments. Virtual experiments thus pose educational challenges: (a) expanding the design of experimentally based curricula beyond traditional boundaries and (b) ensuring that the learner acquires sufficient experience in making practical measurements.
Jupp, Simon; Burdett, Tony; Welter, Danielle; Sarntivijai, Sirarat; Parkinson, Helen; Malone, James
2016-01-01
Authoring bio-ontologies is a task that has traditionally been undertaken by skilled experts trained in understanding complex languages such as the Web Ontology Language (OWL), in tools designed for such experts. As requests for new terms are made, the need for expert ontologists represents a bottleneck in the development process. Furthermore, the ability to rigorously enforce ontology design patterns in large, collaboratively developed ontologies is difficult with existing ontology authoring software. We present Webulous, an application suite for supporting ontology creation by design patterns. Webulous provides infrastructure to specify templates for populating ontology design patterns that get transformed into OWL assertions in a target ontology. Webulous provides programmatic access to the template server and a client application has been developed for Google Sheets that allows templates to be loaded, populated and resubmitted to the Webulous server for processing. The development and delivery of ontologies to the community requires software support that goes beyond the ontology editor. Building ontologies by design patterns and providing simple mechanisms for the addition of new content helps reduce the overall cost and effort required to develop an ontology. The Webulous system provides support for this process and is used as part of the development of several ontologies at the European Bioinformatics Institute.
ERIC Educational Resources Information Center
Hudson, Tina M.; Knight, Victoria; Collins, Belva C.
2012-01-01
This article provides an overview of the planning and instructional delivery of a course in Applied Behavior Analysis using Adobe Connect Pro™. A description of software features used by course instructors is provided along with how each feature compares to resources found to deliver instruction in a traditional classroom setting. In addition, the…
Expedited Systems Engineering for Rapid Capability and Urgent Needs
2012-12-31
rapid organizations start to differ from traditional ones, and there is a shift in energy , commitment, and knowledge. These findings are motivated by...123 C.7.1 Description: Integration of Modeling and Simulation , Software Design, and...differ from traditional ones, and there is a shift in energy , commitment, and knowledge. These findings are motivated by an analysis of effective
Kamel Boulos, Maged N; Wheeler, Steve
2007-03-01
Web 2.0 sociable technologies and social software are presented as enablers in health and health care, for organizations, clinicians, patients and laypersons. They include social networking services, collaborative filtering, social bookmarking, folksonomies, social search engines, file sharing and tagging, mashups, instant messaging, and online multi-player games. The more popular Web 2.0 applications in education, namely wikis, blogs and podcasts, are but the tip of the social software iceberg. Web 2.0 technologies represent a quite revolutionary way of managing and repurposing/remixing online information and knowledge repositories, including clinical and research information, in comparison with the traditional Web 1.0 model. The paper also offers a glimpse of future software, touching on Web 3.0 (the Semantic Web) and how it could be combined with Web 2.0 to produce the ultimate architecture of participation. Although the tools presented in this review look very promising and potentially fit for purpose in many health care applications and scenarios, careful thinking, testing and evaluation research are still needed in order to establish 'best practice models' for leveraging these emerging technologies to boost our teaching and learning productivity, foster stronger 'communities of practice', and support continuing medical education/professional development (CME/CPD) and patient education.
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
Scott, S. D.; Mumgaard, R. T.
2016-07-20
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
Tokamak-independent software analysis suite for multi-spectral line-polarization MSE diagnostics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, S. D.; Mumgaard, R. T.
A tokamak-independent analysis suite has been developed to process data from Motional Stark Effect (mse) diagnostics. The software supports multi-spectral line-polarization mse diagnostics which simultaneously measure emission at the mse σ and π lines as well as at two "background" wavelengths that are displaced from the mse spectrum by a few nanometers. This analysis accurately estimates the amplitude of partially polarized background light at the σ and π wavelengths even in situations where the background light changes rapidly in time and space, a distinct improvement over traditional "time-interpolation" background estimation. The signal amplitude at many frequencies is computed using amore » numerical-beat algorithm which allows the retardance of the mse photo-elastic modulators (pem's) to be monitored during routine operation. It also allows the use of summed intensities at multiple frequencies in the calculation of polarization direction, which increases the effective signal strength and reduces sensitivity to pem retardance drift. The software allows the polarization angles to be corrected for calibration drift using a system that illuminates the mse diagnostic with polarized light at four known polarization angles within ten seconds of a plasma discharge. As a result, the software suite is modular, parallelized, and portable to other facilities.« less
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
A system for the real time, direct measurement of natural gas flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sowell, T.
1995-12-31
PMI/Badger Meter, Inc. with partial sponsorship from the Gas Research Institute, has designed and developed direct measurement total energy flow metering instrumentation. As industry demands for improved accuracy and speed of measurement have increased so has the complexity of the overall hardware and software systems. Considering traditional system approaches, few companies have the in house capability of maintaining a complete system. This paper addresses efforts to implement a direct, total gas energy flow metering system which is simple to use and cost effective.
Architecture for Business Intelligence in the Healthcare Sector
NASA Astrophysics Data System (ADS)
Lee, Sang Young
2018-03-01
Healthcare environment is growing to include not only the traditional information systems, but also a business intelligence platform. For executive leaders, consultants, and analysts, there is no longer a need to spend hours in design and develop of typical reports or charts, the entire solution can be completed through using Business Intelligence software. The current paper highlights the advantages of big data analytics and business intelligence in the healthcare industry. In this paper, In this paper we focus our discussion around intelligent techniques and methodologies which are recently used for business intelligence in healthcare.
He, Xi-Ran; Li, Chun-Guang; Zhu, Xiao-Shu; Li, Yuan-Qing; Jarouche, Mariam; Bensoussan, Alan; Li, Ping-Ping
2017-01-01
There is a recognized challenge in analyzing traditional Chinese medicine formulas because of their complex chemical compositions. The application of modern analytical techniques such as high-performance liquid chromatography coupled with a tandem mass spectrometry has improved the characterization of various compounds from traditional Chinese medicine formulas significantly. This study aims to conduct a bibliometric analysis to recognize the overall trend of high-performance liquid chromatography coupled with tandem mass spectrometry approaches in the analysis of traditional Chinese medicine formulas, its significance and possible underlying interactions between individual herbs in these formulas. Electronic databases were searched systematically, and the identified studies were collected and analyzed using Microsoft Access 2010, Graph Pad 5.0 software and Ucinet software package. 338 publications between 1997 and 2015 were identified, and analyzed in terms of annual growth and accumulated publications, top journals, forms of traditional Chinese medicine preparations and highly studied formulas and single herbs, as well as social network analysis of single herbs. There is a significant increase trend in using high-performance liquid chromatography coupled with tandem mass spectrometry related techniques in analysis of commonly used forms of traditional Chinese medicine formulas in the last 3 years. Stringent quality control is of great significance for the modernization and globalization of traditional Chinese medicine, and this bibliometric analysis provided the first and comprehensive summary within this field. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Easy Leaf Area: Automated digital image analysis for rapid and accurate measurement of leaf area.
Easlon, Hsien Ming; Bloom, Arnold J
2014-07-01
Measurement of leaf areas from digital photographs has traditionally required significant user input unless backgrounds are carefully masked. Easy Leaf Area was developed to batch process hundreds of Arabidopsis rosette images in minutes, removing background artifacts and saving results to a spreadsheet-ready CSV file. • Easy Leaf Area uses the color ratios of each pixel to distinguish leaves and calibration areas from their background and compares leaf pixel counts to a red calibration area to eliminate the need for camera distance calculations or manual ruler scale measurement that other software methods typically require. Leaf areas estimated by this software from images taken with a camera phone were more accurate than ImageJ estimates from flatbed scanner images. • Easy Leaf Area provides an easy-to-use method for rapid measurement of leaf area and nondestructive estimation of canopy area from digital images.
A Visualization-Based Tutoring Tool for Engineering Education
NASA Astrophysics Data System (ADS)
Nguyen, Tang-Hung; Khoo, I.-Hung
2010-06-01
In engineering disciplines, students usually have hard time to visualize different aspects of engineering analysis and design, which inherently are too complex or abstract to fully understand without the aid of visual explanations or visualizations. As examples, when learning materials and sequences of construction process, students need to visualize how all components of a constructed facility are assembled? Such visualization can not be achieved in a textbook and a traditional lecturing environment. In this paper, the authors present the development of a computer tutoring software, in which different visualization tools including video clips, 3 dimensional models, drawings, pictures/photos together with complementary texts are used to assist students in deeply understanding and effectively mastering materials. The paper will also discuss the implementation and the effectiveness evaluation of the proposed tutoring software, which was used to teach a construction engineering management course offered at California State University, Long Beach.
An Overview of the XGAM Code and Related Software for Gamma-ray Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Younes, W.
2014-11-13
The XGAM spectrum-fitting code and associated software were developed specifically to analyze the complex gamma-ray spectra that can result from neutron-induced reactions. The XGAM code is designed to fit a spectrum over the entire available gamma-ray energy range as a single entity, in contrast to the more traditional piecewise approaches. This global-fit philosophy enforces background continuity as well as consistency between local and global behavior throughout the spectrum, and in a natural way. This report presents XGAM and the suite of programs built around it with an emphasis on how they fit into an overall analysis methodology for complex gamma-raymore » data. An application to the analysis of time-dependent delayed gamma-ray yields from 235U fission is shown in order to showcase the codes and how they interact.« less
Collaboration and decision making tools for mobile groups
NASA Astrophysics Data System (ADS)
Abrahamyan, Suren; Balyan, Serob; Ter-Minasyan, Harutyun; Degtyarev, Alexander
2017-12-01
Nowadays the use of distributed collaboration tools is widespread in many areas of people activity. But lack of mobility and certain equipment-dependency creates difficulties and decelerates development and integration of such technologies. Also mobile technologies allow individuals to interact with each other without need of traditional office spaces and regardless of location. Hence, realization of special infrastructures on mobile platforms with help of ad-hoc wireless local networks could eliminate hardware-attachment and be useful also in terms of scientific approach. Solutions from basic internet-messengers to complex software for online collaboration equipment in large-scale workgroups are implementations of tools based on mobile infrastructures. Despite growth of mobile infrastructures, applied distributed solutions in group decisionmaking and e-collaboration are not common. In this article we propose software complex for real-time collaboration and decision-making based on mobile devices, describe its architecture and evaluate performance.
Fault Injection Validation of a Safety-Critical TMR Sysem
NASA Astrophysics Data System (ADS)
Irrera, Ivano; Madeira, Henrique; Zentai, Andras; Hergovics, Beata
2016-08-01
Digital systems and their software are the core technology for controlling and monitoring industrial systems in practically all activity domains. Functional safety standards such as the European standard EN 50128 for railway applications define the procedures and technical requirements for the development of software for railway control and protection systems. The validation of such systems is a highly demanding task. In this paper we discuss the use of fault injection techniques, which have been used extensively in several domains, particularly in the space domain, to complement the traditional procedures to validate a SIL (Safety Integrity Level) 4 system for railway signalling, implementing a TMR (Triple Modular Redundancy) architecture. The fault injection tool is based on JTAG technology. The results of our injection campaign showed a high degree of tolerance to most of the injected faults, but several cases of unexpected behaviour have also been observed, helping understanding worst-case scenarios.
Computer-aided implant design for the restoration of cranial defects.
Chen, Xiaojun; Xu, Lu; Li, Xing; Egger, Jan
2017-06-23
Patient-specific cranial implants are important and necessary in the surgery of cranial defect restoration. However, traditional methods of manual design of cranial implants are complicated and time-consuming. Our purpose is to develop a novel software named EasyCrania to design the cranial implants conveniently and efficiently. The process can be divided into five steps, which are mirroring model, clipping surface, surface fitting, the generation of the initial implant and the generation of the final implant. The main concept of our method is to use the geometry information of the mirrored model as the base to generate the final implant. The comparative studies demonstrated that the EasyCrania can improve the efficiency of cranial implant design significantly. And, the intra- and inter-rater reliability of the software were stable, which were 87.07 ± 1.6% and 87.73 ± 1.4% respectively.
PredGuid+A: Orion Entry Guidance Modified for Aerocapture
NASA Technical Reports Server (NTRS)
Lafleur, Jarret
2013-01-01
PredGuid+A software was developed to enable a unique numerical predictor-corrector aerocapture guidance capability that builds on heritage Orion entry guidance algorithms. The software can be used for both planetary entry and aerocapture applications. Furthermore, PredGuid+A implements a new Delta-V minimization guidance option that can take the place of traditional targeting guidance and can result in substantial propellant savings. PredGuid+A allows the user to set a mode flag and input a target orbit's apoapsis and periapsis. Using bank angle control, the guidance will then guide the vehicle to the appropriate post-aerocapture orbit using one of two algorithms: Apoapsis Targeting or Delta-V Minimization (as chosen by the user). Recently, the PredGuid guidance algorithm was adapted for use in skip-entry scenarios for NASA's Orion multi-purpose crew vehicle (MPCV). To leverage flight heritage, most of Orion's entry guidance routines are adapted from the Apollo program.
Conceptual Design of Environmentally Friendly Rotorcraft - A Comparison of NASA and ONERA Approaches
NASA Technical Reports Server (NTRS)
Russell, Carl; Basset, Pierre-Marie
2015-01-01
In 2011, a task was initiated under the US-French Project Agreement on rotorcraft studies to collaborate on design methodologies for environmentally friendly rotorcraft. This paper summarizes the efforts of that collaboration. The French and US aerospace agencies, ONERA and NASA, have their own software toolsets and approaches to rotorcraft design. The first step of this research effort was to understand how rotorcraft impact the environment, with the initial focus on air pollution. Second, similar baseline helicopters were developed for a passenger transport mission, using NASA and ONERA rotorcraft design software tools. Comparisons were made between the designs generated by the two tools. Finally, rotorcraft designs were generated targeting reduced environmental impact. The results show that a rotorcraft design that targets reduced environmental impact can be significantly different than one that targets traditional cost drivers, such as fuel burn and empty weight.
MER Surface Phase; Blurring the Line Between Fault Protection and What is Supposed to Happen
NASA Technical Reports Server (NTRS)
Reeves, Glenn E.
2008-01-01
An assessment on the limitations of communication with MER rovers and how such constraints drove the system design, flight software and fault protection architecture, blurring the line between traditional fault protection and expected nominal behavior, and requiring the most novel autonomous and semi-autonomous elements of the vehicle software including communication, surface mobility, attitude knowledge acquisition, fault protection, and the activity arbitration service.
Models for Threat Assessment in Networks
2006-09-01
Software International and Command AntiVirus . [Online]. Available: http://www.commandsoftware.com/virus/newlove.html [38] C. Ng and P. Ferrie. (2000...28 2.3 False positive trends across all population sizes for r=0.7 and m=0.1 . . . . 33 2.4 False negative trends across all population...benefits analysis is often performed to determine the list of mitigation procedures. Traditionally, risk assessment has been done in part with software
Pastor, Dena A; Lazowski, Rory A
2018-01-01
The term "multilevel meta-analysis" is encountered not only in applied research studies, but in multilevel resources comparing traditional meta-analysis to multilevel meta-analysis. In this tutorial, we argue that the term "multilevel meta-analysis" is redundant since all meta-analysis can be formulated as a special kind of multilevel model. To clarify the multilevel nature of meta-analysis the four standard meta-analytic models are presented using multilevel equations and fit to an example data set using four software programs: two specific to meta-analysis (metafor in R and SPSS macros) and two specific to multilevel modeling (PROC MIXED in SAS and HLM). The same parameter estimates are obtained across programs underscoring that all meta-analyses are multilevel in nature. Despite the equivalent results, not all software programs are alike and differences are noted in the output provided and estimators available. This tutorial also recasts distinctions made in the literature between traditional and multilevel meta-analysis as differences between meta-analytic choices, not between meta-analytic models, and provides guidance to inform choices in estimators, significance tests, moderator analyses, and modeling sequence. The extent to which the software programs allow flexibility with respect to these decisions is noted, with metafor emerging as the most favorable program reviewed.
NASA Astrophysics Data System (ADS)
Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.
2016-12-01
Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World Bank.
Data collection and analysis software development for rotor dynamics testing in spin laboratory
NASA Astrophysics Data System (ADS)
Abdul-Aziz, Ali; Arble, Daniel; Woike, Mark
2017-04-01
Gas turbine engine components undergo high rotational loading another complex environmental conditions. Such operating environment leads these components to experience damages and cracks that can cause catastrophic failure during flights. There are traditional crack detections and health monitoring methodologies currently being used which rely on periodic routine maintenances, nondestructive inspections that often times involve engine and components dis-assemblies. These methods do not also offer adequate information about the faults, especially, if these faults at subsurface or not clearly evident. At NASA Glenn research center, the rotor dynamics laboratory is presently involved in developing newer techniques that are highly dependent on sensor technology to enable health monitoring and prediction of damage and cracks in rotor disks. These approaches are noninvasive and relatively economical. Spin tests are performed using a subscale test article mimicking turbine rotor disk undergoing rotational load. Non-contact instruments such as capacitive and microwave sensors are used to measure the blade tip gap displacement and blade vibrations characteristics in an attempt develop a physics based model to assess/predict the faults in the rotor disk. Data collection is a major component in this experimental-analytical procedure and as a result, an upgrade to an older version of the data acquisition software which is based on LabVIEW program has been implemented to support efficiently running tests and analyze the results. Outcomes obtained from the tests data and related experimental and analytical rotor dynamics modeling including key features of the updated software are presented and discussed.
Problems Discovery of Final Graduation Projects During the Software Development Processes
NASA Astrophysics Data System (ADS)
Al-Hagery, Mohammed Abdullah Hassan
2012-01-01
The traditional techniques and methods used by students during systems development at the College of Computer, Qassim University is leading to this study. It leads to identify problems that hinder the construction and development of information systems, and identify their causes. The most important stages of systems development are analysis and design, which represent a solid foundation to build strong systems that are free from errors. The motivation of this research is the existence of many problems that impede getting exact output after the development of systems by the university graduates within departments of computer. The research concentrates on discovering the problems during the development tasks. The required data were collected using a questionnaire method, which formulated and judged and distributed to the target population. The research results were analyzed by three statistic methods.
Programmable bandwidth management in software-defined EPON architecture
NASA Astrophysics Data System (ADS)
Li, Chengjun; Guo, Wei; Wang, Wei; Hu, Weisheng; Xia, Ming
2016-07-01
This paper proposes a software-defined EPON architecture which replaces the hardware-implemented DBA module with reprogrammable DBA module. The DBA module allows pluggable bandwidth allocation algorithms among multiple ONUs adaptive to traffic profiles and network states. We also introduce a bandwidth management scheme executed at the controller to manage the customized DBA algorithms for all date queues of ONUs. Our performance investigation verifies the effectiveness of this new EPON architecture, and numerical results show that software-defined EPONs can achieve less traffic delay and provide better support to service differentiation in comparison with traditional EPONs.
Assessment of MSFCs Process for the Development and Activation of Space Act Agreements
NASA Technical Reports Server (NTRS)
Daugherty, Rachel A.
2014-01-01
A Space Act Agreement (SAA) is a contractual vehicle that NASA utilizes to form partnerships with non-NASA entities to stimulate cutting-edge innovation within the science and technology communities while concurrently supporting the NASA missions. SAAs are similar to traditional contracts in that they involve the commitment of Agency resources but allow more flexibility and are more cost effective to implement than traditional contracts. Consequently, the use of SAAs to develop partnerships has greatly increased over the past several years. To facilitate this influx of SAAs, Marshall Space Flight Center (MSFC) developed a process during a kaizen event to streamline and improve the quality of SAAs developed at the Center level. This study assessed the current SAA process to determine if improvements could be implemented to increase productivity, decrease time to activation, and improve the quality of deliverables. Using a combination of direct procedural observation, personnel interviews, and statistical analysis, elements of the process in need of remediation were identified and potential solutions developed. The findings focus primarily on the difficulties surrounding tracking and enforcing process adherence and communication issues among stakeholders. Potential solutions include utilizing customer relationship management (CRM) software to facilitate process coordination and co-locating or potentially merging the two separate organizations involved in SAA development and activation at MSFC.
[Design and development of an online system of parasite's images for training and evaluation].
Yuan-Chun, Mao; Sui, Xu; Jie, Wang; Hua-Yun, Zhou; Jun, Cao
2017-08-08
To design and develop an online training and evaluation system for parasitic pathogen recognition. The system was based on a Parasitic Diseases Specimen Image Digitization Construction Database by using MYSQL 5.0 as the system of database development software, and PHP 5 as the interface development language. It was mainly used for online training and evaluation of parasitic pathology diagnostic techniques. The system interface was designed simple, flexible, and easy to operate for medical staff. It enabled full day and 24 hours accessible to online training study and evaluation. Thus, the system broke the time and space constraints of the traditional training models. The system provides a shared platform for the professional training of parasitic diseases, and a reference for other training tasks.
3DVEM Software Modules for Efficient Management of Point Clouds and Photorealistic 3d Models
NASA Astrophysics Data System (ADS)
Fabado, S.; Seguí, A. E.; Cabrelles, M.; Navarro, S.; García-De-San-Miguel, D.; Lerma, J. L.
2013-07-01
Cultural heritage managers in general and information users in particular are not usually used to deal with high-technological hardware and software. On the contrary, information providers of metric surveys are most of the times applying latest developments for real-life conservation and restoration projects. This paper addresses the software issue of handling and managing either 3D point clouds or (photorealistic) 3D models to bridge the gap between information users and information providers as regards the management of information which users and providers share as a tool for decision-making, analysis, visualization and management. There are not many viewers specifically designed to handle, manage and create easily animations of architectural and/or archaeological 3D objects, monuments and sites, among others. 3DVEM - 3D Viewer, Editor & Meter software will be introduced to the scientific community, as well as 3DVEM - Live and 3DVEM - Register. The advantages of managing projects with both sets of data, 3D point cloud and photorealistic 3D models, will be introduced. Different visualizations of true documentation projects in the fields of architecture, archaeology and industry will be presented. Emphasis will be driven to highlight the features of new userfriendly software to manage virtual projects. Furthermore, the easiness of creating controlled interactive animations (both walkthrough and fly-through) by the user either on-the-fly or as a traditional movie file will be demonstrated through 3DVEM - Live.
NASA Astrophysics Data System (ADS)
Price, Katie; Ballow, William
2015-04-01
Traditional high-precision survey methods for stream channel measurement are labor-intensive and require wadeability or boat access to streams. These conditions limit the number of sites researchers are able to study and generally prohibit the possibility of repeat channel surveys to evaluate short-term fluctuations in channel morphology. In recent years, unmanned aerial vehicles (drones) equipped with photo and video capabilities have become widely available and affordable. Concurrently, developments in photogrammetric software offer unprecedented mapping and 3D rendering capabilities of drone-captured photography. In this study, we evaluate the potential use of drone-mounted cameras for detailed stream channel morphometric analysis. We used a relatively low-cost drone (DJI Phantom 2+ Vision) and commercially available, user friendly software (Agisoft Photscan) for photogrammetric analysis of drone-captured stream channel photography. Our test study was conducted on Proctor Creek, a highly responsive urban stream in Atlanta, Georgia, within the crystalline Piedmont region of the southeastern United States. As a baseline, we performed traditional high-precision survey methods to collect morphological measurements (e.g., bankfull and wetted width, bankfull and wetted thalweg depth) at 11 evenly-spaced transects, following USGS protocols along reaches of 20 times average channel width. We additionally used the drone to capture 200+ photos along the same reaches, concurrent with the channel survey. Using the photogrammetry software, we generated georeferenced 3D models of the stream channel, from which morphological measurements were derived from the 11 transects and compared with measurements from the traditional survey method. We additionally explored possibilities for novel morphometric characterization available from the continuous 3D surface, as an improvement on the limited number of detailed cross-sections available from standard methods. These results showed great promise for the drone photogrammetry methods, which encouraged the exploration of the possibility of repeat aerial surveys to evaluate channel response to high flow events. Repeat drone surveys were performed following a sequence of high-flow events in Proctor Creek to evaluate the possibility of using these methods for assessment of stream channel response to flooding.
Cundy, K V; Willard, K E; Valeri, L J; Shanholtzer, C J; Singh, J; Peterson, L R
1991-01-01
Three gas chromatography (GC) methods were compared for the identification of 52 clinical Clostridium difficile isolates, as well as 17 non-C. difficile Clostridium isolates. Headspace GC and Microbial Identification System (MIS) GC, an automated system which utilizes a software library developed at the Virginia Polytechnic Institute to identify organisms based on the fatty acids extracted from the bacterial cell wall, were compared against the reference method of traditional GC. Headspace GC and MIS were of approximately equivalent accuracy in identifying the 52 C. difficile isolates (52 of 52 versus 51 of 52, respectively). However, 7 of 52 organisms required repeated sample preparation before an identification was achieved by the MIS method. Both systems effectively differentiated C. difficile from non-C. difficile clostridia, although the MIS method correctly identified only 9 of 17. We conclude that the headspace GC system is an accurate method of C. difficile identification, which requires only one-fifth of the sample preparation time of MIS GC and one-half of the sample preparation time of traditional GC. PMID:2007632
NASA Technical Reports Server (NTRS)
Reinhart, Richard C.; Kacpura, Thomas J.
2004-01-01
The NASA Glenn Research Center is investigating the development and suitability of a software-based open-architecture for space-based reconfigurable transceivers (RTs) and software-defined radios (SDRs). The main objectives of this project are to enable advanced operations and reduce mission costs. SDRs are becoming more common because of the capabilities of reconfigurable digital signal processing technologies such as field programmable gate arrays and digital signal processors, which place radio functions in firmware and software that were traditionally performed with analog hardware components. Features of interest of this communications architecture include nonproprietary open standards and application programming interfaces to enable software reuse and portability, independent hardware and software development, and hardware and software functional separation. The goals for RT and SDR technologies for NASA space missions include prelaunch and on-orbit frequency and waveform reconfigurability and programmability, high data rate capability, and overall communications and processing flexibility. These operational advances over current state-of-art transceivers will be provided to reduce the power, mass, and cost of RTs and SDRs for space communications. The open architecture for NASA communications will support existing (legacy) communications needs and capabilities while providing a path to more capable, advanced waveform development and mission concepts (e.g., ad hoc constellations with self-healing networks and high-rate science data return). A study was completed to assess the state of the art in RT architectures, implementations, and technologies. In-house researchers conducted literature searches and analysis, interviewed Government and industry contacts, and solicited information and white papers from industry on space-qualifiable RTs and SDRs and their associated technologies for space-based NASA applications. The white papers were evaluated, compiled, and used to assess RT and SDR system architectures and core technology elements to determine an appropriate investment strategy to advance these technologies to meet future mission needs. The use of these radios in the space environment represents a challenge because of the space radiation suitability of the components, which drastically reduces the processing capability. The radios available for space are considered to be RTs (as opposed to SDRs), which are digitally programmable radios with selectable changes from an architecture combining analog and digital components. The limited flexibility of this design contrasts against the desire to have a power-efficient solution and open architecture.
NASA Astrophysics Data System (ADS)
Mazzaracchio, Antonio; Marchetti, Mario
2010-03-01
Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.
Graphics processing units in bioinformatics, computational biology and systems biology.
Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela
2017-09-01
Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.
Change management methodologies trained for automotive infotainment projects
NASA Astrophysics Data System (ADS)
Prostean, G.; Volker, S.; Hutanu, A.
2017-01-01
An Automotive Electronic Control Units (ECU) development project embedded within a car Environment is constantly under attack of a continuous flow of modifications of specifications throughout the life cycle. Root causes for those modifications are for instance simply software or hardware implementation errors or requirement changes to satisfy the forthcoming demands of the market to ensure the later commercial success. It is unavoidable that from the very beginning until the end of the project “requirement changes” will “expose” the agreed objectives defined by contract specifications, which are product features, budget, schedule and quality. The key discussions will focus upon an automotive radio-navigation (infotainment) unit, which challenges aftermarket devises such as smart phones. This competition stresses especially current used automotive development processes, which are fit into a 4 Year car development (introduction) cycle against a one-year update cycle of a smart phone. The research will focus the investigation of possible impacts of changes during all phases of the project: the Concept-Validation, Development and Debugging-Phase. Building a thorough understanding of prospective threats is of paramount importance in order to establish the adequate project management process to handle requirement changes. Personal automotive development experiences and Literature review of change- and configuration management software development methodologies led the authors to new conceptual models, which integrates into the structure of traditional development models used in automotive projects, more concretely of radio-navigation projects.
Lo, Ming; Hue, Chih-Wei
2008-11-01
The Character-Component Analysis Toolkit (C-CAT) software was designed to assist researchers in constructing experimental materials using traditional Chinese characters. The software package contains two sets of character stocks: one suitable for research using literate adults as subjects and one suitable for research using schoolchildren as subjects. The software can identify linguistic properties, such as the number of strokes contained, the character-component pronunciation regularity, and the arrangement of character components within a character. Moreover, it can compute a character's linguistic frequency, neighborhood size, and phonetic validity with respect to a user-selected character stock. It can also search the selected character stock for similar characters or for character components with user-specified linguistic properties.
A framework for building real-time expert systems
NASA Technical Reports Server (NTRS)
Lee, S. Daniel
1991-01-01
The Space Station Freedom is an example of complex systems that require both traditional and artificial intelligence (AI) real-time methodologies. It was mandated that Ada should be used for all new software development projects. The station also requires distributed processing. Catastrophic failures on the station can cause the transmission system to malfunction for a long period of time, during which ground-based expert systems cannot provide any assistance to the crisis situation on the station. This is even more critical for other NASA projects that would have longer transmission delays (e.g., the lunar base, Mars missions, etc.). To address these issues, a distributed agent architecture (DAA) is proposed that can support a variety of paradigms based on both traditional real-time computing and AI. The proposed testbed for DAA is an autonomous power expert (APEX) which is a real-time monitoring and diagnosis expert system for the electrical power distribution system of the space station.
DoE Phase II SBIR: Spectrally-Assisted Vehicle Tracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Villeneuve, Pierre V.
2013-02-28
The goal of this Phase II SBIR is to develop a prototype software package to demonstrate spectrally-aided vehicle tracking performance. The primary application is to demonstrate improved target vehicle tracking performance in complex environments where traditional spatial tracker systems may show reduced performance. Example scenarios in Figure 1 include a) the target vehicle obscured by a large structure for an extended period of time, or b), the target engaging in extreme maneuvers amongst other civilian vehicles. The target information derived from spatial processing is unable to differentiate between the green versus the red vehicle. Spectral signature exploitation enables comparison ofmore » new candidate targets with existing track signatures. The ambiguity in this confusing scenario is resolved by folding spectral analysis results into each target nomination and association processes. Figure 3 shows a number of example spectral signatures from a variety of natural and man-made materials. The work performed over the two-year effort was divided into three general areas: algorithm refinement, software prototype development, and prototype performance demonstration. The tasks performed under this Phase II to accomplish the program goals were as follows: 1. Acquire relevant vehicle target datasets to support prototype. 2. Refine algorithms for target spectral feature exploitation. 3. Implement a prototype multi-hypothesis target tracking software package. 4. Demonstrate and quantify tracking performance using relevant data.« less
Janus: Graphical Software for Analyzing In-Situ Measurements of Solar-Wind Ions
NASA Astrophysics Data System (ADS)
Maruca, B.; Stevens, M. L.; Kasper, J. C.; Korreck, K. E.
2016-12-01
In-situ observations of solar-wind ions provide tremendous insights into the physics of space plasmas. Instrument on spacecraft measure distributions of ion energies, which can be processed into scientifically useful data (e.g., values for ion densities and temperatures). This analysis requires a strong, technical understanding of the instrument, so it has traditionally been carried out by the instrument teams using automated software that they had developed for that purpose. The automated routines are optimized for typical solar-wind conditions, so they can fail to capture the complex (and scientifically interesting) microphysics of transient solar-wind - such as coronal mass ejections (CME's) and co-rotating interaction regions (CIR's) - which are often better analyzed manually.This presentation reports on the ongoing development of Janus, a new software package for processing in-situ measurement of solar-wind ions. Janus will provide user with an easy-to-use graphical user interface (GUI) for carrying out highly customized analyses. Transparent to the user, Janus will automatically handle the most technical tasks (e.g., the retrieval and calibration of measurements). For the first time, users with only limited knowledge about the instruments (e.g., non-instrumentalists and students) will be able to easily process measurements of solar-wind ions. Version 1 of Janus focuses specifically on such measurements from the Wind spacecraft's Faraday Cups and is slated for public release in time for this presentation.
NASA Astrophysics Data System (ADS)
Hua, H.; Manipon, G.; Starch, M.
2017-12-01
NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.
User Interface Developed for Controls/CFD Interdisciplinary Research
NASA Technical Reports Server (NTRS)
1996-01-01
The NASA Lewis Research Center, in conjunction with the University of Akron, is developing analytical methods and software tools to create a cross-discipline "bridge" between controls and computational fluid dynamics (CFD) technologies. Traditionally, the controls analyst has used simulations based on large lumping techniques to generate low-order linear models convenient for designing propulsion system controls. For complex, high-speed vehicles such as the High Speed Civil Transport (HSCT), simulations based on CFD methods are required to capture the relevant flow physics. The use of CFD should also help reduce the development time and costs associated with experimentally tuning the control system. The initial application for this research is the High Speed Civil Transport inlet control problem. A major aspect of this research is the development of a controls/CFD interface for non-CFD experts, to facilitate the interactive operation of CFD simulations and the extraction of reduced-order, time-accurate models from CFD results. A distributed computing approach for implementing the interface is being explored. Software being developed as part of the Integrated CFD and Experiments (ICE) project provides the basis for the operating environment, including run-time displays and information (data base) management. Message-passing software is used to communicate between the ICE system and the CFD simulation, which can reside on distributed, parallel computing systems. Initially, the one-dimensional Large-Perturbation Inlet (LAPIN) code is being used to simulate a High Speed Civil Transport type inlet. LAPIN can model real supersonic inlet features, including bleeds, bypasses, and variable geometry, such as translating or variable-ramp-angle centerbodies. Work is in progress to use parallel versions of the multidimensional NPARC code.
Exploring the Cost and Functionality of MEDCOM Web Services
2005-10-24
Software Name 24. What backend database software supports your intranet/Internet content? (check all that apply)-. o Oracle o Microsoft SQL Server E0...Department of Defense (DoD) service branches, which funded and deployed an Internet portal, TRICARE Online, to serve as an information conduit between the...public website, the information contained on the intranet is traditionally limited to the members of the hosting command. The local information serves as
NASA Astrophysics Data System (ADS)
Wang, H. T.; Chen, T. T.; Yan, C.; Pan, H.
2018-05-01
For App recommended areas of mobile phone software, made while using conduct App application recommended combined weighted Slope One algorithm collaborative filtering algorithm items based on further improvement of the traditional collaborative filtering algorithm in cold start, data matrix sparseness and other issues, will recommend Spark stasis parallel algorithm platform, the introduction of real-time streaming streaming real-time computing framework to improve real-time software applications recommended.
Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin
2018-01-01
Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills.
Sheikhaboumasoudi, Rouhollah; Bagheri, Maryam; Hosseini, Sayed Abbas; Ashouri, Elaheh; Elahi, Nasrin
2018-01-01
Background: Fundamentals of nursing course are prerequisite to providing comprehensive nursing care. Despite development of technology on nursing education, effectiveness of using e-learning methods in fundamentals of nursing course is unclear in clinical skills laboratory for nursing students. The aim of this study was to compare the effect of blended learning (combining e-learning with traditional learning methods) with traditional learning alone on nursing students' scores. Materials and Methods: A two-group post-test experimental study was administered from February 2014 to February 2015. Two groups of nursing students who were taking the fundamentals of nursing course in Iran were compared. Sixty nursing students were selected as control group (just traditional learning methods) and experimental group (combining e-learning with traditional learning methods) for two consecutive semesters. Both groups participated in Objective Structured Clinical Examination (OSCE) and were evaluated in the same way using a prepared checklist and questionnaire of satisfaction. Statistical analysis was conducted through SPSS software version 16. Results: Findings of this study reflected that mean of midterm (t = 2.00, p = 0.04) and final score (t = 2.50, p = 0.01) of the intervention group (combining e-learning with traditional learning methods) were significantly higher than the control group (traditional learning methods). The satisfaction of male students in intervention group was higher than in females (t = 2.60, p = 0.01). Conclusions: Based on the findings, this study suggests that the use of combining traditional learning methods with e-learning methods such as applying educational website and interactive online resources for fundamentals of nursing course instruction can be an effective supplement for improving nursing students' clinical skills. PMID:29861761
Space Flight Software Development Software for Intelligent System Health Management
NASA Technical Reports Server (NTRS)
Trevino, Luis C.; Crumbley, Tim
2004-01-01
The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.
Information System Engineering Supporting Observation, Orientation, Decision, and Compliant Action
NASA Astrophysics Data System (ADS)
Georgakopoulos, Dimitrios
The majority of today's software systems and organizational/business structures have been built on the foundation of solving problems via long-term data collection, analysis, and solution design. This traditional approach of solving problems and building corresponding software systems and business processes, falls short in providing the necessary solutions needed to deal with many problems that require agility as the main ingredient of their solution. For example, such agility is needed in responding to an emergency, in military command control, physical security, price-based competition in business, investing in the stock market, video gaming, network monitoring and self-healing, diagnosis in emergency health care, and many other areas that are too numerous to list here. The concept of Observe, Orient, Decide, and Act (OODA) loops is a guiding principal that captures the fundamental issues and approach for engineering information systems that deal with many of these problem areas. However, there are currently few software systems that are capable of supporting OODA. In this talk, we provide a tour of the research issues and state of the art solutions for supporting OODA. In addition, we provide specific examples of OODA solutions we have developed for the video surveillance and emergency response domains.
Scalable Performance Environments for Parallel Systems
NASA Technical Reports Server (NTRS)
Reed, Daniel A.; Olson, Robert D.; Aydt, Ruth A.; Madhyastha, Tara M.; Birkett, Thomas; Jensen, David W.; Nazief, Bobby A. A.; Totty, Brian K.
1991-01-01
As parallel systems expand in size and complexity, the absence of performance tools for these parallel systems exacerbates the already difficult problems of application program and system software performance tuning. Moreover, given the pace of technological change, we can no longer afford to develop ad hoc, one-of-a-kind performance instrumentation software; we need scalable, portable performance analysis tools. We describe an environment prototype based on the lessons learned from two previous generations of performance data analysis software. Our environment prototype contains a set of performance data transformation modules that can be interconnected in user-specified ways. It is the responsibility of the environment infrastructure to hide details of module interconnection and data sharing. The environment is written in C++ with the graphical displays based on X windows and the Motif toolkit. It allows users to interconnect and configure modules graphically to form an acyclic, directed data analysis graph. Performance trace data are represented in a self-documenting stream format that includes internal definitions of data types, sizes, and names. The environment prototype supports the use of head-mounted displays and sonic data presentation in addition to the traditional use of visual techniques.
Glassman, Nancy R.; Habousha, Racheline G.; Minuti, Aurelia; Schwartz, Rachel; Sorensen, Karen
2009-01-01
Due to the proliferation of electronic resources, fewer users visit the library. Traditional classroom instruction and in-person consultations are no longer sufficient in assisting library users. Librarians are constantly seeking new ways to interact with patrons and facilitate efficient use of electronic resources. This article describes the development, implementation, and evaluation of a project in which desktop-sharing software was used to reach out to users at remote locations. Various ways of using this tool are described, and challenges and implications for future expansion are discussed. PMID:20183031
Operations management system advanced automation: Fault detection isolation and recovery prototyping
NASA Technical Reports Server (NTRS)
Hanson, Matt
1990-01-01
The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.
Virtual Planetary Analysis Environment for Remote Science
NASA Technical Reports Server (NTRS)
Keely, Leslie; Beyer, Ross; Edwards. Laurence; Lees, David
2009-01-01
All of the data for NASA's current planetary missions and most data for field experiments are collected via orbiting spacecraft, aircraft, and robotic explorers. Mission scientists are unable to employ traditional field methods when operating remotely. We have developed a virtual exploration tool for remote sites with data analysis capabilities that extend human perception quantitatively and qualitatively. Scientists and mission engineers can use it to explore a realistic representation of a remote site. It also provides software tools to "touch" and "measure" remote sites with an immediacy that boosts scientific productivity and is essential for mission operations.
Providing a complete online multimedia patient record.
Dayhoff, R. E.; Kuzmak, P. M.; Kirin, G.; Frank, S.
1999-01-01
Seamless integration of all types of patient data is a critical feature for clinical workstation software. The Dept. of Veterans Affairs has developed a multimedia online patient record that includes traditional medical chart information as well as a wide variety of medical images from specialties such as cardiology, pulmonary and gastrointestinal medicine, pathology, radiology, hematology, and nuclear medicine. This online patient record can present data in ways not possible with a paper chart or other physical media. Obtaining a critical mass of information online is essential to achieve the maximum benefits from an integrated patient record system. Images Figure 1 Figure 2 PMID:10566357
Medical color displays and their calibration
NASA Astrophysics Data System (ADS)
Fan, Jiahua; Roehrig, Hans; Dallas, W.; Krupinski, Elizabeth
2009-08-01
Color displays are increasingly used for medical imaging, replacing the traditional monochrome displays in radiology for multi-modality applications, 3D representation applications, etc. Color displays are also used increasingly because of wide spread application of Tele-Medicine, Tele-Dermatology and Digital Pathology. At this time, there is no concerted effort for calibration procedures for this diverse range of color displays in Telemedicine and in other areas of the medical field. Using a colorimeter to measure the display luminance and chrominance properties as well as some processing software we developed a first attempt to a color calibration protocol for the medical imaging field.
An Integrated Analysis-Test Approach
NASA Technical Reports Server (NTRS)
Kaufman, Daniel
2003-01-01
This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
Digitizing the Facebow: A Clinician/Technician Communication Tool.
Kalman, Les; Chrapka, Julia; Joseph, Yasmin
2016-01-01
Communication between the clinician and the technician has been an ongoing problem in dentistry. To improve the issue, a dental software application has been developed--the Virtual Facebow App. It is an alternative to the traditional analog facebow, used to orient the maxillary cast in mounting. Comparison data of the two methods indicated that the digitized virtual facebow provided increased efficiency in mounting, increased accuracy in occlusion, and lower cost. Occlusal accuracy, lab time, and total time were statistically significant (P<.05). The virtual facebow provides a novel alternative for cast mounting and another tool for clinician-technician communication.
Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer
NASA Astrophysics Data System (ADS)
Stryjewski, Wieslaw J.
1991-08-01
A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.
Collaborative Planetary GIS with JMARS
NASA Astrophysics Data System (ADS)
Dickenshied, S.; Christensen, P. R.; Edwards, C. S.; Prashad, L. C.; Anwar, S.; Engle, E.; Noss, D.; Jmars Development Team
2010-12-01
Traditional GIS tools have allowed users to work locally with their own datasets in their own computing environment. More recently, data providers have started offering online repositories of preprocessed data which helps minimize the learning curve required to access new datasets. The ideal collaborative GIS tool provides the functionality of a traditional GIS and easy access to preprocessed data repositories while also enabling users to contribute data, analysis, and ideas back into the very tools they're using. JMARS (Java Mission-planning and Analysis for Remote Sensing) is a suite of geospatial applications developed by the Mars Space Flight Facility at Arizona State University. This software is used for mission planning and scientific data analysis by several NASA missions, including Mars Odyssey, Mars Reconnaissance Orbiter, and the Lunar Reconnaissance Orbiter. It is used by scientists, researchers and students of all ages from more than 40 countries around the world. In addition to offering a rich set of global and regional maps and publicly released orbiter images, the JMARS software development team has been working on ways to encourage the creation of collaborative datasets. Bringing together users from diverse teams and backgrounds allows new features to be developed with an interest in making the application useful and accessible to as wide a potential audience as possible. Actively engaging the scientific community in development strategy and hands on tasks allows the creation of user driven data content that would not otherwise be possible. The first community generated dataset to result from this effort is a tool mapping peer-reviewed papers to the locations they relate to on Mars with links to ancillary data. This allows users of JMARS to browse to an area of interest and then quickly locate papers corresponding to that area. Alternately, users can search for published papers over a specified time interval and visually see what areas of Mars have received the most attention over the requested time span.
Imaging Beyond What Man Can See
NASA Technical Reports Server (NTRS)
May, George; Mitchell, Brian
2004-01-01
Three lightweight, portable hyperspectral sensor systems have been built that capture energy from 200 to 1700 nanometers (ultravio1et to shortwave infrared). The sensors incorporate a line scanning technique that requires no relative movement between the target and the sensor. This unique capability, combined with portability, opens up new uses of hyperspectral imaging for laboratory and field environments. Each system has a GUI-based software package that allows the user to communicate with the imaging device for setting spatial resolution, spectral bands and other parameters. NASA's Space Partnership Development has sponsored these innovative developments and their application to human problems on Earth and in space. Hyperspectral datasets have been captured and analyzed in numerous areas including precision agriculture, food safety, biomedical imaging, and forensics. Discussion on research results will include realtime detection of food contaminants, molds and toxin research on corn, identifying counterfeit documents, non-invasive wound monitoring and aircraft applications. Future research will include development of a thermal infrared hyperspectral sensor that will support natural resource applications on Earth and thermal analyses during long duration space flight. This paper incorporates a variety of disciplines and imaging technologies that have been linked together to allow the expansion of remote sensing across both traditional and non-traditional boundaries.
Static and Dynamic Verification of Critical Software for Space Applications
NASA Astrophysics Data System (ADS)
Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.
Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.
Efficient Design and Analysis of Lightweight Reinforced Core Sandwich and PRSEUS Structures
NASA Technical Reports Server (NTRS)
Bednarcyk, Brett A.; Yarrington, Phillip W.; Lucking, Ryan C.; Collier, Craig S.; Ainsworth, James J.; Toubia, Elias A.
2012-01-01
Design, analysis, and sizing methods for two novel structural panel concepts have been developed and incorporated into the HyperSizer Structural Sizing Software. Reinforced Core Sandwich (RCS) panels consist of a foam core with reinforcing composite webs connecting composite facesheets. Boeing s Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) panels use a pultruded unidirectional composite rod to provide axial stiffness along with integrated transverse frames and stitching. Both of these structural concepts are ovencured and have shown great promise applications in lightweight structures, but have suffered from the lack of efficient sizing capabilities similar to those that exist for honeycomb sandwich, foam sandwich, hat stiffened, and other, more traditional concepts. Now, with accurate design methods for RCS and PRSEUS panels available in HyperSizer, these concepts can be traded and used in designs as is done with the more traditional structural concepts. The methods developed to enable sizing of RCS and PRSEUS are outlined, as are results showing the validity and utility of the methods. Applications include several large NASA heavy lift launch vehicle structures.
Using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.
2016-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
An automatic and effective parameter optimization method for model tuning
NASA Astrophysics Data System (ADS)
Zhang, T.; Li, L.; Lin, Y.; Xue, W.; Xie, F.; Xu, H.; Huang, X.
2015-11-01
Physical parameterizations in general circulation models (GCMs), having various uncertain parameters, greatly impact model performance and model climate sensitivity. Traditional manual and empirical tuning of these parameters is time-consuming and ineffective. In this study, a "three-step" methodology is proposed to automatically and effectively obtain the optimum combination of some key parameters in cloud and convective parameterizations according to a comprehensive objective evaluation metrics. Different from the traditional optimization methods, two extra steps, one determining the model's sensitivity to the parameters and the other choosing the optimum initial value for those sensitive parameters, are introduced before the downhill simplex method. This new method reduces the number of parameters to be tuned and accelerates the convergence of the downhill simplex method. Atmospheric GCM simulation results show that the optimum combination of these parameters determined using this method is able to improve the model's overall performance by 9 %. The proposed methodology and software framework can be easily applied to other GCMs to speed up the model development process, especially regarding unavoidable comprehensive parameter tuning during the model development stage.
Software ``Best'' Practices: Agile Deconstructed
NASA Astrophysics Data System (ADS)
Fraser, Steven
Software “best” practices depend entirely on context - in terms of the problem domain, the system constructed, the software designers, and the “customers” ultimately deriving value from the system. Agile practices no longer have the luxury of “choosing” small non-mission critical projects with co-located teams. Project stakeholders are selecting and adapting practices based on a combina tion of interest, need and staffing. For example, growing product portfolios through a merger or the acquisition of a company exposes legacy systems to new staff, new software integration challenges, and new ideas. Innovation in communications (tools and processes) to span the growth and contraction of both information and organizations, while managing the adoption of changing software practices, is imperative for success. Traditional web-based tools such as web pages, document libraries, and forums are not suf ficient. A blend of tweeting, blogs, wikis, instant messaging, web-based confer encing, and telepresence creates a new dimension of communication “best” practices.
Python based high-level synthesis compiler
NASA Astrophysics Data System (ADS)
Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard
2014-11-01
This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.
Software as a service approach to sensor simulation software deployment
NASA Astrophysics Data System (ADS)
Webster, Steven; Miller, Gordon; Mayott, Gregory
2012-05-01
Traditionally, military simulation has been problem domain specific. Executing an exercise currently requires multiple simulation software providers to specialize, deploy, and configure their respective implementations, integrate the collection of software to achieve a specific system behavior, and then execute for the purpose at hand. This approach leads to rigid system integrations which require simulation expertise for each deployment due to changes in location, hardware, and software. Our alternative is Software as a Service (SaaS) predicated on the virtualization of Night Vision Electronic Sensors (NVESD) sensor simulations as an exemplary case. Management middleware elements layer self provisioning, configuration, and integration services onto the virtualized sensors to present a system of services at run time. Given an Infrastructure as a Service (IaaS) environment, enabled and managed system of simulations yields a durable SaaS delivery without requiring user simulation expertise. Persistent SaaS simulations would provide on demand availability to connected users, decrease integration costs and timelines, and benefit the domain community from immediate deployment of lessons learned.
Applying Agile Methods to the Development of a Community-Based Sea Ice Observations Database
NASA Astrophysics Data System (ADS)
Pulsifer, P. L.; Collins, J. A.; Kaufman, M.; Eicken, H.; Parsons, M. A.; Gearheard, S.
2011-12-01
Local and traditional knowledge and community-based monitoring programs are increasingly being recognized as an important part of establishing an Arctic observing network, and understanding Arctic environmental change. The Seasonal Ice Zone Observing Network (SIZONet, http://www.sizonet.org) project has implemented an integrated program for observing seasonal ice in Alaska. Observation and analysis by local sea ice experts helps track seasonal and inter-annual variability of the ice cover and its use by coastal communities. The ELOKA project (http://eloka-arctic.org) is collaborating with SIZONet on the development of a community accessible, Web-based application for collecting and distributing local observations. The SIZONet project is dealing with complicated qualitative and quantitative data collected from a growing number of observers in different communities while concurrently working to design a system that will serve a wide range of different end users including Arctic residents, scientists, educators, and other stakeholders with a need for sea ice information. The benefits of linking and integrating knowledge from communities and university-based researchers are clear, however, development of an information system in this multidisciplinary, multi-participant context is challenging. Participants are geographically distributed, have different levels of technical expertise, and have varying goals for how the system will be used. As previously reported (Pulsifer et al. 2010), new technologies have been used to deal with some of the challenges presented in this complex development context. In this paper, we report on the challenges and innovations related to working as a multi-disciplinary software development team. Specifically, we discuss how Agile software development methods have been used in defining and refining user needs, developing prototypes, and releasing a production level application. We provide an overview of the production application that includes discussion of a hybrid architecture that combines a traditional relational database, schema-less database, advanced free text search, and the preliminary framework for Semantic Web support. The current version of the SIZONet web application is discussed in relation to the high-value features defined as part of the Agile approach. Preliminary feedback indicates a system that meets the needs of multiple user groups.
Computational science: shifting the focus from tools to models
Hinsen, Konrad
2014-01-01
Computational techniques have revolutionized many aspects of scientific research over the last few decades. Experimentalists use computation for data analysis, processing ever bigger data sets. Theoreticians compute predictions from ever more complex models. However, traditional articles do not permit the publication of big data sets or complex models. As a consequence, these crucial pieces of information no longer enter the scientific record. Moreover, they have become prisoners of scientific software: many models exist only as software implementations, and the data are often stored in proprietary formats defined by the software. In this article, I argue that this emphasis on software tools over models and data is detrimental to science in the long term, and I propose a means by which this can be reversed. PMID:25309728
Software Authority Transition through Multiple Distributors
Han, Kyusunk; Shon, Taeshik
2014-01-01
The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times. PMID:25143971
Software authority transition through multiple distributors.
Han, Kyusunk; Shon, Taeshik
2014-01-01
The rapid growth in the use of smartphones and tablets has changed the software distribution ecosystem. The trend today is to purchase software through application stores rather than from traditional offline markets. Smartphone and tablet users can install applications easily by purchasing from the online store deployed in their device. Several systems, such as Android or PC-based OS units, allow users to install software from multiple sources. Such openness, however, can promote serious threats, including malware and illegal usage. In order to prevent such threats, several stores use online authentication techniques. These methods can, however, also present a problem whereby even licensed users cannot use their purchased application. In this paper, we discuss these issues and provide an authentication method that will make purchased applications available to the registered user at all times.
Intellectual Property, Digital Technology and the Developing World
NASA Astrophysics Data System (ADS)
Pupillo, Lorenzo Maria
This chapter provides an overview of how the converging ICTs are challenging the traditional off-line copyright doctrine and suggests how developing countries should approach issues such as copyright in the digital world, software (Protection, Open Source, Reverse Engineering), and data base protection. The balance of the chapter is organized into three sections. After the introduction, the second section explains how digital technology is dramatically changing the entertainment industry, what are the major challenges to the industry, and what are the approaches that the economic literature suggest to face the structural changes that the digital revolution is bringing forward. Starting from the assumption that IPRs frameworks need to be customized to the countries’ development needs, the third section makes recommendations on how developing countries should use copyright to support access to information and to creative industries.
Research on digital city geographic information common services platform
NASA Astrophysics Data System (ADS)
Chen, Dequan; Wu, Qunyong; Wang, Qinmin
2008-10-01
Traditional GIS (Geographic Information System) software development mode exposes many defects that will largely slow down the city informational progress. It is urgent need to build a common application infrastructure for informational project to speed up the development pace of digital city. The advent of service-oriented architecture (SOA) has motivated the adoption of GIS functionality portals that can be executed in distributed computing environment. According to the SOA principle, we bring forward and design a digital city geographic information common services platform which provides application development service interfaces for field users that can be further extended relevant business application. In the end, a public-oriented Web GIS is developed based on the platform for helping public users to query geographic information in their daily life. It indicates that our platform have the capacity that can be integrated by other applications conveniently.
New and existing roadway inventory data acquisition methods
DOT National Transportation Integrated Search
2000-12-01
A number of agencies collect roadway inventory data using the traditional manual method. Representing an advancement in roadway inventory data collection, mobile mapping systems use state-of-the-art imaging, georeference, and software technologies to...
Software Replica of Minimal Living Processes
NASA Astrophysics Data System (ADS)
Bersini, Hugues
2010-04-01
There is a long tradition of software simulations in theoretical biology to complement pure analytical mathematics which are often limited to reproduce and understand the self-organization phenomena resulting from the non-linear and spatially grounded interactions of the huge number of diverse biological objects. Since John Von Neumann and Alan Turing pioneering works on self-replication and morphogenesis, proponents of artificial life have chosen to resolutely neglecting a lot of materialistic and quantitative information deemed not indispensable and have focused on the rule-based mechanisms making life possible, supposedly neutral with respect to their underlying material embodiment. Minimal life begins at the intersection of a series of processes which need to be isolated, differentiated and duplicated as such in computers. Only software developments and running make possible to understand the way these processes are intimately interconnected in order for life to appear at the crossroad. In this paper, I will attempt to set out the history of life as the disciples of artificial life understand it, by placing these different lessons on a temporal and causal axis, showing which one is indispensable to the appearance of the next and how does it connect to the next. I will discuss the task of artificial life as setting up experimental software platforms where these different lessons, whether taken in isolation or together, are tested, simulated, and, more systematically, analyzed. I will sketch some of these existing software platforms: chemical reaction networks, Varela’s autopoietic cellular automata, Ganti’s chemoton model, whose running delivers interesting take home messages to open-minded biologists.
Rosser, James C; Fleming, Jeffrey P; Legare, Timothy B; Choi, Katherine M; Nakagiri, Jamie; Griffith, Elliot
2017-12-22
To design and develop a distance learning (DL) system for the transference of laparoscopic surgery knowledge and skill constructed from off-the-shelf materials and commercially available software. Minimally invasive surgery offers significant benefits over traditional surgical procedures, but adoption rates for many procedures are low. Skill and confidence deficits are two of the culprits. DL combined with simulation training and telementoring may address these issues with scale. The system must be built to meet the instruction requirements of a proven laparoscopic skills course (Top Gun). Thus, the rapid sharing of multimedia educational materials, secure two-way audio/visual communications, and annotation and recording capabilities are requirements for success. These requirements are more in line with telementoring missions than standard distance learning efforts. A DL system with telementor, classroom, and laboratory stations was created. The telementor station consists of a desktop computer and headset with microphone. For the classroom station, a laptop is connected to a digital projector that displays the remote instructor and content. A tripod-mounted webcam provides classroom visualization and a Bluetooth® wireless speaker establishes audio. For the laboratory station, a laptop with universal serial bus (USB) expander is combined with a tabletop laparoscopic skills trainer, a headset with microphone, two webcams and a Bluetooth® speaker. The cameras are mounted on a standard tripod and an adjustable gooseneck camera mount clamp to provide an internal and external view of the training area. Internet meeting software provides audio/visual communications including transmission of educational materials. A DL system was created using off-the-shelf materials and commercially available software. It will allow investigations to evaluate the effectiveness of laparoscopic surgery knowledge and skill transfer utilizing DL techniques.
NASA Astrophysics Data System (ADS)
Eslinger, Eric Martin
Metacognitive skills are a crucial component of a successful learning career. We define metacognition as the ability to plan, monitor progress toward a goal, reflect on the quality of work and process, and revise the work or plan accordingly. By explicitly addressing certain metacognitive practices in classrooms, researchers have observed improved learning outcomes in both science and mathematical problem solving. Although these efforts were successful, they were also limited in the range of skills that could be addressed at one time and the methods used to address them due to the static nature inherent in traditional pencil-and-paper format. We wished to address these skills in a more dynamic, continuous representation such as that afforded by a computerized learning environment. This paper outlines such an environment and describes pedagogical activities afforded by the system. The ThinkerTools group developed and tested a software scaffold for inquiry projects in a middle-school classroom. By analyzing student use of the software tool, three forms of self-assessment activity were noted: integrated, task and project self-assessment. Each assessment form was related to the degree of interleaving between assessment and work the students engaged in as they developed their inquiry products. I argue that the integrated forms of assessment are more beneficial to student learning, and show that there is a significant relationship between active self-assessment forms and measures of student achievement and product quality. Through the use of case studies including video analysis, I address specific student self-assessment activity that utilized the software as well as self-assessment that took place outside of the software. A model of student self-assessment activity was created, highlighting aspects of activity that afford more productive self-assessment episodes.
Bigdata Driven Cloud Security: A Survey
NASA Astrophysics Data System (ADS)
Raja, K.; Hanifa, Sabibullah Mohamed
2017-08-01
Cloud Computing (CC) is a fast-growing technology to perform massive-scale and complex computing. It eliminates the need to maintain expensive computing hardware, dedicated space, and software. Recently, it has been observed that massive growth in the scale of data or big data generated through cloud computing. CC consists of a front-end, includes the users’ computers and software required to access the cloud network, and back-end consists of various computers, servers and database systems that create the cloud. In SaaS (Software as-a-Service - end users to utilize outsourced software), PaaS (Platform as-a-Service-platform is provided) and IaaS (Infrastructure as-a-Service-physical environment is outsourced), and DaaS (Database as-a-Service-data can be housed within a cloud), where leading / traditional cloud ecosystem delivers the cloud services become a powerful and popular architecture. Many challenges and issues are in security or threats, most vital barrier for cloud computing environment. The main barrier to the adoption of CC in health care relates to Data security. When placing and transmitting data using public networks, cyber attacks in any form are anticipated in CC. Hence, cloud service users need to understand the risk of data breaches and adoption of service delivery model during deployment. This survey deeply covers the CC security issues (covering Data Security in Health care) so as to researchers can develop the robust security application models using Big Data (BD) on CC (can be created / deployed easily). Since, BD evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. In this purview, MapReduce [12] is a good example of big data processing in a cloud environment, and a model for Cloud providers.
Developing a Treatment Planning Software Based on TG-43U1 Formalism for Cs-137 LDR Brachytherapy.
Sina, Sedigheh; Faghihi, Reza; Soleimani Meigooni, Ali; Siavashpour, Zahra; Mosleh-Shirazi, Mohammad Amin
2013-08-01
The old Treatment Planning Systems (TPSs) used for intracavitary brachytherapy with Cs-137 Selectron source utilize traditional dose calculation methods, considering each source as a point source. Using such methods introduces significant errors in dose estimation. As of 1995, TG-43 is used as the main dose calculation formalism in treatment TPSs. The purpose of this study is to design and establish a treatment planning software for Cs-137 Solectron brachytherapy source, based on TG-43U1 formalism by applying the effects of the applicator and dummy spacers. Two softwares used for treatment planning of Cs-137 sources in Iran (STPS and PLATO), are based on old formalisms. The purpose of this work is to establish and develop a TPS for Selectron source based on TG-43 formalism. In this planning system, the dosimetry parameters of each pellet in different places inside applicators were obtained by MCNP4c code. Then the dose distribution around every combination of active and inactive pellets was obtained by summing the doses. The accuracy of this algorithm was checked by comparing its results for special combination of active and inactive pellets with MC simulations. Finally, the uncertainty of old dose calculation formalism was investigated by comparing the results of STPS and PLATO softwares with those obtained by the new algorithm. For a typical arrangement of 10 active pellets in the applicator, the percentage difference between doses obtained by the new algorithm at 1cm distance from the tip of the applicator and those obtained by old formalisms is about 30%, while the difference between the results of MCNP and the new algorithm is less than 5%. According to the results, the old dosimetry formalisms, overestimate the dose especially towards the applicator's tip. While the TG-43U1 based software perform the calculations more accurately.
Using a web-based survey tool to undertake a Delphi study: application for nurse education research.
Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M
2013-11-01
The Internet is increasingly being used as a data collection medium to access research participants. This paper reports on the experience and value of using web-survey software to conduct an eDelphi study to develop Australian critical care course graduate practice standards. The eDelphi technique used involved the iterative process of administering three rounds of surveys to a national expert panel. The survey was developed online using SurveyMonkey. Panel members responded to statements using one rating scale for round one and two scales for rounds two and three. Text boxes for panel comments were provided. For each round, the SurveyMonkey's email tool was used to distribute an individualized email invitation containing the survey web link. The distribution of panel responses, individual responses and a summary of comments were emailed to panel members. Stacked bar charts representing the distribution of responses were generated using the SurveyMonkey software. Panel response rates remained greater than 85% over all rounds. An online survey provided numerous advantages over traditional survey approaches including high quality data collection, ease and speed of survey administration, direct communication with the panel and rapid collation of feedback allowing data collection to be undertaken in 12 weeks. Only minor challenges were experienced using the technology. Ethical issues, specific to using the Internet to conduct research and external hosting of web-based software, lacked formal guidance. High response rates and an increased level of data quality were achieved in this study using web-survey software and the process was efficient and user-friendly. However, when considering online survey software, it is important to match the research design with the computer capabilities of participants and recognize that ethical review guidelines and processes have not yet kept pace with online research practices. Copyright © 2013 Elsevier Ltd. All rights reserved.
MDAS: an integrated system for metabonomic data analysis.
Liu, Juan; Li, Bo; Xiong, Jiang-Hui
2009-03-01
Metabonomics, the latest 'omics' research field, shows great promise as a tool in biomarker discovery, drug efficacy and toxicity analysis, disease diagnosis and prognosis. One of the major challenges now facing researchers is how to process this data to yield useful information about a biological system, e.g., the mechanism of diseases. Traditional methods employed in metabonomic data analysis use multivariate analysis methods developed independently in chemometrics research. Additionally, with the development of machine learning approaches, some methods such as SVMs also show promise for use in metabonomic data analysis. Aside from the application of general multivariate analysis and machine learning methods to this problem, there is also a need for an integrated tool customized for metabonomic data analysis which can be easily used by biologists to reveal interesting patterns in metabonomic data.In this paper, we present a novel software tool MDAS (Metabonomic Data Analysis System) for metabonomic data analysis which integrates traditional chemometrics methods and newly introduced machine learning approaches. MDAS contains a suite of functional models for metabonomic data analysis and optimizes the flow of data analysis. Several file formats can be accepted as input. The input data can be optionally preprocessed and can then be processed with operations such as feature analysis and dimensionality reduction. The data with reduced dimensionalities can be used for training or testing through machine learning models. The system supplies proper visualization for data preprocessing, feature analysis, and classification which can be a powerful function for users to extract knowledge from the data. MDAS is an integrated platform for metabonomic data analysis, which transforms a complex analysis procedure into a more formalized and simplified one. The software package can be obtained from the authors.
Effects of telework and the virtual enterprise on the organization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moore, R.A.
1996-12-31
This paper provides information on the growing trend towards telework and using {open_quotes}virtual employees{close_quotes} as a fundamental component of the human resource requirements for the conduct of business. As the organization moves from a traditional approach of fixed plant and permanent employees toward a more dynamic model of motile office arrangements and virtual workers, new challenges arise for workers, supervisors, and managers. These challenges pertain to both the individual and the organization and are rooted in both technology and human behavior. Notwithstanding the challenges, the opportunities created for increased productivity and cost-effective operations are propelling organizations globally to adopt themore » virtual enterprise model, to a greater or lesser extent. Management hierarchy is giving way to autonomous teams. Middle management is being replaced by better organizational communication systems, better information storage and retrieval systems, and a newly developing classification of software called groupware. In the midst of these changes, the business process of identifying and acquiring the services of the virtual team member seems to lie at an intersection where Human Resources, Information Systems, Contracts/Subcontracts, and the functional department requiring the services intersect. Human Resources departments are slowly coming to grips with the virtual worker model but are largely uncomfortable in the role. Information Systems departments can implement networks; but, dynamic links outside the traditional organization bring up a myriad of questions about compatibility and system security. The champion of the virtual worker is the Functional Department. This might be engineering, software development, the design department, the financial analysis group, or whichever department in the organization is faced with the responsibility of creating knowledge work product and has resource constraints and upper management support.« less
A decision support system for quality of life in head and neck oncology patients.
Gonçalves, Joaquim J; Rocha, Alvaro M
2012-02-16
The assessment of Quality of Life (QoL) is a Medical goal; it is used in clinical research, medical practice, health-related economic studies and in planning health management measures and strategies. The objective of this project is to develop an informational platform to achieve a patient self-assessment with standardized QoL measuring instruments, through friendly software, easy for the user to adapt, which should aid the study of QoL, by promoting the creation of databases and accelerating its statistical treatment and yet generating subsequent useful results in graphical format for the physician analyzes in an appointment immediately after the answers collection. First, a software platform was designed and developed in an action-research process with patients, physicians and nurses. The computerized patient self-assessment with standardized QoL measuring instruments was compared with traditional one, to verify if its use did not influence the patient's answers. For that, the Wilcoxon and t-Student tests were applied. After, we adopted and adapted the mathematic Rash model to make possible the use of QoL measure in the routine appointments. The results show that the computerized patient self-assessment does not influence the patient's answers and can be used as a suitable tool in the routine appointment, because indicates problems which are more difficult to identify in a traditional appointment, improving thus the physician's decisions. The possibility of representing graphically useful results that physician needs to analyze in the appointment, immediately after the answer collection, in an useful time, makes this QoL assessment platform a diagnosis instrument ready to be used routinely in clinical practice.
Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.
Demchak, Barry; Krüger, Ingolf
2012-07-01
The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.
Fiorelli, Alfonso; Raucci, Antonio; Cascone, Roberto; Reginelli, Alfonso; Di Natale, Davide; Santoriello, Carlo; Capuozzo, Antonio; Grassi, Roberto; Serra, Nicola; Polverino, Mario; Santini, Mario
2017-04-01
We proposed a new virtual bronchoscopy tool to improve the accuracy of traditional transbronchial needle aspiration for mediastinal staging. Chest-computed tomographic images (1 mm thickness) were reconstructed with Osirix software to produce a virtual bronchoscopic simulation. The target adenopathy was identified by measuring its distance from the carina on multiplanar reconstruction images. The static images were uploaded in iMovie Software, which produced a virtual bronchoscopic movie from the images; the movie was then transferred to a tablet computer to provide real-time guidance during a biopsy. To test the validity of our tool, we divided all consecutive patients undergoing transbronchial needle aspiration retrospectively in two groups based on whether the biopsy was guided by virtual bronchoscopy (virtual bronchoscopy group) or not (traditional group). The intergroup diagnostic yields were statistically compared. Our analysis included 53 patients in the traditional and 53 in the virtual bronchoscopy group. The sensitivity, specificity, positive predictive value, negative predictive value and diagnostic accuracy for the traditional group were 66.6%, 100%, 100%, 10.53% and 67.92%, respectively, and for the virtual bronchoscopy group were 84.31%, 100%, 100%, 20% and 84.91%, respectively. The sensitivity ( P = 0.011) and diagnostic accuracy ( P = 0.011) of sampling the paratracheal station were better for the virtual bronchoscopy group than for the traditional group; no significant differences were found for the subcarinal lymph node. Our tool is simple, economic and available in all centres. It guided in real time the needle insertion, thereby improving the accuracy of traditional transbronchial needle aspiration, especially when target lesions are located in a difficult site like the paratracheal station. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Bone age maturity assessment using hand-held device
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Gilsanz, Vicente; Liu, Xiaodong; Boechat, M. I.
2004-04-01
Purpose: Assessment of bone maturity is traditionally performed through visual comparison of hand and wrist radiograph with existing reference images in textbooks. Our goal was to develop a digital index based on idealized hand Xray images that can be incorporated in a hand held computer and used for visual assessment of bone age for patients. Material and methods: Due to the large variability in bone maturation in normals, we generated a set of "ideal" images obtained by computer combinations of images from our normal reference data sets. Software for hand-held PDA devices was developed for easy navigation through the set of images and visual selection of matching images. A formula based on our statistical analysis provides the standard deviation from normal based on the chronological age of the patient. The accuracy of the program was compared to traditional interpretation by two radiologists in a double blind reading of 200 normal Caucasian children (100 boys, 100 girls). Results: Strong correlations were present between chronological age and bone age (r > 0.9) with no statistical difference between the digital and traditional assessment methods. Determinations of carpal bone maturity in adolescents was slightly more accurate using the digital system. The users did praise the convenience and effectiveness of the digital Palm Index in clinical practice. Conclusion: An idealized digital Palm Bone Age Index provides a convenient and effective alternative to conventional atlases for the assessment of skeletal maturity.
Farinango, Charic D; Benavides, Juan S; Cerón, Jesús D; López, Diego M; Álvarez, Rosa E
2018-01-01
Previous studies have demonstrated the effectiveness of information and communication technologies to support healthy lifestyle interventions. In particular, personal health record systems (PHR-Ss) empower self-care, essential to support lifestyle changes. Approaches such as the user-centered design (UCD), which is already a standard within the software industry (ISO 9241-210:2010), provide specifications and guidelines to guarantee user acceptance and quality of eHealth systems. However, no single PHR-S for metabolic syndrome (MS) developed following the recommendations of the ISO 9241-210:2010 specification has been found in the literature. The aim of this study was to describe the development of a PHR-S for the management of MS according to the principles and recommendations of the ISO 9241-210 standard. The proposed PHR-S was developed using a formal software development process which, in addition to the traditional activities of any software process, included the principles and recommendations of the ISO 9241-210 standard. To gather user information, a survey sample of 1,187 individuals, eight interviews, and a focus group with seven people were performed. Throughout five iterations, three prototypes were built. Potential users of each system evaluated each prototype. The quality attributes of efficiency, effectiveness, and user satisfaction were assessed using metrics defined in the ISO/IEC 25022 standard. The following results were obtained: 1) a technology profile from 1,187 individuals at risk for MS from the city of Popayan, Colombia, identifying that 75.2% of the people use the Internet and 51% had a smartphone; 2) a PHR-S to manage MS developed (the PHR-S has the following five main functionalities: record the five MS risk factors, share these measures with health care professionals, and three educational modules on nutrition, stress management, and a physical activity); and 3) usability tests on each prototype obtaining the following results: 100% effectiveness, 100% efficiency, and 84.2 points in the system usability scale. The software development methodology used was based on the ISO 9241-210 standard, which allowed the development team to maintain a focus on user's needs and requirements throughout the project, which resulted in an increased satisfaction and acceptance of the system. Additionally, the establishment of a multidisciplinary team allowed the application of considerations not only from the disciplines of software engineering and health sciences but also from other disciplines such as graphical design and media communication. Finally, usability testing allowed the observation of flaws in the designs, which helped to improve the solution.
SERVER DEVELOPMENT FOR NSLS-II PHYSICS APPLICATIONS AND PERFORMANCE ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shen, G.; Kraimer, M.
2011-03-28
The beam commissioning software framework of NSLS-II project adopts a client/server based architecture to replace the more traditional monolithic high level application approach. The server software under development is available via an open source sourceforge project named epics-pvdata, which consists of modules pvData, pvAccess, pvIOC, and pvService. Examples of two services that already exist in the pvService module are itemFinder, and gather. Each service uses pvData to store in-memory transient data, pvService to transfer data over the network, and pvIOC as the service engine. The performance benchmarking for pvAccess and both gather service and item finder service are presented inmore » this paper. The performance comparison between pvAccess and Channel Access are presented also. For an ultra low emittance synchrotron radiation light source like NSLS II, the control system requirements, especially for beam control are tight. To control and manipulate the beam effectively, a use case study has been performed to satisfy the requirement and theoretical evaluation has been performed. The analysis shows that model based control is indispensable for beam commissioning and routine operation. However, there are many challenges such as how to re-use a design model for on-line model based control, and how to combine the numerical methods for modeling of a realistic lattice with the analytical techniques for analysis of its properties. To satisfy the requirements and challenges, adequate system architecture for the software framework for beam commissioning and operation is critical. The existing traditional approaches are self-consistent, and monolithic. Some of them have adopted a concept of middle layer to separate low level hardware processing from numerical algorithm computing, physics modelling, data manipulating and plotting, and error handling. However, none of the existing approaches can satisfy the requirement. A new design has been proposed by introducing service oriented architecture technology, and client interface is undergoing. The design and implementation adopted a new EPICS implementation, namely epics-pvdata [9], which is under active development. The implementation of this project under Java is close to stable, and binding to other language such as C++ and/or Python is undergoing. In this paper, we focus on the performance benchmarking and comparison for pvAccess and Channel Access, the performance evaluation for 2 services, gather and item finder respectively.« less
Passman, Dina B.
2013-01-01
Objective The objective of this demonstration is to show conference attendees how they can integrate, analyze, and visualize diverse data type data from across a variety of systems by leveraging an off-the-shelf enterprise business intelligence (EBI) solution to support decision-making in disasters. Introduction Fusion Analytics is the data integration system developed by the Fusion Cell at the U.S. Department of Health and Human Services (HHS), Office of the Assistant Secretary for Preparedness and Response (ASPR). Fusion Analytics meaningfully augments traditional public and population health surveillance reporting by providing web-based data analysis and visualization tools. Methods Fusion Analytics serves as a one-stop-shop for the web-based data visualizations of multiple real-time data sources within ASPR. The 24-7 web availability makes it an ideal analytic tool for situational awareness and response allowing stakeholders to access the portal from any internet-enabled device without installing any software. The Fusion Analytics data integration system was built using off-the-shelf EBI software. Fusion Analytics leverages the full power of statistical analysis software and delivers reports to users in a secure web-based environment. Fusion Analytics provides an example of how public health staff can develop and deploy a robust public health informatics solution using an off-the shelf product and with limited development funding. It also provides the unique example of a public health information system that combines patient data for traditional disease surveillance with manpower and resource data to provide overall decision support for federal public health and medical disaster response operations. Conclusions We are currently in a unique position within public health. One the one hand, we have been gaining greater and greater access to electronic data of all kinds over the last few years. On the other, we are working in a time of reduced government spending to support leveraging this data for decision support with robust analytics and visualizations. Fusion Analytics provides an opportunity for attendees to see how various types of data are integrated into a single application for population health decision support. It also can provide them with ideas of how they can use their own staff to create analyses and reports that support their public health activities.
Networked Instructional Chemistry: Using Technology To Teach Chemistry
NASA Astrophysics Data System (ADS)
Smith, Stanley; Stovall, Iris
1996-10-01
Networked multimedia microcomputers provide new ways to help students learn chemistry and to help instructors manage the learning environment. This technology is used to replace some traditional laboratory work, collect on-line experimental data, enhance lectures and quiz sections with multimedia presentations, provide prelaboratory training for beginning nonchemistry- major organic laboratory, provide electronic homework for organic chemistry students, give graduate students access to real NMR data for analysis, and provide access to molecular modeling tools. The integration of all of these activities into an active learning environment is made possible by a client-server network of hundreds of computers. This requires not only instructional software but also classroom and course management software, computers, networking, and room management. Combining computer-based work with traditional course material is made possible with software management tools that allow the instructor to monitor the progress of each student and make available an on-line gradebook so students can see their grades and class standing. This client-server based system extends the capabilities of the earlier mainframe-based PLATO system, which was used for instructional computing. This paper outlines the components of a technology center used to support over 5,000 students per semester.
Real-time Implementation of the Waveloc Technique for Monitoring Earthquake Swarms
NASA Astrophysics Data System (ADS)
Maggi, A.; Langet, N.; Michelini, A.
2013-12-01
Monitoring regions with high swarm-type seismicity (e.g. volcanoes, certain tectonic regions) is a challenge for the traditional pick-associate-locate type algorithms that form the basis of most seismicity monitoring software. Over the past few years, new approaches that avoid the association phase by direct migration of some characteristic function of the recorded seismograms have started to be implemented, and have shown great promise (see related abstract on the Waveloc method applied to Piton de la Fournaise volcano). Implementing such methods in real-time is an essential step in proving their usefulness and robustness in swarm-monitoring situations. Here we describe the work in progress on adapting the Waveloc migration technique to real-time operation. The resulting software package, RT-Waveloc, is currently in the prototype stage, and we hope to have a version that can be distributed to the scientific community for beta-testing within a year. The development of RT-Waveloc is financed by the EU NERA project.
Research on distributed optical fiber sensing data processing method based on LabVIEW
NASA Astrophysics Data System (ADS)
Li, Zhonghu; Yang, Meifang; Wang, Luling; Wang, Jinming; Yan, Junhong; Zuo, Jing
2018-01-01
The pipeline leak detection and leak location problem have gotten extensive attention in the industry. In this paper, the distributed optical fiber sensing system is designed based on the heat supply pipeline. The data processing method of distributed optical fiber sensing based on LabVIEW is studied emphatically. The hardware system includes laser, sensing optical fiber, wavelength division multiplexer, photoelectric detector, data acquisition card and computer etc. The software system is developed using LabVIEW. The software system adopts wavelet denoising method to deal with the temperature information, which improved the SNR. By extracting the characteristic value of the fiber temperature information, the system can realize the functions of temperature measurement, leak location and measurement signal storage and inquiry etc. Compared with traditional negative pressure wave method or acoustic signal method, the distributed optical fiber temperature measuring system can measure several temperatures in one measurement and locate the leak point accurately. It has a broad application prospect.
Queue and stack sorting algorithm optimization and performance analysis
NASA Astrophysics Data System (ADS)
Qian, Mingzhu; Wang, Xiaobao
2018-04-01
Sorting algorithm is one of the basic operation of a variety of software development, in data structures course specializes in all kinds of sort algorithm. The performance of the sorting algorithm is directly related to the efficiency of the software. A lot of excellent scientific research queue is constantly optimizing algorithm, algorithm efficiency better as far as possible, the author here further research queue combined with stacks of sorting algorithms, the algorithm is mainly used for alternating operation queue and stack storage properties, Thus avoiding the need for a large number of exchange or mobile operations in the traditional sort. Before the existing basis to continue research, improvement and optimization, the focus on the optimization of the time complexity of the proposed optimization and improvement, The experimental results show that the improved effectively, at the same time and the time complexity and space complexity of the algorithm, the stability study corresponding research. The improvement and optimization algorithm, improves the practicability.
NASA Astrophysics Data System (ADS)
Yang, Eunice
2016-02-01
This paper discusses the use of a free mobile engineering application (app) called Autodesk® ForceEffect™ to provide students assistance with spatial visualization of forces and more practice in solving/visualizing statics problems compared to the traditional pencil-and-paper method. ForceEffect analyzes static rigid-body systems using free-body diagrams (FBDs) and provides solutions in real time. It is a cost-free software that is available for download on the Internet. The software is supported on the iOS™, Android™, and Google Chrome™ platforms. It is easy to use and the learning curve is approximately two hours using the tutorial provided within the app. The use of ForceEffect has the ability to provide students different problem modalities (textbook, real-world, and design) to help them acquire and improve on skills that are needed to solve force equilibrium problems. Although this paper focuses on the engineering mechanics statics course, the technology discussed is also relevant to the introductory physics course.
Gurdita, Akshay; Vovko, Heather; Ungrin, Mark
2016-01-01
Basic equipment such as incubation and refrigeration systems plays a critical role in nearly all aspects of the traditional biological research laboratory. Their proper functioning is therefore essential to ensure reliable and repeatable experimental results. Despite this fact, in many academic laboratories little attention is paid to validating and monitoring their function, primarily due to the cost and/or technical complexity of available commercial solutions. We have therefore developed a simple and low-cost monitoring system that combines a "Raspberry Pi" single-board computer with USB-connected sensor interfaces to track and log parameters such as temperature and pressure, and send email alert messages as appropriate. The system is controlled by open-source software, and we have also generated scripts to automate software setup so that no background in programming is required to install and use it. We have applied it to investigate the behaviour of our own equipment, and present here the results along with the details of the monitoring system used to obtain them.
Remote measurement methods for 3-D modeling purposes using BAE Systems' Software
NASA Astrophysics Data System (ADS)
Walker, Stewart; Pietrzak, Arleta
2015-06-01
Efficient, accurate data collection from imagery is the key to an economical generation of useful geospatial products. Incremental developments of traditional geospatial data collection and the arrival of new image data sources cause new software packages to be created and existing ones to be adjusted to enable such data to be processed. In the past, BAE Systems' digital photogrammetric workstation, SOCET SET®, met fin de siècle expectations in data processing and feature extraction. Its successor, SOCET GXP®, addresses today's photogrammetric requirements and new data sources. SOCET GXP is an advanced workstation for mapping and photogrammetric tasks, with automated functionality for triangulation, Digital Elevation Model (DEM) extraction, orthorectification and mosaicking, feature extraction and creation of 3-D models with texturing. BAE Systems continues to add sensor models to accommodate new image sources, in response to customer demand. New capabilities added in the latest version of SOCET GXP facilitate modeling, visualization and analysis of 3-D features.
MorphoGraphX: A platform for quantifying morphogenesis in 4D.
Barbier de Reuille, Pierre; Routier-Kierzkowska, Anne-Lise; Kierzkowski, Daniel; Bassel, George W; Schüpbach, Thierry; Tauriello, Gerardo; Bajpai, Namrata; Strauss, Sören; Weber, Alain; Kiss, Annamaria; Burian, Agata; Hofhuis, Hugo; Sapala, Aleksandra; Lipowczan, Marcin; Heimlicher, Maria B; Robinson, Sarah; Bayer, Emmanuelle M; Basler, Konrad; Koumoutsakos, Petros; Roeder, Adrienne H K; Aegerter-Wilmsen, Tinri; Nakayama, Naomi; Tsiantis, Miltos; Hay, Angela; Kwiatkowska, Dorota; Xenarios, Ioannis; Kuhlemeier, Cris; Smith, Richard S
2015-05-06
Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX ( www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software's modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth.
A Plug and Play GNC Architecture Using FPGA Components
NASA Technical Reports Server (NTRS)
KrishnaKumar, K.; Kaneshige, J.; Waterman, R.; Pires, C.; Ippoloito, C.
2005-01-01
The goal of Plug and Play, or PnP, is to allow hardware and software components to work together automatically, without requiring manual setup procedures. As a result, new or replacement hardware can be plugged into a system and automatically configured with the appropriate resource assignments. However, in many cases it may not be practical or even feasible to physically replace hardware components. One method for handling these types of situations is through the incorporation of reconfigurable hardware such as Field Programmable Gate Arrays, or FPGAs. This paper describes a phased approach to developing a Guidance, Navigation, and Control (GNC) architecture that expands on the traditional concepts of PnP, in order to accommodate hardware reconfiguration without requiring detailed knowledge of the hardware. This is achieved by establishing a functional based interface that defines how the hardware will operate, and allow the hardware to reconfigure itself. The resulting system combines the flexibility of manipulating software components with the speed and efficiency of hardware.
Diagnostic accuracy of tablet-based software for the detection of concussion.
Yang, Suosuo; Flores, Benjamin; Magal, Rotem; Harris, Kyrsti; Gross, Jonathan; Ewbank, Amy; Davenport, Sasha; Ormachea, Pablo; Nasser, Waleed; Le, Weidong; Peacock, W Frank; Katz, Yael; Eagleman, David M
2017-01-01
Despite the high prevalence of traumatic brain injuries (TBI), there are few rapid and straightforward tests to improve its assessment. To this end, we developed a tablet-based software battery ("BrainCheck") for concussion detection that is well suited to sports, emergency department, and clinical settings. This article is a study of the diagnostic accuracy of BrainCheck. We administered BrainCheck to 30 TBI patients and 30 pain-matched controls at a hospital Emergency Department (ED), and 538 healthy individuals at 10 control test sites. We compared the results of the tablet-based assessment against physician diagnoses derived from brain scans, clinical examination, and the SCAT3 test, a traditional measure of TBI. We found consistent distributions of normative data and high test-retest reliability. Based on these assessments, we defined a composite score that distinguishes TBI from non-TBI individuals with high sensitivity (83%) and specificity (87%). We conclude that our testing application provides a rapid, portable testing method for TBI.
INRstar: computerised decision support software for anticoagulation management in primary care.
Jones, Robert Treharne; Sullivan, Mark; Barrett, David
2005-01-01
Computerised decision support software (CDSS) for anticoagulation management has become established practice in the UK, offering significant advantages for patients and clinicians over traditional methods of dose calculation. The New GMS Contract has been partly responsible for this shift of management from secondary to primary care, in which INRstar has been the market leader for many years. In September 2004, INRstar received the John Perry Prize, awarded by the PHCSG for excellence and innovation in medical applications of information technology.
Epistemology, software engineering and formal methods
NASA Technical Reports Server (NTRS)
Holloway, C. Michael
1994-01-01
One of the most basic questions anyone can ask is, 'How do I know that what I think I know is true?' The study of this question is called epistemology. Traditionally, epistemology has been considered to be of legitimate interest only to philosophers, theologians, and three year old children who respond to every statement by asking, 'Why?' Software engineers need to be interested in the subject, however, because a lack of sufficient understanding of epistemology contributes to many of the current problems in the field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoshii, Kazutomo; Llopis, Pablo; Zhang, Kaicheng
As CMOS scaling nears its end, parameter variations (process, temperature and voltage) are becoming a major concern. To overcome parameter variations and provide stability, modern processors are becoming dynamic, opportunistically adjusting voltage and frequency based on thermal and energy constraints, which negatively impacts traditional bulk-synchronous parallelism-minded hardware and software designs. As node-level architecture is growing in complexity, implementing variation control mechanisms only with hardware can be a challenging task. In this paper we investigate a software strategy to manage hardwareinduced variations, leveraging low-level monitoring/controlling mechanisms.
Outcome modelling strategies in epidemiology: traditional methods and basic alternatives
Greenland, Sander; Daniel, Rhian; Pearce, Neil
2016-01-01
Abstract Controlling for too many potential confounders can lead to or aggravate problems of data sparsity or multicollinearity, particularly when the number of covariates is large in relation to the study size. As a result, methods to reduce the number of modelled covariates are often deployed. We review several traditional modelling strategies, including stepwise regression and the ‘change-in-estimate’ (CIE) approach to deciding which potential confounders to include in an outcome-regression model for estimating effects of a targeted exposure. We discuss their shortcomings, and then provide some basic alternatives and refinements that do not require special macros or programming. Throughout, we assume the main goal is to derive the most accurate effect estimates obtainable from the data and commercial software. Allowing that most users must stay within standard software packages, this goal can be roughly approximated using basic methods to assess, and thereby minimize, mean squared error (MSE). PMID:27097747
Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk
2014-10-20
Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.
Gamification as a tool for enhancing graduate medical education
Nevin, Christa R; Westfall, Andrew O; Rodriguez, J Martin; Dempsey, Donald M; Cherrington, Andrea; Roy, Brita; Patel, Mukesh; Willig, James H
2014-01-01
Introduction The last decade has seen many changes in graduate medical education training in the USA, most notably the implementation of duty hour standards for residents by the Accreditation Council of Graduate Medical Education. As educators are left to balance more limited time available between patient care and resident education, new methods to augment traditional graduate medical education are needed. Objectives To assess acceptance and use of a novel gamification-based medical knowledge software among internal medicine residents and to determine retention of information presented to participants by this medical knowledge software. Methods We designed and developed software using principles of gamification to deliver a web-based medical knowledge competition among internal medicine residents at the University of Alabama (UA) at Birmingham and UA at Huntsville in 2012–2013. Residents participated individually and in teams. Participants accessed daily questions and tracked their online leaderboard competition scores through any internet-enabled device. We completed focus groups to assess participant acceptance and analysed software use, retention of knowledge and factors associated with loss of participants (attrition). Results Acceptance: In focus groups, residents (n=17) reported leaderboards were the most important motivator of participation. Use: 16 427 questions were completed: 28.8% on Saturdays/Sundays, 53.1% between 17:00 and 08:00. Retention of knowledge: 1046 paired responses (for repeated questions) were collected. Correct responses increased by 11.9% (p<0.0001) on retest. Differences per time since question introduction, trainee level and style of play were observed. Attrition: In ordinal regression analyses, completing more questions (0.80 per 10% increase; 0.70 to 0.93) decreased, while postgraduate year 3 class (4.25; 1.44 to 12.55) and non-daily play (4.51; 1.50 to 13.58) increased odds of attrition. Conclusions Our software-enabled, gamification-based educational intervention was well accepted among our millennial learners. Coupling software with gamification and analysis of trainee use and engagement data can be used to develop strategies to augment learning in time-constrained educational settings. PMID:25352673
Gamification as a tool for enhancing graduate medical education.
Nevin, Christa R; Westfall, Andrew O; Rodriguez, J Martin; Dempsey, Donald M; Cherrington, Andrea; Roy, Brita; Patel, Mukesh; Willig, James H
2014-12-01
The last decade has seen many changes in graduate medical education training in the USA, most notably the implementation of duty hour standards for residents by the Accreditation Council of Graduate Medical Education. As educators are left to balance more limited time available between patient care and resident education, new methods to augment traditional graduate medical education are needed. To assess acceptance and use of a novel gamification-based medical knowledge software among internal medicine residents and to determine retention of information presented to participants by this medical knowledge software. We designed and developed software using principles of gamification to deliver a web-based medical knowledge competition among internal medicine residents at the University of Alabama (UA) at Birmingham and UA at Huntsville in 2012-2013. Residents participated individually and in teams. Participants accessed daily questions and tracked their online leaderboard competition scores through any internet-enabled device. We completed focus groups to assess participant acceptance and analysed software use, retention of knowledge and factors associated with loss of participants (attrition). Acceptance: In focus groups, residents (n=17) reported leaderboards were the most important motivator of participation. Use: 16 427 questions were completed: 28.8% on Saturdays/Sundays, 53.1% between 17:00 and 08:00. Retention of knowledge: 1046 paired responses (for repeated questions) were collected. Correct responses increased by 11.9% (p<0.0001) on retest. Differences per time since question introduction, trainee level and style of play were observed. Attrition: In ordinal regression analyses, completing more questions (0.80 per 10% increase; 0.70 to 0.93) decreased, while postgraduate year 3 class (4.25; 1.44 to 12.55) and non-daily play (4.51; 1.50 to 13.58) increased odds of attrition. Our software-enabled, gamification-based educational intervention was well accepted among our millennial learners. Coupling software with gamification and analysis of trainee use and engagement data can be used to develop strategies to augment learning in time-constrained educational settings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit
NASA Technical Reports Server (NTRS)
Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.
2016-01-01
Shoulder injury is one of the most severe risks that have the potential to impair crewmembers' performance and health in long duration space flight. Overall, 64% of crewmembers experience shoulder pain after extra-vehicular training in a space suit, and 14% of symptomatic crewmembers require surgical repair (Williams & Johnson, 2003). Suboptimal suit fit, in particular at the shoulder region, has been identified as one of the predominant risk factors. However, traditional suit fit assessments and laser scans represent only a single person's data, and thus may not be generalized across wide variations of body shapes and poses. The aim of this work is to develop a software tool based on a statistical analysis of a large dataset of crewmember body shapes. This tool can accurately predict the skin deformation and shape variations for any body size and shoulder pose for a target population, from which the geometry can be exported and evaluated against suit models in commercial CAD software. A preliminary software tool was developed by statistically analyzing 150 body shapes matched with body dimension ranges specified in the Human-Systems Integration Requirements of NASA ("baseline model"). Further, the baseline model was incorporated with shoulder joint articulation ("articulation model"), using additional subjects scanned in a variety of shoulder poses across a pre-specified range of motion. Scan data was cleaned and aligned using body landmarks. The skin deformation patterns were dimensionally reduced and the co-variation with shoulder angles was analyzed. A software tool is currently in development and will be presented in the final proceeding. This tool would allow suit engineers to parametrically generate body shapes in strategically targeted anthropometry dimensions and shoulder poses. This would also enable virtual fit assessments, with which the contact volume and clearance between the suit and body surface can be predictively quantified at reduced time and cost.
Parsons, Thomas D; McMahan, Timothy; Kane, Robert
2018-01-01
Clinical neuropsychologists have long underutilized computer technologies for neuropsychological assessment. Given the rapid advances in technology (e.g. virtual reality; tablets; iPhones) and the increased accessibility in the past decade, there is an on-going need to identify optimal specifications for advanced technologies while minimizing potential sources of error. Herein, we discuss concerns raised by a joint American Academy of Clinical Neuropsychology/National Academy of Neuropsychology position paper. Moreover, we proffer parameters for the development and use of advanced technologies in neuropsychological assessments. We aim to first describe software and hardware configurations that can impact a computerized neuropsychological assessment. This is followed by a description of best practices for developers and practicing neuropsychologists to minimize error in neuropsychological assessments using advanced technologies. We also discuss the relevance of weighing potential computer error in light of possible errors associated with traditional testing. Throughout there is an emphasis on the need for developers to provide bench test results for their software's performance on various devices and minimum specifications (documented in manuals) for the hardware (e.g. computer, monitor, input devices) in the neuropsychologist's practice. Advances in computerized assessment platforms offer both opportunities and challenges. The challenges can appear daunting but are a manageable and require informed consumers who can appreciate the issues and ask pertinent questions in evaluating their options.
A stochastic optimal feedforward and feedback control methodology for superagility
NASA Technical Reports Server (NTRS)
Halyo, Nesim; Direskeneli, Haldun; Taylor, Deborah B.
1992-01-01
A new control design methodology is developed: Stochastic Optimal Feedforward and Feedback Technology (SOFFT). Traditional design techniques optimize a single cost function (which expresses the design objectives) to obtain both the feedforward and feedback control laws. This approach places conflicting demands on the control law such as fast tracking versus noise atttenuation/disturbance rejection. In the SOFFT approach, two cost functions are defined. The feedforward control law is designed to optimize one cost function, the feedback optimizes the other. By separating the design objectives and decoupling the feedforward and feedback design processes, both objectives can be achieved fully. A new measure of command tracking performance, Z-plots, is also developed. By analyzing these plots at off-nominal conditions, the sensitivity or robustness of the system in tracking commands can be predicted. Z-plots provide an important tool for designing robust control systems. The Variable-Gain SOFFT methodology was used to design a flight control system for the F/A-18 aircraft. It is shown that SOFFT can be used to expand the operating regime and provide greater performance (flying/handling qualities) throughout the extended flight regime. This work was performed under the NASA SBIR program. ICS plans to market the software developed as a new module in its commercial CACSD software package: ACET.
Defining the questions: a research agenda for nontraditional authentication in arms control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hauck, Danielle K; Mac Arthur, Duncan W; Smith, Morag K
Many traditional authentication techniques have been based on hardware solutions. Thus authentication of measurement system hardware has been considered in terms of physical inspection and destructive analysis. Software authentication has implied hash function analysis or authentication tools such as Rose. Continuity of knowledge is maintained through TIDs and cameras. Although there is ongoing progress improving all of these authentication methods, there has been little discussion of the human factors involved in authentication. Issues of non-traditional authentication include sleight-of-hand substitutions, monitor perception vs. reality, and visual diversions. Since monitor confidence in a measurement system depends on the product of their confidencesmore » in each authentication element, it is important to investigate all authentication techniques, including the human factors. This paper will present an initial effort to identify the most important problems that traditional authentication approaches in safeguards have not addressed and are especially relevant to arms control verification. This will include a survey of the literature and direct engagement with nontraditional experts in areas like psychology and human factors. Based on the identification of problem areas, potential research areas will be identified and a possible research agenda will be developed.« less
Foody, Mairéad; Samara, Muthanna; O'Higgins Norman, James
2017-12-01
Bullying research has gained a substantial amount of interest in recent years because of the implications for child and adolescent development. We conducted a meta-analysis of traditional and cyberbullying studies in the Republic and North of Ireland to gain an understanding of prevalence rates and associated issues (particularly psychological correlates and intervention strategies) among young people (primary and secondary school students). Four electronic databases were searched (PsychArticles, ERIC, PsychInfo and Education Research Complete) for studies of traditional bullying and cyberbullying behaviours (perpetrators, victims or both) published between January 1997 and April 2016. A final sample of 39 articles fit our selection criteria. CMA software was used to estimate a pooled prevalence rate for traditional/cyberbullying victimization and perpetration. A systematic review on the psychological impacts for all types of bullying and previously used interventions in an Irish setting is also provided. The results demonstrate the influence moderating factors (e.g., assessment tools, answer scale, time frame) have on reported prevalence rates. These results are discussed in light of current studies, and points for future research are considered. © 2017 The British Psychological Society.
Building and maintaining a library Gopher: traditional skills applied to emerging resources.
Riley, R A; Shipman, B L
1995-04-01
Gopher software has emerged rapidly as a powerful tool for providing library users with organized access to Internet resources. Building and maintaining Gophers is one way in which librarians' traditional knowledge and skills are being applied in a nontraditional area. In March 1992, the University of Michigan's ULibrary Gopher was created, mainly as a means of providing access to U.S. Census data and the U.S. Department of Commerce's Economic Bulletin Board. In an effort to broaden the scope of the Gopher, librarians were asked to submit ideas for new resources to access. The result was the ULibrary Gopher Working Group, a team of eighteen librarians from six libraries on campus. The Alfred Taubman Medical Library staff drafted a menu design for life sciences resources and received basic UNIX training. In March 1993, the life sciences section of ULibrary became fully operational and now is maintained by the Taubman staff. This paper describes the history of the ULibrary Gopher and the working group, the creation and ongoing maintenance of the life sciences area of the Gopher, staffing issues, and the relationship of Gopher building to traditional collection development.
Subaperture metrology technologies extend capabilities in optics manufacturing
NASA Astrophysics Data System (ADS)
Tricard, Marc; Forbes, Greg; Murphy, Paul
2005-10-01
Subaperture polishing technologies have radically changed the landscape of precision optics manufacturing and enabled the production of higher precision optics with increasingly difficult figure requirements. However, metrology is a critical piece of the optics fabrication process, and the dependence on interferometry is especially acute for computer-controlled, deterministic finishing. Without accurate full-aperture metrology, figure correction using subaperture polishing technologies would not be possible. QED Technologies has developed the Subaperture Stitching Interferometer (SSI) that extends the effective aperture and dynamic range of a phase measuring interferometer. The SSI's novel developments in software and hardware improve the capacity and accuracy of traditional interferometers, overcoming many of the limitations previously faced. The SSI performs high-accuracy automated measurements of spheres, flats, and mild aspheres up to 200 mm in diameter by stitching subaperture data. The system combines a six-axis precision workstation, a commercial Fizeau interferometer of 4" or 6" aperture, and dedicated software. QED's software automates the measurement design, data acquisition, and mathematical reconstruction of the full-aperture phase map. The stitching algorithm incorporates a general framework for compensating several types of errors introduced by the interferometer and stage mechanics. These include positioning errors, viewing system distortion, the system reference wave error, etc. The SSI has been proven to deliver the accurate and flexible metrology that is vital to precision optics fabrication. This paper will briefly review the capabilities of the SSI as a production-ready, metrology system that enables costeffective manufacturing of precision optical surfaces.
Taveira-Gomes, Tiago; Ferreira, Patrícia; Taveira-Gomes, Isabel; Severo, Milton; Ferreira, Maria Amélia
2016-08-01
Computer-based learning (CBL) has been widely used in medical education, and reports regarding its usage and effectiveness have ranged broadly. Most work has been done on the effectiveness of CBL approaches versus traditional methods, and little has been done on the comparative effects of CBL versus CBL methodologies. These findings urged other authors to recommend such studies in hopes of improving knowledge about which CBL methods work best in which settings. In this systematic review, we aimed to characterize recent studies of the development of software platforms and interventions in medical education, search for common points among studies, and assess whether recommendations for CBL research are being taken into consideration. We conducted a systematic review of the literature published from 2003 through 2013. We included studies written in English, specifically in medical education, regarding either the development of instructional software or interventions using instructional software, during training or practice, that reported learner attitudes, satisfaction, knowledge, skills, or software usage. We conducted 2 latent class analyses to group articles according to platform features and intervention characteristics. In addition, we analyzed references and citations for abstracted articles. We analyzed 251 articles. The number of publications rose over time, and they encompassed most medical disciplines, learning settings, and training levels, totaling 25 different platforms specifically for medical education. We uncovered 4 latent classes for educational software, characteristically making use of multimedia (115/251, 45.8%), text (64/251, 25.5%), Web conferencing (54/251, 21.5%), and instructional design principles (18/251, 7.2%). We found 3 classes for intervention outcomes: knowledge and attitudes (175/212, 82.6%), knowledge, attitudes, and skills (11.8%), and online activity (12/212, 5.7%). About a quarter of the articles (58/227, 25.6%) did not hold references or citations in common with other articles. The number of common references and citations increased in articles reporting instructional design principles (P=.03), articles measuring online activities (P=.01), and articles citing a review by Cook and colleagues on CBL (P=.04). There was an association between number of citations and studies comparing CBL versus CBL, independent of publication date (P=.02). Studies in this field vary highly, and a high number of software systems are being developed. It seems that past recommendations regarding CBL interventions are being taken into consideration. A move into a more student-centered model, a focus on implementing reusable software platforms for specific learning contexts, and the analysis of online activity to track and predict outcomes are relevant areas for future research in this field.
NASA Technical Reports Server (NTRS)
Oluwole, Oluwayemisi O.; Wong, Hsi-Wu; Green, William
2012-01-01
AdapChem software enables high efficiency, low computational cost, and enhanced accuracy on computational fluid dynamics (CFD) numerical simulations used for combustion studies. The software dynamically allocates smaller, reduced chemical models instead of the larger, full chemistry models to evolve the calculation while ensuring the same accuracy to be obtained for steady-state CFD reacting flow simulations. The software enables detailed chemical kinetic modeling in combustion CFD simulations. AdapChem adapts the reaction mechanism used in the CFD to the local reaction conditions. Instead of a single, comprehensive reaction mechanism throughout the computation, a dynamic distribution of smaller, reduced models is used to capture accurately the chemical kinetics at a fraction of the cost of the traditional single-mechanism approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan
X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less
Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan; ...
2018-02-01
X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less
Software resilience and the effectiveness of software mitigation in microcontrollers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather; Baker, Zachary; Fairbanks, Tom
Commercially available microprocessors could be useful to the space community for noncritical computations. There are many possible components that are smaller, lower-power, and less expensive than traditional radiation-hardened microprocessors. Many commercial microprocessors have issues with single-event effects (SEEs), such as single-event upsets (SEUs) and single-event transients (SETs), that can cause the microprocessor to calculate an incorrect result or crash. In this paper we present the Trikaya technique for masking SEUs and SETs through software mitigation techniques. Furthermore, test results show that this technique can be very effective at masking errors, making it possible to fly these microprocessors for a varietymore » of missions.« less
Software resilience and the effectiveness of software mitigation in microcontrollers
Quinn, Heather; Baker, Zachary; Fairbanks, Tom; ...
2015-12-01
Commercially available microprocessors could be useful to the space community for noncritical computations. There are many possible components that are smaller, lower-power, and less expensive than traditional radiation-hardened microprocessors. Many commercial microprocessors have issues with single-event effects (SEEs), such as single-event upsets (SEUs) and single-event transients (SETs), that can cause the microprocessor to calculate an incorrect result or crash. In this paper we present the Trikaya technique for masking SEUs and SETs through software mitigation techniques. Furthermore, test results show that this technique can be very effective at masking errors, making it possible to fly these microprocessors for a varietymore » of missions.« less
Real-Time Extended Interface Automata for Software Testing Cases Generation
Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin
2014-01-01
Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080
Accelerating artificial intelligence with reconfigurable computing
NASA Astrophysics Data System (ADS)
Cieszewski, Radoslaw
Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.
NASA Astrophysics Data System (ADS)
Babaali, Parisa; Gonzalez, Lidia
2015-07-01
Supporting student success in entry-level mathematics courses at the undergraduate level has and continues to be a challenge. Recently we have seen an increased reliance on technological supports including software to supplement more traditional in-class instruction. In this paper, we explore the effects on student performance of the use of a computer software program to supplement instruction in an entry-level mathematics course at the undergraduate level, specifically, a pre-calculus course. Relying on data from multiple sections of the course over various semesters, we compare student performance in those classes utilizing the software against those in which it was not used. Quantitative analysis of the data then leads us to conclusions about the effectiveness of the software as well as recommendations for future iterations of the course and others like it.
Optics simulations: a Python workshop
NASA Astrophysics Data System (ADS)
Ghalila, H.; Ammar, A.; Varadharajan, S.; Majdi, Y.; Zghal, M.; Lahmar, S.; Lakshminarayanan, V.
2017-08-01
Numerical simulations allow teachers and students to indirectly perform sophisticated experiments that cannot be realizable otherwise due to cost and other constraints. During the past few decades there has been an explosion in the development of numerical tools concurrently with open source environments such as Python software. This availability of open source software offers an incredible opportunity for advancing teaching methodologies as well as in research. More specifically it is possible to correlate theoretical knowledge with experimental measurements using "virtual" experiments. We have been working on the development of numerical simulation tools using the Python program package and we have concentrated on geometric and physical optics simulations. The advantage of doing hands-on numerical experiments is that it allows the student learner to be an active participant in the pedagogical/learning process rather than playing a passive role as in the traditional lecture format. Even in laboratory classes because of constraints of space, lack of equipment and often-large numbers of students, many students play a passive role since they work in groups of 3 or more students. Furthermore these new tools help students get a handle on numerical methods as well simulations and impart a "feel" for the physics under investigation.
Software for Automation of Real-Time Agents, Version 2
NASA Technical Reports Server (NTRS)
Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steve; Chouinard, Caroline; Engelhardt, Barbara; Wilklow, Colette; Mutz, Darren; Knight, Russell; Rabideau, Gregg;
2005-01-01
Version 2 of Closed Loop Execution and Recovery (CLEaR) has been developed. CLEaR is an artificial intelligence computer program for use in planning and execution of actions of autonomous agents, including, for example, Deep Space Network (DSN) antenna ground stations, robotic exploratory ground vehicles (rovers), robotic aircraft (UAVs), and robotic spacecraft. CLEaR automates the generation and execution of command sequences, monitoring the sequence execution, and modifying the command sequence in response to execution deviations and failures as well as new goals for the agent to achieve. The development of CLEaR has focused on the unification of planning and execution to increase the ability of the autonomous agent to perform under tight resource and time constraints coupled with uncertainty in how much of resources and time will be required to perform a task. This unification is realized by extending the traditional three-tier robotic control architecture by increasing the interaction between the software components that perform deliberation and reactive functions. The increase in interaction reduces the need to replan, enables earlier detection of the need to replan, and enables replanning to occur before an agent enters a state of failure.