Evolving software reengineering technology for the emerging innovative-competitive era
NASA Technical Reports Server (NTRS)
Hwang, Phillip Q.; Lock, Evan; Prywes, Noah
1994-01-01
This paper reports on a multi-tool commercial/military environment combining software Domain Analysis techniques with Reusable Software and Reengineering of Legacy Software. It is based on the development of a military version for the Department of Defense (DOD). The integrated tools in the military version are: Software Specification Assistant (SSA) and Software Reengineering Environment (SRE), developed by Computer Command and Control Company (CCCC) for Naval Surface Warfare Center (NSWC) and Joint Logistics Commanders (JLC), and the Advanced Research Project Agency (ARPA) STARS Software Engineering Environment (SEE) developed by Boeing for NAVAIR PMA 205. The paper describes transitioning these integrated tools to commercial use. There is a critical need for the transition for the following reasons: First, to date, 70 percent of programmers' time is applied to software maintenance. The work of these users has not been facilitated by existing tools. The addition of Software Reengineering will also facilitate software maintenance and upgrading. In fact, the integrated tools will support the entire software life cycle. Second, the integrated tools are essential to Business Process Reengineering, which seeks radical process innovations to achieve breakthrough results. Done well, process reengineering delivers extraordinary gains in process speed, productivity and profitability. Most importantly, it discovers new opportunities for products and services in collaboration with other organizations. Legacy computer software must be changed rapidly to support innovative business processes. The integrated tools will provide commercial organizations important competitive advantages. This, in turn, will increase employment by creating new business opportunities. Third, the integrated system will produce much higher quality software than use of the tools separately. The reason for this is that producing or upgrading software requires keen understanding of extremely complex applications which is facilitated by the integrated tools. The radical savings in the time and cost associated with software, due to use of CASE tools that support combined Reuse of Software and Reengineering of Legacy Code, will add an important impetus to improving the automation of enterprises. This will be reflected in continuing operations, as well as in innovating new business processes. The proposed multi-tool software development is based on state of the art technology, which will be further advanced through the use of open systems for adding new tools and experience in their use.
Reengineering legacy software to object-oriented systems
NASA Technical Reports Server (NTRS)
Pitman, C.; Braley, D.; Fridge, E.; Plumb, A.; Izygon, M.; Mears, B.
1994-01-01
NASA has a legacy of complex software systems that are becoming increasingly expensive to maintain. Reengineering is one approach to modemizing these systems. Object-oriented technology, other modem software engineering principles, and automated tools can be used to reengineer the systems and will help to keep maintenance costs of the modemized systems down. The Software Technology Branch at the NASA/Johnson Space Center has been developing and testing reengineering methods and tools for several years. The Software Technology Branch is currently providing training and consulting support to several large reengineering projects at JSC, including the Reusable Objects Software Environment (ROSE) project, which is reengineering the flight analysis and design system (over 2 million lines of FORTRAN code) into object-oriented C++. Many important lessons have been learned during the past years; one of these is that the design must never be allowed to diverge from the code during maintenance and enhancement. Future work on open, integrated environments to support reengineering is being actively planned.
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III
1991-01-01
Programs in use today generally have all of the function and information processing capabilities required to do their specified job. However, older programs usually use obsolete technology, are not integrated properly with other programs, and are difficult to maintain. Reengineering is becoming a prominent discipline as organizations try to move their systems to more modern and maintainable technologies. The Johnson Space Center (JSC) Software Technology Branch (STB) is researching and developing a system to support reengineering older FORTRAN programs into more maintainable forms that can also be more readily translated to a modern languages such as FORTRAN 8x, Ada, or C. This activity has led to the development of maintenance strategies for design recovery and reengineering. These strategies include a set of standards, methodologies, and the concepts for a software environment to support design recovery and reengineering. A brief description of the problem being addressed and the approach that is being taken by the STB toward providing an economic solution to the problem is provided. A statement of the maintenance problems, the benefits and drawbacks of three alternative solutions, and a brief history of the STB experience in software reengineering are followed by the STB new FORTRAN standards, methodology, and the concepts for a software environment.
Space and Missile Systems Center Standard: Software Development
2015-01-16
maintenance , or any other activity or combination of activities resulting in products . Within this standard, requirements to “develop,” “define...integration, reuse, reengineering, maintenance , or any other activity that results in products ). The term “developer” encompasses all software team...activities that results in software products . Software development includes new development, modification, reuse, reengineering, maintenance , and any other
Software Development Standard for Mission Critical Systems
2014-03-17
new development, modification, reuse, reengineering, maintenance , or any other activity or combination of activities resulting in products . Within...develops” includes new development, modification, integration, reuse, reengineering, maintenance , or any other activity that results in products ... Maintenance organization. The organization that is responsible for modifying and otherwise sustaining the software and other software products and
The software-cycle model for re-engineering and reuse
NASA Technical Reports Server (NTRS)
Bailey, John W.; Basili, Victor R.
1992-01-01
This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.
Development of a case tool to support decision based software development
NASA Technical Reports Server (NTRS)
Wild, Christian J.
1993-01-01
A summary of the accomplishments of the research over the past year are presented. Achievements include: made demonstrations with DHC, a prototype supporting decision based software development (DBSD) methodology, for Paramax personnel at ODU; met with Paramax personnel to discuss DBSD issues, the process of integrating DBSD and Refinery and the porting process model; completed and submitted a paper describing DBSD paradigm to IFIP '92; completed and presented a paper describing the approach for software reuse at the Software Reuse Workshop in April 1993; continued to extend DHC with a project agenda, facility necessary for a better project management; completed a primary draft of the re-engineering process model for porting; created a logging form to trace all the activities involved in the process of solving the reengineering problem, and developed a primary chart with the problems involved by the reengineering process.
Using a Foundational Ontology for Reengineering a Software Enterprise Ontology
NASA Astrophysics Data System (ADS)
Perini Barcellos, Monalessa; de Almeida Falbo, Ricardo
The knowledge about software organizations is considerably relevant to software engineers. The use of a common vocabulary for representing the useful knowledge about software organizations involved in software projects is important for several reasons, such as to support knowledge reuse and to allow communication and interoperability between tools. Domain ontologies can be used to define a common vocabulary for sharing and reuse of knowledge about some domain. Foundational ontologies can be used for evaluating and re-designing domain ontologies, giving to these real-world semantics. This paper presents an evaluating of a Software Enterprise Ontology that was reengineered using the Unified Foundation Ontology (UFO) as basis.
Computational Fluid Dynamics Technology for Hypersonic Applications
NASA Technical Reports Server (NTRS)
Gnoffo, Peter A.
2003-01-01
Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.
Myths and realities: Defining re-engineering for a large organization
NASA Technical Reports Server (NTRS)
Yin, Sandra; Mccreary, Julia
1992-01-01
This paper describes the background and results of three studies concerning software reverse engineering, re-engineering, and reuse (R3) hosted by the Internal Revenue Service in 1991 and 1992. The situation at the Internal Revenue--aging, piecemeal computer systems and outdated technology maintained by a large staff--is familiar to many institutions, especially among management information systems. The IRS is distinctive for the sheer magnitude and diversity of its problems; the country's tax records are processed using assembly language and COBOL and spread across tape and network DBMS files. How do we proceed with replacing legacy systems? The three software re-engineering studies looked at methods, CASE tool support, and performed a prototype project using re-engineering methods and tools. During the course of these projects, we discovered critical issues broader than the mechanical definitions of methods and tool technology.
Reengineering the JPL Spacecraft Design Process
NASA Technical Reports Server (NTRS)
Briggs, C.
1995-01-01
This presentation describes the factors that have emerged in the evolved process of reengineering the unmanned spacecraft design process at the Jet Propulsion Laboratory in Pasadena, California. Topics discussed include: New facilities, new design factors, new system-level tools, complex performance objectives, changing behaviors, design integration, leadership styles, and optimization.
Reengineering Activities in K-8 Classrooms: Focus on Formative Feedback
ERIC Educational Resources Information Center
Foster, Patrick N.
2006-01-01
Almost every K-8 technology activity includes feedback. Technology teachers generally view the input-process-output paradigm as being incomplete. A step toward completion would be the inclusion of a "feedback" component incorporating reengineering. This article provides an example of how one activity evolved through several stages to include…
Design Recovery for Software Library Population
1992-12-01
increase understandability, efficiency, and maintainability of the software and the design. A good representation choice will also aid in...required for a reengineering project. It details the analysis and planning phase and gives good criteria for determining the need for a reengineering...because it deals with all of these issues. With his complete description of the analysis and planning phase, Byrne has a good foundation for
Critical evaluation of reverse engineering tool Imagix 4D!
Yadav, Rashmi; Patel, Ravindra; Kothari, Abhay
2016-01-01
The comprehension of legacy codes is difficult to understand. Various commercial reengineering tools are available that have unique working styles, and are equipped with their inherent capabilities and shortcomings. The focus of the available tools is in visualizing static behavior not the dynamic one. Therefore, it is difficult for people who work in software product maintenance, code understanding reengineering/reverse engineering. Consequently, the need for a comprehensive reengineering/reverse engineering tool arises. We found the usage of Imagix 4D to be good as it generates the maximum pictorial representations in the form of flow charts, flow graphs, class diagrams, metrics and, to a partial extent, dynamic visualizations. We evaluated Imagix 4D with the help of a case study involving a few samples of source code. The behavior of the tool was analyzed on multiple small codes and a large code gcc C parser. Large code evaluation was performed to uncover dead code, unstructured code, and the effect of not including required files at preprocessing level. The utility of Imagix 4D to prepare decision density and complexity metrics for a large code was found to be useful in getting to know how much reengineering is required. At the outset, Imagix 4D offered limitations in dynamic visualizations, flow chart separation (large code) and parsing loops. The outcome of evaluation will eventually help in upgrading Imagix 4D and posed a need of full featured tools in the area of software reengineering/reverse engineering. It will also help the research community, especially those who are interested in the realm of software reengineering tool building.
System Re-engineering Project Executive Summary
1991-11-01
Management Information System (STAMIS) application. This project involved reverse engineering, evaluation of structured design and object-oriented design, and re- implementation of the system in Ada. This executive summary presents the approach to re-engineering the system, the lessons learned while going through the process, and issues to be considered in future tasks of this nature.... Computer-Aided Software Engineering (CASE), Distributed Software, Ada, COBOL, Systems Analysis, Systems Design, Life Cycle Development, Functional Decomposition, Object-Oriented
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III
1991-01-01
Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. JSC created a significant set of tools to develop and maintain FORTRAN and C code during development of the Space Shuttle. This tool set forms the basis for an integrated environment to re-engineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. A beta vision of the environment was released in Mar. 1991. The commercial potential for such re-engineering tools is very great. CASE TRENDS magazine reported it to be the primary concern of over four hundred of the top MIS executives.
Scheduling System Assessment, and Development and Enhancement of Re-engineered Version of GPSS
NASA Technical Reports Server (NTRS)
Loganantharaj, Rasiah; Thomas, Bushrod; Passonno, Nicole
1996-01-01
The objective of this project is two-fold. First to provide an evaluation of a commercially developed version of the ground processing scheduling system (GPSS) for its applicability to the Kennedy Space Center (KSC) ground processing problem. Second, to work with the KSC GPSS development team and provide enhancement to the existing software. Systems reengineering is required to provide a sustainable system for the users and the software maintenance group. Using the LISP profile prototype code developed by the GPSS reverse reengineering groups as a building block, we have implemented the resource deconfliction portion of GPSS in common LISP using its object oriented features. The prototype corrects and extends some of the deficiencies of the current production version, plus it uses and builds on the classes from the development team's profile prototype.
NASA Technical Reports Server (NTRS)
Fordyce, Jess
1996-01-01
Work carried out to re-engineer the mission analysis segment of JPL's mission planning ground system architecture is reported on. The aim is to transform the existing software tools, originally developed for specific missions on different support environments, into an integrated, general purpose, multi-mission tool set. The issues considered are: the development of a partnership between software developers and users; the definition of key mission analysis functions; the development of a consensus based architecture; the move towards evolutionary change instead of revolutionary replacement; software reusability, and the minimization of future maintenance costs. The current status and aims of new developments are discussed and specific examples of cost savings and improved productivity are presented.
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
Paradigm shift: Can TQM save DOD's procurement process?
NASA Astrophysics Data System (ADS)
Romeo, Ross V.
1992-11-01
The Department of Defense's (DOD) ambitious introduction of total quality management (TQM) will fail, unless they change their paradigm and reengineer how they do business. TQM implementation in the defense department and possibilities for reengineering DOD's management structure were investigated. This paper uses a case study to investigate DOD's procurement efficiency and effectiveness with information technology. The findings show DOD is faced with its greatest challenge since WWII in meeting the rapidly evolving environment of the 1990s and the 21st century.
NASA Technical Reports Server (NTRS)
Leach, Ronald J.
1997-01-01
The purpose of this project was to study the feasibility of reusing major components of a software system that had been used to control the operations of a spacecraft launched in the 1980s. The study was done in the context of a ground data processing system that was to be rehosted from a large mainframe to an inexpensive workstation. The study concluded that a systematic approach using inexpensive tools could aid in the reengineering process by identifying a set of certified reusable components. The study also developed procedures for determining duplicate versions of software, which were created because of inadequate naming conventions. Such procedures reduced reengineering costs by approximately 19.4 percent.
ERIC Educational Resources Information Center
Pouyioutas, Philippos; Gjermundrod, Harald; Dionysiou, Ioanna
2012-01-01
Purpose: The purpose of this paper is to present ReProTool Version 2.0, a software tool that is used for the European Credit Transfer System (ECTS) and the Bologna Process re-engineering of academic programmes. The tool is the result of an 18 months project (February 2012-July 2013) project, co-financed by the European Regional Development Fund…
Code of Federal Regulations, 2014 CFR
2014-04-01
... inquiries and responses between SWAs. Major IT Modernization Project means conversion, re-engineering..., or upgrading software libraries, protocols, or hardware platform and infrastructure. These are...
Reengineering Real-Time Software Systems
1993-09-09
reengineering existing large-scale (or real-time) systems; systems designed prior to or during the advent of applied SE (Parnas 1979, Freeman 1980). Is... Advisor : Yutaka Kanayama Approved for public release; distribution is unlimited. 93-29769 93 12 6 098 Form Appmoved REPORT DOCUMENTATION PAGE 1o No. PI rep...trm b Idn 1o tl# caik t al wdornon s easnated to waere 1how per response. fr4ikcdm the time rem matnodons. siauide exetig da"a siuo a i and mami diqw
Future Software Sizing Metrics and Estimation Challenges
2011-07-01
systems 4. Ultrahigh software system assurance 5. Legacy maintenance and Brownfield development 6. Agile and Lean/ Kanban development. This paper...refined as the design of the maintenance modifications or Brownfield re-engineering is determined. VII. 6. AGILE AND LEAN/ KANBAN DEVELOPMENT The...difficulties of software maintenance estimation can often be mitigated by using lean workflow management techniques such as Kanban [25]. In Kanban
Agile Methods: Selected DoD Management and Acquisition Concerns
2011-10-01
SIDRE Software Intensive Innovative Development and Reengineering/Evolution SLIM Software Lifecycle Management -Estimate SLOC source lines of code...ISBN #0321502752 Coaching Agile Teams Lyssa Adkins ISBN #0321637704 Agile Project Management : Creating Innovative Products – Second Edition Jim...Accessed July 13, 2011. [Highsmith 2009] Highsmith, J. Agile Project Management : Creating Innovative Products, 2nd ed. Addison- Wesley, 2009
IDC Re-Engineering Phase 2 Iteration E2 Use Case Realizations Version 1.2.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamlet, Benjamin R.; Harris, James M.; Burns, John F.
2016-12-01
This document contains 4 use case realizations generated from the model contained in Rational Software Architect. These use case realizations are the current versions of the realizations originally delivered in Elaboration Iteration 2.
Careers Work in England's Schools: Politics, Practices and Prospects
ERIC Educational Resources Information Center
Hughes, Deirdre
2017-01-01
This article provides a five-year historical synopsis of how central government policies are impacting on careers work in England's secondary schools. It shows attempts to reshape and re-engineer careers provision for young people, through an evolving careers experiment. The extent to which such exogenous arrangements are facilitating and/or…
MFV-class: a multi-faceted visualization tool of object classes.
Zhang, Zhi-meng; Pan, Yun-he; Zhuang, Yue-ting
2004-11-01
Classes are key software components in an object-oriented software system. In many industrial OO software systems, there are some classes that have complicated structure and relationships. So in the processes of software maintenance, testing, software reengineering, software reuse and software restructure, it is a challenge for software engineers to understand these classes thoroughly. This paper proposes a class comprehension model based on constructivist learning theory, and implements a software visualization tool (MFV-Class) to help in the comprehension of a class. The tool provides multiple views of class to uncover manifold facets of class contents. It enables visualizing three object-oriented metrics of classes to help users focus on the understanding process. A case study was conducted to evaluate our approach and the toolkit.
ERIC Educational Resources Information Center
Pitter, Gita Wijesinghe
2007-01-01
Program reviews became widely used as quality assurance activities in the United States beginning in the 1970s. Since then, they have evolved as an essential component in demonstrating institutional effectiveness to accrediting bodies. The paper discusses various approaches to reviews with a focus on a recently reengineered institutional program…
2004-02-01
Publishing Company , Addison- Wesley Systems Programming Series, 1990. [5] E. Stroulia and T. Systa. Dynamic analysis for reverse engineering and program...understanding, Applied Computing Reviews, Spring 2002, ACM Press. [6] El- Ramly , Mohammad; Stroulia, Eleni; Sorenson, Paul. “Recovering software
Modern software approaches applied to a Hydrological model: the GEOtop Open-Source Software Project
NASA Astrophysics Data System (ADS)
Cozzini, Stefano; Endrizzi, Stefano; Cordano, Emanuele; Bertoldi, Giacomo; Dall'Amico, Matteo
2017-04-01
The GEOtop hydrological scientific package is an integrated hydrological model that simulates the heat and water budgets at and below the soil surface. It describes the three-dimensional water flow in the soil and the energy exchange with the atmosphere, considering the radiative and turbulent fluxes. Furthermore, it reproduces the highly non-linear interactions between the water and energy balance during soil freezing and thawing, and simulates the temporal evolution of snow cover, soil temperature and moisture. The core components of the package were presented in the 2.0 version (Endrizzi et al, 2014), which was released as Free Software Open-source project. However, despite the high scientific quality of the project, a modern software engineering approach was still missing. Such weakness hindered its scientific potential and its use both as a standalone package and, more importantly, in an integrate way with other hydrological software tools. In this contribution we present our recent software re-engineering efforts to create a robust and stable scientific software package open to the hydrological community, easily usable by researchers and experts, and interoperable with other packages. The activity takes as a starting point the 2.0 version, scientifically tested and published. This version, together with several test cases based on recent published or available GEOtop applications (Cordano and Rigon, 2013, WRR, Kollet et al, 2016, WRR) provides the baseline code and a certain number of referenced results as benchmark. Comparison and scientific validation can then be performed for each software re-engineering activity performed on the package. To keep track of any single change the package is published on its own github repository geotopmodel.github.io/geotop/ under GPL v3.0 license. A Continuous Integration mechanism by means of Travis-CI has been enabled on the github repository on master and main development branches. The usage of CMake configuration tool and the suite of tests (easily manageable by means of ctest tools) greatly reduces the burden of the installation and allows us to enhance portability on different compilers and Operating system platforms. The package was also complemented by several software tools which provide web-based visualization of results based on R plugins, in particular "shiny" (Chang at al, 2016), "geotopbricks" and "geotopOptim2" (Cordano et al, 2016) packages, which allow rapid and efficient scientific validation of new examples and tests. The software re-engineering activities are still under development. However, our first results are promising enough to eventually reach a robust and stable software project that manages in a flexible way a complex state-of-the-art hydrological model like GEOtop and integrates it into wider workflows.
Heterogeneous scalable framework for multiphase flows
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Karla Vanessa
2013-09-01
Two categories of challenges confront the developer of computational spray models: those related to the computation and those related to the physics. Regarding the computation, the trend towards heterogeneous, multi- and many-core platforms will require considerable re-engineering of codes written for the current supercomputing platforms. Regarding the physics, accurate methods for transferring mass, momentum and energy from the dispersed phase onto the carrier fluid grid have so far eluded modelers. Significant challenges also lie at the intersection between these two categories. To be competitive, any physics model must be expressible in a parallel algorithm that performs well on evolving computermore » platforms. This work created an application based on a software architecture where the physics and software concerns are separated in a way that adds flexibility to both. The develop spray-tracking package includes an application programming interface (API) that abstracts away the platform-dependent parallelization concerns, enabling the scientific programmer to write serial code that the API resolves into parallel processes and threads of execution. The project also developed the infrastructure required to provide similar APIs to other application. The API allow object-oriented Fortran applications direct interaction with Trilinos to support memory management of distributed objects in central processing units (CPU) and graphic processing units (GPU) nodes for applications using C++.« less
NASA Technical Reports Server (NTRS)
Nieten, Joseph L.; Burke, Roger
1992-01-01
The System Diagnostic Builder (SDB) is an automated software verification and validation tool using state-of-the-art Artificial Intelligence (AI) technologies. The SDB is used extensively by project BURKE at NASA-JSC as one component of a software re-engineering toolkit. The SDB is applicable to any government or commercial organization which performs verification and validation tasks. The SDB has an X-window interface, which allows the user to 'train' a set of rules for use in a rule-based evaluator. The interface has a window that allows the user to plot up to five data parameters (attributes) at a time. Using these plots and a mouse, the user can identify and classify a particular behavior of the subject software. Once the user has identified the general behavior patterns of the software, he can train a set of rules to represent his knowledge of that behavior. The training process builds rules and fuzzy sets to use in the evaluator. The fuzzy sets classify those data points not clearly identified as a particular classification. Once an initial set of rules is trained, each additional data set given to the SDB will be used by a machine learning mechanism to refine the rules and fuzzy sets. This is a passive process and, therefore, it does not require any additional operator time. The evaluation component of the SDB can be used to validate a single software system using some number of different data sets, such as a simulator. Moreover, it can be used to validate software systems which have been re-engineered from one language and design methodology to a totally new implementation.
1994-09-01
report for the Properties of User Interface Software Architetures ", draft DISCUS Working Group, Programmers Tutorial, MITRE paper, SEI. Carnegie...execution that we have defined called asynchronous remote procedure call (ARPC) [15], which allows concurrency in amounts proportional to the amount of...demonstration project to use STARS DoD software budget and the proportion concepts. IBM is one of the prime is expected to be increased during the contractors
Riding the Crest of the E-Commerce Wave: Transforming MIT's Campus Computer Resale Operation.
ERIC Educational Resources Information Center
Hallisey, Joanne
1998-01-01
Reengineering efforts, vendor consolidation, and rising costs prompted the Massachusetts Institute of Technology to convert its computer resale store to an online catalog that allows students, faculty, and staff to purchase equipment and software through a World Wide Web interface. The transition has been greeted with a mixed reaction. The next…
Enhanced Software for Scheduling Space-Shuttle Processing
NASA Technical Reports Server (NTRS)
Barretta, Joseph A.; Johnson, Earl P.; Bierman, Rocky R.; Blanco, Juan; Boaz, Kathleen; Stotz, Lisa A.; Clark, Michael; Lebovitz, George; Lotti, Kenneth J.; Moody, James M.;
2004-01-01
The Ground Processing Scheduling System (GPSS) computer program is used to develop streamlined schedules for the inspection, repair, and refurbishment of space shuttles at Kennedy Space Center. A scheduling computer program is needed because space-shuttle processing is complex and it is frequently necessary to modify schedules to accommodate unanticipated events, unavailability of specialized personnel, unexpected delays, and the need to repair newly discovered defects. GPSS implements constraint-based scheduling algorithms and provides an interactive scheduling software environment. In response to inputs, GPSS can respond with schedules that are optimized in the sense that they contain minimal violations of constraints while supporting the most effective and efficient utilization of space-shuttle ground processing resources. The present version of GPSS is a product of re-engineering of a prototype version. While the prototype version proved to be valuable and versatile as a scheduling software tool during the first five years, it was characterized by design and algorithmic deficiencies that affected schedule revisions, query capability, task movement, report capability, and overall interface complexity. In addition, the lack of documentation gave rise to difficulties in maintenance and limited both enhanceability and portability. The goal of the GPSS re-engineering project was to upgrade the prototype into a flexible system that supports multiple- flow, multiple-site scheduling and that retains the strengths of the prototype while incorporating improvements in maintainability, enhanceability, and portability.
Generalized Operations Simulation Environment for Aircraft Maintenance Training
2004-04-01
Operations Simulation Environment ( GOSE ) project is a collaborative effort between AETC and AFRL to develop common, cost-effective, generalized VR training...maintenance training domain since it provided an opportunity to build on the VEST architecture. Development of GOSE involves re-engineering VEST as a scalable...modular, immersive VR training system comprised of PC-based hardware and software. GOSE initiatives include: (a) formalize training needs across
Transforming Our SMEX Organization by Way of Innovation, Standardization, and Automation
NASA Technical Reports Server (NTRS)
Madden, Maureen; Crouse, Pat; Carry, Everett; Esposito, timothy; Parker, Jeffrey; Bradley, David
2006-01-01
NASA's Small Explorer (SMEX) Flight Operations Team (FOT) is currently tackling the challenge of supporting ground operations for several satellites that have surpassed their designed lifetime and have a dwindling budget. At Goddard Space Flight Center (GSFC), these missions are presently being reengineered into a fleet-oriented ground system. When complete, this ground system will provide command and control of four SMEX missions, and will demonstrate fleet automation and control concepts as a pathfinder for additional mission integrations. A goal of this reengineering effort is to demonstrate new ground-system technologies that show promise of supporting longer mission lifecycles and simplifying component integration. In pursuit of this goal, the SMEX organization has had to examine standardization, innovation, and automation. A core technology being demonstrated in this effort is the GSFC Mission Services Evolution Center (GMSEC) architecture. The GMSEC architecture focuses on providing standard interfaces for ground system applications to promote application interoperability. Building around commercial Message Oriented Middleware and providing a common messaging standard allows GMSEC to provide the capabilities necessary to support integration of new software components into existing missions and increase the level of interaction within the system. For SMS, GMSEC has become the technology platform to transform flight operations with the innovation and automation necessary to reduce operational costs. The automation technologies supported in SMEX are built upon capabilities provided by the GMSEC architecture that allows the FOT to further reduce the involvement of the console, operator. Initially, SMEX is automating only routine operations, such as safety and health monitoring, basic commanding, and system recovery. The operational concepts being developed here will reduce the need for staffed passes and are a necessity for future fleet management. As this project continues to evolve, additional innovations beyond GMSEC and automation have, and will continue to be developed. The team developed techniques for migrating ground systems of existing on-orbit assets. The tools necessary to monitor and control software failures were integrated and tailored for operational environments. All this was done with a focus of extending fleet operations to mission beyond SMU. The result of this work is the foundation for a broader fleet-capable ground system that will include several missions supported by the Space Science Mission Operations Project.
Development of a testlet generator in re-engineering the Indonesian physics national-exams
NASA Astrophysics Data System (ADS)
Mindyarto, Budi Naini; Mardapi, Djemari; Bastari
2017-08-01
The Indonesian Physics national-exams are end-of-course summative assessments that could be utilized to support the assessment for learning in physics educations. This paper discusses the development and evaluation of a testlet generator based on a re-engineering of Indonesian physics national exams. The exam problems were dissected and decomposed into testlets revealing the deeper understanding of the underlying physical concepts by inserting a qualitative question and its scientific reasoning question. A template-based generator was built to facilitate teachers in generating testlet variants that would be more conform to students' scientific attitude development than their original simple multiple-choice formats. The testlet generator was built using open source software technologies and was evaluated focusing on the black-box testing by exploring the generator's execution, inputs and outputs. The results showed the correctly-performed functionalities of the developed testlet generator in validating inputs, generating testlet variants, and accommodating polytomous item characteristics.
Design implications for task-specific search utilities for retrieval and re-engineering of code
NASA Astrophysics Data System (ADS)
Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif
2017-05-01
The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.
Developing a Cyberinfrastructure for integrated assessments of environmental contaminants.
Kaur, Taranjit; Singh, Jatinder; Goodale, Wing M; Kramar, David; Nelson, Peter
2005-03-01
The objective of this study was to design and implement prototype software for capturing field data and automating the process for reporting and analyzing the distribution of mercury. The four phase process used to design, develop, deploy and evaluate the prototype software is described. Two different development strategies were used: (1) design of a mobile data collection application intended to capture field data in a meaningful format and automate transfer into user databases, followed by (2) a re-engineering of the original software to develop an integrated database environment with improved methods for aggregating and sharing data. Results demonstrated that innovative use of commercially available hardware and software components can lead to the development of an end-to-end digital cyberinfrastructure that captures, records, stores, transmits, compiles and integrates multi-source data as it relates to mercury.
Using integration technology as a strategic advantage.
Fry, P A
1993-08-01
The underlying premise of the Managed Competition Act previously cited is that through managed competition providers will be forced to lower care costs while increasing the level of positive care outcomes. Because it may also be that tomorrow's hospitals will find a severe rationing of technology, what can they do to prepare? Most of the systems in place today already have built within them all the necessary potential to address this premise and technology requirement with no change, no conversion, no expense for new equipment and software, and no disruption in day-to-day operations, just a little re-engineering. Today, however, these systems are similar to a 20-mule team pulling in different directions: all the power is there, but the wagon remains motionless and totally unable to reach its objective. It takes a skilled wagonmaster to bring them together, to make the mules work as a cohesive unit, to make the power of 20 mules greater than the sum of 20 mules. So it is and will be for the hospital of tomorrow. System integration is no longer a question of whether but of when. Those hospitals that use it today as a strategic advantage will be in a better position tomorrow to use it as a competitive strategic advantage in an environment that will reward low cost and high positive care outcomes and will penalize those that cannot compete. The technology is already here and economically within reach of nearly every hospital, just waiting to be used. The question that must nag all of us who want to make the health care system of America better is, Why not make the when now? Rich Helppie, president of Superior Consultant Company, summarized the solution well: The old ways will not give way to the new overnight. The re-engineering process in healthcare must evolve. Compared to the last 20 years, however, such evolution may appear to be a massive, forthright, complete, comprehensive, drastic and rapid revolution. Survival is the name of the game, and for healthcare organizations to survive, they must join the re-engineering revolution. When that happens, the "Americanized" health care system might look like the community network shown in Figure 7, which is a networked integration of all the systems used in the community working together and exchanging information, with hospitals exchanging information with other hospitals and sharing it with physicians, clinics, laboratories, radiology centers, universities, employers, payors, and governmental agencies.(ABSTRACT TRUNCATED AT 400 WORDS)
Leadership processes for re-engineering changes to the health care industry.
Guo, Kristina L
2004-01-01
As health care organizations seek innovative ways to change financing and delivery mechanisms due to escalated health care costs and increased competition, drastic changes are being sought in the form of re-engineering. This study discusses the leader's role of re-engineering in health care. It specifically addresses the reasons for failures in re-engineering and argues that success depends on senior level leaders playing a critical role. Existing studies lack comprehensiveness in establishing models of re-engineering and management guidelines. This research focuses on integrating re-engineering and leadership processes in health care by creating a step-by-step model. Particularly, it illustrates the four Es: Examination, Establishment, Execution and Evaluation, as a comprehensive re-engineering process that combines managerial roles and activities to result in successfully changed and reengineered health care organizations.
Re-Engineering the Mission Operations System (MOS) for the Prime and Extended Mission
NASA Technical Reports Server (NTRS)
Hunt, Joseph C., Jr.; Cheng, Leo Y.
2012-01-01
One of the most challenging tasks in a space science mission is designing the Mission Operations System (MOS). Whereas the focus of the project is getting the spacecraft built and tested for launch, the mission operations engineers must build a system to carry out the science objectives. The completed MOS design is then formally assessed in the many reviews. Once a mission has completed the reviews, the Mission Operation System (MOS) design has been validated to the Functional Requirements and is ready for operations. The design was built based on heritage processes, new technology, and lessons learned from past experience. Furthermore, our operational concepts must be properly mapped to the mission design and science objectives. However, during the course of implementing the science objective in the operations phase after launch, the MOS experiences an evolutional change to adapt for actual performance characteristics. This drives the re-engineering of the MOS, because the MOS includes the flight and ground segments. Using the Spitzer mission as an example we demonstrate how the MOS design evolved for both the prime and extended mission to enhance the overall efficiency for science return. In our re-engineering process, we ensured that no requirements were violated or mission objectives compromised. In most cases, optimized performance across the MOS, including gains in science return as well as savings in the budget profile was achieved. Finally, we suggest a need to better categorize the Operations Phase (Phase E) in the NASA Life-Cycle Phases of Formulation and Implementation
Simulation software: engineer processes before reengineering.
Lepley, C J
2001-01-01
People make decisions all the time using intuition. But what happens when you are asked: "Are you sure your predictions are accurate? How much will a mistake cost? What are the risks associated with this change?" Once a new process is engineered, it is difficult to analyze what would have been different if other options had been chosen. Simulating a process can help senior clinical officers solve complex patient flow problems and avoid wasted efforts. Simulation software can give you the data you need to make decisions. The author introduces concepts, methodologies, and applications of computer aided simulation to illustrate their use in making decisions to improve workflow design.
NASA Astrophysics Data System (ADS)
Khripko, Elena
2017-10-01
In the present article we study the issues of organizational resistance to reengineering of business processes in construction of transport infrastructure. Reengineering in a company of transport sector is, first and foremost, an innovative component of business strategy. We analyze the choice of forward and reverse reengineering tools and terms of their application in connection with organizational resistance. Reengineering is defined taking into account four aspects: fundamentality, radicality, abruptness, business process. We describe the stages of reengineering and analyze key requirements to newly created business processes.
Flight dynamics software in a distributed network environment
NASA Technical Reports Server (NTRS)
Jeletic, J.; Weidow, D.; Boland, D.
1995-01-01
As with all NASA facilities, the announcement of reduced budgets, reduced staffing, and the desire to implement smaller/quicker/cheaper missions has required the Agency's organizations to become more efficient in what they do. To accomplish these objectives, the FDD has initiated the development of the Flight Dynamics Distributed System (FDDS). The underlying philosophy of FDDS is to build an integrated system that breaks down the traditional barriers of attitude, mission planning, and navigation support software to provide a uniform approach to flight dynamics applications. Through the application of open systems concepts and state-of-the-art technologies, including object-oriented specification concepts, object-oriented software, and common user interface, communications, data management, and executive services, the FDD will reengineer most of its six million lines of code.
IDC Reengineering Phase 2 Project Scope.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.
This report provides a brief description of the scope of the IDC Reengineering Phase 2 project. It describes the goals and objectives of reengineering, the system definition, and the technical scope of the system. REVISIONS Version Date Author/Team Revision Description Authorized by 1.0 9/25/2014 SNL IDC Reengineering Team Unlimited Release for I2 M. Harris 1.1 28/01/2015 IDC Reengineering Team Align with previous IDC scope document E. Tomuta.
Re-engineering pre-employment check-up systems: a model for improving health services.
Rateb, Said Abdel Hakim; El Nouman, Azza Abdel Razek; Rateb, Moshira Abdel Hakim; Asar, Mohamed Naguib; El Amin, Ayman Mohammed; Gad, Saad abdel Aziz; Mohamed, Mohamed Salah Eldin
2011-01-01
The purpose of this paper is to develop a model for improving health services provided by the pre-employment medical fitness check-up system affiliated to Egypt's Health Insurance Organization (HIO). Operations research, notably system re-engineering, is used in six randomly selected centers and findings before and after re-engineering are compared. The re-engineering model follows a systems approach, focusing on three areas: structure, process and outcome. The model is based on six main components: electronic booking, standardized check-up processes, protected medical documents, advanced archiving through an electronic content management (ECM) system, infrastructure development, and capacity building. The model originates mainly from customer needs and expectations. The centers' monthly customer flow increased significantly after re-engineering. The mean time spent per customer cycle improved after re-engineering--18.3 +/- 5.5 minutes as compared to 48.8 +/- 14.5 minutes before. Appointment delay was also significantly decreased from an average 18 to 6.2 days. Both beneficiaries and service providers were significantly more satisfied with the services after re-engineering. The model proves that re-engineering program costs are exceeded by increased revenue. Re-engineering in this study involved multiple structure and process elements. The literature review did not reveal similar re-engineering healthcare packages. Therefore, each element was compared separately. This model is highly recommended for improving service effectiveness and efficiency. This research is the first in Egypt to apply the re-engineering approach to public health systems. Developing user-friendly models for service improvement is an added value.
Schwentner, Andreas; Feith, André; Münch, Eugenia; Busche, Tobias; Rückert, Christian; Kalinowski, Jörn; Takors, Ralf; Blombach, Bastian
2018-03-06
Evolutionary approaches are often undirected and mutagen-based yielding numerous mutations, which need elaborate screenings to identify relevant targets. We here apply Metabolic engineering to Guide Evolution (MGE), an evolutionary approach evolving and identifying new targets to improve microbial producer strains. MGE is based on the idea to impair the cell's metabolism by metabolic engineering, thereby generating guided evolutionary pressure. It consists of three distinct phases: (i) metabolic engineering to create the evolutionary pressure on the applied strain followed by (ii) a cultivation phase with growth as straightforward screening indicator for the evolutionary event, and (iii) comparative whole genome sequencing (WGS), to identify mutations in the evolved strains, which are eventually re-engineered for verification. Applying MGE, we evolved the PEP and pyruvate carboxylase-deficient strain C. glutamicum Δppc Δpyc to grow on glucose as substrate with rates up to 0.31 ± 0.02 h -1 which corresponds to 80% of the growth rate of the wildtype strain. The intersection of the mutations identified by WGS revealed isocitrate dehydrogenase (ICD) as consistent target in three independently evolved mutants. Upon re-engineering in C. glutamicum Δppc Δpyc, the identified mutations led to diminished ICD activities and activated the glyoxylate shunt replenishing oxaloacetate required for growth. Intracellular relative quantitative metabolome analysis showed that the pools of citrate, isocitrate, cis-aconitate, and L-valine were significantly higher compared to the WT control. As an alternative to existing L-valine producer strains based on inactivated or attenuated pyruvate dehydrogenase complex, we finally engineered the PEP and pyruvate carboxylase-deficient C. glutamicum strains with identified ICD mutations for L-valine production by overexpression of the L-valine biosynthesis genes. Among them, C. glutamicum Δppc Δpyc ICD G407S (pJC4ilvBNCE) produced up to 8.9 ± 0.4 g L-valine L -1 , with a product yield of 0.22 ± 0.01 g L-valine per g glucose. Copyright © 2018 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
2003-09-01
BLANK xv LIST OF ACRONYMS ABC Activity Based Costing ADO ActiveX Data Object ASP Application Server Page BPR Business Process Re...processes uses people and systems (hardware, software, machinery, etc.) and that these people and systems contain the “corporate” knowledge of the...server architecture was also a high maintenance item. Data was no longer contained on one mainframe but was distributed throughout the enterprise
2009-04-23
of Code Need for increased functionality will be a forcing function to bring the fields of software and systems engineering... of Software-Intensive Systems is Increasing 3 How Evolving Trends in Systems and Software Technologies Bode Well for Advancing the Precision of ...Engineering in Continued Partnership 4 How Evolving Trends in Systems and Software Technologies Bode Well for Advancing the
Business Reengineering - Government Viability
2000-04-01
BUSINESS REENGINEERING - GOVERNMENT VIABILITY BY LIEUTENANT COLONEL GRAY K. COYNER United States Air Force Reserve DISTRIBUTION STATEMENT A: Approved...PROJECT Business Reengineering - Government Viability by Lt Col Gray K. Coyner USAFR Col Harry E. LeBoeuf Jr. Project Advisor The views expressed in...release. Distribution is unlimited. ii ABSTRACT AUTHOR: Gray K. Coyner TITLE: Business Reengineering - Government Viability FORMAT: Strategy Research
A new software for prediction of femoral neck fractures.
Testi, Debora; Cappello, Angelo; Sgallari, Fiorella; Rumpf, Martin; Viceconti, Marco
2004-08-01
Femoral neck fractures are an important clinical, social and economic problem. Even if many different attempts have been carried out to improve the accuracy predicting the fracture risk, it was demonstrated in retrospective studies that the standard clinical protocol achieves an accuracy of about 65%. A new procedure was developed including for the prediction not only bone mineral density but also geometric and femoral strength information and achieving an accuracy of about 80% in a previous retrospective study. Aim of the present work was to re-engineer research-based procedures and develop a real-time software for the prediction of the risk for femoral fracture. The result was efficient, repeatable and easy to use software for the evaluation of the femoral neck fracture risk to be inserted in the daily clinical practice providing a useful tool for the improvement of fracture prediction.
NASA Technical Reports Server (NTRS)
Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan
1993-01-01
Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.
2001-09-01
replication) -- all from Visual Basic and VBA . In fact, we found that the SQL Server engine actually had a plethora of options, most formidable of...2002, the new SQL Server 2000 database engine, and Microsoft Visual Basic.NET. This thesis describes our use of the Spiral Development Model to...versions of Microsoft products? Specifically, the pending release of Microsoft Office 2002, the new SQL Server 2000 database engine, and Microsoft
Gumbo, B
2000-01-01
The Harare metropolis in Zimbabwe, extending upstream from Manyame Dam in the Upper Manyame River Basin, consists of the City of Harare and its satellite towns: Chitungwiza, Norton, Epworth and Ruwa. The existing urban drainage system is typically a single-use-mixing system: water is used and discharged to "waste", excreta are flushed to sewers and eventually, after "treatment", the effluent is discharged to a drinking water supply source. Polluted urban storm water is evacuated as fast as possible. This system not only ignores the substantial value in "waste" materials, but it also exports problems to downstream communities and to vulnerable fresh-water sources. The question is how can the harare metropolis urban drainage system, which is complex and has evolved over time, be rearranged to achieve sustainability (i.e. water conservation, pollution prevention at source, protection of the vulnerable drinking water sources and recovery of valuable materials)? This paper reviews current concepts regarding the future development of the urban drainage system in line with the new vision of "Sustainable Cities of the Future". The Harare Metropolis in Zimbabwe is taken as a case, and philosophical options for re-engineering the drainage system are discussed.
Theory and Practice Meets in Industrial Process Design -Educational Perspective-
NASA Astrophysics Data System (ADS)
Aramo-Immonen, Heli; Toikka, Tarja
Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.
SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool
NASA Technical Reports Server (NTRS)
Boyer, Jeffrey S.
1994-01-01
Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.
SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool
NASA Astrophysics Data System (ADS)
Boyer, Jeffrey S.
1994-11-01
Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.
An open system approach to process reengineering in a healthcare operational environment.
Czuchry, A J; Yasin, M M; Norris, J
2000-01-01
The objective of this study is to examine the applicability of process reengineering in a healthcare operational environment. The intake process of a mental healthcare service delivery system is analyzed systematically to identify process-related problems. A methodology which utilizes an open system orientation coupled with process reengineering is utilized to overcome operational and patient related problems associated with the pre-reengineered intake process. The systematic redesign of the intake process resulted in performance improvements in terms of cost, quality, service and timing.
Innovative model of business process reengineering at machine building enterprises
NASA Astrophysics Data System (ADS)
Nekrasov, R. Yu; Tempel, Yu A.; Tempel, O. A.
2017-10-01
The paper provides consideration of business process reengineering viewed as amanagerial innovation accepted by present day machine building enterprises, as well as waysto improve its procedure. A developed innovative model of reengineering measures isdescribed and is based on the process approach and other principles of company management.
Reengineering outcomes management: an integrated approach to managing data, systems, and processes.
Neuman, K; Malloch, K; Ruetten, V
1999-01-01
The integration of outcomes management into organizational reengineering projects is often overlooked or marginalized in proportion to the entire project. Incorporation of an integrated outcomes management program strengthens the overall quality of reengineering projects and enhances their sustainability. This article presents a case study in which data, systems, and processes were reengineered to form an effective Outcomes Management program as a component of the organization's overall project. The authors describe eight steps to develop and monitor an integrated outcomes management program. An example of an integrated report format is included.
The Cloud-Based Integrated Data Viewer (IDV)
NASA Astrophysics Data System (ADS)
Fisher, Ward
2015-04-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While there are a suite of tools and methodologies used in traditional software engineering environments to mitigate this issue, they are typically ignored by developers lacking a background in software engineering. The result is a large body of software which is simultaneously critical and difficult to maintain. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. The advent of cloud computing has provided a solution to this problem, which was not previously practical on a large scale; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. Through application streaming we are able to bring the same visualization to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be. Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved. We will also discuss the differences between local software and software-as-a-service.
The role of reengineering in health care delivery.
Boland, P
1996-01-01
Health care reengineering is a powerful methodology that helps organizations reorder priorities, provide more cost-effective care, and increase value to customers. It should be driven by what the customer wants and what the market needs. Systemwide reengineering integrates three levels of activity: managing community and health plan partnerships; consolidating overlapping delivery system functions among participating providers and vendors; and redesigning administrative functions, clinical services, and caregiving programs to improve health status. Reengineering is not a panacea; it is a critical core competency and requisite skill for health care organizations if they are to succeed under managed care in the future.
Re-engineering: a prescription for hospitals.
Bolton, C; Gordon, J R
1994-01-01
Previously applied mostly in large, private sector corporations, "re-engineering" is fast becoming a tool that hospitals can use to break away from the old to find a new and better way of doing things. Re-engineering, however, first requires strong leadership which is committed to employee involvement and re-inventing the process design to meet the needs of the customers. Once the transition has been completed, the processes and the organization must continue to be managed differently. This article reviews the processes involved in re-engineering, and discusses the implementation of the initiative at the Sunnybrook Health Science Centre in Toronto.
DIRAC3 - the new generation of the LHCb grid software
NASA Astrophysics Data System (ADS)
Tsaregorodtsev, A.; Brook, N.; Casajus Ramo, A.; Charpentier, Ph; Closier, J.; Cowan, G.; Graciani Diaz, R.; Lanciotti, E.; Mathe, Z.; Nandakumar, R.; Paterson, S.; Romanovsky, V.; Santinelli, R.; Sapunov, M.; Smith, A. C.; Seco Miguelez, M.; Zhelezov, A.
2010-04-01
DIRAC, the LHCb community Grid solution, was considerably reengineered in order to meet all the requirements for processing the data coming from the LHCb experiment. It is covering all the tasks starting with raw data transportation from the experiment area to the grid storage, data processing up to the final user analysis. The reengineered DIRAC3 version of the system includes a fully grid security compliant framework for building service oriented distributed systems; complete Pilot Job framework for creating efficient workload management systems; several subsystems to manage high level operations like data production and distribution management. The user interfaces of the DIRAC3 system providing rich command line and scripting tools are complemented by a full-featured Web portal providing users with a secure access to all the details of the system status and ongoing activities. We will present an overview of the DIRAC3 architecture, new innovative features and the achieved performance. Extending DIRAC3 to manage computing resources beyond the WLCG grid will be discussed. Experience with using DIRAC3 by other user communities than LHCb and in other application domains than High Energy Physics will be shown to demonstrate the general-purpose nature of the system.
Kohlberg, Gavriel D; Mancuso, Dean M; Chari, Divya A; Lalwani, Anil K
2015-01-01
Enjoyment of music remains an elusive goal following cochlear implantation. We test the hypothesis that reengineering music to reduce its complexity can enhance the listening experience for the cochlear implant (CI) listener. Normal hearing (NH) adults (N = 16) and CI listeners (N = 9) evaluated a piece of country music on three enjoyment modalities: pleasantness, musicality, and naturalness. Participants listened to the original version along with 20 modified, less complex, versions created by including subsets of the musical instruments from the original song. NH participants listened to the segments both with and without CI simulation processing. Compared to the original song, modified versions containing only 1-3 instruments were less enjoyable to the NH listeners but more enjoyable to the CI listeners and the NH listeners with CI simulation. Excluding vocals and including rhythmic instruments improved enjoyment for NH listeners with CI simulation but made no difference for CI listeners. Reengineering a piece of music to reduce its complexity has the potential to enhance music enjoyment for the cochlear implantee. Thus, in addition to improvements in software and hardware, engineering music specifically for the CI listener may be an alternative means to enhance their listening experience.
2001-09-01
measurable benefit in terms of process efficiency and effectiveness, business process reengineering (BPR) is becoming increasingly important. BPR suggests...technology by businesses in hopes of achieving a measurable benefit in terms of process efficiency and effectiveness, business process...KOPER-LITE ........................................13 E. HOW MIGHT THE MILITARY BENEFIT FROM PROCESS REENGINEERING EFFORTS
IDC System Specification Document Version 1.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Lober, Randall R.
2015-02-01
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Reengineering Phase 2 project. Revisions Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Reengineering Project Team Initial delivery M. Harris V1.1 2/2015 IDC Reengineering Project Team Iteration I2 Review Comments M. Harris
NASA Astrophysics Data System (ADS)
Menguy, Theotime
Because of its critical nature, avionic industry is bound with numerous constraints such as security standards and certifications while having to fulfill the clients' desires for personalization. In this context, variability management is a very important issue for re-engineering projects of avionic softwares. In this thesis, we propose a new approach, based on formal concept analysis and semantic web, to support variability management. The first goal of this research is to identify characteristic behaviors and interactions of configuration variables in a dynamically configured system. To identify such elements, we used formal concept analysis on different levels of abstractions in the system and defined new metrics. Then, we built a classification for the configuration variables and their relations in order to enable a quick identification of a variable's behavior in the system. This classification could help finding a systematic approach to process variables during a re-engineering operation, depending on their category. To have a better understanding of the system, we also studied the shared controls of code between configuration variables. A second objective of this research is to build a knowledge platform to gather the results of all the analysis performed, and to store any additional element relevant in the variability management context, for instance new results helping define re-engineering process for each of the categories. To address this goal, we built a solution based on a semantic web, defining a new ontology, very extensive and enabling to build inferences related to the evolution processes. The approach presented here is, to the best of our knowledge, the first classification of configuration variables of a dynamically configured software and an original use of documentation and variability management techniques using semantic web in the aeronautic field. The analysis performed and the final results show that formal concept analysis is a way to identify specific properties and behaviors and that semantic web is a good solution to store and explore the results. However, the use of formal concept analysis with new boolean relations, such as the link between configuration variables and files, and the definition of new inferences may be a way to draw better conclusions. The use of the same methodology with other systems would enable to validate the approach in other contexts.
In situ regeneration of bioactive coatings enabled by an evolved Staphylococcus aureus sortase A
NASA Astrophysics Data System (ADS)
Ham, Hyun Ok; Qu, Zheng; Haller, Carolyn A.; Dorr, Brent M.; Dai, Erbin; Kim, Wookhyun; Liu, David R.; Chaikof, Elliot L.
2016-04-01
Surface immobilization of bioactive molecules is a central paradigm in the design of implantable devices and biosensors with improved clinical performance capabilities. However, in vivo degradation or denaturation of surface constituents often limits the long-term performance of bioactive films. Here we demonstrate the capacity to repeatedly regenerate a covalently immobilized monomolecular thin film of bioactive molecules through a two-step stripping and recharging cycle. Reversible transpeptidation by a laboratory evolved Staphylococcus aureus sortase A (eSrtA) enabled the rapid immobilization of an anti-thrombogenic film in the presence of whole blood and permitted multiple cycles of film regeneration in vitro that preserved its biological activity. Moreover, eSrtA transpeptidation facilitated surface re-engineering of medical devices in situ after in vivo implantation through removal and restoration film constituents. These studies establish a rapid, orthogonal and reversible biochemical scheme to regenerate selective molecular constituents with the potential to extend the lifetime of bioactive films.
Software Development and Test Methodology for a Distributed Ground System
NASA Technical Reports Server (NTRS)
Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)
2002-01-01
The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.
Bringing the Unidata IDV to the Cloud
NASA Astrophysics Data System (ADS)
Fisher, W. I.; Oxelson Ganter, J.
2015-12-01
Maintaining software compatibility across new computing environments and the associated underlying hardware is a common problem for software engineers and scientific programmers. While traditional software engineering provides a suite of tools and methodologies which may mitigate this issue, they are typically ignored by developers lacking a background in software engineering. Causing further problems, these methodologies are best applied at the start of project; trying to apply them to an existing, mature project can require an immense effort. Visualization software is particularly vulnerable to this problem, given the inherent dependency on particular graphics hardware and software API's. As a result of these issues, there exists a large body of software which is simultaneously critical to the scientists who are dependent upon it, and yet increasingly difficult to maintain.The solution to this problem was partially provided with the advent of Cloud Computing; Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations, with little-to-no re-engineering required. When coupled with containerization technology such as Docker, we are able to easily bring the same visualization software to a desktop, a netbook, a smartphone, and the next generation of hardware, whatever it may be.Unidata has been able to harness Application Streaming to provide a tablet-compatible version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.
1998-06-01
Public Policy Analysis and Management Vol. 5 (Connecticut: JAI Press Inc ., 1992) 20. 38 Thomas A Simcik, Reengineering the Navy Program Objective...Winston Inc .,1969. Olvey, Lee D. The Economics of National Security, Avery Publishing Group : 1984. Premchand, A., Government Budgeting And Expenditure... the current process is presented and analyzed against relevant theory on policy analysis , reengineering, and contemporary budgeting systems, in
Irma 5.2 multi-sensor signature prediction model
NASA Astrophysics Data System (ADS)
Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles
2007-04-01
The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after an extensive verification and validation of an upgraded and reengineered active channel. Since 2005, the reengineering effort has focused on the Irma passive channel. Field measurements for the validation effort include the unpolarized data collection. Irma 5.2 is scheduled for release in the summer of 2007. This paper will report the validation test results of the Irma passive models and discuss the new features in Irma 5.2.
Kohlberg, Gavriel D.; Mancuso, Dean M.; Chari, Divya A.; Lalwani, Anil K.
2015-01-01
Objective. Enjoyment of music remains an elusive goal following cochlear implantation. We test the hypothesis that reengineering music to reduce its complexity can enhance the listening experience for the cochlear implant (CI) listener. Methods. Normal hearing (NH) adults (N = 16) and CI listeners (N = 9) evaluated a piece of country music on three enjoyment modalities: pleasantness, musicality, and naturalness. Participants listened to the original version along with 20 modified, less complex, versions created by including subsets of the musical instruments from the original song. NH participants listened to the segments both with and without CI simulation processing. Results. Compared to the original song, modified versions containing only 1–3 instruments were less enjoyable to the NH listeners but more enjoyable to the CI listeners and the NH listeners with CI simulation. Excluding vocals and including rhythmic instruments improved enjoyment for NH listeners with CI simulation but made no difference for CI listeners. Conclusions. Reengineering a piece of music to reduce its complexity has the potential to enhance music enjoyment for the cochlear implantee. Thus, in addition to improvements in software and hardware, engineering music specifically for the CI listener may be an alternative means to enhance their listening experience. PMID:26543322
Re-engineering the Multimission Command System at the Jet Propulsion Laboratory
NASA Technical Reports Server (NTRS)
Alexander, Scott; Biesiadecki, Jeff; Cox, Nagin; Murphy, Susan C.; Reeve, Tim
1994-01-01
The Operations Engineering Lab (OEL) at JPL has developed the multimission command system as part of JPL's Advanced Multimission Operations System. The command system provides an advanced multimission environment for secure, concurrent commanding of multiple spacecraft. The command functions include real-time command generation, command translation and radiation, status reporting, some remote control of Deep Space Network antenna functions, and command file management. The mission-independent architecture has allowed easy adaptation to new flight projects and the system currently supports all JPL planetary missions (Voyager, Galileo, Magellan, Ulysses, Mars Pathfinder, and CASSINI). This paper will discuss the design and implementation of the command software, especially trade-offs and lessons learned from practical operational use. The lessons learned have resulted in a re-engineering of the command system, especially in its user interface and new automation capabilities. The redesign has allowed streamlining of command operations with significant improvements in productivity and ease of use. In addition, the new system has provided a command capability that works equally well for real-time operations and within a spacecraft testbed. This paper will also discuss new development work including a multimission command database toolkit, a universal command translator for sequencing and real-time commands, and incorporation of telecommand capabilities for new missions.
IDC Re-Engineering Phase 3 Development Plan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harris, James M.; Burns, John F.; Pollock, David L.
2017-01-01
Sandia National Laboratories has prepared a project development plan that proposes how the parties interested in the IDC Re-Engineering system will coordinate its development, testing and transition to operations.
Technologies and problems of reengineering of the business processes of company
NASA Astrophysics Data System (ADS)
Silka, Dmitriy
2017-10-01
Management of the combination of business processes is a modern approach in the field of business management. Together with a lot of management approaches business processes allow us to identify all the resultant actions. Article reveals the modern view on the essence of business processes as well as the general approaches of their allocation. Principles of construction and business process re-engineering are proposed. Recommendations on how to perform re-engineering under high cyclic dynamics of business activity are provided.
Hubble Space Telescope: cost reduction by re-engineering telemetry processing and archiving
NASA Astrophysics Data System (ADS)
Miebach, Manfred P.
1998-05-01
The Hubble Space Telescope (HST), the first of NASA's Great Observatories, was launched on April 24, 1990. The HST was designed for a minimum fifteen-year mission with on-orbit servicing by the Space Shuttle System planned at approximately three-year intervals. Major changes to the HST ground system are planned to be in place for the third servicing mission in December 1999. The primary objectives of the ground system reengineering effort, a project called 'vision December 1999. The primary objectives of the ground system re-engineering effort, a project called 'vision 2000 control center systems (CCS)', are to reduce both development and operating costs significantly for the remaining years of HST's lifetime. Development costs will be reduced by providing a modern hardware and software architecture and utilizing commercial of f the shelf (COTS) products wherever possible. Operating costs will be reduced by eliminating redundant legacy systems and processes and by providing an integrated ground system geared toward autonomous operation. Part of CCS is a Space Telescope Engineering Data Store, the design of which is based on current Data Warehouse technology. The purpose of this data store is to provide a common data source of telemetry data for all HST subsystems. This data store will become the engineering data archive and will include a queryable database for the user to analyze HST telemetry. The access to the engineering data in the Data Warehouse is platform- independent from an office environment using commercial standards. Latest internet technology is used to reach the HST engineering community. A WEB-based user interface allows easy access to the data archives. This paper will provide a high level overview of the CCS system and will illustrate some of the CCS telemetry capabilities. Samples of CCS user interface pages will be given. Vision 2000 is an ambitious project, but one that is well under way. It will allow the HST program to realize reduced operations costs for the Third Servicing Mission and beyond.
A Practical Software Architecture for Virtual Universities
ERIC Educational Resources Information Center
Xiang, Peifeng; Shi, Yuanchun; Qin, Weijun
2006-01-01
This article introduces a practical software architecture called CUBES, which focuses on system integration and evolvement for online virtual universities. The key of CUBES is a supporting platform that helps to integrate and evolve heterogeneous educational applications developed by different organizations. Both standardized educational…
Curtis, Andrew J; Mills, Jacqueline W; Leitner, Michael
2006-10-10
Geographic Information Systems (GIS) can provide valuable insight into patterns of human activity. Online spatial display applications, such as Google Earth, can democratise this information by disseminating it to the general public. Although this is a generally positive advance for society, there is a legitimate concern involving the disclosure of confidential information through spatial display. Although guidelines exist for aggregated data, little has been written concerning the display of point level information. The concern is that a map containing points representing cases of cancer or an infectious disease, could be re-engineered back to identify an actual residence. This risk is investigated using point mortality locations from Hurricane Katrina re-engineered from a map published in the Baton Rouge Advocate newspaper, and a field team validating these residences using search and rescue building markings. We show that the residence of an individual, visualized as a generalized point covering approximately one and half city blocks on a map, can be re-engineered back to identify the actual house location, or at least a close neighbour, even if the map contains little spatial reference information. The degree of re-engineering success is also shown to depend on the urban characteristic of the neighborhood. The results in this paper suggest a need to re-evaluate current guidelines for the display of point (address level) data. Examples of other point maps displaying health data extracted from the academic literature are presented where a similar re-engineering approach might cause concern with respect to violating confidentiality. More research is also needed into the role urban structure plays in the accuracy of re-engineering. We suggest that health and spatial scientists should be proactive and suggest a series of point level spatial confidentiality guidelines before governmental decisions are made which may be reactionary toward the threat of revealing confidential information, thereby imposing draconian limits on research using a GIS.
Curtis, Andrew J; Mills, Jacqueline W; Leitner, Michael
2006-01-01
Background Geographic Information Systems (GIS) can provide valuable insight into patterns of human activity. Online spatial display applications, such as Google Earth, can democratise this information by disseminating it to the general public. Although this is a generally positive advance for society, there is a legitimate concern involving the disclosure of confidential information through spatial display. Although guidelines exist for aggregated data, little has been written concerning the display of point level information. The concern is that a map containing points representing cases of cancer or an infectious disease, could be re-engineered back to identify an actual residence. This risk is investigated using point mortality locations from Hurricane Katrina re-engineered from a map published in the Baton Rouge Advocate newspaper, and a field team validating these residences using search and rescue building markings. Results We show that the residence of an individual, visualized as a generalized point covering approximately one and half city blocks on a map, can be re-engineered back to identify the actual house location, or at least a close neighbour, even if the map contains little spatial reference information. The degree of re-engineering success is also shown to depend on the urban characteristic of the neighborhood. Conclusion The results in this paper suggest a need to re-evaluate current guidelines for the display of point (address level) data. Examples of other point maps displaying health data extracted from the academic literature are presented where a similar re-engineering approach might cause concern with respect to violating confidentiality. More research is also needed into the role urban structure plays in the accuracy of re-engineering. We suggest that health and spatial scientists should be proactive and suggest a series of point level spatial confidentiality guidelines before governmental decisions are made which may be reactionary toward the threat of revealing confidential information, thereby imposing draconian limits on research using a GIS. PMID:17032448
An ERP Implementation and Business Process Reengineering at a Small University.
ERIC Educational Resources Information Center
Yakovlev, Ilya V.
2002-01-01
Describes the reengineering of business practices that took place at the University of Wisconsin-Superior when they implemented an enterprise resource planning (ERP) system, PeopleSoft Student Administration (SA). Discusses lessons learned. (EV)
Walston, S L; Kimberly, J R; Burns, L R
2001-06-01
In the recent past, a number of managerial innovations--including product line management, total quality management, and reengineering--have swept through the hospital industry. Given their pervasiveness and their cost, understanding the mix of factors that influences their adoption is of theoretical interest and practical relevance. The research reported here focuses on this general question by examining influences on the adoption and extensiveness of a particular managerial innovation, hospital reengineering. The results suggest that while economic and institutional factors have influenced the adoption and extensiveness of hospital reengineering, institutional forces play a more important role. The greater influence of institutional forces may be attributed to the high degree of uncertainty in health care, the causal ambiguity of the innovation, and the anticipatory actions of hospitals attempting to position themselves in a rapidly changing environment.
Software framework for the upcoming MMT Observatory primary mirror re-aluminization
NASA Astrophysics Data System (ADS)
Gibson, J. Duane; Clark, Dusty; Porter, Dallan
2014-07-01
Details of the software framework for the upcoming in-situ re-aluminization of the 6.5m MMT Observatory (MMTO) primary mirror are presented. This framework includes: 1) a centralized key-value store and data structure server for data exchange between software modules, 2) a newly developed hardware-software interface for faster data sampling and better hardware control, 3) automated control algorithms that are based upon empirical testing, modeling, and simulation of the aluminization process, 4) re-engineered graphical user interfaces (GUI's) that use state-of-the-art web technologies, and 5) redundant relational databases for data logging. Redesign of the software framework has several objectives: 1) automated process control to provide more consistent and uniform mirror coatings, 2) optional manual control of the aluminization process, 3) modular design to allow flexibility in process control and software implementation, 4) faster data sampling and logging rates to better characterize the approximately 100-second aluminization event, and 5) synchronized "real-time" web application GUI's to provide all users with exactly the same data. The framework has been implemented as four modules interconnected by a data store/server. The four modules are integrated into two Linux system services that start automatically at boot-time and remain running at all times. Performance of the software framework is assessed through extensive testing within 2.0 meter and smaller coating chambers at the Sunnyside Test Facility. The redesigned software framework helps ensure that a better performing and longer lasting coating will be achieved during the re-aluminization of the MMTO primary mirror.
Investigating interoperability of the LSST data management software stack with Astropy
NASA Astrophysics Data System (ADS)
Jenness, Tim; Bosch, James; Owen, Russell; Parejko, John; Sick, Jonathan; Swinbank, John; de Val-Borro, Miguel; Dubois-Felsmann, Gregory; Lim, K.-T.; Lupton, Robert H.; Schellart, Pim; Krughoff, K. S.; Tollerud, Erik J.
2016-07-01
The Large Synoptic Survey Telescope (LSST) will be an 8.4m optical survey telescope sited in Chile and capable of imaging the entire sky twice a week. The data rate of approximately 15TB per night and the requirements to both issue alerts on transient sources within 60 seconds of observing and create annual data releases means that automated data management systems and data processing pipelines are a key deliverable of the LSST construction project. The LSST data management software has been in development since 2004 and is based on a C++ core with a Python control layer. The software consists of nearly a quarter of a million lines of code covering the system from fundamental WCS and table libraries to pipeline environments and distributed process execution. The Astropy project began in 2011 as an attempt to bring together disparate open source Python projects and build a core standard infrastructure that can be used and built upon by the astronomy community. This project has been phenomenally successful in the years since it has begun and has grown to be the de facto standard for Python software in astronomy. Astropy brings with it considerable expectations from the community on how astronomy Python software should be developed and it is clear that by the time LSST is fully operational in the 2020s many of the prospective users of the LSST software stack will expect it to be fully interoperable with Astropy. In this paper we describe the overlap between the LSST science pipeline software and Astropy software and investigate areas where the LSST software provides new functionality. We also discuss the possibilities of re-engineering the LSST science pipeline software to build upon Astropy, including the option of contributing affliated packages.
Department of Defense Enterprise Requirements and Acquisition Model
2011-06-01
Laboratories’ Center for Reengineering and Enabling Technology ( CRET ) to develop the APM. The APM is a compilation of policy, instructions, and...CDD Capabilities Development Document CBA Capabilities-Based Assessment CPD Capability Production Document CRET Center for Re-engineering
Viceconti, M; Testi, D; Gori, R; Zannoni, C
2000-01-01
The present work describes a technology transfer project called HIPCOM devoted to the re-engineering of the process used by a medical devices manufacturer to design custom-made hip prostheses. Although it started with insufficient support from the end-user management, a very tight scheduling and a moderate budget, the project developed into what is considered by all partners a success story. In particular, the development of the design software, called HIPCOM Interactive Design Environment (HIDE) was completed in a time shorter than any optimistic expectation. The software was quite stable since its first beta version, and once introduced at the user site it fully replaced the original procedure in less than two months. One year after the early adoption, more than 80 custom-made prostheses had been designed with HIDE and the user had reported only two bugs, both cosmetics. The scope of the present work was to report the development experience and to investigate the reasons for these positive results, with particular reference to the development procedure and the software architecture. The choice of TCL/TK as development language and the adoption of well-defined software architecture were found to be the success key factors. Other important determinants were found to be the adoption of an incremental software engineering strategy, well suited for small to medium projects and the presence in the development staff of a technology transfer expert.
IDC Reengineering Iteration I2 Architectural Prototype Reports
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamlet, Benjamin R.
To fulfill the inception phase deliverable “Demonstration of architectural prototype“ the SNL IDC Reengineering project team is providing seven reports describing system prototyping work completed between October 2012 and October 2014as part of the SNL US NDC Modernization project.
How TSPSM Implementation has Evolved at AV-8B
2008-05-01
How TSPSM Implementation Has Evolved at AV-8B Chris Rickets AV-8B Joint Systems/Software Support Activity (JSSA) Brad Hodgins NAVAIR Systems...members need to be involved) NAVAIR Software/Systems Support Center (NSSC) Slide 40 Contact Information • Chris Rickets (AV-8B Sr. Software Engr...phone: (760) 939-5838 – e-mail: chris rickets @navy mil . . • Brad Hodgins (NAVAIR TSP Coach supporting AV-8B ) – phone: (760) 939-0666/4446 – e
Improving Software Engineering on NASA Projects
NASA Technical Reports Server (NTRS)
Crumbley, Tim; Kelly, John C.
2010-01-01
Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.
A business process modeling experience in a complex information system re-engineering.
Bernonville, Stéphanie; Vantourout, Corinne; Fendeler, Geneviève; Beuscart, Régis
2013-01-01
This article aims to share a business process modeling experience in a re-engineering project of a medical records department in a 2,965-bed hospital. It presents the modeling strategy, an extract of the results and the feedback experience.
IDC System Specification Document.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clifford, David J.
2014-12-01
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Reengineering Phase 2 project. Revisions Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Reengineering Project Team Initial delivery M. Harris
The Business Practices Course: Self-Study Learning Reengineered.
ERIC Educational Resources Information Center
Acovelli, Marianne; Nowakowski, Alan
1994-01-01
Describes an interactive multimedia Business Practices Course, developed to help reengineer point-of-need training for professional development, in which the learner conducts a business review of a company's current operations and makes recommendations to improve productivity and increase customer satisfaction. (six references) (LRW)
Johnson, E
1995-01-01
The author describes reengineering efforts in the laboratory of a 550-bed hospital. Key benefits include reduced costs, improved turnaround time, and redirection of staff into new roles in information management and outreach.
Managing hospital supplies: process reengineering at Gujarat Cancer Research Institute, India.
Ramani, K V
2006-01-01
Aims to give an overview of the re-engineering of processes and structures at Gujarat Cancer Research Institute (GCRI), Ahmedabad. A general review of the design, development and implementation of reengineered systems in order to address concerns about the existing systems. Findings GCRI is a comprehensive cancer care center with 550 beds and well equipped with modern diagnostic and treatment facilities. It serves about 200,000 outpatients and 16,000 inpatients annually. The approach to a better management of hospital supplies led to the design, development, and implementation of an IT-based reengineered and integrated purchase and inventory management system. The new system has given GCRI a saving of about 8 percent of its annual costs of purchases, and improved the availability of materials to the user departments. Shows that the savings obtained are used not only for buying more hospital supplies, but also to buy better quality of hospital supplies, and thereby satisfactorily address the GCRI responsibility towards meeting its social obligations for cancer care.
The role of the hospital in a changing environment.
McKee, M.; Healy, J.
2000-01-01
Hospitals pose many challenges to those undertaking reform of health care systems. This paper examines the evolving role of the hospital within the health care system in industrialized countries and explores the evidence on which policy-makers might base their decisions. It begins by tracing the evolving concept of the hospital, concluding that hospitals must continue to evolve in response to factors such as changing health care needs and emerging technologies. The size and distribution of hospitals are matters for ongoing debate. This paper concludes that evidence in favour of concentrating hospital facilities, whether as a means of enhancing effectiveness or efficiency, is less robust than is often assumed. Noting that care provided in hospitals is often less than satisfactory, this paper summarizes the evidence underlying three reform strategies: (i) behavioural interventions such as quality assurance programmes; (ii) changing organizational culture; and (iii) the use of financial incentives. Isolated behavioural interventions have a limited impact, but are more effective when combined. Financial incentives are blunt instruments that must be monitored. Organizational culture, which has previously received relatively little attention, appears to be an important determinant of quality of care and is threatened by ill-considered policies intended to 're-engineer' hospital services. Overall, evidence on the effectiveness of policies relating to hospitals is limited and this paper indicates where such evidence can be found. PMID:10916917
Software Design Methodology Migration for a Distributed Ground System
NASA Technical Reports Server (NTRS)
Ritter, George; McNair, Ann R. (Technical Monitor)
2002-01-01
The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has been developed and has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes. The new Software processes still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Process have evolved highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project .
Irma 5.1 multisensor signature prediction model
NASA Astrophysics Data System (ADS)
Savage, James; Coker, Charles; Edwards, Dave; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles
2006-05-01
The Irma synthetic signature prediction code is being developed to facilitate the research and development of multi-sensor systems. Irma was one of the first high resolution, physics-based Infrared (IR) target and background signature models to be developed for tactical weapon applications. Originally developed in 1980 by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN), the Irma model was used exclusively to generate IR scenes. In 1988, a number of significant upgrades to Irma were initiated including the addition of a laser (or active) channel. This two-channel version was released to the user community in 1990. In 1992, an improved scene generator was incorporated into the Irma model, which supported correlated frame-to-frame imagery. A passive IR/millimeter wave (MMW) code was completed in 1994. This served as the cornerstone for the development of the co-registered active/passive IR/MMW model, Irma 4.0. In 2000, Irma version 5.0 was released which encompassed several upgrades to both the physical models and software. Circular polarization was added to the passive channel, and a Doppler capability was added to the active MMW channel. In 2002, the multibounce technique was added to the Irma passive channel. In the ladar channel, a user-friendly Ladar Sensor Assistant (LSA) was incorporated which provides capability and flexibility for sensor modeling. Irma 5.0 runs on several platforms including Windows, Linux, Solaris, and SGI Irix. Irma is currently used to support a number of civilian and military applications. The Irma user base includes over 130 agencies within the Air Force, Army, Navy, DARPA, NASA, Department of Transportation, academia, and industry. In 2005, Irma version 5.1 was released to the community. In addition to upgrading the Ladar channel code to an object oriented language (C++) and providing a new graphical user interface to construct scenes, this new release significantly improves the modeling of the ladar channel and includes polarization effects, time jittering, speckle effect, and atmospheric turbulence. More importantly, the Munitions Directorate has funded three field tests to verify and validate the re-engineered ladar channel. Each of the field tests was comprehensive and included one month of sensor characterization and a week of data collection. After each field test, the analysis included comparisons of Irma predicted signatures with measured signatures, and if necessary, refining the model to produce realistic imagery. This paper will focus on two areas of the Irma 5.1 development effort: report on the analysis results of the validation and verification of the Irma 5.1 ladar channel, and the software development plan and validation efforts of the Irma passive channel. As scheduled, the Irma passive code is being re-engineered using object oriented language (C++), and field data collection is being conducted to validate the re-engineered passive code. This software upgrade will remove many constraints and limitations of the legacy code including limits on image size and facet counts. The field test to validate the passive channel is expected to be complete in the second quarter of 2006.
ERIC Educational Resources Information Center
Mungaray-Lagarda, Alejandro
2002-01-01
Explaining the Mexico has faced deep economic and social changes over the last decade, explores the changes or re-engineering needed in the country's higher education institutions to improve institutional efficiency through greater linkages with local economic and social development. (EV)
IDC Re-Engineering Phase 2 Glossary Version 1.3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Young, Christopher J.; Harris, James M.
2017-01-01
This document contains the glossary of terms used for the IDC Re-Engineering Phase 2 project. This version was created for Iteration E3. The IDC applies automatic processing methods in order to produce, archive, and distribute standard IDC products on behalf of all States Parties.
Reengineering a cardiovascular surgery service.
Tunick, P A; Etkin, S; Horrocks, A; Jeglinski, G; Kelly, J; Sutton, P
1997-04-01
Reengineering, involving the radical redesign of business processes, has been used successfully in a variety of health care settings. In 1994 New York University (NYU) Medical Center (MC) launched its first reengineering team, whose purpose was to redesign the entire process of caring for patients-from referral to discharge-on the cardiovascular (CV) surgery service. REENIGINEERING TEAM: The multidisciplinary CV Surgery Reengineering Team was charged with two goals: improving customer (patient, family, and referring physician) satisfaction and improving profitability. The methodology to be used was based on a reengineering philosophy-discarding basic assumptions and designing the patient care process from the ground up. THE TRANSFER-IN INITIATIVE: A survey of NYU cardiologists, distributed in April 1994, suggested that the organization was considered a difficult place to transfer patients. The team's recommendations led to a new, streamlined transfer-in policy. The average waiting time from when a referring physician requested a patient transfer and the time when an NYUMC physician accepted the transfer decreased from an average of 9 hours under the old system to immediate acceptance. Three customer satisfaction task forces implemented multiple programs to make the service more user friendly. In addition, referrals increased and length of stay decreased, without an adverse impact on the mortality rate. For the first time at NYUMC, a multidisciplinary team was given the mandate to achieve major changes in an entire patient care process. Similar projects are now underway.
Reengineering the Acquisition/Procurement Process: A Methodology for Requirements Collection
NASA Technical Reports Server (NTRS)
Taylor, Randall; Vanek, Thomas
2011-01-01
This paper captures the systematic approach taken by JPL's Acquisition Reengineering Project team, the methodology used, challenges faced, and lessons learned. It provides pragmatic "how-to" techniques and tools for collecting requirements and for identifying areas of improvement in an acquisition/procurement process or other core process of interest.
ERIC Educational Resources Information Center
Jeyaraj, Anand
2010-01-01
The design of enterprise information systems requires students to master technical skills for elicitation, modeling, and reengineering business processes as well as soft skills for information gathering and communication. These tacit skills and behaviors cannot be effectively taught students but rather experienced and learned by students. This…
As Easy as ABC: Re-engineering the Cost Accounting System.
ERIC Educational Resources Information Center
Trussel, John M.; Bitner, Larry N.
1996-01-01
To be useful for management decision making, the college or university's cost accounting system must capture and measure improvements. Activity-based costing (ABC), which determines more accurately the full costs of services and products, tracks improvements and should proceed alongside reengineering of institutional accounting. Guidelines are…
Lessons Learned in Process Reengineering at a Community College.
ERIC Educational Resources Information Center
Jaacks, Gayle E.; Kurtz, Michael
1999-01-01
Summarizes the successful reengineering of business processes to take full advantage of new functionality in a vendor system upgrade at Western Iowa Tech Community College. Suggests that to truly benefit from implementing new systems or major system upgrades, an institution must streamline processes, eliminate duplication of effort, and examine…
Reshaping the Enterprise through an Information Architecture and Process Reengineering.
ERIC Educational Resources Information Center
Laudato, Nicholas C.; DeSantis, Dennis J.
1995-01-01
The approach used by the University of Pittsburgh (Pennsylvania) in designing a campus-wide information architecture and a framework for reengineering the business process included building consensus on a general philosophy for information systems, using pattern-based abstraction techniques, applying data modeling and application prototyping, and…
A Performance Evaluation of the Cray X1 for Scientific Applications
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Biswas, Rupak; Borrill, Julian; Canning, Andrew; Carter, Jonathan; Djomehri, M. Jahed; Shan, Hongzhang; Skinner, David
2004-01-01
The last decade has witnessed a rapid proliferation of superscalar cache-based microprocessors to build high-end capability and cost effectiveness. However, the recent development of massively parallel vector systems is having a significant effect on the supercomputing landscape. In this paper, we compare the performance of the recently released Cray X1 vector system with that of the cacheless NEC SX-6 vector machine, and the superscalar cache-based IBM Power3 and Power4 architectures for scientific applications. Overall results demonstrate that the X1 is quite promising, but performance improvements are expected as the hardware, systems software, and numerical libraries mature. Code reengineering to effectively utilize the complex architecture may also lead to significant efficiency enhancements.
Monitoring software development through dynamic variables
NASA Technical Reports Server (NTRS)
Doerflinger, Carl W.; Basili, Victor R.
1983-01-01
Research conducted by the Software Engineering Laboratory (SEL) on the use of dynamic variables as a tool to monitor software development is described. Project independent measures which may be used in a management tool for monitoring software development are identified. Several FORTRAN projects with similar profiles are examined. The staff was experienced in developing these types of projects. The projects developed serve similar functions. Because these projects are similar some underlying relationships exist that are invariant between projects. These relationships, once well defined, may be used to compare the development of different projects to determine whether they are evolving the same way previous projects in this environment evolved.
A UML-based metamodel for software evolution process
NASA Astrophysics Data System (ADS)
Jiang, Zuo; Zhou, Wei-Hong; Fu, Zhi-Tao; Xiong, Shun-Qing
2014-04-01
A software evolution process is a set of interrelated software processes under which the corresponding software is evolving. An object-oriented software evolution process meta-model (OO-EPMM), abstract syntax and formal OCL constraint of meta-model are presented in this paper. OO-EPMM can not only represent software development process, but also represent software evolution.
Towards a high performance geometry library for particle-detector simulations
Apostolakis, J.; Bandieramonte, M.; Bitzes, G.; ...
2015-05-22
Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less
Towards a high performance geometry library for particle-detector simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Apostolakis, J.; Bandieramonte, M.; Bitzes, G.
Thread-parallelization and single-instruction multiple data (SIMD) ”vectorisation” of software components in HEP computing has become a necessity to fully benefit from current and future computing hardware. In this context, the Geant-Vector/GPU simulation project aims to re-engineer current software for the simulation of the passage of particles through detectors in order to increase the overall event throughput. As one of the core modules in this area, the geometry library plays a central role and vectorising its algorithms will be one of the cornerstones towards achieving good CPU performance. Here, we report on the progress made in vectorising the shape primitives, asmore » well as in applying new C++ template based optimizations of existing code available in the Geant4, ROOT or USolids geometry libraries. We will focus on a presentation of our software development approach that aims to provide optimized code for all use cases of the library (e.g., single particle and many-particle APIs) and to support different architectures (CPU and GPU) while keeping the code base small, manageable and maintainable. We report on a generic and templated C++ geometry library as a continuation of the AIDA USolids project. As a result, the experience gained with these developments will be beneficial to other parts of the simulation software, such as for the optimization of the physics library, and possibly to other parts of the experiment software stack, such as reconstruction and analysis.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... Assessment Monitoring Stations (PAMS) Network Re-engineering project. DATES: The public teleconference [email protected] . General information concerning the EPA CASAC can be found on the EPA Web site at http://www... Review of EPA's Photochemical Assessment Monitoring Stations (PAMS) Network Re-engineering project. The...
ERIC Educational Resources Information Center
Seabra, Antonio C.; Consonni, Denise
Brazilian engineering schools are under a strict program to reengineer their courses with the financial support of the federal agencies. At the electronic engineering department at the University of Sao Paulo, this process started by modifying the Basic Electricity and Electronic Laboratories. This paper describes the new structure of these labs…
ERIC Educational Resources Information Center
Freifeld, Lorri
2010-01-01
With technology changing every second of every day, it is no surprise a learning management system (LMS) quickly can become outdated. But it is no easy task to re-engineer a current LMS or find exactly the right new one to purchase. In this article, three 2010 Top Young Trainers share their experiences with implementing or re-engineering an…
Solving the Software Legacy Problem with RISA
NASA Astrophysics Data System (ADS)
Ibarra, A.; Gabriel, C.
2012-09-01
Nowadays hardware and system infrastructure evolve on time scales much shorter than the typical duration of space astronomy missions. Data processing software capabilities have to evolve to preserve the scientific return during the entire experiment life time. Software preservation is a key issue that has to be tackled before the end of the project to keep the data usable over many years. We present RISA (Remote Interface to Science Analysis) as a solution to decouple data processing software and infrastructure life-cycles, using JAVA applications and web-services wrappers to existing software. This architecture employs embedded SAS in virtual machines assuring a homogeneous job execution environment. We will also present the first studies to reactivate the data processing software of the EXOSAT mission, the first ESA X-ray astronomy mission launched in 1983, using the generic RISA approach.
ERIC Educational Resources Information Center
Pellerin, Robert; Hadaya, Pierre
2008-01-01
Recognizing the need to teach ERP implementation and business process reengineering (BPR) concepts simultaneously, as well as the pedagogical limitations of the case teaching method and simulation tools, the objective of this study is to propose a new framework and an innovative teaching approach to improve the ERP training experience for IS…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-05
... draft plans for Photochemical Assessment Monitoring Stations (PAMS) Network Re-engineering. DATES: A... information concerning the EPA CASAC can be found at the EPA CASAC Web site at http://www.epa.gov/casac . Any inquiry regarding EPA's draft plans for PAMS Network Re-engineering should be directed to Mr. Kevin...
Downsizing, reengineering and patient safety: numbers, newness and resultant risk.
Knox, G E; Kelley, M; Hodgson, S; Simpson, K R; Carrier, L; Berry, D
1999-01-01
Downsizing and reengineering are facts of life in contemporary healthcare organizations. In most instances, these organizational changes are undertaken in an attempt to increase productivity or cut operational costs with results measured in these terms. Less often considered are potential detrimental effects on patient safety or strategies, which might be used to minimize these risks.
Reforms in Education: The Need for Re-Engineering Teacher Education for Sustainable Development
ERIC Educational Resources Information Center
Ofoego, O. C.; Ebebe, I. E.
2016-01-01
The paper is concerned with reforms in Education and the need for re-engineering Teacher education in Nigeria for better professionalism and National Development. In the process, key concepts like Teacher Education and professionalism were explained. A brief review of the state of Teacher Education and Development in Nigeria revealed the…
ERIC Educational Resources Information Center
Abdous, M'hammed; He, Wu
2009-01-01
During the past three years, we have developed and implemented an enterprise information system (EIS) to reengineer and facilitate the administrative process for preparing and teaching distance learning courses in a midsized-to-large university (with 23,000 students). The outcome of the implementation has been a streamlined and efficient process…
Software metrics: Software quality metrics for distributed systems. [reliability engineering
NASA Technical Reports Server (NTRS)
Post, J. V.
1981-01-01
Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.
Modeling Student Software Testing Processes: Attitudes, Behaviors, Interventions, and Their Effects
ERIC Educational Resources Information Center
Buffardi, Kevin John
2014-01-01
Effective software testing identifies potential bugs and helps correct them, producing more reliable and maintainable software. As software development processes have evolved, incremental testing techniques have grown in popularity, particularly with introduction of test-driven development (TDD). However, many programmers struggle to adopt TDD's…
Re-Engineering of the Hubble Space Telescope (HST) to Reduce Operational Costs
NASA Technical Reports Server (NTRS)
Garvis, Michael; Dougherty, Andrew; Whittier, Wallace
1996-01-01
Satellite telemetry processing onboard the Hubble Space Telescope (HST) is carried out using dedicated software and hardware. The current ground system is expensive to operate and maintain. The mandate to reduce satellite ground system operations and maintenance costs by the year 2000 led NASA to upgrade the command and control systems in order to improve the data processing capabilities, reduce operator experience levels and increase system standardization. As a result, a command and control system product development team was formed to redesign and develop the HST ground system. The command and control system ground system development consists of six elements. The results of the prototyping phase carried out for the following of these elements are presented: the front end processor; middleware, and the graphical user interface.
A Performance Evaluation of the Cray X1 for Scientific Applications
NASA Technical Reports Server (NTRS)
Oliker, Leonid; Biswas, Rupak; Borrill, Julian; Canning, Andrew; Carter, Jonathan; Djomehri, M. Jahed; Shan, Hongzhang; Skinner, David
2003-01-01
The last decade has witnessed a rapid proliferation of superscalar cache-based microprocessors to build high-end capability and capacity computers because of their generality, scalability, and cost effectiveness. However, the recent development of massively parallel vector systems is having a significant effect on the supercomputing landscape. In this paper, we compare the performance of the recently-released Cray X1 vector system with that of the cacheless NEC SX-6 vector machine, and the superscalar cache-based IBM Power3 and Power4 architectures for scientific applications. Overall results demonstrate that the X1 is quite promising, but performance improvements are expected as the hardware, systems software, and numerical libraries mature. Code reengineering to effectively utilize the complex architecture may also lead to significant efficiency enhancements.
Mass Spectrometry Using Nanomechanical Systems: Beyond the Point-Mass Approximation.
Sader, John E; Hanay, M Selim; Neumann, Adam P; Roukes, Michael L
2018-03-14
The mass measurement of single molecules, in real time, is performed routinely using resonant nanomechanical devices. This approach models the molecules as point particles. A recent development now allows the spatial extent (and, indeed, image) of the adsorbate to be characterized using multimode measurements ( Hanay , M. S. , Nature Nanotechnol. , 10 , 2015 , pp 339 - 344 ). This "inertial imaging" capability is achieved through virtual re-engineering of the resonator's vibrating modes, by linear superposition of their measured frequency shifts. Here, we present a complementary and simplified methodology for the analysis of these inertial imaging measurements that exhibits similar performance while streamlining implementation. This development, together with the software that we provide, enables the broad implementation of inertial imaging that opens the door to a range of novel characterization studies of nanoscale adsorbates.
2008-01-01
130, etc), the option to re-engine or place winglets on the wings of TACAIR aircraft does not exist. Bio-fuel is not an option for aviation35 and...TACAIR aircraft can not use alternative fuels, re-engine their aircraft, install winglets , or adjust their sortie lengths in an effort to reduce jet
EOS Operations Systems: EDOS Implemented Changes to Reduce Operations Costs
NASA Technical Reports Server (NTRS)
Cordier, Guy R.; Gomez-Rosa, Carlos; McLemore, Bruce D.
2007-01-01
The authors describe in this paper the progress achieved to-date with the reengineering of the Earth Observing System (EOS) Data and Operations System (EDOS), the experience gained in the process and the ensuing reduction of ground systems operations costs. The reengineering effort included a major methodology change, applying to an existing schedule driven system, a data-driven system approach.
ERIC Educational Resources Information Center
Seidman, Robert
As part of an effort to reduce the costs of attending college, in 1996 New Hampshire College began developing a three-year bachelor's degree program in Business Administration with the goal of reengineering the existing four-year, eight-semester program into a competency- and outcomes-based, team-taught, cohort-based, interdisciplinary, and…
Reengineering of waste management at the Oak Ridge National Laboratory. Volume 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Myrick, T.E.
1997-08-01
A reengineering evaluation of the waste management program at the Oak Ridge National Laboratory (ORNL) was conducted during the months of February through July 1997. The goal of the reengineering was to identify ways in which the waste management process could be streamlined and improved to reduce costs while maintaining full compliance and customer satisfaction. A Core Team conducted preliminary evaluations and determined that eight particular aspects of the ORNL waste management program warranted focused investigations during the reengineering. The eight areas included Pollution Prevention, Waste Characterization, Waste Certification/Verification, Hazardous/Mixed Waste Stream, Generator/WM Teaming, Reporting/Records, Disposal End Points, and On-Sitemore » Treatment/Storage. The Core Team commissioned and assembled Process Teams to conduct in-depth evaluations of each of these eight areas. The Core Team then evaluated the Process Team results and consolidated the 80 process-specific recommendations into 15 overall recommendations. Benchmarking of a commercial nuclear facility, a commercial research facility, and a DOE research facility was conducted to both validate the efficacy of these findings and seek additional ideas for improvement. The outcome of this evaluation is represented by the 15 final recommendations that are described in this report.« less
Reinvention/reengineering of business and technical processes
NASA Technical Reports Server (NTRS)
Olsen, Eugene A.
1996-01-01
The changing marketplace as evidenced by global competition is requiring American organizations to rethink, regroup, and redesign their processes. The umbrella of total quality management (TQM) includes many quality methods, techniques, tools, and approaches. There is no right way for every situation or circumstance. Adaptability and experimentation of several tools is necessary. Process management when properly applied can lead to continuous quality improvements. But some processes simply need to be discarded and new ones developed. This reengineering often results in vertical compression and job redesign and restructuring. Work activities must be designed around processes, not processes around work activities. Reengineering and process management do not stand alone--they support each other. Senior executive leadership and empowerment of workers at all organizational levels is vital for both short-term and long-term success.
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.
2011-01-01
As a result of recommendation from the Augustine Panel, the direction for Human Space Flight has been altered from the original plan referred to as Constellation. NASA s Human Exploration Framework Team (HEFT) proposes the use of a Shuttle Derived Heavy Lift Launch Vehicle (SDLV) and an Orion derived spacecraft (salvaged from Constellation) to support a new flexible direction for space exploration. The SDLV must be developed within an environment of a constrained budget and a preferred fast development schedule. Thus, it has been proposed to utilize existing assets from the Shuttle Program to speed development at a lower cost. These existing assets should not only include structures such as external tanks or solid rockets, but also the Flight Software which has traditionally been a "long pole" in new development efforts. The avionics and software for the Space Shuttle was primarily developed in the 70 s and considered state of the art for that time. As one may argue that the existing avionics and flight software may be too outdated to support the new SDLV effort, this is a fallacy if they can be evolved over time into a "modern avionics" platform. The technology may be outdated, but the avionics concepts and flight software algorithms are not. The reuse of existing avionics and software also allows for the reuse of development, verification, and operations facilities. The keyword is evolve in that these assets can support the fast development of such a vehicle, but then be gradually evolved over time towards more modern platforms as budget and schedule permits. The "gold" of the flight software is the "control loop" algorithms of the vehicle. This is the Guidance, Navigation, and Control (GNC) software algorithms. This software is typically the most expensive to develop, test, and verify. Thus, the approach is to preserve the GNC flight software, while first evolving the supporting software (such as Command and Data Handling, Caution and Warning, Telemetry, etc.). This can be accomplished by gradually removing the "support software" from the legacy flight software leaving only the GNC algorithms. The "support software" could be re-developed for modern platforms, while leaving the GNC algorithms to execute on technology compatible with the legacy system. It is also possible to package the GNC algorithms into an emulated version of the original computer (via Field Programmable Gate Arrays or FPGAs), thus becoming a "GNC on a Chip" solution where it could live forever to be embedded in modern avionics platforms.
Foreign Outsourcing of the U.S. Electronics Industry
1993-04-01
There is also great concern over the success of Japan and the "Asian Tigers" (Hong Kong, Korea, Malaysia , Singapore, and Taiwan) in capturing market share...Quality Management), JIT (Just In Time) inventory, MRP (Manufacturing Resources Planning), BPR (Business Process Reengineering), and EI ( Employee ...planning, business process re-engineering, employee empowering, just to name a few. All these techniques look to address the requirement to remain
ERIC Educational Resources Information Center
Sofoluwe, Abayomi Olumade
2013-01-01
The purpose of the study is to re-engineer vocational and technical education for sustainable development in the North Central Geo-Political Zone in Nigeria. The research design adopted was a survey inferential type. Stratified random was used to select 36 schools out of 98 schools while 920 students out of 3680 students were sampled. The data…
Designing a place for automation.
Bazzoli, F
1995-05-01
Re-engineering is a hot topic in health care as market forces increase pressure to cut costs. Providers and payers that are redesigning their business processes are counting on information systems to help achieve simplification and make large gains in efficiency. But these same organizations say they're reluctant to make large upfront investments in information systems until they know exactly what role technology will play in the re-engineered entity.
RE-ENGINEERING PRIMARY HEALTHCARE NURSING AS A FIRST CAREER CHOICE.
Wheeler, Emily; Govan, Linda
2016-08-01
In line with international models and critical to the primary healthcare nursing workforce, the Australian Primary Health Care Nursing Association (APNA) has been funded by the Commonwealth Department of Health to develop an Education and Career Framework and Toolkit for primary healthcare nurses. The aim of the project is to improve the recruitment and retention of nurses and to re-engineer primary healthcare as a first choice career option.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clifford, David J.; Harris, James M.
2014-12-01
This is the IDC Re-Engineering Phase 2 project Integrated Master Plan (IMP). The IMP presents the major accomplishments planned over time to re-engineer the IDC system. The IMP and the associate Integrated Master Schedule (IMS) are used for planning, scheduling, executing, and tracking the project technical work efforts. REVISIONS Version Date Author/Team Revision Description Authorized by V1.0 12/2014 IDC Re- engineering Project Team Initial delivery M. Harris
Business process re-engineering a cardiology department.
Bakshi, Syed Murtuza Hussain
2014-01-01
The health care sector is the world's third largest industry and is facing several problems such as excessive waiting times for patients, lack of access to information, high costs of delivery and medical errors. Health care managers seek the help of process re-engineering methods to discover the best processes and to re-engineer existing processes to optimize productivity without compromising on quality. Business process re-engineering refers to the fundamental rethinking and radical redesign of business processes to achieve dramatic improvements in critical, contemporary measures of performance, such as cost, quality and speed. The present study is carried out at a tertiary care corporate hospital with 1000-plus-bed facility. A descriptive study and case study method is used with intensive, careful and complete observation of patient flow, delays, short comings in patient movement and workflow. Data is collected through observations, informal interviews and analyzed by matrix analysis. Flowcharts were drawn for the various work activities of the cardiology department including workflow of the admission process, workflow in the ward and ICCU, workflow of the patient for catheterization laboratory procedure, and in the billing and discharge process. The problems of the existing system were studied and necessary suggestions were recommended to cardiology department module with an illustrated flowchart.
Virtual Immunology: Software for Teaching Basic Immunology
ERIC Educational Resources Information Center
Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio
2013-01-01
As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available…
Automatic Molecular Design using Evolutionary Techniques
NASA Technical Reports Server (NTRS)
Globus, Al; Lawton, John; Wipke, Todd; Saini, Subhash (Technical Monitor)
1998-01-01
Molecular nanotechnology is the precise, three-dimensional control of materials and devices at the atomic scale. An important part of nanotechnology is the design of molecules for specific purposes. This paper describes early results using genetic software techniques to automatically design molecules under the control of a fitness function. The fitness function must be capable of determining which of two arbitrary molecules is better for a specific task. The software begins by generating a population of random molecules. The population is then evolved towards greater fitness by randomly combining parts of the better individuals to create new molecules. These new molecules then replace some of the worst molecules in the population. The unique aspect of our approach is that we apply genetic crossover to molecules represented by graphs, i.e., sets of atoms and the bonds that connect them. We present evidence suggesting that crossover alone, operating on graphs, can evolve any possible molecule given an appropriate fitness function and a population containing both rings and chains. Prior work evolved strings or trees that were subsequently processed to generate molecular graphs. In principle, genetic graph software should be able to evolve other graph representable systems such as circuits, transportation networks, metabolic pathways, computer networks, etc.
Study on the E-commerce platform based on the agent
NASA Astrophysics Data System (ADS)
Fu, Ruixue; Qin, Lishuan; Gao, Yinmin
2011-10-01
To solve problem of dynamic integration in e-commerce, the Multi-Agent architecture of electronic commerce platform system based on Agent and Ontology has been introduced, which includes three major types of agent, Ontology and rule collection. In this architecture, service agent and rule are used to realize the business process reengineering, the reuse of software component, and agility of the electronic commerce platform. To illustrate the architecture, a simulation work has been done and the results imply that the architecture provides a very efficient method to design and implement the flexible, distributed, open and intelligent electronic commerce platform system to solve problem of dynamic integration in ecommerce. The objective of this paper is to illustrate the architecture of electronic commerce platform system, and the approach how Agent and Ontology support the electronic commerce platform system.
A network-oriented business modeling environment
NASA Astrophysics Data System (ADS)
Bisconti, Cristian; Storelli, Davide; Totaro, Salvatore; Arigliano, Francesco; Savarino, Vincenzo; Vicari, Claudia
The development of formal models related to the organizational aspects of an enterprise is fundamental when these aspects must be re-engineered and digitalized, especially when the enterprise is involved in the dynamics and value flows of a business network. Business modeling provides an opportunity to synthesize and make business processes, business rules and the structural aspects of an organization explicit, allowing business managers to control their complexity and guide an enterprise through effective decisional and strategic activities. This chapter discusses the main results of the TEKNE project in terms of software components that enable enterprises to configure, store, search and share models of any aspects of their business while leveraging standard and business-oriented technologies and languages to bridge the gap between the world of business people and IT experts and to foster effective business-to-business collaborations.
Borrego springs microgrid demonstration project
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
SDG&E has been developing and implementing the foundation for its Smart Grid platform for three decades – beginning with its innovations in automation and control technologies in the 1980s and 1990s, through its most recent Smart Meter deployment and re-engineering of operational processes enabled by new software applications in its OpEx 20/20 (Operational Excellence with a 20/20 Vision) program. SDG&E’s Smart Grid deployment efforts have been consistently acknowledged by industry observers. SDG&E’s commitment and progress has been recognized by IDC Energy Insights and Intelligent Utility Magazine as the nation’s “Most Intelligent Utility” for three consecutive years, winning this award eachmore » year since its inception. SDG&E also received the “Top Ten Utility” award for excellence in Smart Grid development from GreenTech Media.« less
Knowledge Sharing through Pair Programming in Learning Environments: An Empirical Study
ERIC Educational Resources Information Center
Kavitha, R. K.; Ahmed, M. S.
2015-01-01
Agile software development is an iterative and incremental methodology, where solutions evolve from self-organizing, cross-functional teams. Pair programming is a type of agile software development technique where two programmers work together with one computer for developing software. This paper reports the results of the pair programming…
2006-06-01
Headquarters ( MABES TNI) for priority analysis. After that, MABES TNI submits the proposals to the DOD for procurement processes. (Republic of Indonesia... James E., Ernst and Young, “The New Industrial Engineering: Information Technology and Business Process Redesign.” In Business Process Reengineering...The Art of Balancing, Harvard Business Review, November-December 1993. Grover, Varun, Teng, James T.C., and Fiedler, Kirk D., “Technological and
A conceptual framework for managing clinical processes.
Buffone, G J; Moreau, D
1997-01-01
Reengineering of the health care delivery system is underway, as is the transformation of the processes and methods used for recording information describing patient care (i.e., the development of a computer-based record). This report describes the use of object-oriented analysis and design to develop and implement clinical process reengineering as well as the organization of clinical data. In addition, the facility of the proposed framework for implementing workflow computing is discussed.
Reengineering the Project Design Process
NASA Technical Reports Server (NTRS)
Casani, E.; Metzger, R.
1994-01-01
In response to NASA's goal of working faster, better and cheaper, JPL has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center and the Flight System Testbed. Reengineering at JPL implies a cultural change whereby the character of its design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and cost estimating.
Lan, Hongzhi; Updegrove, Adam; Wilson, Nathan M; Maher, Gabriel D; Shadden, Shawn C; Marsden, Alison L
2018-02-01
Patient-specific simulation plays an important role in cardiovascular disease research, diagnosis, surgical planning and medical device design, as well as education in cardiovascular biomechanics. simvascular is an open-source software package encompassing an entire cardiovascular modeling and simulation pipeline from image segmentation, three-dimensional (3D) solid modeling, and mesh generation, to patient-specific simulation and analysis. SimVascular is widely used for cardiovascular basic science and clinical research as well as education, following increased adoption by users and development of a GATEWAY web portal to facilitate educational access. Initial efforts of the project focused on replacing commercial packages with open-source alternatives and adding increased functionality for multiscale modeling, fluid-structure interaction (FSI), and solid modeling operations. In this paper, we introduce a major SimVascular (SV) release that includes a new graphical user interface (GUI) designed to improve user experience. Additional improvements include enhanced data/project management, interactive tools to facilitate user interaction, new boundary condition (BC) functionality, plug-in mechanism to increase modularity, a new 3D segmentation tool, and new computer-aided design (CAD)-based solid modeling capabilities. Here, we focus on major changes to the software platform and outline features added in this new release. We also briefly describe our recent experiences using SimVascular in the classroom for bioengineering education.
The Planning and Scheduling of HST: Improvements and Enhancements since Launch
NASA Astrophysics Data System (ADS)
Taylor, D. K.; Chance, D. R.; Jordan, I. J. E.; Patterson, A. P.; Stanley, M.; Taylor, D. C.
2001-12-01
The planning and scheduling (P&S) systems used in operating the Hubble Space Telescope (HST) have undergone such substantial and pervasive re-engineering that today they dimly resemble those used when HST was launched. Processes (i.e., software, procedures, networking, etc.) which allow program implementation, the generation of a Long Range Plan (LRP), and the scheduling of science and mission activities have improved drastically in nearly 12 years, resulting in a consistently high observing efficiency, a stable LRP that principal investigators can use, exceptionally clean command loads uplinked to the spacecraft, and the capability of a very fast response time due to onboard anomalies or targets of opportunity. In this presentation we describe many of the systems which comprise the P&S ("front-end") system for HST, how and why they were improved, and what benefits have been realized by either the HST user community or the STScI staff. The systems include the Guide Star System, the Remote Proposal Submission System - 2 (RPS2), Artificial Intelligence (AI) planning tools such as Spike, and the science and mission scheduling software. We also describe how using modern software languages such as Python and better development practices allow STScI staff to do more with HST (e.g., to handle much more science data when ACS is installed) without increasing the cost to HST operations.
Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications.
Patou, François; AlZahra'a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E
2016-09-03
The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.
Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications
Patou, François; AlZahra’a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E.
2016-01-01
The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods. PMID:27598208
NASA Technical Reports Server (NTRS)
1976-01-01
A software analysis was performed of known STS sortie payload elements and their associated experiments. This provided basic data for STS payload software characteristics and sizes. A set of technology drivers was identified based on a survey of future technology needs and an assessment of current software technology. The results will be used to evolve a planned approach to software technology development. The purpose of this plan is to ensure that software technology is advanced at a pace and a depth sufficient to fulfill the identified future needs.
Analytical Design of Evolvable Software for High-Assurance Computing
2001-02-14
Mathematical expression for the Total Sum of Squares which measures the variability that results when all values are treated as a combined sample coming from...primarily interested in background on software design and high-assurance computing, research in software architecture generation or evaluation...respectively. Those readers solely interested in the validation of a software design approach should at the minimum read Chapter 6 followed by Chapter
ERIC Educational Resources Information Center
Williams, Lawrence H., Jr.
2013-01-01
This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…
2007-04-30
numerous reengineering projects and developed a new objective method for objectively measuring the value-added by reengineering. His last assignment...in the corporate world was as the Chief of Consumer Market Research for Telecom Italia in Venice, Italy, where he developed new methods for ...predicting the adoption rates for new interactive multimedia broadband applications. He is Managing Partner for Business Process Auditors, a firm that
MTMC’s Approach to Reengineering DoD’s Personal Property Business Process.
1996-01-01
Logistics Management Institute MTMG’s Approach to Reengineering DoD’s Personal Property Business Process MT502MR2 Alfred H.. Beyer James E...concerns, as expressed by business owners, senior executives, and managers during meetings, telephone conversations, and in correspondence, regarding... Business Process MT502MR2 Alfred H. Beyer James E. Cotterman Prepared pursuant to Department of Defense Contract DASW01-95-C-0019. The views
2013-05-23
This monograph borrows from multiple disciplines to argue for an organizational shift from process reengineering to system design to improve...government customer-service delivery. Specifically, the monograph proposes a transformation in claims processing within the Veterans Benefits Administration...required. The proposed system design is an attempt to place the disability claims process within a larger environment encompassing multiple dimensions of customers.
Department of Defense Energy Strategy: Teaching an Old Dog New Tricks
2007-04-01
extensively by terrorists outside and within the United States. Rebels have targeted one oil pipeline in Colombia over 600 times since 1995. In 1996 ...fourth look at re-engining the B-52H fleet since 1996 . The first three Air Force studies concluded re-engining was not economically justifiable... Peterson , to AMC OG/CCs, memorandum. 12 James, Bullet Background Paper on Hawaii ANG. 13 Muellner, USAF Energy Forum. KC-135 Fact Sheet, http
Impact of detector simulation in particle physics collider experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elvira, V. Daniel
Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less
Impact of detector simulation in particle physics collider experiments
Elvira, V. Daniel
2017-06-01
Through the last three decades, precise simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detectormore » simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the accuracy of the physics results and publication turnaround, from data-taking to submission. It also presents the economic impact and cost of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data, taxing heavily the performance of simulation and reconstruction software for increasingly complex detectors. Consequently, it becomes urgent to find solutions to speed up simulation software in order to cope with the increased demand in a time of flat budgets. The study ends with a short discussion on the potential solutions that are being explored, by leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering of HEP code for concurrency and parallel computing.« less
SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences
NASA Astrophysics Data System (ADS)
Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.
1994-11-01
A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.
SEQ-REVIEW: A tool for reviewing and checking spacecraft sequences
NASA Technical Reports Server (NTRS)
Maldague, Pierre F.; El-Boushi, Mekki; Starbird, Thomas J.; Zawacki, Steven J.
1994-01-01
A key component of JPL's strategy to make space missions faster, better and cheaper is the Advanced Multi-Mission Operations System (AMMOS), a ground software intensive system currently in use and in further development. AMMOS intends to eliminate the cost of re-engineering a ground system for each new JPL mission. This paper discusses SEQ-REVIEW, a component of AMMOS that was designed to facilitate and automate the task of reviewing and checking spacecraft sequences. SEQ-REVIEW is a smart browser for inspecting files created by other sequence generation tools in the AMMOS system. It can parse sequence-related files according to a computer-readable version of a 'Software Interface Specification' (SIS), which is a standard document for defining file formats. It lets users display one or several linked files and check simple constraints using a Basic-like 'Little Language'. SEQ-REVIEW represents the first application of the Quality Function Development (QFD) method to sequence software development at JPL. The paper will show how the requirements for SEQ-REVIEW were defined and converted into a design based on object-oriented principles. The process starts with interviews of potential users, a small but diverse group that spans multiple disciplines and 'cultures'. It continues with the development of QFD matrices that related product functions and characteristics to user-demanded qualities. These matrices are then turned into a formal Software Requirements Document (SRD). The process concludes with the design phase, in which the CRC (Class, Responsibility, Collaboration) approach was used to convert requirements into a blueprint for the final product.
Impact of detector simulation in particle physics collider experiments
NASA Astrophysics Data System (ADS)
Daniel Elvira, V.
2017-06-01
Through the last three decades, accurate simulation of the interactions of particles with matter and modeling of detector geometries has proven to be of critical importance to the success of the international high-energy physics (HEP) experimental programs. For example, the detailed detector modeling and accurate physics of the Geant4-based simulation software of the CMS and ATLAS particle physics experiments at the European Center of Nuclear Research (CERN) Large Hadron Collider (LHC) was a determinant factor for these collaborations to deliver physics results of outstanding quality faster than any hadron collider experiment ever before. This review article highlights the impact of detector simulation on particle physics collider experiments. It presents numerous examples of the use of simulation, from detector design and optimization, through software and computing development and testing, to cases where the use of simulation samples made a difference in the precision of the physics results and publication turnaround, from data-taking to submission. It also presents estimates of the cost and economic impact of simulation in the CMS experiment. Future experiments will collect orders of magnitude more data with increasingly complex detectors, taxing heavily the performance of simulation and reconstruction software. Consequently, exploring solutions to speed up simulation and reconstruction software to satisfy the growing demand of computing resources in a time of flat budgets is a matter that deserves immediate attention. The article ends with a short discussion on the potential solutions that are being considered, based on leveraging core count growth in multicore machines, using new generation coprocessors, and re-engineering HEP code for concurrency and parallel computing.
Meeting patient expectations: healthcare professionals and service re-engineering.
Laing, Angus
2002-08-01
A central theme underpinning the reform of healthcare systems in western economies since the 1980s has been the emphasis on reorienting service provision around the patient. Healthcare organizations have been forced to re-appraise the design of the service delivery process, specifically the service encounter, to take account of these changing patient expectations. This reorientation of healthcare services around the patient has fundamental implications for healthcare professionals, specifically challenging the dominance of service professionals in the design and delivery of health services. Utilizing a qualitative methodological framework, this paper explores the responses of healthcare professionals to service redesign initiatives implemented in acute NHS hospitals in Scotland and considers the implications of such professional responses for the development of patient-focused service delivery. Within this, it specifically examines evolving professional perspectives on the place of a service user focus in a publicly funded healthcare system, professional attitudes towards private sector managerial practices, and the dynamics of changing professional behaviour.
Trusheim, M; Cobbs, E; Bala, M; Garner, S; Hartman, D; Isaacs, K; Lumpkin, M; Lim, R; Oye, K; Pezalla, E; Saltonstall, P; Selker, H
2016-01-01
The current system of biomedical innovation is unable to keep pace with scientific advancements. We propose to address this gap by reengineering innovation processes to accelerate reliable delivery of products that address unmet medical needs. Adaptive biomedical innovation (ABI) provides an integrative, strategic approach for process innovation. Although the term “ABI” is new, it encompasses fragmented “tools” that have been developed across the global pharmaceutical industry, and could accelerate the evolution of the system through more coordinated application. ABI involves bringing stakeholders together to set shared objectives, foster trust, structure decision‐making, and manage expectations through rapid‐cycle feedback loops that maximize product knowledge and reduce uncertainty in a continuous, adaptive, and sustainable learning healthcare system. Adaptive decision‐making, a core element of ABI, provides a framework for structuring decision‐making designed to manage two types of uncertainty – the maturity of scientific and clinical knowledge, and the behaviors of other critical stakeholders. PMID:27626610
NASA Astrophysics Data System (ADS)
Adinolfi, M.; Archilli, F.; Baldini, W.; Baranov, A.; Derkach, D.; Panin, A.; Pearce, A.; Ustyuzhanin, A.
2017-10-01
Data quality monitoring, DQM, is crucial in a high-energy physics experiment to ensure the correct functioning of the experimental apparatus during the data taking. DQM at LHCb is carried out in two phases. The first one is performed on-site, in real time, using unprocessed data directly from the LHCb detector, while the second, also performed on-site, requires the reconstruction of the data selected by the LHCb trigger system and occurs later. For the LHC Run II data taking the LHCb collaboration has re-engineered the DQM protocols and the DQM graphical interface, moving the latter to a web-based monitoring system, called Monet, thus allowing researchers to perform the second phase off-site. In order to support the operator’s task, Monet is also equipped with an automated, fully configurable alarm system, thus allowing its use not only for DQM purposes, but also to track and assess the quality of LHCb software and simulation over time.
3D Reconfigurable MPSoC for Unmanned Spacecraft Navigation
NASA Astrophysics Data System (ADS)
Dekoulis, George
2016-07-01
This paper describes the design of a new lightweight spacecraft navigation system for unmanned space missions. The system addresses the demands for more efficient autonomous navigation in the near-Earth environment or deep space. The proposed instrumentation is directly suitable for unmanned systems operation and testing of new airborne prototypes for remote sensing applications. The system features a new sensor technology and significant improvements over existing solutions. Fluxgate type sensors have been traditionally used in unmanned defense systems such as target drones, guided missiles, rockets and satellites, however, the guidance sensors' configurations exhibit lower specifications than the presented solution. The current implementation is based on a recently developed material in a reengineered optimum sensor configuration for unprecedented low-power consumption. The new sensor's performance characteristics qualify it for spacecraft navigation applications. A major advantage of the system is the efficiency in redundancy reduction achieved in terms of both hardware and software requirements.
E-nursing documentation as a tool for quality assurance.
Rajkovic, Vladislav; Sustersic, Olga; Rajkovic, Uros
2006-01-01
The article presents the results of a project with which we describe the reengineering of nursing documentation. Documentation in nursing is an efficient tool for ensuring quality health care and consequently quality patient treatment along the whole clinical path. We have taken into account the nursing process and patient treatment based on Henderson theoretical model of nursing that consists of 14 basic living activities. The model of new documentation enables tracing, transparency, selectivity, monitoring and analyses. All these factors lead to improvements of a health system as well as to improved safety of patients and members of nursing teams. Thus the documentation was developed for three health care segments: secondary and tertiary level, dispensaries and community health care. The new quality introduced to the documentation process by information and communication technology is presented by a database model and a software prototype for managing documentation.
Green Turning Brown - Domain Engineering for Social and Health Services in Finland.
Suomi, Reima; Nykänen, Pirkko; Vepsäläinen, Tapio; Hiltunen, Riina
2017-01-01
Being able to design information systems to an untouched domain, without the burden of existing information systems, especially legacy systems, is often seen as a dream of most information system professionals. Uncharted domains are anyway scarce, and often such greenfield projects turn into brownfield projects, also to projects where existing structures severely constrain the development of new systems. In this article we discuss the concepts of greenfield and brownfield domain engineering and software development, and reflect their possible messages to the re-engineering of the Finnish health- and social care ecosystem currently under way. In our fieldwork we could identify a lot of need and wish for greenfield domain engineering in the Finnish health and social services delivery. As well we found a lot of brownfield elements inhibiting change. Our proposal for the future is a ecosystem approach, where new and established elements could live together in a self-governed balance.
Proposing an Evidence-Based Strategy for Software Requirements Engineering.
Lindoerfer, Doris; Mansmann, Ulrich
2016-01-01
This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries.
Modern Management Principles Come to the Dental School.
Wataha, John C; Mouradian, Wendy E; Slayton, Rebecca L; Sorensen, John A; Berg, Joel H
2016-04-01
The University of Washington School of Dentistry may be the first dental school in the nation to apply lean process management principles as a primary tool to re-engineer its operations and curriculum to produce the dentist of the future. The efficiencies realized through re-engineering will better enable the school to remain competitive and viable as a national leader of dental education. Several task forces conducted rigorous value stream analyses in a highly collaborative environment led by the dean of the school. The four areas undergoing evaluation and re-engineering were organizational infrastructure, organizational processes, curriculum, and clinic operations. The new educational model was derived by thoroughly analyzing the current state of dental education in order to design and achieve the closest possible ideal state. As well, the school's goal was to create a lean, sustainable operational model. This model aims to ensure continued excellence in restorative dental instruction and to serve as a blueprint for other public dental schools seeking financial stability in this era of shrinking state support and rising costs.
NASA Astrophysics Data System (ADS)
Dachyar, M.; Christy, E.
2014-04-01
To maintain position as a major milk producer, the Indonesian milk industry should do some business development with the purpose of increasing customer service level. One strategy is to create on time release conditions for finished goods which will be distributed to customers and distributors. To achieve this condition, management information systems of finished goods on time release needs to be improved. The focus of this research is to conduct business process improvement using Business Process Reengineering (BPR). The deliverable key of this study is a comprehensive business strategy which is the solution of the root problems. To achieve the goal, evaluation, reengineering, and improvement of the ERP system are conducted. To visualize the predicted implementation, a simulation model is built by Oracle BPM. The output of this simulation showed that the proposed solution could effectively reduce the process lead time and increase the number of quality releases.
Reengineering health care materials management.
Connor, L R
1998-01-01
Health care executives across the country, faced with intense competition, are being forced to consider drastic cost cutting measures as a matter of survival. The entire health care industry is under siege from boards of directors, management and others who encourage health care systems to take actions ranging from strategic acquisitions and mergers to simple "downsizing" or "rightsizing," to improve their perceived competitive positions in terms of costs, revenues and market share. In some cases, management is poorly prepared to work within this new competitive paradigm and turns to consultants who promise that following their methodologies can result in competitive advantage. One favored methodology is reengineering. Frequently, cost cutting attention is focused on the materials management budget because it is relatively large and is viewed as being comprised mostly of controllable expenses. Also, materials management is seldom considered a core competency for the health care system and the organization performing these activities does not occupy a strongly defensible position. This paper focuses on the application of a reengineering methodology to healthcare materials management.
Biocybrid systems and the re-engineering of life
NASA Astrophysics Data System (ADS)
Domingues, Diana; Ferreira da Rocha, Adson; Hamdan, Camila; Augusto, Leci; Miosso, Cristiano Jacques
2011-03-01
The reengineering of life expanded by perceptual experiences in the sense of presence in Virtual Reality and Augmented Reality is the theme of our investigation in collaborative practices confirming the artistś creativity close to the inventivity of scientists and mutual capacity for the generation of biocybrid systems. We consider the enactive bodily interfaces for human existence being co-located in the continuum and symbiotic zone between body and flesh - cyberspace and data - and the hybrid properties of physical world. That continuum generates a biocybrid zone (Bio+cyber+hybrid) and the life is reinvented. Results reaffirm the creative reality of coupled body and mutual influences with environment information, enhancing James Gibson's ecological perception theory. The ecosystem life in its dynamical relations between human, animal, plants, landscapes, urban life and objects, bring questions and challenges for artworks and the reengineering of life discussed in our artworks in technoscience. Finally, we describe an implementation in which the immersion experience is enhanced by the datavisualization of biological audio signals and by using wearable miniaturized devices for biofeedback.
Lin, Jianfei; Chen, He; Luo, Ling; Lai, Yongrong; Xie, Wei; Kee, Kehkooi
2015-01-01
To correct a DNA mutation in the human genome for gene therapy, homology-directed repair (HDR) needs to be specific and have the lowest off-target effects to protect the human genome from deleterious mutations. Zinc finger nucleases, transcription activator-like effector nuclease (TALEN) and CRISPR-CAS9 systems have been engineered and used extensively to recognize and modify specific DNA sequences. Although TALEN and CRISPR/CAS9 could induce high levels of HDR in human cells, their genotoxicity was significantly higher. Here, we report the creation of a monomeric endonuclease that can recognize at least 33 bp by fusing the DNA-recognizing domain of TALEN (TALE) to a re-engineered homing endonuclease I-SceI. After sequentially re-engineering I-SceI to recognize 18 bp of the human β-globin sequence, the re-engineered I-SceI induced HDR in human cells. When the re-engineered I-SceI was fused to TALE (TALE-ISVB2), the chimeric endonuclease induced the same HDR rate at the human β-globin gene locus as that induced by TALEN, but significantly reduced genotoxicity. We further demonstrated that TALE-ISVB2 specifically targeted at the β-globin sequence in human hematopoietic stem cells. Therefore, this monomeric endonuclease has the potential to be used in therapeutic gene targeting in human cells. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
A platform for evolving intelligently interactive adversaries.
Fogel, David B; Hays, Timothy J; Johnson, Douglas R
2006-07-01
Entertainment software developers face significant challenges in designing games with broad appeal. One of the challenges concerns creating nonplayer (computer-controlled) characters that can adapt their behavior in light of the current and prospective situation, possibly emulating human behaviors. This adaptation should be inherently novel, unrepeatable, yet within the bounds of realism. Evolutionary algorithms provide a suitable method for generating such behaviors. This paper provides background on the entertainment software industry, and details a prior and current effort to create a platform for evolving nonplayer characters with genetic and behavioral traits within a World War I combat flight simulator.
Type Safe Extensible Programming
NASA Astrophysics Data System (ADS)
Chae, Wonseok
2009-10-01
Software products evolve over time. Sometimes they evolve by adding new features, and sometimes by either fixing bugs or replacing outdated implementations with new ones. When software engineers fail to anticipate such evolution during development, they will eventually be forced to re-architect or re-build from scratch. Therefore, it has been common practice to prepare for changes so that software products are extensible over their lifetimes. However, making software extensible is challenging because it is difficult to anticipate successive changes and to provide adequate abstraction mechanisms over potential changes. Such extensibility mechanisms, furthermore, should not compromise any existing functionality during extension. Software engineers would benefit from a tool that provides a way to add extensions in a reliable way. It is natural to expect programming languages to serve this role. Extensible programming is one effort to address these issues. In this thesis, we present type safe extensible programming using the MLPolyR language. MLPolyR is an ML-like functional language whose type system provides type-safe extensibility mechanisms at several levels. After presenting the language, we will show how these extensibility mechanisms can be put to good use in the context of product line engineering. Product line engineering is an emerging software engineering paradigm that aims to manage variations, which originate from successive changes in software.
The role of business process reengineering in health care.
Kohn, D
1994-02-01
Business process reengineering (BPR) is a management philosophy capturing attention in health care. It combines some new, old, and recycled management philosophies, and, more often than not, is yielding positive results. BPR's emphasis is on the streamlining of cross-functional processes to significantly reduce time and/or cost, increase revenue, improve quality and service, and reduce risk. Therefore, it has many applications in health care. This article provides an introduction to the concept of BPR, including the definition of BPR, its origin, its champions, and factors for its success.
Wind tunnel productivity status and improvement activities at NASA Langley Research Center
NASA Technical Reports Server (NTRS)
Putnam, Lawrence E.
1996-01-01
Over the last three years, a major effort has been underway to re-engineering the way wind tunnel testing is accomplished at the NASA Langley Research Center. This effort began with the reorganization of the LaRC and the consolidation of the management of the wind tunnels in the Aerodynamics Division under one operations branch. This paper provides an overview of the re-engineering activities and gives the status of the improvements in the wind tunnel productivity and customer satisfaction that have resulted from the new ways of working.
Usability study of clinical exome analysis software: top lessons learned and recommendations.
Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W
2014-10-01
New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Online Videoconferencing Products: Update
ERIC Educational Resources Information Center
Burton, Douglas; Kitchen, Tim
2011-01-01
Software allowing real-time online video connectivity is rapidly evolving. The ability to connect students, staff, and guest speakers instantaneously carries great benefits for the online distance education classroom. This evaluation report compares four software applications at opposite ends of the cost spectrum: "DimDim", "Elluminate VCS",…
Environmental models are products of the computer architecture and software tools available at the time of development. Scientifically sound algorithms may persist in their original state even as system architectures and software development approaches evolve and progress. Dating...
Intelligent Command and Control Systems for Satellite Ground Operations
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1999-01-01
This grant, Intelligent Command and Control Systems for Satellite Ground Operations, funded by NASA Goddard Space Flight Center, has spanned almost a decade. During this time, it has supported a broad range of research addressing the changing needs of NASA operations. It is important to note that many of NASA's evolving needs, for example, use of automation to drastically reduce (e.g., 70%) operations costs, are similar requirements in both government and private sectors. Initially the research addressed the appropriate use of emerging and inexpensive computational technologies, such as X Windows, graphics, and color, together with COTS (commercial-off-the-shelf) hardware and software such as standard Unix workstations to re-engineer satellite operations centers. The first phase of research supported by this grant explored the development of principled design methodologies to make effective use of emerging and inexpensive technologies. The ultimate performance measures for new designs were whether or not they increased system effectiveness while decreasing costs. GT-MOCA (The Georgia Tech Mission Operations Cooperative Associate) and GT-VITA (Georgia Tech Visual and Inspectable Tutor and Assistant), whose latter stages were supported by this research, explored model-based design of collaborative operations teams and the design of intelligent tutoring systems, respectively. Implemented in proof-of-concept form for satellite operations, empirical evaluations of both, using satellite operators for the former and personnel involved in satellite control operations for the latter, demonstrated unequivocally the feasibility and effectiveness of the proposed modeling and design strategy underlying both research efforts. The proof-of-concept implementation of GT-MOCA showed that the methodology could specify software requirements that enabled a human-computer operations team to perform without any significant performance differences from the standard two-person satellite operations team. GT-VITA, using the same underlying methodology, the operator function model (OFM), and its computational implementation, OFMspert, successfully taught satellite control knowledge required by flight operations team members. The tutor structured knowledge in three ways: declarative knowledge (e.g., What is this? What does it do?), procedural knowledge, and operational skill. Operational skill is essential in real-time operations. It combines the two former knowledge types, assisting a student to use them effectively in a dynamic, multi-tasking, real-time operations environment. A high-fidelity simulator of the operator interface to the ground control system, including an almost full replication of both the human-computer interface and human interaction with the dynamic system, was used in the GT-MOCA and GT-VITA evaluations. The GT-VITA empirical evaluation, conducted with a range of'novices' that included GSFC operations management, GSFC operations software developers, and new flight operations team members, demonstrated that GT-VITA effectively taught a wide range of knowledge in a succinct and engaging manner.
Storage system software solutions for high-end user needs
NASA Technical Reports Server (NTRS)
Hogan, Carole B.
1992-01-01
Today's high-end storage user is one that requires rapid access to a reliable terabyte-capacity storage system running in a distributed environment. This paper discusses conventional storage system software and concludes that this software, designed for other purposes, cannot meet high-end storage requirements. The paper also reviews the philosophy and design of evolving storage system software. It concludes that this new software, designed with high-end requirements in mind, provides the potential for solving not only the storage needs of today but those of the foreseeable future as well.
Group Projects and the Computer Science Curriculum
ERIC Educational Resources Information Center
Joy, Mike
2005-01-01
Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…
Horton, M C; Lewis, T E; Kinsey, T V
1999-05-01
Prior to June 1997, military picture archiving and communications systems (PACS) were planned, procured, and installed with key decisions on the system, equipment, and even funding sources made through a research and development office called Medical Diagnostic Imaging Systems (MDIS). Beginning in June 1997, the Joint Imaging Technology Project Office (JITPO) initiated a collaborative and consultative process for planning and implementing PACS into military treatment facilities through a new Department of Defense (DoD) contract vehicle called digital imaging networks (DIN)-PACS. The JITPO reengineered this process incorporating multiple organizations and politics. The reengineered PACS process administered through the JITPO transformed the decision process and accountability from a single office to a consultative method that increased end-user knowledge, responsibility, and ownership in PACS. The JITPO continues to provide information and services that assist multiple groups and users in rendering PACS planning and implementation decisions. Local site project managers are involved from the outset and this end-user collaboration has made the sometimes difficult transition to PACS an easier and more acceptable process for all involved. Corporately, this process saved DoD sites millions by having PACS plans developed within the government and proposed to vendors second, and then having vendors respond specifically to those plans. The integrity and efficiency of the process have reduced the opportunity for implementing nonstandard systems while sharing resources and reducing wasted government dollars. This presentation will describe the chronology of changes, encountered obstacles, and lessons learned within the reengineering of the PACS process for DIN-PACS.
Development of consistent hazard controls for DOE transuranic waste operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woody, W.J.
2007-07-01
This paper describes the results of a re-engineering initiative undertaken with the Department of Energy's (DOE) Office of Environmental Management (EM) in order to standardize hazard analysis assumptions and methods and resulting safety controls applied to multiple transuranic (TRU) waste operations located across the United States. A wide range of safety controls are historically applied to transuranic waste operations, in spite of the fact that these operations have similar operational characteristics and hazard/accident potential. The re-engineering effort supported the development of a DOE technical standard with specific safety controls designated for accidents postulated during waste container retrieval, staging/storage, venting, onsitemore » movements, and characterization activities. Controls cover preventive and mitigative measures; include both hardware and specific administrative controls; and provide protection to the facility worker, onsite co-located workers and the general public located outside of facility boundaries. The Standard development involved participation from all major DOE sites conducting TRU waste operations. Both safety analysts and operations personnel contributed to the re-engineering effort. Acknowledgment is given in particular to the following individuals who formed a core working group: Brenda Hawks, (DOE Oak Ridge Office), Patrice McEahern (CWI-Idaho), Jofu Mishima (Consultant), Louis Restrepo (Omicron), Jay Mullis (DOE-ORO), Mike Hitchler (WSMS), John Menna (WSMS), Jackie East (WSMS), Terry Foppe (CTAC), Carla Mewhinney (WIPP-SNL), Stephie Jennings (WIPP-LANL), Michael Mikolanis (DOESRS), Kraig Wendt (BBWI-Idaho), Lee Roberts (Fluor Hanford), and Jim Blankenhorn (WSRC). Additional acknowledgment is given to Dae Chung (EM) and Ines Triay (EM) for leadership and management of the re-engineering effort. (authors)« less
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Rash, James L. (Inventor); Pena, Joaquin (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which an evolutionary system is managed and viewed as a software product line. In some embodiments, the core architecture is a relatively unchanging part of the system, and each version of the system is viewed as a product from the product line. Each software product is generated from the core architecture with some agent-based additions. The result may be a multi-agent system software product line.
Holland, C A
1995-01-01
Today's laboratory managers are caught in the midst of a tumultuous environment as a result of managed care, mergers and acquisitions, and downsizing. We must prepare ourselves through continuous learning, recognize the marketable value of our skills outside of the laboratory, and seize opportunities to expand into new roles. At Arkansas Children's Hospital, the Chief Executive Officer selected the Administrative Director of Laboratories to reengineer the General Pediatric Center. Our goals were to improve quality of care, efficiency, teamwork, clinic visit times, and satisfaction of patients, staff, and physicians. We developed ideal objectives from surveys, brainstorming sessions, and interviews to serve as guidelines for reengineering teams. Teams met the goals and 12 of 15 ideal objectives. Patient flow redesign resulted in different processes for different patient populations and a 35% decrease in the average clinic visit time. Patient, staff, and physician satisfaction improved, as did the clinic's financial status. The project's success confirms that our leadership and analytical skills are transferable from the laboratory to carry us to new heights in other health-care arenas.
Reengineering the project design process
NASA Astrophysics Data System (ADS)
Kane Casani, E.; Metzger, Robert M.
1995-01-01
In response to the National Aeronautics and Space Administration's goal of working faster, better, and cheaper, the Jet Propulsion Laboratory (JPL) has developed extensive plans to minimize cost, maximize customer and employee satisfaction, and implement small- and moderate-size missions. These plans include improved management structures and processes, enhanced technical design processes, the incorporation of new technology, and the development of more economical space- and ground-system designs. The Laboratory's new Flight Projects Implementation Development Office has been chartered to oversee these innovations and the reengineering of JPL's project design process, including establishment of the Project Design Center (PDC) and the Flight System Testbed (FST). Reengineering at JPL implies a cultural change whereby the character of the Laboratory's design process will change from sequential to concurrent and from hierarchical to parallel. The Project Design Center will support missions offering high science return, design to cost, demonstrations of new technology, and rapid development. Its computer-supported environment will foster high-fidelity project life-cycle development and more accurate cost estimating. These improvements signal JPL's commitment to meeting the challenges of space exploration in the next century.
C-C1-04: Building a Health Services Information Technology Research Environment
Gehrum, David W; Jones, JB; Romania, Gregory J; Young, David L; Lerch, Virginia R; Bruce, Christa A; Donkochik, Diane; Stewart, Walter F
2010-01-01
Background: The electronic health record (EHR) has opened a new era for health services research (HSR) where information technology (IT) is used to re-engineer care processes. While the EHR provides one means of advancing novel solutions, a promising strategy is to develop tools (e.g., online questionnaires, visual display tools, decision support) distinct from, but which interact with, the EHR. Development of such software tools outside the EHR offers an advantage in flexibility, sophistication, and ultimately in portability to other settings. However, institutional IT departments have an imperative to protect patient data and to standardize IT processes to ensure system-level security and support traditional business needs. Such imperatives usually present formidable process barriers to testing novel software solutions. We describe how, in collaboration with our IT department, we are creating an environment and a process that allows for routine and rapid testing of novel software solutions. Methods: We convened a working group consisting of IT and research personnel with expertise in information security, database design/management, web design, EHR programming, and health services research. The working group was tasked with developing a research IT environment to accomplish two objectives: maintain network/ data security and regulatory compliance; allow researchers working with external vendors to rapidly prototype and, in a clinical setting, test web-based tools. Results: Two parallel solutions, one focused on hardware, the second on oversight and management, were developed. First, we concluded that three separate, staged development environments were required to allow external vendor access for testing software and for transitioning software to be used in a clinic. In parallel, the extant oversight process for approving/managing access to internal/external personnel had to be altered to reflect the scope and scale of discrete research projects, as opposed to an enterpriselevel approach to IT management. Conclusions: Innovation in health services software development requires a flexible, scalable IT environment adapted to the unique objectives of a HSR software development model. In our experience, implementing the hardware solution is less challenging than the cultural change required to implement such a model and the modifications to administrative and oversight processes to sustain an environment for rapid product development and testing.
A Case Study in CAD Design Automation
ERIC Educational Resources Information Center
Lowe, Andrew G.; Hartman, Nathan W.
2011-01-01
Computer-aided design (CAD) software and other product life-cycle management (PLM) tools have become ubiquitous in industry during the past 20 years. Over this time they have continuously evolved, becoming programs with enormous capabilities, but the companies that use them have not evolved their design practices at the same rate. Due to the…
Real Time Network Monitoring and Reporting System
ERIC Educational Resources Information Center
Massengale, Ricky L., Sr.
2009-01-01
With the ability of modern system developers to develop intelligent programs that allows machines to learn, modify and evolve themselves, current trends of reactionary methods to detect and eradicate malicious software code from infected machines is proving to be too costly. Addressing malicious software after an attack is the current methodology…
Evolvable Neural Software System
NASA Technical Reports Server (NTRS)
Curtis, Steven A.
2009-01-01
The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.
le Roux, K W D P; Couper, I
2015-06-01
The re-engineering of primary healthcare (PHC) is regarded as an essential precursor to the implementation of National Health Insurance in South Africa, but improvements in the provision of PHC services have been patchy. The authors contend that the role of well- functioning rural district hospitals as a hub from which PHC services can be most efficiently managed has been underestimated, and that the management of district hospitals and PHC clinics need to be co-located at the level of the rural district hospital, to allow for proper integration of care and effective healthcare provision.
Constraints and Opportunities in GCM Model Development
NASA Technical Reports Server (NTRS)
Schmidt, Gavin; Clune, Thomas
2010-01-01
Over the past 30 years climate models have evolved from relatively simple representations of a few atmospheric processes to complex multi-disciplinary system models which incorporate physics from bottom of the ocean to the mesopause and are used for seasonal to multi-million year timescales. Computer infrastructure over that period has gone from punchcard mainframes to modern parallel clusters. Constraints of working within an ever evolving research code mean that most software changes must be incremental so as not to disrupt scientific throughput. Unfortunately, programming methodologies have generally not kept pace with these challenges, and existing implementations now present a heavy and growing burden on further model development as well as limiting flexibility and reliability. Opportunely, advances in software engineering from other disciplines (e.g. the commercial software industry) as well as new generations of powerful development tools can be incorporated by the model developers to incrementally and systematically improve underlying implementations and reverse the long term trend of increasing development overhead. However, these methodologies cannot be applied blindly, but rather must be carefully tailored to the unique characteristics of scientific software development. We will discuss the need for close integration of software engineers and climate scientists to find the optimal processes for climate modeling.
Space Station Mission Planning Study (MPS) development study. Volume 3: Software development plan
NASA Technical Reports Server (NTRS)
Klus, W. L.
1987-01-01
A software development plan is presented for the definition, design, and implementation of the Space Station (SS) Payload Mission Planning System (MPS). This plan is an evolving document and must be updated periodically as the SS design and operations concepts as well as the SS MPS concept evolve. The major segments of this plan are as follows: an overview of the SS MPS and a description of its required capabilities including the computer programs identified as configurable items with an explanation of the place and function of each within the system; an overview of the project plan and a detailed description of each development project activity breaking each into lower level tasks where applicable; identification of the resources required and recommendations for the manner in which they should be utilized including recommended schedules and estimated manpower requirements; and a description of the practices, standards, and techniques recommended for the SS MPS Software (SW) development.
Evolution and Reengineering of NASA's Flight Dynamics Facility (FDF)
NASA Technical Reports Server (NTRS)
Stengle, Thomas; Hoge, Susan
2008-01-01
The NASA Goddard Space Flight Center's Flight Dynamics Facility (FDF) is a multimission support facility that performs ground navigation and spacecraft trajectory design services for a wide range of scientific satellites. The FDF also supports the NASA Space Network by providing orbit determination and tracking data evaluation services for the Tracking Data Relay Satellite System (TDRSS). The FDF traces its history to early NASA missions in the 1960's, including navigation support to the Apollo lunar missions. Over its 40 year history, the FDF has undergone many changes in its architecture, services offered, missions supported, management approach, and business operation. As a fully reimbursable facility (users now pay 100% of all costs for FDF operations and sustaining engineering activities), the FDF has faced significant challenges in recent years in providing mission critical products and services at minimal cost while defining and implementing upgrades necessary to meet future mission demands. This paper traces the history of the FDF and discusses significant events in the past that impacted the FDF infrastructure and/or business model, and the events today that are shaping the plans for the FDF in the next decade. Today's drivers for change include new mission requirements, the availability of new technology for spacecraft navigation, and continued pressures for cost reduction from FDF users. Recently, the FDF completed an architecture study based on these drivers that defines significant changes planned for the facility. This paper discusses the results of this study and a proposed implementation plan. As a case study in how flight dynamics operations have evolved and will continue to evolve, this paper focuses on two periods of time (1992 and the present) in order to contrast the dramatic changes that have taken place in the FDF. This paper offers observations and plans for the evolution of the FDF over the next ten years. Finally, this paper defines the mission model of the future for the FDF based on NASA's current mission list and planning for the Constellation Program. As part of this discussion the following are addressed: the relevance and benefits of a multi-mission facility for NASA's navigation operations in the future; anticipated technologies affecting ground orbit determination; continued incorporation of Commercial Off-the-shelf (COTS) software into the FDF; challenges of a business model that relies entirely on user fees to fund facility upgrades; anticipated changes in flight dynamics services required; and considerations for defining architecture upgrades given a set of cost drivers.
PyEvolve: a toolkit for statistical modelling of molecular evolution.
Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A
2004-01-05
Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software.
WalkThrough Example Procedures for MAMA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggiero, Christy E.; Gaschen, Brian Keith; Bloch, Jeffrey Joseph
This documentation is a growing set of walk through examples of analyses using the MAMA V2.0 software. It does not cover all the features or possibilities with the MAMA software, but will address using many of the basic analysis tools to quantify particle size and shape in an image. This document will continue to evolve as additional procedures and examples are added. The starting assumption is that the MAMA software has been successfully installed.
Commonality and Variability Analysis for Xenon Family of Separation Virtual Machine Monitors (CVAX)
2017-07-18
technical approach is a systematic application of Software Product Line Engineering (SPLE). A systematic application requires describing the family and... engineering Software family September 2016 – October 2016 OSD/OUSD/ATL/ASD(R&E)/RDOffice of Information Systems & Cyber Security RD / ASD(R&E) / AT&L...by the evolving open-source Xen hypervisor. The technical approach is a systematic application of Software Product Line Engineering (SPLE). A
Future Directions for Astronomical Image Display
NASA Technical Reports Server (NTRS)
Mandel, Eric
2000-01-01
In the "Future Directions for Astronomical Image Displav" project, the Smithsonian Astrophysical Observatory (SAO) and the National Optical Astronomy Observatories (NOAO) evolved our existing image display program into fully extensible. cross-platform image display software. We also devised messaging software to support integration of image display into astronomical analysis systems. Finally, we migrated our software from reliance on Unix and the X Window System to a platform-independent architecture that utilizes the cross-platform Tcl/Tk technology.
Automating the design of scientific computing software
NASA Technical Reports Server (NTRS)
Kant, Elaine
1992-01-01
SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.
Business process re-engineering--saviour or just another fad? One UK health care perspective.
Patwardhan, Anjali; Patwardhan, Dhruv
2008-01-01
Pressure to change is politically driven owing to escalating healthcare costs and an emphasis on efficiency gains, value for money and improved performance proof in terms of productivity and recently to some extent by demands from less satisfied patients and stakeholders. In a background of newly immerging expensive techniques and drugs, there is an increasing consumer expectation, i.e. quality services. At the same time, health system managers and practitioners are finding it difficult to cope with demand and quality expectations. Clinicians are frustrated because they are not recognised for their contribution. Managers are frustrated because meaningful dialogue with clinicians is lacking, which has intensified the need for change to a more efficient system that satisfies all arguments about cost effectiveness and sustainable quality services. Various strategies, originally developed by management quality "gurus" for engineering industries, have been applied to health industries with variable success, which largely depends on the type of health care system to which they are applied. Business process re-engineering is examined as a quality management tool using past and recent publications. The paper finds that applying business process re-engineering in the right circumstances and selected settings for quality improvement is critical for its success. It is certainly "not for everybody". The paper provides a critical appraisal of business process re-engineering experiences in UK healthcare. Lessons learned regarding selecting organisations and agreeing realistic expectations are addressed. Business process re-engineering has been evaluated and reviewed since 1987 in US managed health care, with no clear lessons learned possibly because unit selection and simultaneous comparison between two units virtually performing at opposite ends has never been done before. Two UK pilot studies, however, add useful insights.
Software Carpentry: lessons learned
Wilson, Greg
2016-01-01
Since its start in 1998, Software Carpentry has evolved from a week-long training course at the US national laboratories into a worldwide volunteer effort to improve researchers' computing skills. This paper explains what we have learned along the way, the challenges we now face, and our plans for the future. PMID:24715981
Post-Modern Software Development
NASA Technical Reports Server (NTRS)
Filman, Robert E.
2005-01-01
The history of software development includes elements of art, science, engineering, and fashion(though very little manufacturing). In all domains, old ideas give way or evolve to new ones: in the fine arts, the baroque gave way to rococo, romanticism, modernism, postmodernism, and so forth. What is the postmodern programming equivalent? That is, what comes after object orientation?
Automatic Rock Detection and Mapping from HiRISE Imagery
NASA Technical Reports Server (NTRS)
Huertas, Andres; Adams, Douglas S.; Cheng, Yang
2008-01-01
This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.
DAME: planetary-prototype drilling automation.
Glass, B; Cannon, H; Branson, M; Hanagud, S; Paulsen, G
2008-06-01
We describe results from the Drilling Automation for Mars Exploration (DAME) project, including those of the summer 2006 tests from an Arctic analog site. The drill hardware is a hardened, evolved version of the Advanced Deep Drill by Honeybee Robotics. DAME has developed diagnostic and executive software for hands-off surface operations of the evolved version of this drill. The DAME drill automation tested from 2004 through 2006 included adaptively controlled drilling operations and the downhole diagnosis of drilling faults. It also included dynamic recovery capabilities when unexpected failures or drilling conditions were discovered. DAME has developed and tested drill automation software and hardware under stressful operating conditions during its Arctic field testing campaigns at a Mars analog site.
DAME: Planetary-Prototype Drilling Automation
NASA Astrophysics Data System (ADS)
Glass, B.; Cannon, H.; Branson, M.; Hanagud, S.; Paulsen, G.
2008-06-01
We describe results from the Drilling Automation for Mars Exploration (DAME) project, including those of the summer 2006 tests from an Arctic analog site. The drill hardware is a hardened, evolved version of the Advanced Deep Drill by Honeybee Robotics. DAME has developed diagnostic and executive software for hands-off surface operations of the evolved version of this drill. The DAME drill automation tested from 2004 through 2006 included adaptively controlled drilling operations and the downhole diagnosis of drilling faults. It also included dynamic recovery capabilities when unexpected failures or drilling conditions were discovered. DAME has developed and tested drill automation software and hardware under stressful operating conditions during its Arctic field testing campaigns at a Mars analog site.
Architecting a Simulation Framework for Model Rehosting
NASA Technical Reports Server (NTRS)
Madden, Michael M.
2004-01-01
The utility of vehicle math models extends beyond human-in-the-loop simulation. It is desirable to deploy a given model across a multitude of applications that target design, analysis, and research. However, the vehicle model alone represents an incomplete simulation. One must also replicate the environment models (e.g., atmosphere, gravity, terrain) to achieve identical vehicle behavior across all applications. Environment models are increasing in complexity and represent a substantial investment to re-engineer for a new application. A software component that can be rehosted in each application is one solution to the deployment problem. The component must encapsulate both the vehicle and environment models. The component must have a well-defined interface that abstracts the bulk of the logic to operate the models. This paper examines the characteristics of a rehostable modeling component from the perspective of a human-in-the-loop simulation framework. The Langley Standard Real-Time Simulation in C++ (LaSRS++) is used as an example. LaSRS++ was recently redesigned to transform its modeling package into a rehostable component.
Designing Control System Application Software for Change
NASA Technical Reports Server (NTRS)
Boulanger, Richard
2001-01-01
The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.
Hospital social work: contemporary roles and professional activities.
Judd, Rebecca G; Sheffield, Sherry
2010-01-01
Since its inception in the 1900s, hospital social work has been impacted by the ever changing hospital environment. The institution of Diagnostic Related Groups (DRGs), the era of reengineering, and the constant struggle toward health care reform make it necessary to evaluate and substantiate the value and efficacy of social workers in hospital settings. This study identifies current roles and activities carried out by social workers in acute hospital settings from across the nation in the aftermath of reengineering. Findings suggest the primary role of respondents in this study to be discharge planning with little to no involvement in practice research or income-generating activities.
NASA Technical Reports Server (NTRS)
Lehtonen, Ken
2004-01-01
Back in the early 1990s, reengineering was all the rage. All of the corporations and their CEOs got excited about the prospect of having to streamline and reorganize, reengineering their organizations in an effort to improve the bottom line. NASA Goddard Space Flight Center was no exception to that rule. Some folks in upper management wanted to take advantage of this new paradigm and they turned their attention to the Hubble Space Telescope ground system. The objective was to reduce the operating cost of the system by at least 50 percent. This was a noble objective, as Hubble would likely be around for another ten to fifteen years at least.
E-documentation as a process management tool for nursing care in hospitals.
Rajkovic, Uros; Sustersic, Olga; Rajkovic, Vladislav
2009-01-01
Appropriate documentation plays a key role in process management in nursing care. It includes holistic data management based on patient's data along the clinical path with regard to nursing care. We developed an e-documentation model that follows the process method of work in nursing care. It assesses the patient's status on the basis of Henderson's theoretical model of 14 basic living activities and is aligned with internationally recognized nursing classifications. E-documentation development requires reengineering of existing documentation and facilitates process reengineering. A prototype solution of an e-nursing documentation, already being in testing process at University medical centres in Ljubljana and Maribor, will be described.
Research on BIM-based building information value chain reengineering
NASA Astrophysics Data System (ADS)
Hui, Zhao; Weishuang, Xie
2017-04-01
The achievement of value and value-added factor to the building engineering information is accomplished through a chain-flow, that is, building the information value chain. Based on the deconstruction of the information chain on the construction information in the traditional information mode, this paper clarifies the value characteristics and requirements of each stage of the construction project. In order to achieve building information value-added, the paper deconstructs the traditional building information value chain, reengineer the information value chain model on the basis of the theory and techniques of BIM, to build value-added management model and analyse the value of the model.
Qumquad: a UML-based approach for remodeling of legacy systems in health care.
Garde, Sebastian; Knaup, Petra; Herold, Ralf
2003-07-01
Health care information systems still comprise legacy systems to a certain extent. For reengineering legacy systems a thorough remodeling is inalienable. Current modeling techniques like the Unified Modeling Language (UML) do not offer a systematic and comprehensive process-oriented method for remodeling activities. We developed a systematic method for remodeling legacy systems in health care called Qumquad. Qumquad consists of three major steps: (i) modeling the actual state of the application system, (ii) systematic identification of weak points in this model and (iii) development of a target concept for the reimplementation considering the identified weak points. We applied Qumquad for remodeling a documentation and therapy planning system for pediatric oncology (DOSPO). As a result of our remodeling activities we regained an abstract model of the system, an analysis of the current weak points of DOSPO and possible (partly alternative) solutions to overcome the weak points. Qumquad proved to be very helpful in the reengineering process of DOSPO since we now have at our disposal a comprehensive model for the reimplementation of DOSPO that current users of the system agree on. Qumquad can easily be applied to other reengineering projects in health care.
ERIC Educational Resources Information Center
Soderberg, Patti; Price, Frank
2003-01-01
Examines a lesson in which students are engaged in inquiry in evolutionary biology to develop better understanding of concepts and reasoning skills necessary to support knowledge claims about changes in the genetic structure of populations known as microevolution. Explains how a software simulation, EVOLVE, can be used to foster discussions about…
Standalone medical device software: The evolving regulatory framework.
McCarthy, Avril D; Lawford, Patricia V
2014-01-01
The paper provides an introduction to the regulatory landscape affecting a particular category of medical technology, namely standalone software-sometimes referred to as 'software as a medical device'. To aid the reader's comprehension of an often complex area, six case studies are outlined and discussed before the paper continues to provide detail of how software with a medical purpose in its own right can potentially be classified as a medical device. The reader is provided an appreciation of how to go about classifying such software and references to support the developer new to the field in locating detailed regulatory support documents and contact points for advice.
An Evolvable Multi-Agent Approach to Space Operations Engineering
NASA Technical Reports Server (NTRS)
Mandutianu, Sanda; Stoica, Adrian
1999-01-01
A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.
Using CASE to Exploit Process Modeling in Technology Transfer
NASA Technical Reports Server (NTRS)
Renz-Olar, Cheryl
2003-01-01
A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).
NASA Astrophysics Data System (ADS)
Leu, Jun-Der; Lee, Larry Jung-Hsing
2017-09-01
Enterprise resource planning (ERP) is a software solution that integrates the operational processes of the business functions of an enterprise. However, implementing ERP systems is a complex process. In addition to the technical issues, companies must address problems associated with business process re-engineering, time and budget control, and organisational change. Numerous industrial studies have shown that the failure rate of ERP implementation is high, even for well-designed systems. Thus, ERP projects typically require a clear methodology to support the project execution and effectiveness. In this study, we propose a theoretical model for ERP implementation. The value engineering (VE) method forms the basis of the proposed framework, which integrates Six Sigma tools. The proposed framework encompasses five phases: knowledge generation, analysis, creation, development and execution. In the VE method, potential ERP problems related to software, hardware, consultation and organisation are analysed in a group-decision manner and in relation to value, and Six Sigma tools are applied to avoid any project defects. We validate the feasibility of the proposed model by applying it to an international manufacturing enterprise in Taiwan. The results show improvements in customer response time and operational efficiency in terms of work-in-process and turnover of materials. Based on the evidence from the case study, the theoretical framework is discussed together with the study's limitations and suggestions for future research.
Swarming Robot Design, Construction and Software Implementation
NASA Technical Reports Server (NTRS)
Stolleis, Karl A.
2014-01-01
In this paper is presented an overview of the hardware design, construction overview, software design and software implementation for a small, low-cost robot to be used for swarming robot development. In addition to the work done on the robot, a full simulation of the robotic system was developed using Robot Operating System (ROS) and its associated simulation. The eventual use of the robots will be exploration of evolving behaviors via genetic algorithms and builds on the work done at the University of New Mexico Biological Computation Lab.
A Brief Assessment of LC2IEDM, MIST and Web Services for use in Naval Tactical Data Management
2004-07-01
server software, messaging between the client and server, and a database. The MIST database is implemented in an open source DBMS named PostGreSQL ... PostGreSQL had its beginnings at the University of California, Berkley, in 1986 [11]. The development of PostGreSQL has since evolved into a...contact history from the database. DRDC Atlantic TM 2004-148 9 Request Software Request Software Server Side Response from service
NASA Technical Reports Server (NTRS)
1994-01-01
A software management system, originally developed for Goddard Space Flight Center (GSFC) by Century Computing, Inc. has evolved from a menu and command oriented system to a state-of-the art user interface development system supporting high resolution graphics workstations. Transportable Applications Environment (TAE) was initially distributed through COSMIC and backed by a TAE support office at GSFC. In 1993, Century Computing assumed the support and distribution functions and began marketing TAE Plus, the system's latest version. The software is easy to use and does not require programming experience.
Toward Evolvable Hardware Chips: Experiments with a Programmable Transistor Array
NASA Technical Reports Server (NTRS)
Stoica, Adrian
1998-01-01
Evolvable Hardware is reconfigurable hardware that self-configures under the control of an evolutionary algorithm. We search for a hardware configuration can be performed using software models or, faster and more accurate, directly in reconfigurable hardware. Several experiments have demonstrated the possibility to automatically synthesize both digital and analog circuits. The paper introduces an approach to automated synthesis of CMOS circuits, based on evolution on a Programmable Transistor Array (PTA). The approach is illustrated with a software experiment showing evolutionary synthesis of a circuit with a desired DC characteristic. A hardware implementation of a test PTA chip is then described, and the same evolutionary experiment is performed on the chip demonstrating circuit synthesis/self-configuration directly in hardware.
A New Generation of Telecommunications for Mars: The Reconfigurable Software Radio
NASA Technical Reports Server (NTRS)
Adams, J.; Horne, W.
2000-01-01
Telecommunications is a critical component for any mission at Mars as it is an enabling function that provides connectivity back to Earth and provides a means for conducting science. New developments in telecommunications, specifically in software - configurable radios, expand the possible approaches for science missions at Mars. These radios provide a flexible and re-configurable platform that can evolve with the mission and that provide an integrated approach to communications and science data processing. Deep space telecommunication faces challenges not normally faced by terrestrial and near-earth communications. Radiation, thermal, highly constrained mass, volume, packaging and reliability all are significant issues. Additionally, once the spacecraft leaves earth, there is no way to go out and upgrade or replace radio components. The reconfigurable software radio is an effort to provide not only a product that is immediately usable in the harsh space environment but also to develop a radio that will stay current as the years pass and technologies evolve.
The Exercise–Affect–Adherence Pathway: An Evolutionary Perspective
Lee, Harold H.; Emerson, Jessica A.; Williams, David M.
2016-01-01
The low rates of regular exercise and overall physical activity (PA) in the general population represent a significant public health challenge. Previous research suggests that, for many people, exercise leads to a negative affective response and, in turn, reduced likelihood of future exercise. The purpose of this paper is to examine this exercise–affect–adherence relationship from an evolutionary perspective. Specifically, we argue that low rates of physical exercise in the general population are a function of the evolved human tendency to avoid unnecessary physical exertion. This innate tendency evolved because it allowed our evolutionary ancestors to conserve energy for physical activities that had immediate adaptive utility such as pursuing prey, escaping predators, and engaging in social and reproductive behaviors. The commonly observed negative affective response to exercise is an evolved proximate psychological mechanism through which humans avoid unnecessary energy expenditure. The fact that the human tendencies toward negative affective response to and avoidance of unnecessary physical activities are innate does not mean that they are unchangeable. Indeed, it is only because of human-engineered changes in our environmental conditions (i.e., it is no longer necessary for us to work for our food) that our predisposition to avoid unnecessary physical exertion has become a liability. Thus, it is well within our capabilities to reengineer our environments to once again make PA necessary or, at least, to serve an immediate functional purpose. We propose a two-pronged approach to PA promotion based on this evolutionary functional perspective: first, to promote exercise and other physical activities that are perceived to have an immediate purpose, and second, to instill greater perceived purpose for a wider range of physical activities. We posit that these strategies are more likely to result in more positive (or less negative) affective responses to exercise, better adherence to exercise programs, and higher rates of overall PA. PMID:27610096
NASA Astrophysics Data System (ADS)
Wesley, Hema; Sheshadri, Geetha
2015-08-01
Science publishing and its procedures have evolved rapidly and incredibly. Technical advances in the production and promotion of science content have dramatically widened the visibility and reach, deepened the impact and intensified the thrust of our journals’ science content. Nevertheless, it is also true that with speed and ease of automation, quality is to be ferociously guarded. Could traditional methods when juxtaposed with the continuous onslaught of newer technologies, in our publishing procedures, be the answer. Are people still the quintessential element in technology?Why do we publish; the phrase “Publish or Perish” has itself, interestingly, evolved in its meaning. How have publishing procedures evolved through the years; what is the impact of these developments; what are the challenges and opportunities for scholarly publishers in this digital age. How do we cope with and preempt the abuse and distortion of internet data, and with unethical practices that threaten to cripple the peer review system, and jeopardize publishing ethics and academic integrity. How can publishers fulfill their responsibility to publish content that is explicitly original.With increasing demand from the astronomical community for e-journals and other e-resources and with the gradual loss of value for printed journals, librarians are developing and maintaining e-libraries and providing access to digital resources, and re-engineering their roles to provide advancing tools for e-pub content.How can we prepare for the future of publishing, or is it already here with the popularity of pre-print servers. How have publishers responded to the Open Access model while being aware that ‘the price of keeping something free comes with a cost’. Will potential authors be confused with identifying traditional peer reviewed content, the predatory journals, the fake reviewers, placing research in institutional repositories, copyright and licenses. Are libraries impacted by these developments in their journal pricing schemes.
Configuring the Orion Guidance, Navigation, and Control Flight Software for Automated Sequencing
NASA Technical Reports Server (NTRS)
Odegard, Ryan G.; Siliwinski, Tomasz K.; King, Ellis T.; Hart, Jeremy J.
2010-01-01
The Orion Crew Exploration Vehicle is being designed with greater automation capabilities than any other crewed spacecraft in NASA s history. The Guidance, Navigation, and Control (GN&C) flight software architecture is designed to provide a flexible and evolvable framework that accommodates increasing levels of automation over time. Within the GN&C flight software, a data-driven approach is used to configure software. This approach allows data reconfiguration and updates to automated sequences without requiring recompilation of the software. Because of the great dependency of the automation and the flight software on the configuration data, the data management is a vital component of the processes for software certification, mission design, and flight operations. To enable the automated sequencing and data configuration of the GN&C subsystem on Orion, a desktop database configuration tool has been developed. The database tool allows the specification of the GN&C activity sequences, the automated transitions in the software, and the corresponding parameter reconfigurations. These aspects of the GN&C automation on Orion are all coordinated via data management, and the database tool provides the ability to test the automation capabilities during the development of the GN&C software. In addition to providing the infrastructure to manage the GN&C automation, the database tool has been designed with capabilities to import and export artifacts for simulation analysis and documentation purposes. Furthermore, the database configuration tool, currently used to manage simulation data, is envisioned to evolve into a mission planning tool for generating and testing GN&C software sequences and configurations. A key enabler of the GN&C automation design, the database tool allows both the creation and maintenance of the data artifacts, as well as serving the critical role of helping to manage, visualize, and understand the data-driven parameters both during software development and throughout the life of the Orion project.
NASA Technical Reports Server (NTRS)
Bailey, Brandon
2015-01-01
Historically security within organizations was thought of as an IT function (web sites/servers, email, workstation patching, etc.) Threat landscape has evolved (Script Kiddies, Hackers, Advanced Persistent Threat (APT), Nation States, etc.) Attack surface has expanded -Networks interconnected!! Some security posture factors Network Layer (Routers, Firewalls, etc.) Computer Network Defense (IPS/IDS, Sensors, Continuous Monitoring, etc.) Industrial Control Systems (ICS) Software Security (COTS, FOSS, Custom, etc.)
ERIC Educational Resources Information Center
Kakar, Adarsh Kumar
2013-01-01
Feature selection is one of the most important decisions made by product managers. This three article study investigates the concepts, tools and techniques for making trade-off decisions of introducing new features in evolving Commercial-Off-The-Shelf (COTS) software products. The first article investigates the efficacy of various feature…
ERIC Educational Resources Information Center
Mirriahi, Negin; Liaqat, Daniyal; Dawson, Shane; Gaševic, Dragan
2016-01-01
This study explores the types of learning profiles that evolve from student use of video annotation software for reflective learning. The data traces from student use of the software were analysed across four undergraduate courses with differing instructional conditions. That is, the use of graded or non-graded self-reflective annotations. Using…
NASA Technical Reports Server (NTRS)
Rasmussen, Robert; Bennett, Matthew
2006-01-01
The State Analysis Database Tool software establishes a productive environment for collaboration among software and system engineers engaged in the development of complex interacting systems. The tool embodies State Analysis, a model-based system engineering methodology founded on a state-based control architecture (see figure). A state represents a momentary condition of an evolving system, and a model may describe how a state evolves and is affected by other states. The State Analysis methodology is a process for capturing system and software requirements in the form of explicit models and states, and defining goal-based operational plans consistent with the models. Requirements, models, and operational concerns have traditionally been documented in a variety of system engineering artifacts that address different aspects of a mission s lifecycle. In State Analysis, requirements, models, and operations information are State Analysis artifacts that are consistent and stored in a State Analysis Database. The tool includes a back-end database, a multi-platform front-end client, and Web-based administrative functions. The tool is structured to prompt an engineer to follow the State Analysis methodology, to encourage state discovery and model description, and to make software requirements and operations plans consistent with model descriptions.
Downsizing, reengineering, and restructuring: long-term implications for healthcare organizations.
Leatt, P; Baker, G R; Halverson, P K; Aird, C
1997-01-01
This article provides a framework for analyzing how downsizing and reengineering have affected healthcare organizations. These approaches are reviewed, and key tools that have been used, such as across-the-board cuts, reorganizing, and redesigning, are described. Examples are drawn from healthcare as well as other business sectors. The consequences of cost reduction strategies for an organizations's performance in terms of costs, quality of services, and satisfaction of consumers and employees are explored. The case is made that an organization's context--that is, its culture, level of trust, and leadership--is an important factor that influences the effect of cost-cutting strategies. Characteristics of organizations where downsizing has a better chance of succeeding also are described.
IDC Re-Engineering Phase 2 System Specification Document Version 1.5
DOE Office of Scientific and Technical Information (OSTI.GOV)
Satpathi, Meara Allena; Burns, John F.; Harris, James M.
This document contains the system specifications derived to satisfy the system requirements found in the IDC System Requirements Document for the IDC Re-Engineering Phase 2 project. This System Specification Document (SSD) defines waveform data processing requirements for the International Data Centre (IDC) of the Comprehensive Nuclear Test Ban Treaty Organization (CTBTO). The routine processing includes characterization of events with the objective of screening out events considered to be consistent with natural phenomena or non-nuclear, man-made phenomena. This document does not address requirements concerning acquisition, processing and analysis of radionuclide data but does include requirements for the dissemination of radionuclide datamore » and products.« less
Reengineering the corporate culture of hospitals.
Johnson, J; Billingsley, M
1997-01-01
As we move closer to the turn of the century, the American health care system continues to provide management experts with a wealth of case studies describing the winners and losers in today's business environment. Winners in this cost-conscious era of institutional reengineering, said Senge in his best-selling book The Fifth Discipline (1990), are those organizations that bind together around a common identity, a sense of destiny, and a shared commitment to reconciling the existing threats to their survival. Senge suggests that the organizations that are able to tap people's commitment and capacity to learn at all levels of the structure will overcome the type of maladaptation to business threats so commonly seen in corporate failures today.
Vabbilisetty, Pratima; Boron, Mallorie; Nie, Huan; Ozhegov, Evgeny; Sun, Xue-Long
2018-02-28
Introduction of selectively chemical reactive groups at the cell surface enables site-specific cell surface labeling and modification opportunity, thus facilitating the capability to study the cell surface molecular structure and function and the molecular mechanism it underlies. Further, it offers the opportunity to change or improve a cell's functionality for interest of choice. In this study, two chemical reactive anchor lipids, phosphatidylethanolamine-poly(ethylene glycol)-dibenzocyclooctyne (DSPE-PEG 2000 -DBCO) and cholesterol-PEG-dibenzocyclooctyne (CHOL-PEG 2000 -DBCO) were synthesized and their potential application for cell surface re-engineering via lipid fusion were assessed with RAW 264.7 cells as a model cell. Briefly, RAW 264.7 cells were incubated with anchor lipids under various concentrations and at different incubation times. The successful incorporation of the chemical reactive anchor lipids was confirmed by biotinylation via copper-free click chemistry, followed by streptavidin-fluorescein isothiocyanate binding. In comparison, the cholesterol-based anchor lipid afforded a higher cell membrane incorporation efficiency with less internalization than the phospholipid-based anchor lipid. Low cytotoxicity of both anchor lipids upon incorporation into the RAW 264.7 cells was observed. Further, the cell membrane residence time of the cholesterol-based anchor lipid was evaluated with confocal microscopy. This study suggests the potential cell surface re-engineering applications of the chemical reactive anchor lipids.
The Virtual Astronomical Observatory: Re-engineering access to astronomical data
NASA Astrophysics Data System (ADS)
Hanisch, R. J.; Berriman, G. B.; Lazio, T. J. W.; Emery Bunn, S.; Evans, J.; McGlynn, T. A.; Plante, R.
2015-06-01
The US Virtual Astronomical Observatory was a software infrastructure and development project designed both to begin the establishment of an operational Virtual Observatory (VO) and to provide the US coordination with the international VO effort. The concept of the VO is to provide the means by which an astronomer is able to discover, access, and process data seamlessly, regardless of its physical location. This paper describes the origins of the VAO, including the predecessor efforts within the US National Virtual Observatory, and summarizes its main accomplishments. These accomplishments include the development of both scripting toolkits that allow scientists to incorporate VO data directly into their reduction and analysis environments and high-level science applications for data discovery, integration, analysis, and catalog cross-comparison. Working with the international community, and based on the experience from the software development, the VAO was a major contributor to international standards within the International Virtual Observatory Alliance. The VAO also demonstrated how an operational virtual observatory could be deployed, providing a robust operational environment in which VO services worldwide were routinely checked for aliveness and compliance with international standards. Finally, the VAO engaged in community outreach, developing a comprehensive web site with on-line tutorials, announcements, links to both US and internationally developed tools and services, and exhibits and hands-on training at annual meetings of the American Astronomical Society and through summer schools and community days. All digital products of the VAO Project, including software, documentation, and tutorials, are stored in a repository for community access. The enduring legacy of the VAO is an increasing expectation that new telescopes and facilities incorporate VO capabilities during the design of their data management systems.
Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.
Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth
2017-03-01
Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Frank Gilbreth and health care delivery method study driven learning.
Towill, Denis R
2009-01-01
The purpose of this article is to look at method study, as devised by the Gilbreths at the beginning of the twentieth century, which found early application in hospital quality assurance and surgical "best practice". It has since become a core activity in all modern methods, as applied to healthcare delivery improvement programmes. The article traces the origin of what is now currently and variously called "business process re-engineering", "business process improvement" and "lean healthcare" etc., by different management gurus back to the century-old pioneering work of Frank Gilbreth. The outcome is a consistent framework involving "width", "length" and "depth" dimensions within which healthcare delivery systems can be analysed, designed and successfully implemented to achieve better and more consistent performance. Healthcare method (saving time plus saving motion) study is best practised as co-joint action learning activity "owned" by all "players" involved in the re-engineering process. However, although process mapping is a key step forward, in itself it is no guarantee of effective re-engineering. It is not even the beginning of the end of the change challenge, although it should be the end of the beginning. What is needed is innovative exploitation of method study within a healthcare organisational learning culture accelerated via the Gilbreth Knowledge Flywheel. It is shown that effective healthcare delivery pipeline improvement is anchored into a team approach involving all "players" in the system especially physicians. A comprehensive process study, constructive dialogue, proper and highly professional re-engineering plus managed implementation are essential components. Experience suggests "learning" is thereby achieved via "natural groups" actively involved in healthcare processes. The article provides a proven method for exploiting Gilbreths' outputs and their many successors in enabling more productive evidence-based healthcare delivery as summarised in the "learn-do-learn-do" feedback loop in the Gilbreth Knowledge Flywheel.
NASA Technical Reports Server (NTRS)
Marius, Julio L.; Busch, Jim
2008-01-01
The Tropical Rainfall Measuring Mission (TRMM) spacecraft was launched in November of 1996 in order to obtain unique three dimensional radar cross sectional observations of cloud structures with particular interest in hurricanes. The TRMM mission life was recently extended with current estimates that operations will continue through the 2012-2013 timeframe. Faced with this extended mission profile, the project has embarked on a technology refresh and re-engineering effort. TRMM has recently implemented a re-engineering effort to expand a middleware based messaging architecture to enable fully redundant lights-out of flight operations activities. The middleware approach is based on the Goddard Mission Services Evolution Center (GMSEC) architecture, tools and associated open-source Applications Programming Interface (API). Middleware based messaging systems are useful in spacecraft operations and automation systems because private node based knowledge (such as that within a telemetry and command system) can be broadcast on the middleware messaging bus and hence enable collaborative decisions to be made by multiple subsystems. In this fashion, private data is made public and distributed within the local area network and multiple nodes can remain synchronized with other nodes. This concept is useful in a fully redundant architecture whereby one node is monitoring the processing of the 'prime' node so that in the event of a failure the backup node can assume operations of the prime, without loss of state knowledge. This paper will review and present the experiences, architecture, approach and lessons learned of the TRMM re-engineering effort centered on the GMSEC middleware architecture and tool suite. Relevant information will be presented that relates to the dual redundant parallel nature of the Telemetry and Command (T and C) and Front-End systems and how these systems can interact over a middleware bus to achieve autonomous operations including autonomous commanding to recover missing science data during the same spacecraft contact.
Automatic Nanodesign Using Evolutionary Techniques
NASA Technical Reports Server (NTRS)
Globus, Al; Saini, Subhash (Technical Monitor)
1998-01-01
Many problems associated with the development of nanotechnology require custom designed molecules. We use genetic graph software, a new development, to automatically evolve molecules of interest when only the requirements are known. Genetic graph software designs molecules, and potentially nanoelectronic circuits, given a fitness function that determines which of two molecules is better. A set of molecules, the first generation, is generated at random then tested with the fitness function, Subsequent generations are created by randomly choosing two parent molecules with a bias towards high scoring molecules, tearing each molecules in two at random, and mating parts from the mother and father to create two children. This procedure is repeated until a satisfactory molecule is found. An atom pair similarity test is currently used as the fitness function to evolve molecules similar to existing pharmaceuticals.
AI tools in computer based problem solving
NASA Technical Reports Server (NTRS)
Beane, Arthur J.
1988-01-01
The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.
NASA Astrophysics Data System (ADS)
Pozna, E.; Ramirez, A.; Mérand, A.; Mueller, A.; Abuter, R.; Frahm, R.; Morel, S.; Schmid, C.; Duc, T. Phan; Delplancke-Ströbele, F.
2014-07-01
The quality of data obtained by VLTI instruments may be refined by analyzing the continuous data supplied by the Reflective Memory Network (RMN). Based on 5 years experience providing VLTI instruments (PACMAN, AMBER, MIDI) with RMN data, the procedure has been generalized to make the synchronization with observation trouble-free. The present software interface saves not only months of efforts for each instrument but also provides the benefits of software frameworks. Recent applications (GRAVITY, MATISSE) supply feedback for the software to evolve. The paper highlights the way common features been identified to be able to offer reusable code in due course.
Domain Drivers in the Modularization of FLOSS Systems
NASA Astrophysics Data System (ADS)
Capiluppi, Andrea
The classification of software systems into types has been achieved in the past by observing both their specifications and behavioral patterns: the SPE classification, for instance, and its further supplements and refinements, has identified the S-type (i.e., fully specified), the P-type (i.e., specified but dependent on the context) and the E-type (i.e., addressing evolving problems) among the software systems.
Hardware Evolution of Closed-Loop Controller Designs
NASA Technical Reports Server (NTRS)
Gwaltney, David; Ferguson, Ian
2002-01-01
Poster presentation will outline on-going efforts at NASA, MSFC to employ various Evolvable Hardware experimental platforms in the evolution of digital and analog circuitry for application to automatic control. Included will be information concerning the application of commercially available hardware and software along with the use of the JPL developed FPTA2 integrated circuit and supporting JPL developed software. Results to date will be presented.
NASA Astrophysics Data System (ADS)
Vabbilisetty, Pratima
For decades, lipid vesicular bodies such as liposomes have been widely used and explored as biomimetic models of cell membranes and as drug/gene delivery carrier systems. Similarly, micellar iron oxide nanoparticles have also been investigated as potential MRI agents as well as drug delivery carrier systems. Cell surface carbohydrate-protein interactions allow them to serve as markers for recognition of many molecular and cellular activities thereby, are exploited as attractive molecules for surface modification of nanocarrier systems with purpose for tissues specific targeting and biocompatibility. In addition, the cell lipid membrane serves as an important platform for occurrence of many biological processes that are governed and guided by cell surface receptors. Introduction of chemoselective functional groups, via bio-orthogonal conjugation strategies, at the cell surface facilitates many cellular modifications and paves path for novel and potential biomedical applications. Anchoring lipids are needed for liposome surface functionalization with ligands of interest and play important roles in ligand grafting density, liposomes stability and biological activity. On the other hand, anchoring lipids are also needed for cell surface re-engineering by lipid fusion approach and have high impact for ligand insertion efficiency and biological activity. Overall, in this dissertation study, functional anchoring lipids for glyco-functionalized carrier systems and for efficient cell surface re-engineering applications were systematically investigated, respectively. Firstly, investigation of the synthesis of glyco-functionalized liposome systems based on phosphatidylethonalamine (PE) and cholesterol (Chol) anchoring lipids, prepared by post chemically selective functionalization via Staudinger ligation were carried out. The effect of anchor lipids on the stability, encapsulation and releasing capacity of the glycosylated liposomes were investigated by dynamic light scattering (DLS) technique and by entrapping 5, 6-carboxyfluorescein (CF) dye and monitoring the fluorescence leakage, respectively. Overall, the Chol-anchored liposomes showed faster releasing rate than DSPE-anchored liposomes. This could be due to the increase in rigidity of the lipid membrane upon inclusion of Chol, thereby, leading to fast leakage of liposomes. Second, the potential effects of phospholipid (PE) and cholesterol (Chol)-based anchor lipids on cell surface re-engineering via copper free click chemistry were assessed with RAW 264.7 cells as model. The confocal microscopy and flow cytometry results indicated the successful incorporation of biotinylated Chol-based anchor lipids after specific streptavidin-FITC binding onto the cell surface. Higher fluorescence intensities from the cell membrane were observed for Chol-based anchor lipids when compared to DSPE as anchoring lipid. Furthermore, cytotoxicity of the synthesized biotinylated anchor lipids on the RAW 264.7 cells was assessed by MTT assay. The MTT assay results further confirmed that cell surface re-engineering via lipid anchoring approach strategy has very little or negligible amount of cytotoxicity on the cell viability. Thus, this study suggests the possible use of these lipids for potential cell surface re-engineering applications. In addition, synthesis of lipid coated iron oxide nanoparticles via dual solvent exchange approach and their glyco-functionalization via Staudinger ligation were investigated and characterized by FT-IR and TEM techniques. The stability of iron oxide nanoparticles with varying compositions of lipid anchors was evaluated by dynamic light scattering technique.
Application of Domain Knowledge to Software Quality Assurance
NASA Technical Reports Server (NTRS)
Wild, Christian W.
1997-01-01
This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted, and possible reassessments resulting in new policy are made. In this regard we implemented a machine learning algorithm which automatically defined business rules based on expert assessment of the quality of operational decisions as recorded during deployment.
NASA Astrophysics Data System (ADS)
Teague, Kelly K.; Smith, G. Louis; Priestley, Kory; Lukashin, Constantine; Roithmayr, Carlos
2012-09-01
Five CERES scanning radiometers have been flown to date. The Proto-Flight Model flew aboard the Tropical Rainfall Measurement Mission spacecraft in November 1997. Two CERES instruments, Flight Models (FM) 1 and 2, are aboard the Terra spacecraft, which was launched in December 1999. Two more CERES instruments, FM-3 and FM-4, are on the Aqua spacecraft, which was placed in orbit in May 2002. These instruments continue to operate after providing over a decade of Earth Radiation Budget data. The CERES FM-5 instrument, onboard the Suomi-NPP spacecraft, launched in October 2011. The CERES FM- 6 instrument is manifested on the JPPS-1 spacecraft to be launched in December 2016. A successor to these instruments is presently in the definition stage. This paper describes the evolving role of flight software in the operation of these instruments to meet the Science objectives of the mission and also the ability to execute supplemental tasks as they evolve. In order to obtain and maintain high accuracy in the data products from these instruments, a number of operational activities have been developed and implemented since the instruments were originally designed and placed in orbit. These new activities are possible because of the ability to exploit and modify the flight software, which operates the instruments. The CERES Flight Software interface was designed to allow for on-orbit modification, and as such, constantly evolves to meet changing needs. The purpose of this paper is to provide a brief overview of modifications which have been developed to allow dedicated targeting of specific geographic locations as the CERES sensor flies overhead on its host spacecraft. This new observing strategy greatly increases the temporal and angular sampling for specific targets of high scientific interest.
Computing in high-energy physics
Mount, Richard P.
2016-05-31
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
Computing in high-energy physics
NASA Astrophysics Data System (ADS)
Mount, Richard P.
2016-04-01
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Finally, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
Computing in high-energy physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mount, Richard P.
I present a very personalized journey through more than three decades of computing for experimental high-energy physics, pointing out the enduring lessons that I learned. This is followed by a vision of how the computing environment will evolve in the coming ten years and the technical challenges that this will bring. I then address the scale and cost of high-energy physics software and examine the many current and future challenges, particularly those of management, funding and software-lifecycle management. Lastly, I describe recent developments aimed at improving the overall coherence of high-energy physics software.
NASA Technical Reports Server (NTRS)
Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.
1987-01-01
The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.
Evolving the ECSS Standards and their Use: Experience Based on Industrial Case Studies
NASA Astrophysics Data System (ADS)
Feldt, R.; Ahmad, E.; Raza, B.; Hult, E.; Nordebäck, T.
2009-05-01
This paper introduces two case studies conducted at two Swedish companies developing software for the space industry. The overall goal of the project is to evaluate if current use of ECSS is cost efficient and if there are ways to make the process leaner while maintaining quality. The case studies reported on here focused on how the ECSS standard was used by the companies and how that affected software development processes and software quality. This paper describes the results and recommendations based on identified challenges.
NASA Technical Reports Server (NTRS)
Lehtonen, Ken
1999-01-01
This is a report to the Third Annual International Virtual Company Conference, on The Development of a Virtual Company to Support the Reengineering of the NASA/Goddard Hubble Space Telescope (HST) Control Center System. It begins with a HST Science "Commercial": Brief Tour of Our Universe showing various pictures taken from the Hubble Space Telescope. The presentation then reviews the project background and goals. Evolution of the Control Center System ("CCS Inc.") is then reviewed. Topics of Interest to "virtual companies" are reviewed: (1) "How To Choose A Team" (2) "Organizational Model" (3) "The Human Component" (4) "'Virtual Trust' Among Teaming Companies" (5) "Unique Challenges to Working Horizontally" (6) "The Cultural Impact" (7) "Lessons Learned".
Construction of RNA nanocages by re-engineering the packaging RNA of Phi29 bacteriophage
NASA Astrophysics Data System (ADS)
Hao, Chenhui; Li, Xiang; Tian, Cheng; Jiang, Wen; Wang, Guansong; Mao, Chengde
2014-05-01
RNA nanotechnology promises rational design of RNA nanostructures with wide array of structural diversities and functionalities. Such nanostructures could be used in applications such as small interfering RNA delivery and organization of in vivo chemical reactions. Though having impressive development in recent years, RNA nanotechnology is still quite limited and its programmability and complexity could not rival the degree of its closely related cousin: DNA nanotechnology. Novel strategies are needed for programmed RNA self-assembly. Here, we have assembled RNA nanocages by re-engineering a natural, biological RNA motif: the packaging RNA of phi29 bacteriophage. The resulting RNA nanostructures have been thoroughly characterized by gel electrophoresis, cryogenic electron microscopy imaging and dynamic light scattering.
Medina-Ramírez, Max; Garces, Fernando; Escolano, Amelia; ...
2017-08-28
Induction of broadly neutralizing antibodies (bNAbs) by HIV-1 envelope glycoprotein immunogens would be a major advance toward an effective vaccine. A critical step in this process is the activation of naive B cells expressing germline (gl) antibody precursors that have the potential to evolve into bNAbs. Here, we reengineered the BG505 SOSIP.664 glycoprotein to engage gl precursors of bNAbs that target either the trimer apex or the CD4-binding site. The resulting BG505 SOSIP.v4.1-GT1 trimer binds multiple bNAb gl precursors in vitro. Immunization experiments in knock-in mice expressing gl-VRC01 or gl-PGT121 show that this trimer activates B cells in vivo, resultingmore » in the secretion of specific antibodies into the sera. A crystal structure of the gl-targeting trimer at 3.2-Å resolution in complex with neutralizing antibodies 35O22 and 9H+109L reveals a native-like conformation and the successful incorporation of design features associated with binding of multiple gl-bNAb precursors.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Medina-Ramírez, Max; Garces, Fernando; Escolano, Amelia
Induction of broadly neutralizing antibodies (bNAbs) by HIV-1 envelope glycoprotein immunogens would be a major advance toward an effective vaccine. A critical step in this process is the activation of naive B cells expressing germline (gl) antibody precursors that have the potential to evolve into bNAbs. Here, we reengineered the BG505 SOSIP.664 glycoprotein to engage gl precursors of bNAbs that target either the trimer apex or the CD4-binding site. The resulting BG505 SOSIP.v4.1-GT1 trimer binds multiple bNAb gl precursors in vitro. Immunization experiments in knock-in mice expressing gl-VRC01 or gl-PGT121 show that this trimer activates B cells in vivo, resultingmore » in the secretion of specific antibodies into the sera. A crystal structure of the gl-targeting trimer at 3.2-Å resolution in complex with neutralizing antibodies 35O22 and 9H+109L reveals a native-like conformation and the successful incorporation of design features associated with binding of multiple gl-bNAb precursors.« less
CrossTalk: The Journal of Defense Software Engineering. Volume 19, Number 7
2006-07-01
public key certificates and cryptographic keys for crypto - graphic devices. In an environment where enterprise protection relies on an array of IA-enabled...allowing for greater accuracy, currency , and relevance of the information con- tained in the satellite communications data- base will be evaluated...JWPO). The purpose of the JWPO is to define, develop, validate, and evolve the JTRS SCA; acquire wave- form software applications; acquire Crypto
An approach for assessing software prototypes
NASA Technical Reports Server (NTRS)
Church, V. E.; Card, D. N.; Agresti, W. W.; Jordan, Q. L.
1986-01-01
A procedure for evaluating a software prototype is presented. The need to assess the prototype itself arises from the use of prototyping to demonstrate the feasibility of a design or development stategy. The assessment procedure can also be of use in deciding whether to evolve a prototype into a complete system. The procedure consists of identifying evaluations criteria, defining alterative design approaches, and ranking the alternatives according to the criteria.
A Nursing Informatics Research Agenda for 2008–18: Contextual Influences and Key Components
Bakken, Suzanne; Stone, Patricia W.; Larson, Elaine L.
2008-01-01
The context for nursing informatics research has changed significantly since the National Institute of Nursing Research-funded Nursing Informatics Research Agenda was published in 1993 and the Delphi study of nursing informatics research priorities reported a decade ago. The authors focus on three specific aspects of context - genomic health care, shifting research paradigms, and social (Web 2.0) technologies - that must be considered in formulating a nursing informatics research agenda. These influences are illustrated using the significant issue of healthcare associated infections (HAI). A nursing informatics research agenda for 2008–18 must expand users of interest to include interdisciplinary researchers; build upon the knowledge gained in nursing concept representation to address genomic and environmental data; guide the reengineering of nursing practice; harness new technologies to empower patients and their caregivers for collaborative knowledge development; develop user-configurable software approaches that support complex data visualization, analysis, and predictive modeling; facilitate the development of middle-range nursing informatics theories; and encourage innovative evaluation methodologies that attend to human-computer interface factors and organizational context. PMID:18922269
2018-01-01
Introduction of selectively chemical reactive groups at the cell surface enables site-specific cell surface labeling and modification opportunity, thus facilitating the capability to study the cell surface molecular structure and function and the molecular mechanism it underlies. Further, it offers the opportunity to change or improve a cell’s functionality for interest of choice. In this study, two chemical reactive anchor lipids, phosphatidylethanolamine–poly(ethylene glycol)–dibenzocyclooctyne (DSPE–PEG2000–DBCO) and cholesterol–PEG–dibenzocyclooctyne (CHOL–PEG2000–DBCO) were synthesized and their potential application for cell surface re-engineering via lipid fusion were assessed with RAW 264.7 cells as a model cell. Briefly, RAW 264.7 cells were incubated with anchor lipids under various concentrations and at different incubation times. The successful incorporation of the chemical reactive anchor lipids was confirmed by biotinylation via copper-free click chemistry, followed by streptavidin-fluorescein isothiocyanate binding. In comparison, the cholesterol-based anchor lipid afforded a higher cell membrane incorporation efficiency with less internalization than the phospholipid-based anchor lipid. Low cytotoxicity of both anchor lipids upon incorporation into the RAW 264.7 cells was observed. Further, the cell membrane residence time of the cholesterol-based anchor lipid was evaluated with confocal microscopy. This study suggests the potential cell surface re-engineering applications of the chemical reactive anchor lipids. PMID:29503972
Evolving technologies for Space Station Freedom computer-based workstations
NASA Technical Reports Server (NTRS)
Jensen, Dean G.; Rudisill, Marianne
1990-01-01
Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
A New Look at NASA: Strategic Research In Information Technology
NASA Technical Reports Server (NTRS)
Alfano, David; Tu, Eugene (Technical Monitor)
2002-01-01
This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.
Dye-Sensitized Solar Cells (DSSCs) reengineering using TiO2 with natural dye (anthocyanin)
NASA Astrophysics Data System (ADS)
Subodro, Rohmat; Kristiawan, Budi; Ramelan, Ari Handono; Wahyuningsih, Sayekti; Munawaroh, Hanik; Hanif, Qonita Awliya; Saputri, Liya Nikmatul Maula Zulfa
2017-01-01
This research on Dye-Sensitized Solar Cells (DSSCs) reengineering was carried out using TiO2 with natural dye (anthocyanin). The fabrication of active carbon layer/TiO2 DSSC solar cell was based on natural dye containing anthocyanins such as mangosteen peel, red rose flower, black glutinous rice, and purple eggplant peel. DSSC was prepared with TiO2 thin layer doped with active carbon; Natural dye was analyzed using UV-Vis and TiO2 was analyzed using X-ray diffractometer (XRD), meanwhile scanning electron microscope (SEM) was used to obtain the size of the crystal. Keithley instrument test was carried out to find out I-V characteristics indicating that the highest efficiency occurred in DSSCs solar cell with 24-hour soaking with mangosteen peel 0.00047%.
Cicchetti, A
1998-01-01
The innovations introduced in the Italian Health Care System by the legislative decrees No. 502/92 and 517/93 are remarkable, drawing on managerial methods and organizational structure of hospitals. To face this kind of change, in 1995 the "Policlinico A. Gemelli" prepared a five-year Strategic Plan that gave strategical lines and targets to be pursued. This plan has isolated 3 main strategical projects to be implemented with the participation of all medical and managerial professionals operating in the "Policlinico". The most complex project is that of the reengineering of management processes, we expect to end during 1998. The main target is to check the fundamental factors involved with the aim of achieving patient satisfaction as well as a cost-effective management.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Pressburger, Thomas; Markosian, Lawrence; Feather, Martin S.
2006-01-01
New processes, methods and tools are constantly appearing in the field of software engineering. Many of these augur great potential in improving software development processes, resulting in higher quality software with greater levels of assurance. However, there are a number of obstacles that impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may most benefit them, and cannot afford to risk time and effort in evaluating and experimenting with them while there is still uncertainty about whether they will have payoff in this particular context. Similarly, researchers cannot readily identify those practitioners whose problems would be amenable to their techniques and lack the feedback from practical applications necessary to help them to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team, and the NASA Research Infusion Initiative, established by NASA s Software Engineering Initiative, to overcome these obstacles.
Impacts of object-oriented technologies: Seven years of SEL studies
NASA Technical Reports Server (NTRS)
Stark, Mike
1993-01-01
This paper examines the premise that object-oriented technology (OOT) is the most significant technology ever examined by the Software Engineering Laboratory. The evolution of the use of OOT in the Software Engineering Laboratory (SEL) 'Experience Factory' is described in terms of the SEL's original expectations, focusing on how successive generations of projects have used OOT. General conclusions are drawn on how the usage of the technology has evolved in this environment.
Infusing and selecting V&V activities
NASA Technical Reports Server (NTRS)
Feather, M. S.
2002-01-01
The evolving nature of software development poses a continuing series of challenges for V&V. In response, the V&V community selectively adapts the use of existing V&V activities, and introduces new and improved ones.
Maintaining the Health of Software Monitors
NASA Technical Reports Server (NTRS)
Person, Suzette; Rungta, Neha
2013-01-01
Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.
Maintaining Quality and Confidence in Open-Source, Evolving Software: Lessons Learned with PFLOTRAN
NASA Astrophysics Data System (ADS)
Frederick, J. M.; Hammond, G. E.
2017-12-01
Software evolution in an open-source framework poses a major challenge to a geoscientific simulator, but when properly managed, the pay-off can be enormous for both the developers and the community at large. Developers must juggle implementing new scientific process models, adopting increasingly efficient numerical methods and programming paradigms, changing funding sources (or total lack of funding), while also ensuring that legacy code remains functional and reported bugs are fixed in a timely manner. With robust software engineering and a plan for long-term maintenance, a simulator can evolve over time incorporating and leveraging many advances in the computational and domain sciences. In this positive light, what practices in software engineering and code maintenance can be employed within open-source development to maximize the positive aspects of software evolution and community contributions while minimizing its negative side effects? This presentation will discusses steps taken in the development of PFLOTRAN (www.pflotran.org), an open source, massively parallel subsurface simulator for multiphase, multicomponent, and multiscale reactive flow and transport processes in porous media. As PFLOTRAN's user base and development team continues to grow, it has become increasingly important to implement strategies which ensure sustainable software development while maintaining software quality and community confidence. In this presentation, we will share our experiences and "lessons learned" within the context of our open-source development framework and community engagement efforts. Topics discussed will include how we've leveraged both standard software engineering principles, such as coding standards, version control, and automated testing, as well unique advantages of object-oriented design in process model coupling, to ensure software quality and confidence. We will also be prepared to discuss the major challenges faced by most open-source software teams, such as on-boarding new developers or one-time contributions, dealing with competitors or lookie-loos, and other downsides of complete transparency, as well as our approach to community engagement, including a user group email list, hosting short courses and workshops for new users, and maintaining a website. SAND2017-8174A
2009-01-01
interface, mechatronics, video games 1. INTRODUCTION Engineering methods have substantially and continuously evolved over the past 40 years. In the past...1970s, video games have pioneered interactive simulation and laid the groundwork for inexpensive computing that individuals, corporations, and...purposes. This has not gone unnoticed, and software technology and techniques evolved for video games are beginning to have extraordinary impact in
Department of Defense Travel Reengineering Pilot Report to Congress
1997-06-01
Electronic Commerce /Electronic Data Interchange (EC/EDI) capabilities to integrate functions. automate edit checks for internal controls, and create user-friendly management tools at all levels of the process.
Computer System Performance Measurement Techniques for ARTS III Computer Systems
DOT National Transportation Integrated Search
1973-12-01
The potential contribution of direct system measurement in the evolving ARTS 3 Program is discussed and software performance measurement techniques are comparatively assessed in terms of credibility of results, ease of implementation, volume of data,...
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
A Genetic Representation for Evolutionary Fault Recovery in Virtex FPGAs
NASA Technical Reports Server (NTRS)
Lohn, Jason; Larchev, Greg; DeMara, Ronald; Korsmeyer, David (Technical Monitor)
2003-01-01
Most evolutionary approaches to fault recovery in FPGAs focus on evolving alternative logic configurations as opposed to evolving the intra-cell routing. Since the majority of transistors in a typical FPGA are dedicated to interconnect, nearly 80% according to one estimate, evolutionary fault-recovery systems should benefit hy accommodating routing. In this paper, we propose an evolutionary fault-recovery system employing a genetic representation that takes into account both logic and routing configurations. Experiments were run using a software model of the Xilinx Virtex FPGA. We report that using four Virtex combinational logic blocks, we were able to evolve a 100% accurate quadrature decoder finite state machine in the presence of a stuck-at-zero fault.
Lessons learned applying CASE methods/tools to Ada software development projects
NASA Technical Reports Server (NTRS)
Blumberg, Maurice H.; Randall, Richard L.
1993-01-01
This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.
Design and engineering of water-soluble light-harvesting protein maquettes
Kodali, Goutham; Mancini, Joshua A.; Solomon, Lee A.; ...
2017-01-01
Design of nanometer scale artificial light harvesting and charge separating proteins enables reengineering to overcome the limitations of natural selection for efficient systems that better meet human energetic needs.
Design and engineering of water-soluble light-harvesting protein maquettes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kodali, Goutham; Mancini, Joshua A.; Solomon, Lee A.
Design of nanometer scale artificial light harvesting and charge separating proteins enables reengineering to overcome the limitations of natural selection for efficient systems that better meet human energetic needs.
An evaluation of software tools for the design and development of cockpit displays
NASA Technical Reports Server (NTRS)
Ellis, Thomas D., Jr.
1993-01-01
The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.
Are Earth System model software engineering practices fit for purpose? A case study.
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Johns, T. C.
2009-04-01
We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.
Agile: From Software to Mission Systems
NASA Technical Reports Server (NTRS)
Trimble, Jay; Shirley, Mark; Hobart, Sarah
2017-01-01
To maximize efficiency and flexibility in Mission Operations System (MOS) design, we are evolving principles from agile and lean methods for software, to the complete mission system. This allows for reduced operational risk at reduced cost, and achieves a more effective design through early integration of operations into mission system engineering and flight system design. The core principles are assessment of capability through demonstration, risk reduction through targeted experiments, early test and deployment, and maturation of processes and tools through use.
NASA Technical Reports Server (NTRS)
Lohn, Jason; Smith, David; Frank, Jeremy; Globus, Al; Crawford, James
2007-01-01
JavaGenes is a general-purpose, evolutionary software system written in Java. It implements several versions of a genetic algorithm, simulated annealing, stochastic hill climbing, and other search techniques. This software has been used to evolve molecules, atomic force field parameters, digital circuits, Earth Observing Satellite schedules, and antennas. This version differs from version 0.7.28 in that it includes the molecule evolution code and other improvements. Except for the antenna code, JaveGenes is available for NASA Open Source distribution.
Imaging has enormous untapped potential to improve cancer research through software to extract and process morphometric and functional biomarkers. In the era of non-cytotoxic treatment agents, multi- modality image-guided ablative therapies and rapidly evolving computational resources, quantitative imaging software can be transformative in enabling minimally invasive, objective and reproducible evaluation of cancer treatment response. Post-processing algorithms are integral to high-throughput analysis and fine- grained differentiation of multiple molecular targets.
Preliminary Design and Implementation of a Method for Validating Evolving ADA Compilers.
1983-03-01
Goodenough, John B. "The Ada Compiler Validation Capability," Computer. 14 (6): 57-64 (June 1981). 7. Pressman, Roger S. Software Engineering : A Practi...COMPILERS THESIS Presented to the faculty of the School of Engineering of the Air Force Institute of Technology Air University in Partial Fulfillment...support and encouragement they have given me. ii Contents Page 1. INTRODUCTION 1 1.1 Background -- DoDls Software Problem 1 1.1.1 The proliferation of
ScaffoldSeq: Software for characterization of directed evolution populations.
Woldring, Daniel R; Holec, Patrick V; Hackel, Benjamin J
2016-07-01
ScaffoldSeq is software designed for the numerous applications-including directed evolution analysis-in which a user generates a population of DNA sequences encoding for partially diverse proteins with related functions and would like to characterize the single site and pairwise amino acid frequencies across the population. A common scenario for enzyme maturation, antibody screening, and alternative scaffold engineering involves naïve and evolved populations that contain diversified regions, varying in both sequence and length, within a conserved framework. Analyzing the diversified regions of such populations is facilitated by high-throughput sequencing platforms; however, length variability within these regions (e.g., antibody CDRs) encumbers the alignment process. To overcome this challenge, the ScaffoldSeq algorithm takes advantage of conserved framework sequences to quickly identify diverse regions. Beyond this, unintended biases in sequence frequency are generated throughout the experimental workflow required to evolve and isolate clones of interest prior to DNA sequencing. ScaffoldSeq software uniquely handles this issue by providing tools to quantify and remove background sequences, cluster similar protein families, and dampen the impact of dominant clones. The software produces graphical and tabular summaries for each region of interest, allowing users to evaluate diversity in a site-specific manner as well as identify epistatic pairwise interactions. The code and detailed information are freely available at http://research.cems.umn.edu/hackel. Proteins 2016; 84:869-874. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Evolving the Reuse Process at the Flight Dynamics Division (FDD) Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Condon, S.; Seaman, C.; Basili, Victor; Kraft, S.; Kontio, J.; Kim, Y.
1996-01-01
This paper presents the interim results from the Software Engineering Laboratory's (SEL) Reuse Study. The team conducting this study has, over the past few months, been studying the Generalized Support Software (GSS) domain asset library and architecture, and the various processes associated with it. In particular, we have characterized the process used to configure GSS-based attitude ground support systems (AGSS) to support satellite missions at NASA's Goddard Space Flight Center. To do this, we built detailed models of the tasks involved, the people who perform these tasks, and the interdependencies and information flows among these people. These models were based on information gleaned from numerous interviews with people involved in this process at various levels. We also analyzed effort data in order to determine the cost savings in moving from actual development of AGSSs to support each mission (which was necessary before GSS was available) to configuring AGSS software from the domain asset library. While characterizing the GSS process, we became aware of several interesting factors which affect the successful continued use of GSS. Many of these issues fall under the subject of evolving technologies, which were not available at the inception of GSS, but are now. Some of these technologies could be incorporated into the GSS process, thus making the whole asset library more usable. Other technologies are being considered as an alternative to the GSS process altogether. In this paper, we outline some of issues we will be considering in our continued study of GSS and the impact of evolving technologies.
24 CFR 402.5 - Contract renewals under section 524(b) or (e) of MAHRA.
Code of Federal Regulations, 2011 CFR
2011-04-01
... Reengineering demonstration authority for which HUD made a determination that debt restructuring is... RESTRUCTURING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT SECTION 8 PROJECT-BASED CONTRACT RENEWAL UNDER...
24 CFR 402.5 - Contract renewals under section 524(b) or (e) of MAHRA.
Code of Federal Regulations, 2014 CFR
2014-04-01
... Reengineering demonstration authority for which HUD made a determination that debt restructuring is... RESTRUCTURING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT SECTION 8 PROJECT-BASED CONTRACT RENEWAL UNDER...
24 CFR 402.5 - Contract renewals under section 524(b) or (e) of MAHRA.
Code of Federal Regulations, 2010 CFR
2010-04-01
... Reengineering demonstration authority for which HUD made a determination that debt restructuring is... RESTRUCTURING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT SECTION 8 PROJECT-BASED CONTRACT RENEWAL UNDER...
24 CFR 402.5 - Contract renewals under section 524(b) or (e) of MAHRA.
Code of Federal Regulations, 2013 CFR
2013-04-01
... Reengineering demonstration authority for which HUD made a determination that debt restructuring is... RESTRUCTURING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT SECTION 8 PROJECT-BASED CONTRACT RENEWAL UNDER...
24 CFR 402.5 - Contract renewals under section 524(b) or (e) of MAHRA.
Code of Federal Regulations, 2012 CFR
2012-04-01
... Reengineering demonstration authority for which HUD made a determination that debt restructuring is... RESTRUCTURING, DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT SECTION 8 PROJECT-BASED CONTRACT RENEWAL UNDER...
Reengineering hemodialysis for the home environment.
Treu, Denny
2010-01-01
Denny Treu of NxStage Medical, Inc., who has led the development of six dialysis systems with various companies, reports here on a home hemodialysis system that his company successfully designed specifically for home use.
76 FR 68124 - Television Broadcasting Services; Fond du Lac, WI
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-03
...- engineered proposal that would result in the replacement translators covering the projected analog loss area. The document finds that the re-engineered translators sufficiently address any loss of service, and...
Reengineering NHS Hospitals in Greece: Redistribution Leads to Rational Mergers.
Nikolentzos, Athanasios; Kontodimopoulos, Nick; Polyzos, Nikolaos; Thireos, Eleftherios; Tountas, Yannis
2015-03-18
The purpose of this study was to record and evaluate existing public hospital infrastructure of the National Health System (NHS), in terms of clinics and laboratories, as well as the healthcare workforce in each of these units and in every health region in Greece, in an attempt to optimize the allocation of these resources. An extensive analysis of raw data according to supply and performance indicators was performed to serve as a solid and objective scientific baseline for the proposed reengineering of the Greek public hospitals. Suggestions for "reshuffling" clinics and diagnostic laboratories, and their personnel, were made by using a best versus worst outcome indicator approach at a regional and national level. This study is expected to contribute to the academic debate about the gap between theory and evidence based decision-making in health policy.
Using Human Factors Methods to Design a New Interface for an Electronic Medical Record
Saleem, Jason J.; Patterson, Emily S.; Militello, Laura; Asch, Steven M.; Doebbeling, Bradley N.; Render, Marta L.
2007-01-01
The Veterans Health Administration (VHA) is a leader in development and use of electronic patient records and clinical decision support. The VHA is currently reengineering a somewhat dated platform for its Computerized Patient Record System (CPRS). This process affords a unique opportunity to implement major changes to the current design and function of the system. We report on two human factors studies designed to provide input and guidance during this reengineering process. One study involved a card sort to better understand how providers tend to cognitively organize clinical data, and how that understanding can help guide interface design. The other involved a simulation to assess the impact of redesign modifications on computerized clinical reminders, a form of clinical decision support in the CPRS, on the learnability of the system for first-time users. PMID:18693914
Integrating Automation into a Multi-Mission Operations Center
NASA Technical Reports Server (NTRS)
Surka, Derek M.; Jones, Lori; Crouse, Patrick; Cary, Everett A, Jr.; Esposito, Timothy C.
2007-01-01
NASA Goddard Space Flight Center's Space Science Mission Operations (SSMO) Project is currently tackling the challenge of minimizing ground operations costs for multiple satellites that have surpassed their prime mission phase and are well into extended mission. These missions are being reengineered into a multi-mission operations center built around modern information technologies and a common ground system infrastructure. The effort began with the integration of four SMEX missions into a similar architecture that provides command and control capabilities and demonstrates fleet automation and control concepts as a pathfinder for additional mission integrations. The reengineered ground system, called the Multi-Mission Operations Center (MMOC), is now undergoing a transformation to support other SSMO missions, which include SOHO, Wind, and ACE. This paper presents the automation principles and lessons learned to date for integrating automation into an existing operations environment for multiple satellites.
Reengineering NHS Hospitals in Greece: Redistribution Leads to Rational Mergers
Nikolentzos, Athanasios; Kontodimopoulos, Nick; Polyzos, Nikolaos; Thireos, Eleftherios; Tountas, Yannis
2015-01-01
The purpose of this study was to record and evaluate existing public hospital infrastructure of the National Health System (NHS), in terms of clinics and laboratories, as well as the healthcare workforce in each of these units and in every health region in Greece, in an attempt to optimize the allocation of these resources. An extensive analysis of raw data according to supply and performance indicators was performed to serve as a solid and objective scientific baseline for the proposed reengineering of the Greek public hospitals. Suggestions for “reshuffling” clinics and diagnostic laboratories, and their personnel, were made by using a best versus worst outcome indicator approach at a regional and national level. This study is expected to contribute to the academic debate about the gap between theory and evidence based decision-making in health policy. PMID:26156925
Managing your practice's first impression: the process of front-desk reengineering.
Walsh, Alison L
2004-01-01
Patients must be regarded as consumers. As such, they are increasingly informed, questioning, cost-conscious, technologically savvy, and demanding. Just as health plans have developed defined contribution products that offer consumers more control over how and where their health-care dollars are spent, practice success is linked to reengineering office operations to offer consumers and patients greater choice, control, autonomy, and service. Patients and consumers want practices that deliver clinical and business services that meet the criteria of reliability, effciency, service offerings, patient focus, enthusiasm, customization, and trust. Physician practices must also take care to avoid destructive and disruptive behaviors and conditions such as noise, interference, excessive repetition, long waits, appointment delays, and staff rudeness. A successful patient-focused practice emerges when physicians and office staff begin to look at the clinical and service experience through the patient's eyes.
Re-engineering of CYP2C9 to probe acid-base substrate selectivity.
Tai, Guoying; Dickmann, Leslie J; Matovic, Nicholas; DeVoss, James J; Gillam, Elizabeth M J; Rettie, Allan E
2008-10-01
A common feature of many CYP2C9 ligands is their weak acidity. As revealed by crystallography, the structural basis for this behavior involves a charge-pairing interaction between an anionic moiety on the substrate and an active site R108 residue. In the present study we attempted to re-engineer CYP2C9 to better accept basic ligands by charge reversal at this key residue. We expressed and purified the R108E and R108E/D293N mutants and compared their ability with that of native CYP2C9 to interact with (S)-warfarin, diclofenac, pyrene, propranolol, and ibuprofen amine. As expected, the R108E mutant maintained all the native enzyme's pyrene 1-hydroxylation activity, but catalytic activity toward diclofenac and (S)-warfarin was abrogated. In contrast, the double mutant displayed much less selectivity in its behavior toward these control ligands. Neither of the mutants displayed significant enhancement of propranolol metabolism, and all three preparations exhibited a type II (inhibitor) rather than type I (substrate) spectrum with ibuprofen amine, although binding became progressively weaker with the single and double mutants. Collectively, these data underscore the importance of the amino acid at position 108 in the acid substrate selectivity of CYP2C9, highlight the accommodating nature of the CYP2C9 active site, and provide a cautionary note regarding facile re-engineering of these complex cytochrome P450 active sites.
Re-engineering of CYP2C9 to Probe Acid-Base Substrate Selectivity
Tai, Guoying; Dickmann, Leslie J.; Matovic, Nicholas; DeVoss, James J.; Gillam, Elizabeth M. J.; Rettie, Allan E.
2009-01-01
A common feature of many CYP2C9 ligands is their weak acidity. As revealed by crystallography, the structural basis for this behavior involves a charge-pairing interaction between an anionic moiety on the substrate and an active site R108 residue. In the present study we attempted to re-engineer CYP2C9 to better accept basic ligands by charge reversal at this key residue. We expressed and purified the R108E and R108E/D293N mutants and compared their ability with that of native CYP2C9 to interact with (S)-warfarin, diclofenac, pyrene, propranolol, and ibuprofen amine. As expected, the R108E mutant maintained all the native enzyme's pyrene 1-hydroxylation activity, but catalytic activity toward diclofenac and (S)-warfarin was abrogated. In contrast, the double mutant displayed much less selectivity in its behavior toward these control ligands. Neither of the mutants displayed significant enhancement of propranolol metabolism, and all three preparations exhibited a type II (inhibitor) rather than type I (substrate) spectrum with ibuprofen amine, although binding became progressively weaker with the single and double mutants. Collectively, these data underscore the importance of the amino acid at position 108 in the acid substrate selectivity of CYP2C9, highlight the accommodating nature of the CYP2C9 active site, and provide a cautionary note regarding facile re-engineering of these complex cytochrome P450 active sites. PMID:18606741
Teleoperated Modular Robots for Lunar Operations
NASA Technical Reports Server (NTRS)
Globus, Al; Hornby, Greg; Larchev, Greg; Hancher, Matt; Cannon, Howard; Lohn, Jason
2004-01-01
Solar system exploration is currently carried out by special purpose robots exquisitely designed for the anticipated tasks. However, all contingencies for in situ resource utilization (ISRU), human habitat preparation, and exploration will be difficult to anticipate. Furthermore, developing the necessary special purpose mechanisms for deployment and other capabilities is difficult and error prone. For example, the Galileo high gain antenna never opened, severely restricting the quantity of data returned by the spacecraft. Also, deployment hardware is used only once. To address these problems, we are developing teleoperated modular robots for lunar missions, including operations in transit from Earth. Teleoperation of lunar systems from Earth involves a three second speed-of-light delay, but experiment suggests that interactive operations are feasible.' Modular robots typically consist of many identical modules that pass power and data between them and can be reconfigured for different tasks providing great flexibility, inherent redundancy and graceful degradation as modules fail. Our design features a number of different hub, link, and joint modules to simplify the individual modules, lower structure cost, and provide specialized capabilities. Modular robots are well suited for space applications because of their extreme flexibility, inherent redundancy, high-density packing, and opportunities for mass production. Simple structural modules can be manufactured from lunar regolith in situ using molds or directed solar sintering. Software to direct and control modular robots is difficult to develop. We have used genetic algorithms to evolve both the morphology and control system for walking modular robots3 We are currently using evolvable system technology to evolve controllers for modular robots in the ISS glove box. Development of lunar modular robots will require software and physical simulators, including regolith simulation, to enable design and test of robot software and hardware, particularly automation software. Ready access to these simulators could provide opportunities for contest-driven development ala RoboCup (http://www.robocup.org/). Licensing of module designs could provide opportunities in the toy market and for spin-off applications.
The Road to Successful ITS Software Acquisition. Executive Summary
DOT National Transportation Integrated Search
2013-08-01
This report analyzes the merits and limits of active sensing technologies such as radar, LIDAR, and ultrasonic detectors and how the market for these technologies is evolving and being applied to vehicles and highway infrastructure to improve...
Web-Based Mapping Puts the World at Your Fingertips
NASA Technical Reports Server (NTRS)
2008-01-01
NASA's award-winning Earth Resources Laboratory Applications Software (ELAS) package was developed at Stennis Space Center. Since 1978, ELAS has been used worldwide for processing satellite and airborne sensor imagery data of the Earth's surface into readable and usable information. DATASTAR Inc., of Picayune, Mississippi, has used ELAS software in the DATASTAR Image Processing Exploitation (DIPEx) desktop and Internet image processing, analysis, and manipulation software. The new DIPEx Version III includes significant upgrades and improvements compared to its esteemed predecessor. A true World Wide Web application, this product evolved with worldwide geospatial dimensionality and numerous other improvements that seamlessly support the World Wide Web version.
PATHFINDER: Probing Atmospheric Flows in an Integrated and Distributed Environment
NASA Technical Reports Server (NTRS)
Wilhelmson, R. B.; Wojtowicz, D. P.; Shaw, C.; Hagedorn, J.; Koch, S.
1995-01-01
PATHFINDER is a software effort to create a flexible, modular, collaborative, and distributed environment for studying atmospheric, astrophysical, and other fluid flows in the evolving networked metacomputer environment of the 1990s. It uses existing software, such as HDF (Hierarchical Data Format), DTM (Data Transfer Mechanism), GEMPAK (General Meteorological Package), AVS, SGI Explorer, and Inventor to provide the researcher with the ability to harness the latest in desktop to teraflop computing. Software modules developed during the project are available in the public domain via anonymous FTP from the National Center for Supercomputing Applications (NCSA). The address is ftp.ncsa.uiuc.edu, and the directory is /SGI/PATHFINDER.
[Virtual Patients and Medical Teaching].
Gómez-Restrepo, Carlos; Narváez, Yamile Reveiz
2012-01-01
Biomedical advancements have evolved to the point where teaching software may be implemented to represent real-life scenarios. Virtual Patients or VPs are software programs that simulate clinical scenarios allowing students to generate a diagnosis and make treatment decisions. In this article, advantages and disadvantages regarding the use of this state-of-the-art technology are discussed. VP is a useful technique for psychiatry students. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
NASA Technical Reports Server (NTRS)
Garrocq, C. A.; Hurley, M. J.
1973-01-01
System requirements, software elements, and hardware equipment required for an IPAD system are defined. An IPAD conceptual design was evolved, a potential user survey was conducted, and work loads for various types of interactive terminals were projected. Various features of major host computing systems were compared, and target systems were selected in order to identify the various elements of software required.
Land and Hold Short Operations : A Primer
DOT National Transportation Integrated Search
1996-04-20
Michigan Department of Transportation (M-DOT) started its Systems Re-engineering process with a clear road map the PROSE initiative. PROSE, standing for PROject Support Environment, is an ambitious venture to develop strategic information systems aut...
On Wings: Aerodynamics of Eagles.
ERIC Educational Resources Information Center
Millson, David
2000-01-01
The Aerodynamics Wing Curriculum is a high school program that combines basic physics, aerodynamics, pre-engineering, 3D visualization, computer-assisted drafting, computer-assisted manufacturing, production, reengineering, and success in a 15-hour, 3-week classroom module. (JOW)
Bacterial production of methyl ketones
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beller, Harry R.; Goh, Ee-Been
The present invention relates to methods and compositions for increasing production of methyl ketones in a genetically modified host cell that overproduces .beta.-ketoacyl-CoAs through a re-engineered .beta.-oxidation pathway and overexpresses FadM.
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
The systematic evolution of a NASA software technology, Appendix C
NASA Technical Reports Server (NTRS)
Deregt, M. P.; Dulfer, J. E.
1972-01-01
A long range program is described whose ultimate purpose is to make possible the production of software in NASA within predictable schedule and budget constraints and with major characteristics such as size, run-time, and correctness predictable within reasonable tolerances. As part of the program a pilot NASA computer center will be chosen to apply software development and management techniques systematically and determine a set which is effective. The techniques will be developed by a Technology Group, which will guide the pilot project and be responsible for its success. The application of the technology will involve a sequence of NASA programming tasks graduated from simpler ones at first to complex systems in late phases of the project. The evaluation of the technology will be made by monitoring the operation of the software at the users' installations. In this way a coherent discipline for software design, production maintenance, and management will be evolved.
A Validation Framework for the Long Term Preservation of High Energy Physics Data
NASA Astrophysics Data System (ADS)
Ozerov, Dmitri; South, David M.
2014-06-01
The study group on data preservation in high energy physics, DPHEP, is moving to a new collaboration structure, which will focus on the implementation of preservation projects, such as those described in the group's large scale report published in 2012. One such project is the development of a validation framework, which checks the compatibility of evolving computing environments and technologies with the experiments software for as long as possible, with the aim of substantially extending the lifetime of the analysis software, and hence of the usability of the data. The framework is designed to automatically test and validate the software and data of an experiment against changes and upgrades to the computing environment, as well as changes to the experiment software itself. Technically, this is realised using a framework capable of hosting a number of virtual machine images, built with different configurations of operating systems and the relevant software, including any necessary external dependencies.
Infusing Software Assurance Research Techniques into Use
NASA Technical Reports Server (NTRS)
Pressburger, Thomas; DiVito, Ben; Feather, Martin S.; Hinchey, Michael; Markosian, Lawrence; Trevino, Luis C.
2006-01-01
Research in the software engineering community continues to lead to new development techniques that encompass processes, methods and tools. However, a number of obstacles impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may benefit them, and cannot afford to risk time and effort evaluating and trying one out while there remains uncertainty about whether it will work for them. Researchers cannot readily identify the practitioners whose problems would be amenable to their techniques, and, lacking feedback from practical applications, are hard-pressed to gauge the where and in what ways to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team established by NASA s Software Engineering Initiative to overcome these obstacles. .
The Evolution of Sonic Ecosystems
NASA Astrophysics Data System (ADS)
McCormack, Jon
This chapter describes a novel type of artistic artificial life software environment. Agents that have the ability to make and listen to sound populate a synthetic world. An evolvable, rule-based classifier system drives agent behavior. Agents compete for limited resources in a virtual environment that is influenced by the presence and movement of people observing the system. Electronic sensors create a link between the real and virtual spaces, virtual agents evolve implicitly to try to maintain the interest of the human audience, whose presence provides them with life-sustaining food.
Reengineering ribosome export.
Lo, Kai-Yin; Johnson, Arlen W
2009-03-01
Large cargoes require multiple receptors for efficient transport through the nuclear pore complex. The 60S ribosomal subunit is one of the bulkiest transport cargoes, and in yeast three different receptors, Crm1, Mex67/Mtr2, and Arx1, collaborate in its export. However, only Crm1, recruited by the adapter Nmd3, appears to be conserved for 60S export in higher eukaryotes. We asked if export of the large subunit requires specific receptors. We made protein fusions between mutant Nmd3 and various export receptors. Surprisingly, fusions of Mex67, the tRNA exportin Los1, Mtr2, Cse1, or Msn5 to Nmd3, lacking its Crm1-dependent nuclear export signal (NES), all functioned in export. Furthermore, these chimeric proteins supported 60S export even in the presence of the Crm1 inhibitor leptomycin B, indicating that export was now independent of Crm1. These results suggest that there is not a requirement for a specific export receptor for the large subunit, as recruitment of any receptor will suffice. Finally we show that the addition of an NES directly to the 60S ribosomal subunit protein Rpl3 promotes export. These results imply remarkable flexibility in the export pathway for the 60S subunit and help explain how different export receptors could have evolved in different eukaryotic lineages.
Lo, Kai-Yin
2009-01-01
Large cargoes require multiple receptors for efficient transport through the nuclear pore complex. The 60S ribosomal subunit is one of the bulkiest transport cargoes, and in yeast three different receptors, Crm1, Mex67/Mtr2, and Arx1, collaborate in its export. However, only Crm1, recruited by the adapter Nmd3, appears to be conserved for 60S export in higher eukaryotes. We asked if export of the large subunit requires specific receptors. We made protein fusions between mutant Nmd3 and various export receptors. Surprisingly, fusions of Mex67, the tRNA exportin Los1, Mtr2, Cse1, or Msn5 to Nmd3, lacking its Crm1-dependent nuclear export signal (NES), all functioned in export. Furthermore, these chimeric proteins supported 60S export even in the presence of the Crm1 inhibitor leptomycin B, indicating that export was now independent of Crm1. These results suggest that there is not a requirement for a specific export receptor for the large subunit, as recruitment of any receptor will suffice. Finally we show that the addition of an NES directly to the 60S ribosomal subunit protein Rpl3 promotes export. These results imply remarkable flexibility in the export pathway for the 60S subunit and help explain how different export receptors could have evolved in different eukaryotic lineages. PMID:19144820
Technologies to counter aviation security threats
NASA Astrophysics Data System (ADS)
Karoly, Steve
2017-11-01
The Aviation and Transportation Security Act (ATSA) makes TSA responsible for security in all modes of transportation, and requires that TSA assess threats to transportation, enforce security-related regulations and requirements, and ensure the adequacy of security measures at airports and other transportation facilities. Today, TSA faces a significant challenge and must address a wide range of commercial, military grade, and homemade explosives and these can be presented in an infinite number of configurations and from multiple vectors. TSA screens 2 million passengers and crew, and screens almost 5 million carry-on items and 1.2 million checked bags daily. As TSA explores new technologies for improving efficiency and security, those on the forefront of research and development can help identify unique and advanced methods to combat terrorism. Research and Development (R&D) drives the development of future technology investments that can address an evolving adversary and aviation threat. The goal is to rethink the aviation security regime in its entirety, and rather than focusing security at particular points in the enterprise, distribute security from the time a reservation is made to the time a passenger boards the aircraft. The ultimate objective is to reengineer aviation security from top to bottom with a continued focus on increasing security throughout the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Roser, Robert; LeCompte, Tom
2015-10-29
Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3)more » Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.« less
Reengineering the Tumor Microenvironment to Alleviate Hypoxia and Overcome Cancer Heterogeneity
Martin, John D.; Fukumura, Dai; Duda, Dan G.; Boucher, Yves; Jain, Rakesh K.
2017-01-01
Solid tumors consist of cancer cells and stromal cells, including resident and transiting immune cells—all ensconced in an extracellular matrix (ECM)—nourished by blood vessels and drained by lymphatic vessels. The microenvironment constituents are abnormal and heterogeneous in morphology, phenotype, and physiology. Such irregularities include an inefficient tumor vascular network comprised of leaky and compressed vessels, which impair blood flow and oxygen delivery. Low oxygenation in certain tumor regions—or focal hypoxia—is a mediator of cancer progression, metastasis, immunosuppression, and treatment resistance. Thus, repairing an abnormal and heterogeneous microenvironment—and hypoxia in particular—can significantly improve treatments of solid tumors. Here, we summarize two strategies to reengineer the tumor microenvironment (TME)—vessel normalization and decompression—that can alleviate hypoxia. In addition, we discuss how these two strategies alone and in combination with each other—or other therapeutic strategies—may overcome the challenges posed by cancer heterogeneity. PMID:27663981
A total design and implementation of an intelligent mobile chemotherapy medication administration.
Kuo, Ming-Chuan; Chang, Polun
2014-01-01
The chemotherapy medication administration is a process involved many stakeholders and efforts. Therefore, the information support system cannot be well designed if the entire process was not carefully examined and reengineered first. We, from a 805-teaching medical center, did a process reengineering and involved physicians, pharmacists and IT engineers to work together to design a mobile support solution. System was implemented in March to July, 2013. A 6" android handheld device with 1D BCR was used as the main hardware. 18 nurses were invited to evaluate their perceived acceptance of system based on Technology Acceptance Model for Mobile Service Model. Time saved was also calculated to measure the effectiveness of system. The results showed positive support from nurses. The estimated time saved every year was about 288 nursing days. We believe our mobile chemotherapy medication administration support system is successful in terms of acceptance and real impacts.
Silverman, Steven M; Moses, John E; Sharpless, K Barry
2017-01-01
Vancomycin has long been considered a drug of last resort. Its efficiency in treating multiple drug-resistant bacterial infections, particularly methicillin-resistant Staphylococcus aureus (MRSA), has had a profound effect on the treatment of life-threatening infections. However, the emergence of resistance to vancomycin is a cause for significant worldwide concern, prompting the urgent development of new effective treatments for antibiotic resistant bacterial infections. Harnessing the benefits of multivalency and cooperativity against vancomycin-resistant strains, we report a Click Chemistry approach towards reengineered vancomycin derivatives and the synthesis of a number of dimers with increased potency against MRSA and vancomycin resistant Enterococci (VRE; VanB). These semi-synthetic dimeric ligands were linked together with great efficiency using the powerful CuAAC reaction, demonstrating high levels of selectivity and purity. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Constructive Engineering of Simulations
NASA Technical Reports Server (NTRS)
Snyder, Daniel R.; Barsness, Brendan
2011-01-01
Joint experimentation that investigates sensor optimization, re-tasking and management has far reaching implications for Department of Defense, Interagency and multinational partners. An adaption of traditional human in the loop (HITL) Modeling and Simulation (M&S) was one approach used to generate the findings necessary to derive and support these implications. Here an entity-based simulation was re-engineered to run on USJFCOM's High Performance Computer (HPC). The HPC was used to support the vast number of constructive runs necessary to produce statistically significant data in a timely manner. Then from the resulting sensitivity analysis, event designers blended the necessary visualization and decision making components into a synthetic environment for the HITL simulations trials. These trials focused on areas where human decision making had the greatest impact on the sensor investigations. Thus, this paper discusses how re-engineering existing M&S for constructive applications can positively influence the design of an associated HITL experiment.
Re-engineering the process of medical imaging physics and technology education and training.
Sprawls, Perry
2005-09-01
The extensive availability of digital technology provides an opportunity for enhancing both the effectiveness and efficiency of virtually all functions in the process of medical imaging physics and technology education and training. This includes degree granting academic programs within institutions and a wide spectrum of continuing education lifelong learning activities. Full achievement of the advantages of technology-enhanced education (e-learning, etc.) requires an analysis of specific educational activities with respect to desired outcomes and learning objectives. This is followed by the development of strategies and resources that are based on established educational principles. The impact of contemporary technology comes from its ability to place learners into enriched learning environments. The full advantage of a re-engineered and implemented educational process involves changing attitudes and functions of learning facilitators (teachers) and resource allocation and sharing both within and among institutions.
Beyond the computer-based patient record: re-engineering with a vision.
Genn, B; Geukers, L
1995-01-01
In order to achieve real benefit from the potential offered by a Computer-Based Patient Record, the capabilities of the technology must be applied along with true re-engineering of healthcare delivery processes. University Hospital recognizes this and is using systems implementation projects, such as the catalyst, for transforming the way we care for our patients. Integration is fundamental to the success of these initiatives and this must be explicitly planned against an organized systems architecture whose standards are market-driven. University Hospital also recognizes that Community Health Information Networks will offer improved quality of patient care at a reduced overall cost to the system. All of these implementation factors are considered up front as the hospital makes its initial decisions on to how to computerize its patient records. This improves our chances for success and will provide a consistent vision to guide the hospital's development of new and better patient care.
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly; Ganesan, Dharma; Stratton, William C.; Sibol, Deane E.
2008-01-01
Analyze, Visualize, and Evaluate structure and behavior using static and dynamic information, individual systems as well as systems of systems. Next steps: Refine software tool support; Apply to other systems; and Apply earlier in system life cycle.
Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...
The Evolution of Constructivist Learning Environments: Immersion in Distributed, Virtual Worlds.
ERIC Educational Resources Information Center
Dede, Chris
1995-01-01
Discusses the evolution of constructivist learning environments and examines the collaboration of simulated software models, virtual environments, and evolving mental models via immersion in artificial realities. A sidebar gives a realistic example of a student navigating through cyberspace. (JMV)
An Introduction to Flight Software Development: FSW Today, FSW 2010
NASA Technical Reports Server (NTRS)
Gouvela, John
2004-01-01
Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by automated office assistants. The infrastructure in use today includes strict software development and configuration management procedures, including strong control of resource management and critical skills coverage. This will evolve to a fully integrated staff organization with efficient and effective communication throughout all levels guided by a Mission-Systems Architecture framework with focus on risk management and attention toward inevitable product obsolescence. This infrastructure of computing equipment, software and processes will itself be subject to technological change and need for management of change and improvement,
NASA Technical Reports Server (NTRS)
Oluwole, Oluwayemisi O.; Wong, Hsi-Wu; Green, William
2012-01-01
AdapChem software enables high efficiency, low computational cost, and enhanced accuracy on computational fluid dynamics (CFD) numerical simulations used for combustion studies. The software dynamically allocates smaller, reduced chemical models instead of the larger, full chemistry models to evolve the calculation while ensuring the same accuracy to be obtained for steady-state CFD reacting flow simulations. The software enables detailed chemical kinetic modeling in combustion CFD simulations. AdapChem adapts the reaction mechanism used in the CFD to the local reaction conditions. Instead of a single, comprehensive reaction mechanism throughout the computation, a dynamic distribution of smaller, reduced models is used to capture accurately the chemical kinetics at a fraction of the cost of the traditional single-mechanism approach.
Open Source Next Generation Visualization Software for Interplanetary Missions
NASA Technical Reports Server (NTRS)
Trimble, Jay; Rinker, George
2016-01-01
Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).
V&V Plan for FPGA-based ESF-CCS Using System Engineering Approach.
NASA Astrophysics Data System (ADS)
Maerani, Restu; Mayaka, Joyce; El Akrat, Mohamed; Cheon, Jung Jae
2018-02-01
Instrumentation and Control (I&C) systems play an important role in maintaining the safety of Nuclear Power Plant (NPP) operation. However, most current I&C safety systems are based on Programmable Logic Controller (PLC) hardware, which is difficult to verify and validate, and is susceptible to software common cause failure. Therefore, a plan for the replacement of the PLC-based safety systems, such as the Engineered Safety Feature - Component Control System (ESF-CCS), with Field Programmable Gate Arrays (FPGA) is needed. By using a systems engineering approach, which ensures traceability in every phase of the life cycle, from system requirements, design implementation to verification and validation, the system development is guaranteed to be in line with the regulatory requirements. The Verification process will ensure that the customer and stakeholder’s needs are satisfied in a high quality, trustworthy, cost efficient and schedule compliant manner throughout a system’s entire life cycle. The benefit of the V&V plan is to ensure that the FPGA based ESF-CCS is correctly built, and to ensure that the measurement of performance indicators has positive feedback that “do we do the right thing” during the re-engineering process of the FPGA based ESF-CCS.
Modular Countermine Payload for Small Robots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herman Herman; Doug Few; Roelof Versteeg
2010-04-01
Payloads for small robotic platforms have historically been designed and implemented as platform and task specific solutions. A consequence of this approach is that payloads cannot be deployed on different robotic platforms without substantial re-engineering efforts. To address this issue, we developed a modular countermine payload that is designed from the ground-up to be platform agnostic. The payload consists of the multi-mission payload controller unit (PCU) coupled with the configurable mission specific threat detection, navigation and marking payloads. The multi-mission PCU has all the common electronics to control and interface to all the payloads. It also contains the embedded processormore » that can be used to run the navigational and control software. The PCU has a very flexible robot interface which can be configured to interface to various robot platforms. The threat detection payload consists of a two axis sweeping arm and the detector. The navigation payload consists of several perception sensors that are used for terrain mapping, obstacle detection and navigation. Finally, the marking payload consists of a dual-color paint marking system. Through the multi-mission PCU, all these payloads are packaged in a platform agnostic way to allow deployment on multiple robotic platforms, including Talon and Packbot.« less
Modular countermine payload for small robots
NASA Astrophysics Data System (ADS)
Herman, Herman; Few, Doug; Versteeg, Roelof; Valois, Jean-Sebastien; McMahill, Jeff; Licitra, Michael; Henciak, Edward
2010-04-01
Payloads for small robotic platforms have historically been designed and implemented as platform and task specific solutions. A consequence of this approach is that payloads cannot be deployed on different robotic platforms without substantial re-engineering efforts. To address this issue, we developed a modular countermine payload that is designed from the ground-up to be platform agnostic. The payload consists of the multi-mission payload controller unit (PCU) coupled with the configurable mission specific threat detection, navigation and marking payloads. The multi-mission PCU has all the common electronics to control and interface to all the payloads. It also contains the embedded processor that can be used to run the navigational and control software. The PCU has a very flexible robot interface which can be configured to interface to various robot platforms. The threat detection payload consists of a two axis sweeping arm and the detector. The navigation payload consists of several perception sensors that are used for terrain mapping, obstacle detection and navigation. Finally, the marking payload consists of a dual-color paint marking system. Through the multimission PCU, all these payloads are packaged in a platform agnostic way to allow deployment on multiple robotic platforms, including Talon and Packbot.
Re-Engineering Complex Legacy Systems at NASA
NASA Technical Reports Server (NTRS)
Ruszkowski, James; Meshkat, Leila
2010-01-01
The Flight Production Process (FPP) Re-engineering project has established a Model-Based Systems Engineering (MBSE) methodology and the technological infrastructure for the design and development of a reference, product-line architecture as well as an integrated workflow model for the Mission Operations System (MOS) for human space exploration missions at NASA Johnson Space Center. The design and architectural artifacts have been developed based on the expertise and knowledge of numerous Subject Matter Experts (SMEs). The technological infrastructure developed by the FPP Re-engineering project has enabled the structured collection and integration of this knowledge and further provides simulation and analysis capabilities for optimization purposes. A key strength of this strategy has been the judicious combination of COTS products with custom coding. The lean management approach that has led to the success of this project is based on having a strong vision for the whole lifecycle of the project and its progress over time, a goal-based design and development approach, a small team of highly specialized people in areas that are critical to the project, and an interactive approach for infusing new technologies into existing processes. This project, which has had a relatively small amount of funding, is on the cutting edge with respect to the utilization of model-based design and systems engineering. An overarching challenge that was overcome by this project was to convince upper management of the needs and merits of giving up more conventional design methodologies (such as paper-based documents and unwieldy and unstructured flow diagrams and schedules) in favor of advanced model-based systems engineering approaches.
Kukafka, Rita; Allegrante, John P; Khan, Sharib; Bigger, J Thomas; Johnson, Stephen B
2013-09-01
Solutions are employed to support clinical research trial tasks in community-based practice settings. Using the IT Implementation Framework (ITIF), an integrative framework intended to guide the synthesis of theoretical perspectives for planning multi-level interventions to enhance IT use, we sought to understand the barriers and facilitators to clinical research in community-based practice settings preliminary to implementing new informatics solutions for improving clinical research infrastructure. The studies were conducted in practices within the Columbia University Clinical Trials Network. A mixed-method approach, including surveys, interviews, time-motion studies, and observations was used. The data collected, which incorporates predisposing, enabling, and reinforcing factors in IT use, were analyzed according to each phase of ITIF. Themes identified in the first phase of ITIF were 1) processes and tools to support clinical trial research and 2) clinical research peripheral to patient care processes. Not all of the problems under these themes were found to be amenable to IT solutions. Using the multi-level orientation of the ITIF, we set forth strategies beyond IT solutions that can have an impact on reengineering clinical research tasks in practice-based settings. Developing strategies to target enabling and reinforcing factors, which focus on organizational factors, and the motivation of the practice at large to use IT solutions to integrate clinical research tasks with patient care processes, is most challenging. The ITIF should be used to consider both IT and non-IT solutions concurrently for reengineering of clinical research in community-based practice settings. © 2013.
Reengineering Education at Anderson Consulting.
ERIC Educational Resources Information Center
Nowakowski, Alan
1994-01-01
Describes new educational approaches and classroom management procedures that are changing the professional education program at a business consulting firm. Highlights include a needs assessment study; skills development; specialists; adapting to change; computer-based self-study training programs; curriculum reassessment; and curriculum…
77 FR 45379 - Advisory Board Hearing
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-31
.... Place: U.S. Department of Justice, Main Conference Center, 7th Floor, 950 Pennsylvania Avenue NW...- based Budgeting and Public Safety; reengineering population management; cost-effective strategies for meeting policy requirements and legislative mandates; innovative cost-saving strategies; identifying and...
Business Process Reengineering for Quality Improvement.
1995-07-01
17 1.7.3. UNDERSTANDING BUSINESS VALUE...44 3.1.1. UNDERSTANDING BUSINESS PROCESS...targets which will promote the survival and growth of the business. 1.7.3. UNDERSTANDING BUSINESS VALUE Many businesses that have survived the test of time
Adopting best practices: "Agility" moves from software development to healthcare project management.
Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge
2006-01-01
It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.
NASA Astrophysics Data System (ADS)
Oliveira, Micael
The CECAM Electronic Structure Library (ESL) is a community-driven effort to segregate shared pieces of software as libraries that could be contributed and used by the community. Besides allowing to share the burden of developing and maintaining complex pieces of software, these can also become a target for re-coding by software engineers as hardware evolves, ensuring that electronic structure codes remain at the forefront of HPC trends. In a series of workshops hosted at the CECAM HQ in Lausanne, the tools and infrastructure for the project were prepared, and the first contributions were included and made available online (http://esl.cecam.org). In this talk I will present the different aspects and aims of the ESL and how these can be useful for the electronic structure community.
Production of a small-circulation medical journal using desktop publishing methods.
Peters, B A
1994-07-01
Since its inception in January 1988, the Baylor University Medical Center Proceedings, a quarterly medical journal, has been published by the few staff of the Scientific Publications Office (Baylor Research Institute, Dallas, Texas, USA) using microcomputers and page-makeup software in conjunction with a commercial printing company. This article outlines the establishment of the journal; the steps used in the publication process; the software and hardware used; and the changes in design, content, and circulation that have taken place as the journal and the technology used to create it have evolved.
Frequency Estimator Performance for a Software-Based Beacon Receiver
NASA Technical Reports Server (NTRS)
Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix
2014-01-01
As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.
Re-engineering Ammunition Residue Management in IMCOM-SE
2008-06-01
Max Accoutability ) Non-Automatic Return Item Recycling (OBJ: Max Items, Max Profit) Regionalize Store Brass (OBJ: Min Time) Demilitarize... Accoutability ) Non-Automatic Return Item Recycling (OBJ: Max Items, Max Profit) Regionalize Store Brass (OBJ: Min Time) Demilitarize Brass (OBJ: Min
Toward High-Performance Organizations.
ERIC Educational Resources Information Center
Lawler, Edward E., III
2002-01-01
Reviews management changes that companies have made over time in adopting or adapting four approaches to organizational performance: employee involvement, total quality management, re-engineering, and knowledge management. Considers future possibilities and defines a new view of what constitutes effective organizational design in management.…
78 FR 17722 - Technological Upgrades to Registration and Recordation Functions
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-22
... 2000, the Copyright Office initiated a comprehensive business process reengineering initiative intended... outside consultants and business analysts, the Office identified opportunities for efficiency enhancements... business processes and the automated production of public copyright records. Funding available for the...
Code of Federal Regulations, 2013 CFR
2013-01-01
... 5 Administrative Personnel 1 2013-01-01 2013-01-01 false Definitions. 410.101 Section 410.101 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS TRAINING General Provisions... organizational needs in response to human resource plans and re-engineering, downsizing, restructuring, and/or...
Code of Federal Regulations, 2014 CFR
2014-01-01
... 5 Administrative Personnel 1 2014-01-01 2014-01-01 false Definitions. 410.101 Section 410.101 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS TRAINING General Provisions... organizational needs in response to human resource plans and re-engineering, downsizing, restructuring, and/or...
Code of Federal Regulations, 2012 CFR
2012-01-01
... 5 Administrative Personnel 1 2012-01-01 2012-01-01 false Definitions. 410.101 Section 410.101 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS TRAINING General Provisions... organizational needs in response to human resource plans and re-engineering, downsizing, restructuring, and/or...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Definitions. 410.101 Section 410.101 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS TRAINING General Provisions... organizational needs in response to human resource plans and re-engineering, downsizing, restructuring, and/or...
Organizational Paradigm Shifts.
ERIC Educational Resources Information Center
National Association of College and University Business Officers, Washington, DC.
This collection of essays explores a new paradigm of higher education. The first essay, "Beyond Re-engineering: Changing the Organizational Paradigm" (L. Edwin Coate), suggests a model of quality process management and a structure for managing organizational change. "Thinking About Consortia" (Mary Jo Maydew) discusses…
NASA Technical Reports Server (NTRS)
Mckay, C. W.; Bown, R. L.
1985-01-01
The space station data management system involves networks of computing resources that must work cooperatively and reliably over an indefinite life span. This program requires a long schedule of modular growth and an even longer period of maintenance and operation. The development and operation of space station computing resources will involve a spectrum of systems and software life cycle activities distributed across a variety of hosts, an integration, verification, and validation host with test bed, and distributed targets. The requirement for the early establishment and use of an apporopriate Computer Systems and Software Engineering Support Environment is identified. This environment will support the Research and Development Productivity challenges presented by the space station computing system.
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1
NASA Technical Reports Server (NTRS)
1985-01-01
The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.
Studying the laws of software evolution in a long-lived FLOSS project.
Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe
2014-07-01
Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd.
Studying the laws of software evolution in a long-lived FLOSS project
Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe
2014-01-01
Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd. PMID:25893093
Career Resilience. ERIC Digest No. 178.
ERIC Educational Resources Information Center
Brown, Bettina Lankard
Changes in the workplace such as mergers, acquisitions, reengineering, and downsizing are forcing individuals to recognize the temporary nature of all jobs and develop what has been termed "career resilience.""Career resilience" differs from "career self-reliance" in that the former refers to individual career…
John Weisberg; Jay Beaman
2001-01-01
Progress in the options for survey data collection and its effective processing continues. This paper focuses on the rapidly evolving capabilities of handheld computers, and their effective exploitation including links to data captured from scanned questionnaires (OMR and barcodes). The paper describes events in Parks Canada that led to the creation of survey software...
A Method for Aligning Acquisition Strategies and Software Architectures
2014-09-01
system • Want to make sure the system can be readily evolved to use new technology Members of the HR staff ( super - visors and those who would use the...References URLs are valid as of the publication date of this document. [Barbacci 2003] Barbacci, Mario , Ellison, Robert, Lattanze, Anthony, Stafford
Evolution of Ada technology in the flight dynamics area: Design phase analysis
NASA Technical Reports Server (NTRS)
Quimby, Kelvin L.; Esker, Linda
1988-01-01
The software engineering issues related to the use of the Ada programming language during the design phase of an Ada project are analyzed. Discussion shows how an evolving understanding of these issues is reflected in the design processes of three generations of Ada projects.
Music, Technology, and an Evolving Curriculum.
ERIC Educational Resources Information Center
Moore, Brian
1992-01-01
Mechanical examples of musical technology, like the Steinway piano, are well known and accepted. Use of computers and electronic technology is the next logical step in developing art of music. MIDI (Musical Instrument Digital Interface) is explained, along with digital devices (such as synthesizers, sequencers, music notation software, multimedia,…
An Overview of Video Description: History, Benefits, and Guidelines
ERIC Educational Resources Information Center
Packer, Jaclyn; Vizenor, Katie; Miele, Joshua A.
2015-01-01
This article provides an overview of the historical context in which video description services have evolved in the United States, a summary of research demonstrating benefits to people with vision loss, an overview of current video description guidelines, and information about current software programs that are available to produce video…
Robotics for Computer Scientists: What's the Big Idea?
ERIC Educational Resources Information Center
Touretzky, David S.
2013-01-01
Modern robots, like today's smartphones, are complex devices with intricate software systems. Introductory robot programming courses must evolve to reflect this reality, by teaching students to make use of the sophisticated tools their robots provide rather than reimplementing basic algorithms. This paper focuses on teaching with Tekkotsu, an open…
Surveillance Jumps on the Network
ERIC Educational Resources Information Center
Raths, David
2011-01-01
Internet protocol (IP) network-based cameras and digital video management software are maturing, and many issues that have surrounded them, including bandwidth, data storage, ease of use, and integration are starting to become clearer as the technology continues to evolve. Prices are going down and the number of features is going up. Many school…
An efficient, modular and simple tape archiving solution for LHC Run-3
NASA Astrophysics Data System (ADS)
Murray, S.; Bahyl, V.; Cancio, G.; Cano, E.; Kotlyar, V.; Kruse, D. F.; Leduc, J.
2017-10-01
The IT Storage group at CERN develops the software responsible for archiving to tape the custodial copy of the physics data generated by the LHC experiments. Physics run 3 will start in 2021 and will introduce two major challenges for which the tape archive software must be evolved. Firstly the software will need to make more efficient use of tape drives in order to sustain the predicted data rate of 150 petabytes per year as opposed to the current 50 petabytes per year. Secondly the software will need to be seamlessly integrated with EOS, which has become the de facto disk storage system provided by the IT Storage group for physics data. The tape storage software for LHC physics run 3 is code named CTA (the CERN Tape Archive). This paper describes how CTA will introduce a pre-emptive drive scheduler to use tape drives more efficiently, will encapsulate all tape software into a single module that will sit behind one or more EOS systems, and will be simpler by dropping support for obsolete backwards compatibility.
NASA Astrophysics Data System (ADS)
Madaras, Gary S.
2002-05-01
The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.
Flight simulation software at NASA Dryden Flight Research Center
NASA Technical Reports Server (NTRS)
Norlin, Ken A.
1995-01-01
The NASA Dryden Flight Research Center has developed a versatile simulation software package that is applicable to a broad range of fixed-wing aircraft. This package has evolved in support of a variety of flight research programs. The structure is designed to be flexible enough for use in batch-mode, real-time pilot-in-the-loop, and flight hardware-in-the-loop simulation. Current simulations operate on UNIX-based platforms and are coded with a FORTRAN shell and C support routines. This paper discusses the features of the simulation software design and some basic model development techniques. The key capabilities that have been included in the simulation are described. The NASA Dryden simulation software is in use at other NASA centers, within industry, and at several universities. The straightforward but flexible design of this well-validated package makes it especially useful in an engineering environment.
Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R
2009-01-01
We report on the real-time creation of an application for hands-on neurophysiology in an advanced undergraduate teaching laboratory. Enabled by the rapid software development tools included in the Matlab technical computing environment (The Mathworks, Natick, MA), a team, consisting of a neurophysiology educator and a biophysicist trained as an electrical engineer, interfaced to a course of approximately 15 students from engineering and biology backgrounds. The result is the powerful freeware data acquisition and analysis environment, "g-PRIME." The software was developed from week to week in response to curriculum demands, and student feedback. The program evolved from a simple software oscilloscope, enabling RC circuit analysis, to a suite of tools supporting analysis of neuronal excitability and synaptic transmission analysis in invertebrate model systems. The program has subsequently expanded in application to university courses, research, and high school projects in the US and abroad as free courseware.
Parallel software for lattice N = 4 supersymmetric Yang-Mills theory
NASA Astrophysics Data System (ADS)
Schaich, David; DeGrand, Thomas
2015-05-01
We present new parallel software, SUSY LATTICE, for lattice studies of four-dimensional N = 4 supersymmetric Yang-Mills theory with gauge group SU(N). The lattice action is constructed to exactly preserve a single supersymmetry charge at non-zero lattice spacing, up to additional potential terms included to stabilize numerical simulations. The software evolved from the MILC code for lattice QCD, and retains a similar large-scale framework despite the different target theory. Many routines are adapted from an existing serial code (Catterall and Joseph, 2012), which SUSY LATTICE supersedes. This paper provides an overview of the new parallel software, summarizing the lattice system, describing the applications that are currently provided and explaining their basic workflow for non-experts in lattice gauge theory. We discuss the parallel performance of the code, and highlight some notable aspects of the documentation for those interested in contributing to its future development.
Marshall Space Flight Center Ground Systems Development and Integration
NASA Technical Reports Server (NTRS)
Wade, Gina
2016-01-01
Ground Systems Development and Integration performs a variety of tasks in support of the Mission Operations Laboratory (MOL) and other Center and Agency projects. These tasks include various systems engineering processes such as performing system requirements development, system architecture design, integration, verification and validation, software development, and sustaining engineering of mission operations systems that has evolved the Huntsville Operations Support Center (HOSC) into a leader in remote operations for current and future NASA space projects. The group is also responsible for developing and managing telemetry and command configuration and calibration databases. Personnel are responsible for maintaining and enhancing their disciplinary skills in the areas of project management, software engineering, software development, software process improvement, telecommunications, networking, and systems management. Domain expertise in the ground systems area is also maintained and includes detailed proficiency in the areas of real-time telemetry systems, command systems, voice, video, data networks, and mission planning systems.
Galdzicki, Michal; Clancy, Kevin P; Oberortner, Ernst; Pocock, Matthew; Quinn, Jacqueline Y; Rodriguez, Cesar A; Roehner, Nicholas; Wilson, Mandy L; Adam, Laura; Anderson, J Christopher; Bartley, Bryan A; Beal, Jacob; Chandran, Deepak; Chen, Joanna; Densmore, Douglas; Endy, Drew; Grünberg, Raik; Hallinan, Jennifer; Hillson, Nathan J; Johnson, Jeffrey D; Kuchinsky, Allan; Lux, Matthew; Misirli, Goksel; Peccoud, Jean; Plahar, Hector A; Sirin, Evren; Stan, Guy-Bart; Villalobos, Alan; Wipat, Anil; Gennari, John H; Myers, Chris J; Sauro, Herbert M
2014-06-01
The re-use of previously validated designs is critical to the evolution of synthetic biology from a research discipline to an engineering practice. Here we describe the Synthetic Biology Open Language (SBOL), a proposed data standard for exchanging designs within the synthetic biology community. SBOL represents synthetic biology designs in a community-driven, formalized format for exchange between software tools, research groups and commercial service providers. The SBOL Developers Group has implemented SBOL as an XML/RDF serialization and provides software libraries and specification documentation to help developers implement SBOL in their own software. We describe early successes, including a demonstration of the utility of SBOL for information exchange between several different software tools and repositories from both academic and industrial partners. As a community-driven standard, SBOL will be updated as synthetic biology evolves to provide specific capabilities for different aspects of the synthetic biology workflow.
Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H
2004-06-01
Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.
Phosphate Sink Containing Two-Component Signaling Systems as Tunable Threshold Devices
Amin, Munia; Kothamachu, Varun B.; Feliu, Elisenda; Scharf, Birgit E.; Porter, Steven L.; Soyer, Orkun S.
2014-01-01
Synthetic biology aims to design de novo biological systems and reengineer existing ones. These efforts have mostly focused on transcriptional circuits, with reengineering of signaling circuits hampered by limited understanding of their systems dynamics and experimental challenges. Bacterial two-component signaling systems offer a rich diversity of sensory systems that are built around a core phosphotransfer reaction between histidine kinases and their output response regulator proteins, and thus are a good target for reengineering through synthetic biology. Here, we explore the signal-response relationship arising from a specific motif found in two-component signaling. In this motif, a single histidine kinase (HK) phosphotransfers reversibly to two separate output response regulator (RR) proteins. We show that, under the experimentally observed parameters from bacteria and yeast, this motif not only allows rapid signal termination, whereby one of the RRs acts as a phosphate sink towards the other RR (i.e. the output RR), but also implements a sigmoidal signal-response relationship. We identify two mathematical conditions on system parameters that are necessary for sigmoidal signal-response relationships and define key parameters that control threshold levels and sensitivity of the signal-response curve. We confirm these findings experimentally, by in vitro reconstitution of the one HK-two RR motif found in the Sinorhizobium meliloti chemotaxis pathway and measuring the resulting signal-response curve. We find that the level of sigmoidality in this system can be experimentally controlled by the presence of the sink RR, and also through an auxiliary protein that is shown to bind to the HK (yielding Hill coefficients of above 7). These findings show that the one HK-two RR motif allows bacteria and yeast to implement tunable switch-like signal processing and provides an ideal basis for developing threshold devices for synthetic biology applications. PMID:25357192
Re-engineering the mission life cycle with ABC and IDEF
NASA Technical Reports Server (NTRS)
Mandl, Daniel; Rackley, Michael; Karlin, Jay
1994-01-01
The theory behind re-engineering a business process is to remove the non-value added activities thereby lowering the process cost. In order to achieve this, one must be able to identify where the non-value added elements are located which is not a trivial task. This is because the non-value added elements are often hidden in the form of overhead and/or pooled resources. In order to be able to isolate these non-value added processes from among the other processes, one must first decompose the overall top level process into lower layers of sub-processes. In addition, costing data must be assigned to each sub-process along with the value the sub-process adds towards the final product. IDEF0 is a Federal Information Processing Standard (FIPS) process-modeling tool that allows for this functional decomposition through structured analysis. In addition, it illustrates the relationship of the process and the value added to the product or service. The value added portion is further defined in IDEF1X which is an entity relationship diagramming tool. The entity relationship model is the blueprint of the product as it moves along the 'assembly line' and therefore relates all of the parts to each other and the final product. It also relates the parts to the tools that produce the product and all of the paper work that is used in their acquisition. The use of IDEF therefore facilitates the use of Activity Based Costing (ABC). ABC is an essential method in a high variety, product-customizing environment, to facilitate rapid response to externally caused change. This paper describes the work being done in the Mission Operations Division to re-engineer the development and operation life cycle of Mission Operations Centers using these tools.
Shafeghat, Hossein; Jafari, Mehdi; Monavarian, Abbas; Shafayi, Maryam; Dehnavieh, Reza
2014-02-01
Labor laws and regulations have inevitable effects on employees' work motivation as well as the overall efficiency and productivity of the organization. This study was conducted to assess the effects of the "Countrywide Services Management Law" on the work motivation level of the employees of the Iranian Ministry of Health. This cross-sectional study was done in 2011 in the Iran's Ministry of Health. Data was collected by a 51-item Likert scale questionnaire, in five domains including: organizational structure, information technology, training patterns, salary and bonus system and re-engineering process. The reliability and validity of the questionnaire was evaluated (Cronbach's alpha= 0.96). Data analysis was conducted using descriptive and inferential statistics (t-test). Out of 192 samples examined, 55.2% of the respondents were female, 88 (45.8%) had BS degree and 116 (60.4%) had less than 10 years' experience. The mean scores in the domains of organizational structure, information technology, training patterns, salary and bonus system and re-engineering patterns were: 3.11, 3.51, 3.05, 3.21 and 3.14, respectively. Relationship between the items related to manpower in the "Countrywide Services Management Law", with employees' work motivation was significant (P < 0.0001). The training patterns did not show a significant relation (P < 0.26) with any of five domains. According to our results and the views of the employees of the Iranian Ministry of Health, "Countrywide Services Management Law" positively affected the personnel's work motivation regarding all the factors associated with motivation including: organizational structure, information technology, training patterns, salary and bonus system and re-engineering pattern. Finally, to enhance the workforce motivation and satisfaction level, application and implementation of the rules and regulations should be based on the organizational needs.
Wong, Brian J F; Karimi, Koohyar; Devcic, Zlatko; McLaren, Christine E; Chen, Wen-Pin
2008-06-01
The objectives of this study were to: 1) determine if a genetic algorithm in combination with morphing software can be used to evolve more attractive faces; and 2) evaluate whether this approach can be used as a tool to define or identify the attributes of the ideal attractive face. Basic research study incorporating focus group evaluations. Digital images were acquired of 250 female volunteers (18-25 y). Randomly selected images were used to produce a parent generation (P) of 30 synthetic faces using morphing software. Then, a focus group of 17 trained volunteers (18-25 y) scored each face on an attractiveness scale ranging from 1 (unattractive) to 10 (attractive). A genetic algorithm was used to select 30 new pairs from the parent generation, and these were morphed using software to produce a new first generation (F1) of faces. The F1 faces were scored by the focus group, and the process was repeated for a total of four iterations of the algorithm. The algorithm mimics natural selection by using the attractiveness score as the selection pressure; the more attractive faces are more likely to morph. All five generations (P-F4) were then scored by three focus groups: a) surgeons (n = 12), b) cos-metology students (n = 44), and c) undergraduate students (n = 44). Morphometric measurements were made of 33 specific features on each of the 150 synthetic faces, and correlated with attractiveness scores using univariate and multivariate analysis. The average facial attractiveness scores increased with each generation and were 3.66 (+0.60), 4.59 (+/-0.73), 5.50 (+/-0.62), 6.23 (+/-0.31), and 6.39 (+/-0.24) for P and F1-F4 generations, respectively. Histograms of attractiveness score distributions show a significant shift in the skew of each curve toward more attractive faces with each generation. Univariate analysis identified nasal width, eyebrow arch height, and lip thickness as being significantly correlated with attractiveness scores. Multivariate analysis identified a similar collection of morphometric measures. No correlation with more commonly accepted measures such as the length facial thirds or fifths were identified. When images are examined as a montage (by generation), clear distinct trends are identified: oval shaped faces, distinct arched eyebrows, and full lips predominate. Faces evolve to approximate the guidelines suggested by classical canons. F3 and F4 generation faces look profoundly similar. The statistical and qualitative analysis indicates that the algorithm and methodology succeeds in generating successively more attractive faces. The use of genetic algorithms in combination with a morphing software and traditional focus-group derived attractiveness scores can be used to evolve attractive synthetic faces. We have demonstrated that the evolution of attractive faces can be mimicked in software. Genetic algorithms and morphing provide a robust alternative to traditional approaches rooted in comparing attractiveness scores with a series of morphometric measurements in human subjects.
Beuscart-Zéphir, Marie-Catherine; Pelayo, Sylvia; Bernonville, Stéphanie
2010-04-01
The objectives of this paper are: In this approach, the implementation of such a complex IT solution is considered a major redesign of the work system. The paper describes the Human Factor (HF) tasks embedded in the project lifecycle: (1) analysis and modelling of the current work system and usability assessment of the medication CPOE solution; (2) HF recommendations for work re-design and usability recommendations for IT system re-engineering both aiming at a safer and more efficient work situation. Standard ethnographic methods were used to support the analysis of the current work system and work situations, coupled with cognitive task analysis methods and documents review. Usability inspection (heuristic evaluation) and both in-lab (simulated tasks) and on-site (real tasks) usability tests were performed for the evaluation of the CPOE candidate. Adapted software engineering models were used in combination with usual textual descriptions, tasks models and mock-ups to support the recommendations for work and product re-design. The analysis of the work situations identified different work organisations and procedures across the hospital's departments. The most important differences concerned the doctor-nurse communications and cooperation modes and the procedures for preparing and administering the medications. The assessment of the medication CPOE functions uncovered a number of usability problems including severe ones leading to impossible to detect or to catch errors. Models of the actual and possible distribution of tasks and roles were used to support decision making in the work design process. The results of the usability assessment were translated into requirements to support the necessary re-engineering of the IT application. The HFE approach to medication CPOE efficiently identifies and distinguishes currently unsafe or uncomfortable work situations that could obviously benefit from an IT solution from other work situations incorporating efficient work procedures that might be impaired by the implementation of the CPOE. In this context, a careful redesign of the work situation and of the entire work system is necessary to actually benefit from the installation of the product in terms of patient safety and human performances. In parallel, a usability assessment of the product to be implemented is mandatory to identify potentially dangerous usability flaws and to fix them before the installation. (c) 2009 Elsevier Ireland Ltd. All rights reserved.
Network-Based Analysis of Software Change Propagation
Wang, Rongcun; Qu, Binbin
2014-01-01
The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system. PMID:24790557
Network-based analysis of software change propagation.
Wang, Rongcun; Huang, Rubing; Qu, Binbin
2014-01-01
The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.
[The application of new technologies to hospital pharmacy in Spain].
Bermejo Vicedo, T; Pérez Menéndez Conde, C; Alvarez, Ana; Codina, Carlos; Delgado, Olga; Herranz, Ana; Hidalgo Correas, Francisco; Martín, Isabel; Martínez, Julio; Luis Poveda, José; Queralt Gorgas, María; Sanjurjo Sáez, María
2007-01-01
To describe the degree of introduction of new technologies in the medication use process in pharmacy services in Spain. A descriptive study via a survey into the degree of introduction of computer systems for: management, computerized physician order entry (CPOE), automated unit dose drug dispensing, preparation of parenteral nutrition solutions, recording drug administration, pharmaceutical care and foreseen improvements. The survey was sent by electronic mail to the heads of the pharmacy services of 207 hospitals throughout Spain. Response index: 82 hospitals (38.6%). 29 hospitals (36.7%) have a modular management system, 24 (30.4%) an integrated one and 34 (44.9%) a modular-integrated one. CPOE is utilised in 17 (22.4%). According to the size of the hospital, between 17.9 and 26.7% of unit dose dispensing is done online with a management software; between 5.1 and 33.3% of unit dose dispensing is automated. Automation of unit dose dispensing centred in the pharmacy service varies between 10 and 33.3%. Between 13.2 and 35.7% of automated in-ward dispensing systems are utilised. Administration records are kept manually on a computerised sheet at 23 (31.5%) of the hospitals; at 4 (5.4%) on CPOE and 7 (9.5%) online on the integral management programme and 4 (5.4%) on specific nursing softwares. Sixty-three per cent foresee the implementation of improvements in the short to medium term. The introduction of new technologies is being developed in Spain aiming to improve the safety and management of drugs, and there is a trend towards increasing their deployment in the near future. It is hoped that their fomentation could help to bring about process reengineering within pharmacy services in order to increase the time available for devotion to pharmaceutical care.
Information Technology Architectures. New Opportunities for Partnering, CAUSE94. Track VI.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Eight papers are presented from the 1994 CAUSE conference track on information technology architectures as applied to higher education institutions. The papers include: (1) "Reshaping the Enterprise: Building the Next Generation of Information Systems Through Information Architecture and Processing Reengineering," which notes…
Measuring Institutional Performance in Higher Education.
ERIC Educational Resources Information Center
Meyerson, Joel W., Ed.; Massy, William F., Ed.
This collection of seven essays from the Stanford Forum for Higher Education Futures focuses on how downsizing, quality management, and reengineering have are affecting higher education. An introductory paper, "Introduction: Change in Higher Education: Its Effect on Institutional Performance," (Joel W. Meyerson and Sandra L. Johnson)…
Re-Engineering Graduate Skills--A Case Study
ERIC Educational Resources Information Center
Nair, Chenicheri Sid; Patil, Arun; Mertova, Patricie
2009-01-01
Research on student-learning outcomes indicates that university graduates do not possess important skills required by employers, such as communication, decision-making, problem-solving, leadership, emotional intelligence, social ethics skills as well as the ability to work with people of different backgrounds. Today, engineering graduates are…
Thriving and Not Just Surviving: New Directions for Tomorrow's Performance-Improvement Managers.
ERIC Educational Resources Information Center
Kaufman, Roger
2000-01-01
Examines popular approaches to performance and organizational improvement to see what flaws they contain and how continued practice will impede progress. Discusses benchmarking; quality management/continuous improvement; needs assessment; training; downsizing; reengineering; system(s) approach; and strategic planning. Describes the Organizational…
The National Ambient Air Monitoring Stategy: Rethinking the Role of National Networks
A current re-engineering of the United States routine ambient monitoring networks intended to improve the balance in addressing both regulatory and scientific objectives is addressed in this paper. Key attributes of these network modifications include the addition of collocated ...
Integrating Thermal Tools Into the Mechanical Design Process
NASA Technical Reports Server (NTRS)
Tsuyuki, Glenn T.; Siebes, Georg; Novak, Keith S.; Kinsella, Gary M.
1999-01-01
The intent of mechanical design is to deliver a hardware product that meets or exceeds customer expectations, while reducing cycle time and cost. To this end, an integrated mechanical design process enables the idea of parallel development (concurrent engineering). This represents a shift from the traditional mechanical design process. With such a concurrent process, there are significant issues that have to be identified and addressed before re-engineering the mechanical design process to facilitate concurrent engineering. These issues also assist in the integration and re-engineering of the thermal design sub-process since it resides within the entire mechanical design process. With these issues in mind, a thermal design sub-process can be re-defined in a manner that has a higher probability of acceptance, thus enabling an integrated mechanical design process. However, the actual implementation is not always problem-free. Experience in applying the thermal design sub-process to actual situations provides the evidence for improvement, but more importantly, for judging the viability and feasibility of the sub-process.
NASA Astrophysics Data System (ADS)
Shen, Chien-wen; Chou, Ching-Chih
2010-02-01
As business process re-engineering (BPR) is an important foundation to ensure the success of enterprise systems, this study would like to investigate the relationships among BPR implementation, BPR success factors, and business performance for logistics companies. Our empirical findings show that BPR companies outperformed non-BPR companies, not only on information processing, technology applications, organisational structure, and co-ordination, but also on all of the major logistics operations. Comparing the different perceptions of the success factors for BPR, non-BPR companies place greater emphasis on the importance of employee involvement while BPR companies are more concerned about the influence of risk management. Our findings also suggest that management attitude towards BPR success factors could affect performance with regard to technology applications and logistics operations. Logistics companies which have not yet implemented the BPR approach could refer to our findings to evaluate the advantages of such an undertaking and to take care of those BPR success factors affecting performance before conducting BPR projects.
Survey of business process management: challenges and solutions
NASA Astrophysics Data System (ADS)
Alotaibi, Youseef; Liu, Fei
2017-09-01
The current literature shows that creating a good framework on business process model (PM) is not an easy task. A successful business PM should have the ability to ensure accurate alignment between business processes (BPs) and information technology (IT) designs, provide security protection, manage the rapidly changing business environment and BPs, manage customer power, be flexible for reengineering and ensure that IT goals can be easily derived from business goals and hence an information system (IS) can be easily implemented. This article presents an overview of research in the business PM domain. We have presented a review of the challenges facing business PMs, such as misalignment between business and IT, difficulty of deriving IT goals from business goals, creating secured business PM, reengineering BPs, managing the rapidly changing BP and business environment and managing customer power. Also, it presents the limitations of existing business PM frameworks. Finally, we outline several guidelines to create good business PM and the possible further research directions in the business PM domain.
Mapping modern software process engineering techniques onto an HEP development environment
NASA Astrophysics Data System (ADS)
Wellisch, J. P.
2003-04-01
One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.
A digital algorithm for spectral deconvolution with noise filtering and peak picking: NOFIPP-DECON
NASA Technical Reports Server (NTRS)
Edwards, T. R.; Settle, G. L.; Knight, R. D.
1975-01-01
Noise-filtering, peak-picking deconvolution software incorporates multiple convoluted convolute integers and multiparameter optimization pattern search. The two theories are described and three aspects of the software package are discussed in detail. Noise-filtering deconvolution was applied to a number of experimental cases ranging from noisy, nondispersive X-ray analyzer data to very noisy photoelectric polarimeter data. Comparisons were made with published infrared data, and a man-machine interactive language has evolved for assisting in very difficult cases. A modified version of the program is being used for routine preprocessing of mass spectral and gas chromatographic data.
Frequency Estimator Performance for a Software-Based Beacon Receiver
NASA Technical Reports Server (NTRS)
Zemba, Michael J.; Morse, Jacquelynne R.; Nessel, James A.
2014-01-01
As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a Q/V-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.
GeoFramework: A Modeling Framework for Solid Earth Geophysics
NASA Astrophysics Data System (ADS)
Gurnis, M.; Aivazis, M.; Tromp, J.; Tan, E.; Thoutireddy, P.; Liu, Q.; Choi, E.; Dicaprio, C.; Chen, M.; Simons, M.; Quenette, S.; Appelbe, B.; Aagaard, B.; Williams, C.; Lavier, L.; Moresi, L.; Law, H.
2003-12-01
As data sets in geophysics become larger and of greater relevance to other earth science disciplines, and as earth science becomes more interdisciplinary in general, modeling tools are being driven in new directions. There is now a greater need to link modeling codes to one another, link modeling codes to multiple datasets, and to make modeling software available to non modeling specialists. Coupled with rapid progress in computer hardware (including the computational speed afforded by massively parallel computers), progress in numerical algorithms, and the introduction of software frameworks, these lofty goals of merging software in geophysics are now possible. The GeoFramework project, a collaboration between computer scientists and geoscientists, is a response to these needs and opportunities. GeoFramework is based on and extends Pyre, a Python-based modeling framework, recently developed to link solid (Lagrangian) and fluid (Eulerian) models, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. The utility and generality of Pyre as a general purpose framework in science is now being recognized. Besides its use in engineering and geophysics, it is also being used in particle physics and astronomy. Geology and geophysics impose their own unique requirements on software frameworks which are not generally available in existing frameworks and so there is a need for research in this area. One of the special requirements is the way Lagrangian and Eulerian codes will need to be linked in time and space within a plate tectonics context. GeoFramework has grown beyond its initial goal of linking a limited number of exiting codes together. The following codes are now being reengineered within the context of Pyre: Tecton, 3-D FE Visco-elastic code for lithospheric relaxation; CitComS, a code for spherical mantle convection; SpecFEM3D, a SEM code for global and regional seismic waves; eqsim, a FE code for dynamic earthquake rupture; SNAC, a developing 3-D coded based on the FLAC method for visco-elastoplastic deformation; SNARK, a 3-D FE-PIC method for viscoplastic deformation; and gPLATES an open source paleogeographic/plate tectonics modeling package. We will demonstrate how codes can be linked with themselves, such as a regional and global model of mantle convection and a visco-elastoplastic representation of the crust within viscous mantle flow. Finally, we will describe how http://GeoFramework.org has become a distribution site for a suite of modeling software in geophysics.
Advanced Lockouts: Reengineering Safety Programs for Efficiency.
Michalscheck, Jimi
2015-08-01
Remember one golden rule when engineering out lockout/tagout: No additional risk can be introduced to the employees by using alternative procedures. If you can design alternative procedures and an overall alternative program to ensure equivalent protection for specific tasks...the sky is the limit to enhancing productivity.
Communicating Reengineering at Naval Facilities Engineering Command, Southwest Division
2002-09-01
Systems, a California- based division of the Japanese company, implemented a communications messages built around Elvis Presley songs, which helped...people to realize how much change will be required. As many of the people within this organization were familiar with Elvis Presley’s music, the
Re-Engineering the Stomatopod Eye, Nature’s Most Comprehensive Visual Sensor
2013-08-22
degree of polarisation difference. Stomatopods (Right) also react to similar looming stonuli by retreating or showing eye movements . However, the...N Roberts, NJ Marshall 2013 "Varying degrees of polarization vision in octopus : interaction between degree and angle of polarization in contrast
NASA Technical Reports Server (NTRS)
Pourmal, Elena
2016-01-01
The HDF Group maintains and evolves HDF software used by NASA ESDIS program to manage remote sense data. In this talk we will discuss new features of HDF (Virtual Datasets, Single writerMultiple reader access, Community supported HDF5 compression filters) that address storage and IO performance requirements of the applications that work with the ESDIS data products.
ERIC Educational Resources Information Center
Honts, Jerry E.
2003-01-01
Recent advances in genomics and structural biology have resulted in an unprecedented increase in biological data available from Internet-accessible databases. In order to help students effectively use this vast repository of information, undergraduate biology students at Drake University were introduced to bioinformatics software and databases in…
ERIC Educational Resources Information Center
Deek, Fadi; Espinosa, Idania
2005-01-01
Traditionally, novice programmers have had difficulties in three distinct areas: breaking down a given problem, designing a workable solution, and debugging the resulting program. Many programming environments, software applications, and teaching tools have been developed to address the difficulties faced by these novices. Along with advancements…
ERIC Educational Resources Information Center
Ayers, David F.
2015-01-01
Objective: To examine the discursive strategies deployed by community colleges to sustain legitimacy in an evolving and contradictory institutional environment. Method: Using corpus linguistics software, I compared 1,009 mission statements from 2012-2013 with a reference corpus of 427 mission statements from 2004. Results: Keywords analysis,…
ERIC Educational Resources Information Center
Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.
2009-01-01
Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…
ERIC Educational Resources Information Center
Goel, Sanjay
2010-01-01
Community and culture significantly influence value orientation, perceived needs, and motivation as well as provide the ground for creating shared understanding. All disciplines have their own cultures, and all cultures evolve through cross-cultural exchanges. The computing community has created and documented a sound body of knowledge of…
Provenance-Based Approaches to Semantic Web Service Discovery and Usage
ERIC Educational Resources Information Center
Narock, Thomas William
2012-01-01
The World Wide Web Consortium defines a Web Service as "a software system designed to support interoperable machine-to-machine interaction over a network." Web Services have become increasingly important both within and across organizational boundaries. With the recent advent of the Semantic Web, web services have evolved into semantic…
ERIC Educational Resources Information Center
Winer, Laura R.; Cooperstock, Jeremy
2002-01-01
Describes the development and use of the Intelligent Classroom collaborative project at McGill University that explored technology use to improve teaching and learning. Explains the hardware and software installation that allows for the automated capture of audio, video, slides, and handwritten annotations during a live lecture, with subsequent…
Search for supporting methodologies - Or how to support SEI for 35 years
NASA Technical Reports Server (NTRS)
Handley, Thomas H., Jr.; Masline, Richard C.
1991-01-01
Concepts relevant to the development of an evolvable information management system are examined in terms of support for the Space Exploration Initiative. The issues of interoperability within NASA and industry initiatives are studied including the Open Systems Interconnection standard and the operating system of the Open Software Foundation. The requirements of partitioning functionality into separate areas are determined with attention given to the infrastructure required to ensure system-wide compliance. The need for a decision-making context is a key to the distributed implementation of the program, and this environment is concluded to be next step in developing an evolvable, interoperable, and securable support network.
Solar-terrestrial data access distribution and archiving
NASA Technical Reports Server (NTRS)
1984-01-01
It is recommended that a central data catalog and data access network (CDC/DAN) for solar-terrestrial research be established, initially as a NASA pilot program. The system is envisioned to be flexible and to evolve as funds permit, starting from a catalog to an access network for high-resolution data. The report describes the various functional requirements for the CDC/DAN, but does not specify the hardware and software architectures as these are constantly evolving. The importance of a steering committee, working with the CDC/DAN organization, to provide scientific guidelines for the data catalog and for data storage, access, and distribution is also stressed.
Ruan, Jujun; Zhang, Chao; Li, Ya; Li, Peiyi; Yang, Zaizhi; Chen, Xiaohong; Huang, Mingzhi; Zhang, Tao
2017-02-01
This work proposes an on-line hybrid intelligent control system based on a genetic algorithm (GA) evolving fuzzy wavelet neural network software sensor to control dissolved oxygen (DO) in an anaerobic/anoxic/oxic process for treating papermaking wastewater. With the self-learning and memory abilities of neural network, handling the uncertainty capacity of fuzzy logic, analyzing local detail superiority of wavelet transform and global search of GA, this proposed control system can extract the dynamic behavior and complex interrelationships between various operation variables. The results indicate that the reasonable forecasting and control performances were achieved with optimal DO, and the effluent quality was stable at and below the desired values in real time. Our proposed hybrid approach proved to be a robust and effective DO control tool, attaining not only adequate effluent quality but also minimizing the demand for energy, and is easily integrated into a global monitoring system for purposes of cost management. Copyright © 2016 Elsevier Ltd. All rights reserved.
Tran, Simon D; Sugito, Takayuki; Dipasquale, Giovanni; Cotrim, Ana P; Bandyopadhyay, Bidhan C; Riddle, Kathryn; Mooney, David; Kok, Marc R; Chiorini, John A; Baum, Bruce J
2006-10-01
There is no satisfactory conventional treatment for patients who experience irreversible salivary gland damage after therapeutic radiation for head and neck cancer or because of Sjögren's syndrome. Additionally, if most parenchyma is lost, these patients also are not candidates for evolving gene transfer strategies. To help such patients, several years ago we began to develop an artificial salivary gland. In the present study, we used a non-human primate tissue source, parotid glands from rhesus monkeys, to obtain potential autologous graft cells for development of a prototype device for in situ testing. Herein, we present 3 major findings. First, we show that primary cultures of rhesus parotid gland (RPG) cells are capable of attaining a polarized orientation, with Na(+)/K(+)-adenosine triphosphatase, zonula occludens-1, and claudin-1 distributed in specific domains appropriate for epithelial cells. Second, we show that RPG cells exhibit 2 essential epithelial functions required for graft cells in an artificial salivary gland device (i.e., an effective barrier to paracellular water flow and the generation of a moderate transepithelial electrical resistance). Third, we show that RPG cells can express functional water channels, capable of mediating directional fluid movement, after transduction by adenoviral and adeno-associated virus type 2 vectors. Together these results demonstrate that it is feasible to individually prepare RPG cells for eventual use in a prototype artificial salivary gland.
Achieving success: assessing the role of and building a business case for technology in healthcare.
Neumann, C L; Blouin, A S; Byrne, E M
1999-01-01
As the healthcare market continues to evolve, technology will play an increasingly important role in an integrated delivery system's ability to provide high-quality, cost-effective care. Healthcare leaders must be proactive and forward thinking about their technology investments. The financial investment for technology innovation can be significant. Therefore, it is important that healthcare executives deliberately design the role of technology and develop a consistent method for evaluating, identifying, and prioritizing technology investments. The article begins by describing technology's role in a healthcare organization as a window to the organization, a key driver of business strategy, and a high-performance enabler, and it develops a seven-step process for building a business case to ensure that an organization's technology investments are wise, well-reasoned, and will provide value to its customers. In addition, the article discusses the importance of combining people and process reengineering with new technology to exponentially increase the value to an organization. Healthcare leaders must understand the multiple roles of technology and consistently develop a business case when making technology investment decisions. Organizations driven by such an understanding will have a robust infrastructure of enabling technology designed to integrate people and process elements with technology to achieve the goals and initiatives of the organization. These organizations will lead the healthcare industry into the next millennium.
Strasser, Jane E; Cola, Philip A; Rosenblum, Daniel
2013-08-01
Emphasis has been placed on assessing the efficiency of clinical and translational research as part of the National Institutes of Health (NIH) goal to "improve human health." Improvements identified and implemented by individual organizations cannot address the research infrastructure needs of all clinical and translational research conducted. NIH's National Center for Advancing Translational Sciences (NCATS) has brought together 61 Clinical and Translational Science Award (CTSA) sites creating a virtual national laboratory that reflects the diversity and breadth of academic medical centers to collectively improve clinical and translational science. The annual Clinical Research Management workshop is organized by the CTSA consortium with participation from CTSA awardees, NIH, and others with an interest in clinical research management. The primary objective of the workshop is to disseminate information that improves clinical research management although the specific objectives of each workshop evolve within the consortium. The fifth annual workshop entitled "Learning by doing; applying evidence-based tools to re-engineer clinical research management" took place in June 2012. The primary objective of the 2012 workshop was to utilize data to evaluate, modify, and improve clinical research management. This report provides a brief summary of the workshop proceedings and the major themes discussed among the participants. © 2013 Wiley Periodicals, Inc.
Weber, Tobias; Mavratzas, Athanasios; Kiesgen, Stefan; Haase, Stephanie; Bötticher, Benedikt; Exner, Evelyn; Mier, Walter; Grosse-Hovest, Ludger; Jäger, Dirk; Arndt, Michaela A E; Krauss, Jürgen
2015-01-01
Antibody-drug conjugates (ADCs) have evolved as a new class of potent cancer therapeutics. We here report on the development of ADCs with specificity for the B-cell lineage specific (surface) antigen CD22 being expressed in the majority of hematological malignancies. As targeting moiety a previously generated humanized anti-CD22 single-chain variable fragment (scFv) derivative from the monoclonal antibody RFB4 was reengineered into a humanized IgG1 antibody format (huRFB4). Onconase (ranpirnase), a clinically active pancreatic-type ribonuclease, was employed as cytotoxic payload moiety. Chemical conjugation via thiol-cleavable disulfide linkage retained full enzymatic activity and full binding affinity of the ADC. Development of sophisticated purification procedures using size exclusion and ion exchange chromatography allowed the separation of immunoconjugate species with stoichiometrically defined number of Onconase cargos. A minimum of two Onconase molecules per IgG was required for achieving significant in vitro cytotoxicity towards lymphoma and leukemia cell lines. Antibody-drug conjugates with an Onconase to antibody ratio of 3 : 1 exhibited an IC50 of 0.08 nM, corresponding to more than 18,400-fold increased cytotoxicity of the ADC when compared with unconjugated Onconase. These results justify further development of this ADC as a promising first-in-class compound for the treatment of CD22-positive malignancies.
Weber, Tobias; Mavratzas, Athanasios; Kiesgen, Stefan; Haase, Stephanie; Bötticher, Benedikt; Exner, Evelyn; Mier, Walter; Grosse-Hovest, Ludger; Jäger, Dirk; Arndt, Michaela A. E.; Krauss, Jürgen
2015-01-01
Antibody-drug conjugates (ADCs) have evolved as a new class of potent cancer therapeutics. We here report on the development of ADCs with specificity for the B-cell lineage specific (surface) antigen CD22 being expressed in the majority of hematological malignancies. As targeting moiety a previously generated humanized anti-CD22 single-chain variable fragment (scFv) derivative from the monoclonal antibody RFB4 was reengineered into a humanized IgG1 antibody format (huRFB4). Onconase (ranpirnase), a clinically active pancreatic-type ribonuclease, was employed as cytotoxic payload moiety. Chemical conjugation via thiol-cleavable disulfide linkage retained full enzymatic activity and full binding affinity of the ADC. Development of sophisticated purification procedures using size exclusion and ion exchange chromatography allowed the separation of immunoconjugate species with stoichiometrically defined number of Onconase cargos. A minimum of two Onconase molecules per IgG was required for achieving significant in vitro cytotoxicity towards lymphoma and leukemia cell lines. Antibody-drug conjugates with an Onconase to antibody ratio of 3 : 1 exhibited an IC50 of 0.08 nM, corresponding to more than 18,400-fold increased cytotoxicity of the ADC when compared with unconjugated Onconase. These results justify further development of this ADC as a promising first-in-class compound for the treatment of CD22-positive malignancies. PMID:26605343
Re-Engineering Alzheimer Clinical Trials: Global Alzheimer's Platform Network.
Cummings, J; Aisen, P; Barton, R; Bork, J; Doody, R; Dwyer, J; Egan, J C; Feldman, H; Lappin, D; Truyen, L; Salloway, S; Sperling, R; Vradenburg, G
2016-06-01
Alzheimer's disease (AD) drug development is costly, time-consuming, and inefficient. Trial site functions, trial design, and patient recruitment for trials all require improvement. The Global Alzheimer Platform (GAP) was initiated in response to these challenges. Four GAP work streams evolved in the US to address different trial challenges: 1) registry-to-cohort web-based recruitment; 2) clinical trial site activation and site network construction (GAP-NET); 3) adaptive proof-of-concept clinical trial design; and 4) finance and fund raising. GAP-NET proposes to establish a standardized network of continuously funded trial sites that are highly qualified to perform trials (with established clinical, biomarker, imaging capability; certified raters; sophisticated management system. GAP-NET will conduct trials for academic and biopharma industry partners using standardized instrument versions and administration. Collaboration with the Innovative Medicines Initiative (IMI) European Prevention of Alzheimer's Disease (EPAD) program, the Canadian Consortium on Neurodegeneration in Aging (CCNA) and other similar international initiatives will allow conduct of global trials. GAP-NET aims to increase trial efficiency and quality, decrease trial redundancy, accelerate cohort development and trial recruitment, and decrease trial costs. The value proposition for sites includes stable funding and uniform training and trial execution; the value to trial sponsors is decreased trial costs, reduced time to execute trials, and enhanced data quality. The value for patients and society is the more rapid availability of new treatments for AD.
A posteriori operation detection in evolving software models
Langer, Philip; Wimmer, Manuel; Brosch, Petra; Herrmannsdörfer, Markus; Seidl, Martina; Wieland, Konrad; Kappel, Gerti
2013-01-01
As every software artifact, also software models are subject to continuous evolution. The operations applied between two successive versions of a model are crucial for understanding its evolution. Generic approaches for detecting operations a posteriori identify atomic operations, but neglect composite operations, such as refactorings, which leads to cluttered difference reports. To tackle this limitation, we present an orthogonal extension of existing atomic operation detection approaches for detecting also composite operations. Our approach searches for occurrences of composite operations within a set of detected atomic operations in a post-processing manner. One major benefit is the reuse of specifications available for executing composite operations also for detecting applications of them. We evaluate the accuracy of the approach in a real-world case study and investigate the scalability of our implementation in an experiment. PMID:23471366
FAST: A multi-processed environment for visualization of computational fluid dynamics
NASA Technical Reports Server (NTRS)
Bancroft, Gordon V.; Merritt, Fergus J.; Plessel, Todd C.; Kelaita, Paul G.; Mccabe, R. Kevin
1991-01-01
Three-dimensional, unsteady, multi-zoned fluid dynamics simulations over full scale aircraft are typical of the problems being investigated at NASA Ames' Numerical Aerodynamic Simulation (NAS) facility on CRAY2 and CRAY-YMP supercomputers. With multiple processor workstations available in the 10-30 Mflop range, we feel that these new developments in scientific computing warrant a new approach to the design and implementation of analysis tools. These larger, more complex problems create a need for new visualization techniques not possible with the existing software or systems available as of this writing. The visualization techniques will change as the supercomputing environment, and hence the scientific methods employed, evolves even further. The Flow Analysis Software Toolkit (FAST), an implementation of a software system for fluid mechanics analysis, is discussed.
NASA Technical Reports Server (NTRS)
2001-01-01
DATASTAR, Inc., of Picayune, Miss., has taken NASA's award-winning Earth Resources Laboratory Applications (ELAS) software program and evolved it to the point that the company is now providing a unique, spatial imagery service over the Internet. ELAS was developed in the early 80's to process satellite and airborne sensor imagery data of the Earth's surface into readable and useable information. While there are several software packages on the market that allow the manipulation of spatial data into useable products, this is usually a laborious task. The new program, called the DATASTAR Image Processing Exploitation, or DIPX, Delivery Service, is a subscription service available over the Internet that takes the work out of the equation and provides normalized geo-spatial data in the form of decision products.
NASA Technical Reports Server (NTRS)
1989-01-01
At their March 1988 meeting, members of the National Aeronautics and Space Administration (NASA) Information Resources Management (IRM) Council expressed concern that NASA may not have the infrastructure necessary to support the use of Ada for major NASA software projects. Members also observed that the agency has no coordinated strategy for applying its experiences with Ada to subsequent projects (Hinners, 27 June 1988). To deal with these problems, the IRM Council chair appointed an intercenter Ada and Software Management Assessment Working Group (ASMAWG). They prepared a report (McGarry et al., March 1989) entitled, 'Ada and Software Management in NASA: Findings and Recommendations'. That report presented a series of recommendations intended to enable NASA to develop better software at lower cost through the use of Ada and other state-of-the-art software engineering technologies. The purpose here is to describe the steps (called objectives) by which this goal may be achieved, to identify the NASA officials or organizations responsible for carrying out the steps, and to define a schedule for doing so. This document sets forth four goals: adopt agency-wide software standards and policies; use Ada as the programming language for all mission software; establish an infrastructure to support software engineering, including the use of Ada, and to leverage the agency's software experience; and build the agency's knowledge base in Ada and software engineering. A schedule for achieving the objectives and goals is given.
The Legacy of Space Shuttle Flight Software
NASA Technical Reports Server (NTRS)
Hickey, Christopher J.; Loveall, James B.; Orr, James K.; Klausman, Andrew L.
2011-01-01
The initial goals of the Space Shuttle Program required that the avionics and software systems blaze new trails in advancing avionics system technology. Many of the requirements placed on avionics and software were accomplished for the first time on this program. Examples include comprehensive digital fly-by-wire technology, use of a digital databus for flight critical functions, fail operational/fail safe requirements, complex automated redundancy management, and the use of a high-order software language for flight software development. In order to meet the operational and safety goals of the program, the Space Shuttle software had to be extremely high quality, reliable, robust, reconfigurable and maintainable. To achieve this, the software development team evolved a software process focused on continuous process improvement and defect elimination that consistently produced highly predictable and top quality results, providing software managers the confidence needed to sign each Certificate of Flight Readiness (COFR). This process, which has been appraised at Capability Maturity Model (CMM)/Capability Maturity Model Integration (CMMI) Level 5, has resulted in one of the lowest software defect rates in the industry. This paper will present an overview of the evolution of the Primary Avionics Software System (PASS) project and processes over thirty years, an argument for strong statistical control of software processes with examples, an overview of the success story for identifying and driving out errors before flight, a case study of the few significant software issues and how they were either identified before flight or slipped through the process onto a flight vehicle, and identification of the valuable lessons learned over the life of the project.
MATTS- A Step Towards Model Based Testing
NASA Astrophysics Data System (ADS)
Herpel, H.-J.; Willich, G.; Li, J.; Xie, J.; Johansen, B.; Kvinnesland, K.; Krueger, S.; Barrios, P.
2016-08-01
In this paper we describe a Model Based approach to testing of on-board software and compare it with traditional validation strategy currently applied to satellite software. The major problems that software engineering will face over at least the next two decades are increasing application complexity driven by the need for autonomy and serious application robustness. In other words, how do we actually get to declare success when trying to build applications one or two orders of magnitude more complex than today's applications. To solve the problems addressed above the software engineering process has to be improved at least for two aspects: 1) Software design and 2) Software testing. The software design process has to evolve towards model-based approaches with extensive use of code generators. Today, testing is an essential, but time and resource consuming activity in the software development process. Generating a short, but effective test suite usually requires a lot of manual work and expert knowledge. In a model-based process, among other subtasks, test construction and test execution can also be partially automated. The basic idea behind the presented study was to start from a formal model (e.g. State Machines), generate abstract test cases which are then converted to concrete executable test cases (input and expected output pairs). The generated concrete test cases were applied to an on-board software. Results were collected and evaluated wrt. applicability, cost-efficiency, effectiveness at fault finding, and scalability.
Unidata's Vision for Transforming Geoscience by Moving Data Services and Software to the Cloud
NASA Astrophysics Data System (ADS)
Ramamurthy, M. K.; Fisher, W.; Yoksas, T.
2014-12-01
Universities are facing many challenges: shrinking budgets, rapidly evolving information technologies, exploding data volumes, multidisciplinary science requirements, and high student expectations. These changes are upending traditional approaches to accessing and using data and software. It is clear that Unidata's products and services must evolve to support new approaches to research and education. After years of hype and ambiguity, cloud computing is maturing in usability in many areas of science and education, bringing the benefits of virtualized and elastic remote services to infrastructure, software, computation, and data. Cloud environments reduce the amount of time and money spent to procure, install, and maintain new hardware and software, and reduce costs through resource pooling and shared infrastructure. Cloud services aimed at providing any resource, at any time, from any place, using any device are increasingly being embraced by all types of organizations. Given this trend and the enormous potential of cloud-based services, Unidata is taking moving to augment its products, services, data delivery mechanisms and applications to align with the cloud-computing paradigm. Specifically, Unidata is working toward establishing a community-based development environment that supports the creation and use of software services to build end-to-end data workflows. The design encourages the creation of services that can be broken into small, independent chunks that provide simple capabilities. Chunks could be used individually to perform a task, or chained into simple or elaborate workflows. The services will also be portable, allowing their use in researchers' own cloud-based computing environments. In this talk, we present a vision for Unidata's future in a cloud-enabled data services and discuss our initial efforts to deploy a subset of Unidata data services and tools in the Amazon EC2 and Microsoft Azure cloud environments, including the transfer of real-time meteorological data into its cloud instances, product generation using those data, and the deployment of TDS, McIDAS ADDE and AWIPS II data servers and the Integrated Data Server visualization tool.
Better Instructional Design Theory: Process Improvement or Reengineering?
ERIC Educational Resources Information Center
Dick, Walter
1997-01-01
Discusses three ways that instructional design theories can change over time: (1) revision via evolution of models to reflect the outcomes that are being achieved with its current use; (2) revision to reflect current understanding of technology; and (3) complete replacement of present theory with another more powerful theory. Describes the…
Supply-Chain Optimization Template
NASA Technical Reports Server (NTRS)
Quiett, William F.; Sealing, Scott L.
2009-01-01
The Supply-Chain Optimization Template (SCOT) is an instructional guide for identifying, evaluating, and optimizing (including re-engineering) aerospace- oriented supply chains. The SCOT was derived from the Supply Chain Council s Supply-Chain Operations Reference (SCC SCOR) Model, which is more generic and more oriented toward achieving a competitive advantage in business.
The Quality Movements in Higher Education in the United States.
ERIC Educational Resources Information Center
Miller, Richard I.
1996-01-01
Discussion of various quality control strategies in American higher education looks at and compares Total Quality Management (TQM), outcomes assessment, Deming's 14 points, the Malcolm Baldrige National Quality Award, the ISO 9000 series, restructuring, reengineering, and performance indicators. It is suggested that colleges and universities will…
Restructuring a Higher Education Institution: A Case Study from a Developing Country
ERIC Educational Resources Information Center
Sohail, M. Sadiq; Daud, Salina; Rajadurai, Jegatheesan
2006-01-01
Purpose: The competitive environment facing all organizations has forced many of them to choose strategies that enhance organizational effectiveness and efficiency. Re-engineering is one of the tools used in administering productivity improvements, cost control and asset management. Design/methodology/approach: This paper examines the…
Re-engineering Engineering Education
ERIC Educational Resources Information Center
Gordon, Bernard M.; Silevitch, Michael B.
2009-01-01
In 2005, leaders gathered by the National Association of Manufacturers declared yet another "STEM" emergency. In the face of global competition, they argued, the number of bachelor's degrees awarded annually to U.S. students in science, math and engineering must double by 2015. In fact, the need for STEM talent is even more critical…
Employee Empowerment in Manufacturing: A Study of Organisations in the UK.
ERIC Educational Resources Information Center
Psoinos, Anna; Smithson, Steve
2002-01-01
Employee empowerment in the United Kingdom manufacturing industry (through total quality management, delayering, and business process reengineering) was examined in a survey of 103 companies. Success factors included a solid business rationale, organizational culture, and staff decision making power. Culture was also the most influential…
Organisational Change and the Management of Training in Australian Enterprises.
ERIC Educational Resources Information Center
Smith, Andrew; Oczkowski, Edward; Macklin, Robert; Noble, Charles
2003-01-01
A survey of 584 Australian companies investigated the impact of 5 management practices (total quality management, teamwork, lean production, reengineering, learning organizations) in relation to 8 training practices. New management practices had significant but differing effects on the organization of training: total quality management had the…
In Memoriam: Charles Shive | Poster
Earlier this summer, NCI at Frederick lost a friend and colleague, Charles Shive, to cancer. Mr. Shive, better known to most as Charlie, was a systems architect and information technology manager for the Data Science and Information Technology Program, focused on the re-engineering initiative of the Clinical Trials Reporting System.
Enhancing Rubisco activity at higher temperatures by re-engineering of Rubisco activase
USDA-ARS?s Scientific Manuscript database
This report discusses the possibility of increasing plant performance under moderate heat stress by improving the thermal stability of Rubisco activase. • The research is driven by the observation that photosynthesis is acutely sensitive to inhibition by moderate heat stress and that this inhibitio...
Change@ucsc.edu: Managing a Comprehensive Change Effort.
ERIC Educational Resources Information Center
Coate, L. Edwin
This monograph describes how team- and process-oriented change techniques such as Total Quality Management (TQM) and Business Process Reengineering (BPR), were adapted to an academic environment to effect a comprehensive change program at the University of California Santa Cruz (UCSC). The $3 million program, begun in 1993, produced radical…
Foundations for a Team Oriented Curriculum
ERIC Educational Resources Information Center
Neal, Brandi; Martz, Ben
2016-01-01
The business world today values collaboration and team work skills such as those found in the area of project management, business process reengineering, quality circles, etc. In response, the use of group projects permeates many curricula today with varying consequences and levels of success. Technology claims to enhance collaboration in…
Access America for Students. Strategic Plan.
ERIC Educational Resources Information Center
National Partnership for Reinventing Government, Washington, DC.
This report provides an overview of a federal initiative, entitled Access America for Students, which is designed to re-engineer the way training and educational services are delivered to students. Part of the National Partnership for Reinventing Government, the initiative's major objectives are to implement privacy and security processes for…
77 FR 74166 - Report of Acreage, Noninsured Crop Disaster Assistance Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-13
..., Office of Management and Budget, Washington, DC 20503. FOR FURTHER INFORMATION CONTACT: Jantrice Williams... waterbeds); ornamental nursery (including the size and origin, that is container or field grown, of plants... benefits and services through the re-engineering of farm program business processes and the adoption of...
Support a Science of Performance Improvement
ERIC Educational Resources Information Center
Bryk, Anthony S.
2009-01-01
We must reengineer both how we carry out educational R&D and the schools in which this work occurs if we want to achieve more productive ends. Education needs a Design, Educational Engineering, and Development infrastructure, which includes a rapid prototyping process by which researchers and practitioners co-develop innovations, try them in…
Learning Stress Distribution in Soils Using a Digital Multimedia Tool.
ERIC Educational Resources Information Center
da Silva Ferreira, Ronaldo
The available technologies of microcomputers and international communication - Internet, are powerful sources for the Teaching and Learning Process. Undergraduate courses can take advantage of these resources to help students and teachers in the classroom. Thinking on this the Project REESC - Reengineering of Engineering Education in Santa…
Cancer Reporting: Timeliness Analysis and Process Reengineering
ERIC Educational Resources Information Center
Jabour, Abdulrahman M.
2016-01-01
Introduction: Cancer registries collect tumor-related data to monitor incident rates and support population-based research. A common concern with using population-based registry data for research is reporting timeliness. Data timeliness have been recognized as an important data characteristic by both the Centers for Disease Control and Prevention…
The Intellectual Assembly Line is Already Here
ERIC Educational Resources Information Center
Vanderburg, Willem H.
2004-01-01
The universal attempt to link computers by means of business process reengineering, enterprise integration, and the management of technology is creating large systems that structure and control the flows of information within institutions. Human work associated with these systems must be reorganized in the image of these technologies. The…
75 FR 57253 - Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-20
....S. Census Bureau. Title: 2011 Field Test of the Re-Engineered Survey of Income and Program...-engineered Survey of Income and Program Participation (SIPP). The Census Bureau's SIPP CAPI interview will use an event history calendar (EHC) interviewing method and a 12-month, calendar-year reference period...
Learning Organisations--Reengineering Schools for Life Long Learning.
ERIC Educational Resources Information Center
O'Sullivan, Fergus
1997-01-01
Examines some key ideas behind the learning organization and explains why the concept is so powerful in contemporary contexts. Identifies various types of learning organizations, and suggests an analytical technique for relating styles of organizational learning to the environmental context. The key to becoming a learning organization is…
Re-Engineering Female Friendly Science. Athene Series.
ERIC Educational Resources Information Center
Rosser, Sue V.
This monograph discusses the state of female-friendly science using phase theory to explain its evolution. Eight chapters provide insight and raise questions on topics such as curriculum integration, the consequences of ignoring gender and race in group work, the experiences of single-sex coeducational institutions for women and science, gender…
The Moral Impotence of Contemporary Experts
ERIC Educational Resources Information Center
Filion, Yves R.
2004-01-01
Technological growth in developed and developing countries in the 20th century has lent a great deal of importance to scientific reasoning in the management of human affairs. An important outgrowth has been the development of systems thinking to organize the workplace. The business reengineering process and the enterprise resource planning system…
Photochemical Phenomenology Model for the New Millennium
NASA Technical Reports Server (NTRS)
Bishop, James; Evans, J. Scott
2001-01-01
The "Photochemical Phenomenology Model for the New Millennium" project tackles the issue of reengineering and extension of validated physics-based modeling capabilities ("legacy" computer codes) to application-oriented software for use in science and science-support activities. While the design and architecture layouts are in terms of general particle distributions involved in scattering, impact, and reactive interactions, initial Photochemical Phenomenology Modeling Tool (PPMT) implementations are aimed at construction and evaluation of photochemical transport models with rapid execution for use in remote sensing data analysis activities in distributed systems. Current focus is on the Composite Infrared Spectrometer (CIRS) data acquired during the CASSINI flyby of Jupiter. Overall, the project has stayed on the development track outlined in the Year 1 annual report and most Year 2 goals have been met. The issues that have required the most attention are: implementation of the core photochemistry algorithms; implementation of a functional Java Graphical User Interface; completion of a functional CORBA Component Model framework; and assessment of performance issues. Specific accomplishments and the difficulties encountered are summarized in this report. Work to be carried out in the next year center on: completion of testing of the initial operational implementation; its application to analysis of the CASSINI/CIRS Jovian flyby data; extension of the PPMT to incorporate additional phenomenology algorithms; and delivery of a mature operational implementation.
NASA Astrophysics Data System (ADS)
Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario
2010-08-01
The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.
XML Flight/Ground Data Dictionary Management
NASA Technical Reports Server (NTRS)
Wright, Jesse; Wiklow, Colette
2007-01-01
A computer program generates Extensible Markup Language (XML) files that effect coupling between the command- and telemetry-handling software running aboard a spacecraft and the corresponding software running in ground support systems. The XML files are produced by use of information from the flight software and from flight-system engineering. The XML files are converted to legacy ground-system data formats for command and telemetry, transformed into Web-based and printed documentation, and used in developing new ground-system data-handling software. Previously, the information about telemetry and command was scattered in various paper documents that were not synchronized. The process of searching and reading the documents was time-consuming and introduced errors. In contrast, the XML files contain all of the information in one place. XML structures can evolve in such a manner as to enable the addition, to the XML files, of the metadata necessary to track the changes and the associated documentation. The use of this software has reduced the extent of manual operations in developing a ground data system, thereby saving considerable time and removing errors that previously arose in the translation and transcription of software information from the flight to the ground system.
Dworkin, Ian; Wagner, Aaron P.
2014-01-01
Standing genetic variation and the historical environment in which that variation arises (evolutionary history) are both potentially significant determinants of a population's capacity for evolutionary response to a changing environment. Using the open-ended digital evolution software Avida, we evaluated the relative importance of these two factors in influencing evolutionary trajectories in the face of sudden environmental change. We examined how historical exposure to predation pressures, different levels of genetic variation, and combinations of the two, affected the evolvability of anti-predator strategies and competitive abilities in the presence or absence of threats from new, invasive predator populations. We show that while standing genetic variation plays some role in determining evolutionary responses, evolutionary history has the greater influence on a population's capacity to evolve anti-predator traits, i.e. traits effective against novel predators. This adaptability likely reflects the relative ease of repurposing existing, relevant genes and traits, and the broader potential value of the generation and maintenance of adaptively flexible traits in evolving populations. PMID:24955847
A proven approach for more effective software development and maintenance
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Hall, Dana; Sinclair, Craig
1994-01-01
Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.
Using LAMP Applications to Make Our Library Shine
ERIC Educational Resources Information Center
White, Andrew; Balsamo, Joseph
2005-01-01
There has been a move afoot in libraries of all kinds to incorporate open source software and applications to better support their rapidly evolving information access and delivery services. There has been particular interest in a specific configuration of open source components?known by the acronym LAMP?that can be used to build fairly complex Web…
ERIC Educational Resources Information Center
Ruthven, Kenneth; Hennessy, Sara; Deaney, Rosemary
2008-01-01
The idea of "interpretative flexibility" underpins new approaches to studying technological artefacts and curricular resources in use. This paper opens by reviewing, in this light, the evolving design of dynamic geometry, its pioneering use within classroom projects, and early sketches of its mainstream use in ordinary classrooms. After…
The Blended Learning Shift: New Report Shows Blended Learning Growing in U.S. Private Schools
ERIC Educational Resources Information Center
Warren, Travis
2015-01-01
The technology conversation in independent schools has evolved considerably over the last five years. In particular, it has moved beyond the question of how can schools augment traditional classroom practices with hardware (laptops, interactive whiteboards, etc.) to the question of how software can improve outcomes and enable new learning models,…
Wong, Brian J. F.; Karmi, Koohyar; Devcic, Zlatko; McLaren, Christine E.; Chen, Wen-Pin
2013-01-01
Objectives The objectives of this study were to: 1) determine if a genetic algorithm in combination with morphing software can be used to evolve more attractive faces; and 2) evaluate whether this approach can be used as a tool to define or identify the attributes of the ideal attractive face. Study Design Basic research study incorporating focus group evaluations. Methods Digital images were acquired of 250 female volunteers (18–25 y). Randomly selected images were used to produce a parent generation (P) of 30 synthetic faces using morphing software. Then, a focus group of 17 trained volunteers (18–25 y) scored each face on an attractiveness scale ranging from 1 (unattractive) to 10 (attractive). A genetic algorithm was used to select 30 new pairs from the parent generation, and these were morphed using software to produce a new first generation (F1) of faces. The F1 faces were scored by the focus group, and the process was repeated for a total of four iterations of the algorithm. The algorithm mimics natural selection by using the attractiveness score as the selection pressure; the more attractive faces are more likely to morph. All five generations (P-F4) were then scored by three focus groups: a) surgeons (n = 12), b) cosmetology students (n = 44), and c) undergraduate students (n = 44). Morphometric measurements were made of 33 specific features on each of the 150 synthetic faces, and correlated with attractiveness scores using univariate and multivariate analysis. Results The average facial attractiveness scores increased with each generation and were 3.66 (+0.60), 4.59 (±0.73), 5.50 (±0.62), 6.23 (±0.31), and 6.39 (±0.24) for P and F1–F4 generations, respectively. Histograms of attractiveness score distributions show a significant shift in the skew of each curve toward more attractive faces with each generation. Univariate analysis identified nasal width, eyebrow arch height, and lip thickness as being significantly correlated with attractiveness scores. Multivariate analysis identified a similar collection of morphometric measures. No correlation with more commonly accepted measures such as the length facial thirds or fifths were identified. When images are examined as a montage (by generation), clear distinct trends are identified: oval shaped faces, distinct arched eyebrows, and full lips predominate. Faces evolve to approximate the guidelines suggested by classical canon. F3 and F4 generation faces look profoundly similar. The statistical and qualitative analysis indicates that the algorithm and methodology succeeds in generating successively more attractive faces. Conclusions The use of genetic algorithms in combination with a morphing software and traditional focus-group derived attractiveness scores can be used to evolve attractive synthetic faces. We have demonstrated that the evolution of attractive faces can be mimicked in software. Genetic algorithms and morphing provide a robust alternative to traditional approaches rooted in comparing attractiveness scores with a series of morphometric measurements in human subjects. PMID:18401273
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories.
McGibbon, Robert T; Beauchamp, Kyle A; Harrigan, Matthew P; Klein, Christoph; Swails, Jason M; Hernández, Carlos X; Schwantes, Christian R; Wang, Lee-Ping; Lane, Thomas J; Pande, Vijay S
2015-10-20
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.
AdaNET Dynamic Software Inventory (DSI) prototype component acquisition plan
NASA Technical Reports Server (NTRS)
Hanley, Lionel
1989-01-01
A component acquisition plan contains the information needed to evaluate, select, and acquire software and hardware components necessary for successful completion of the AdaNET Dynamic Software Inventory (DSI) Management System Prototype. This plan will evolve and be applicable to all phases of the DSI prototype development. Resources, budgets, schedules, and organizations related to component acquisition activities are provided. A purpose and description of a software or hardware component which is to be acquired are presented. Since this is a plan for acquisition of all components, this section is not applicable. The procurement activities and events conducted by the acquirer are described and who is responsible is identified, where the activity will be performed, and when the activities will occur for each planned procurement. Acquisition requirements describe the specific requirements and standards to be followed during component acquisition. The activities which will take place during component acquisition are described. A list of abbreviations and acronyms, and a glossary are contained.
Efficient, Multi-Scale Designs Take Flight
NASA Technical Reports Server (NTRS)
2003-01-01
Engineers can solve aerospace design problems faster and more efficiently with a versatile software product that performs automated structural analysis and sizing optimization. Collier Research Corporation's HyperSizer Structural Sizing Software is a design, analysis, and documentation tool that increases productivity and standardization for a design team. Based on established aerospace structural methods for strength, stability, and stiffness, HyperSizer can be used all the way from the conceptual design to in service support. The software originated from NASA s efforts to automate its capability to perform aircraft strength analyses, structural sizing, and weight prediction and reduction. With a strategy to combine finite element analysis with an automated design procedure, NASA s Langley Research Center led the development of a software code known as ST-SIZE from 1988 to 1995. Collier Research employees were principal developers of the code along with Langley researchers. The code evolved into one that could analyze the strength and stability of stiffened panels constructed of any material, including light-weight, fiber-reinforced composites.
MDTraj: A Modern Open Library for the Analysis of Molecular Dynamics Trajectories
McGibbon, Robert T.; Beauchamp, Kyle A.; Harrigan, Matthew P.; Klein, Christoph; Swails, Jason M.; Hernández, Carlos X.; Schwantes, Christian R.; Wang, Lee-Ping; Lane, Thomas J.; Pande, Vijay S.
2015-01-01
As molecular dynamics (MD) simulations continue to evolve into powerful computational tools for studying complex biomolecular systems, the necessity of flexible and easy-to-use software tools for the analysis of these simulations is growing. We have developed MDTraj, a modern, lightweight, and fast software package for analyzing MD simulations. MDTraj reads and writes trajectory data in a wide variety of commonly used formats. It provides a large number of trajectory analysis capabilities including minimal root-mean-square-deviation calculations, secondary structure assignment, and the extraction of common order parameters. The package has a strong focus on interoperability with the wider scientific Python ecosystem, bridging the gap between MD data and the rapidly growing collection of industry-standard statistical analysis and visualization tools in Python. MDTraj is a powerful and user-friendly software package that simplifies the analysis of MD data and connects these datasets with the modern interactive data science software ecosystem in Python. PMID:26488642
Virtual immunology: software for teaching basic immunology.
Berçot, Filipe Faria; Fidalgo-Neto, Antônio Augusto; Lopes, Renato Matos; Faggioni, Thais; Alves, Luiz Anastácio
2013-01-01
As immunology continues to evolve, many educational methods have found difficulty in conveying the degree of complexity inherent in its basic principles. Today, the teaching-learning process in such areas has been improved with tools such as educational software. This article introduces "Virtual Immunology," a software program available free of charge in Portuguese and English, which can be used by teachers and students in physiology, immunology, and cellular biology classes. We discuss the development of the initial two modules: "Organs and Lymphoid Tissues" and "Inflammation" and the use of interactive activities to provide microscopic and macroscopic understanding in immunology. Students, both graduate and undergraduate, were questioned along with university level professors about the quality of the software and intuitiveness of use, facility of navigation, and aesthetic organization using a Likert scale. An overwhelmingly satisfactory result was obtained with both students and immunology teachers. Programs such as "Virtual Immunology" are offering more interactive, multimedia approaches to complex scientific principles that increase student motivation, interest, and comprehension. © 2013 by The International Union of Biochemistry and Molecular Biology.
A History of Space Shuttle Main Engine (SSME) Redline Limits Management
NASA Technical Reports Server (NTRS)
Arnold, Thomas M.
2011-01-01
The Space Shuttle Main Engine (SSME) has several "redlines", which are operational limits designated to preclude a catastrophic shutdown of the SSME. The Space Shuttle Orbiter utilizes a combination of hardware and software to enable or disable the automated redline shutdown capability. The Space Shuttle is launched with the automated SSME redline limits enabled, but there are many scenarios which may result in the manual disabling of the software by the onboard crew. The operational philosophy for manually enabling and disabling the redline limits software has evolved continuously throughout the history of the Space Shuttle Program, due to events such as SSME hardware changes and updates to Space Shuttle contingency abort software. In this paper, the evolution of SSME redline limits management will be fully reviewed, including the operational scenarios which call for manual intervention, and the events that triggered changes to the philosophy. Following this review, improvements to the management of redline limits for future spacecraft will be proposed.
Designing the modern pump: engineering aspects of continuous subcutaneous insulin infusion software.
Welsh, John B; Vargas, Steven; Williams, Gary; Moberg, Sheldon
2010-06-01
Insulin delivery systems attracted the efforts of biological, mechanical, electrical, and software engineers well before they were commercially viable. The introduction of the first commercial insulin pump in 1983 represents an enduring milestone in the history of diabetes management. Since then, pumps have become much more than motorized syringes and have assumed a central role in diabetes management by housing data on insulin delivery and glucose readings, assisting in bolus estimation, and interfacing smoothly with humans and compatible devices. Ensuring the integrity of the embedded software that controls these devices is critical to patient safety and regulatory compliance. As pumps and related devices evolve, software engineers will face challenges and opportunities in designing pumps that are safe, reliable, and feature-rich. The pumps and related systems must also satisfy end users, healthcare providers, and regulatory authorities. In particular, pumps that are combined with glucose sensors and appropriate algorithms will provide the basis for increasingly safe and precise automated insulin delivery-essential steps to developing a fully closed-loop system.
Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events
NASA Technical Reports Server (NTRS)
Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix
2005-01-01
To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.
Profile of NASA software engineering: Lessons learned from building the baseline
NASA Technical Reports Server (NTRS)
Hall, Dana; Mcgarry, Frank
1993-01-01
It is critically important in any improvement activity to first understand the organization's current status, strengths, and weaknesses and, only after that understanding is achieved, examine and implement promising improvements. This fundamental rule is certainly true for an organization seeking to further its software viability and effectiveness. This paper addresses the role of the organizational process baseline in a software improvement effort and the lessons we learned assembling such an understanding for NASA overall and for the NASA Goddard Space Flight Center in particular. We discuss important, core data that must be captured and contrast that with our experience in actually finding such information. Our baselining efforts have evolved into a set of data gathering, analysis, and crosschecking techniques and information presentation formats that may prove useful to others seeking to establish similar baselines for their organization.
Basic principles of cone beam computed tomography.
Abramovitch, Kenneth; Rice, Dwight D
2014-07-01
At the end of the millennium, cone-beam computed tomography (CBCT) heralded a new dental technology for the next century. Owing to the dramatic and positive impact of CBCT on implant dentistry and orthognathic/orthodontic patient care, additional applications for this technology soon evolved. New software programs were developed to improve the applicability of, and access to, CBCT for dental patients. Improved, rapid, and cost-effective computer technology, combined with the ability of software engineers to develop multiple dental imaging applications for CBCT with broad diagnostic capability, have played a large part in the rapid incorporation of CBCT technology into dentistry. Copyright © 2014 Elsevier Inc. All rights reserved.
Mission operations systems for planetary exploration
NASA Technical Reports Server (NTRS)
Mclaughlin, William I.; Wolff, Donna M.
1988-01-01
The purpose of the paper is twofold: (1) to present an overview of the processes comprising planetary mission operations as conducted at the Jet Propulsion Laboratory, and (2) to present a project-specific and historical context within which this evolving process functions. In order to accomplish these objectives, the generic uplink and downlink functions are described along with their specialization to current flight projects. Also, new multimission capabilities are outlined, including prototyping of advanced-capability software for subsequent incorporation into more automated future operations. Finally, a specific historical ground is provided by listing some major operations software plus a genealogy of planetary missions beginning with Mariner 2 in 1962.
Yaxx: Yet another X-ray extractor
NASA Astrophysics Data System (ADS)
Aldcroft, Tom
2013-06-01
Yaxx is a Perl script that facilitates batch data processing using Perl open source software and commonly available software such as CIAO/Sherpa, S-lang, SAS, and FTOOLS. For Chandra and XMM analysis it includes automated spectral extraction, fitting, and report generation. Yaxx can be run without climbing an extensive learning curve; even so, yaxx is highly configurable and can be customized to support complex analysis. yaxx uses template files and takes full advantage of the unique Sherpa / S-lang environment to make much of the processing user configurable. Although originally developed with an emphasis on X-ray data analysis, yaxx evolved to be a general-purpose pipeline scripting package.
Using Sequence Diagrams to Detect Communication Problems Between Systems
NASA Technical Reports Server (NTRS)
Lindvall, Mikael; Ackermann, Chris; Stratton, William C.; Sibol, Deane E.; Ray, Arnab; Yonkwa, Lyly; Kresser, Jan; Godfrey, Sally H.; Knodel, Jens
2008-01-01
Many software systems are evolving complex system of systems (SoS) for which inter-system communication is both mission-critical and error-prone. Such communication problems ideally would be detected before deployment. In a NASA-supported Software Assurance Research Program (SARP) project, we are researching a new approach addressing such problems. In this paper, we show that problems in the communication between two systems can be detected by using sequence diagrams to model the planned communication and by comparing the planned sequence to the actual sequence. We identify different kinds of problems that can be addressed by modeling the planned sequence using different level of abstractions.
A new practice-driven approach to develop software in a cyber-physical system environment
NASA Astrophysics Data System (ADS)
Jiang, Yiping; Chen, C. L. Philip; Duan, Junwei
2016-02-01
Cyber-physical system (CPS) is an emerging area, which cannot work efficiently without proper software handling of the data and business logic. Software and middleware is the soul of the CPS. The software development of CPS is a critical issue because of its complicity in a large scale realistic system. Furthermore, object-oriented approach (OOA) is often used to develop CPS software, which needs some improvements according to the characteristics of CPS. To develop software in a CPS environment, a new systematic approach is proposed in this paper. It comes from practice, and has been evolved from software companies. It consists of (A) Requirement analysis in event-oriented way, (B) architecture design in data-oriented way, (C) detailed design and coding in object-oriented way and (D) testing in event-oriented way. It is a new approach based on OOA; the difference when compared with OOA is that the proposed approach has different emphases and measures in every stage. It is more accord with the characteristics of event-driven CPS. In CPS software development, one should focus on the events more than the functions or objects. A case study of a smart home system is designed to reveal the effectiveness of the approach. It shows that the approach is also easy to be operated in the practice owing to some simplifications. The running result illustrates the validity of this approach.
The medical matters wiki: building a library Web site 2.0.
Robertson, Justin; Burnham, Judy; Li, Jie; Sayed, Ellen
2008-01-01
New and innovative information technologies drive the ever-evolving library profession. From clay tablet to parchment scroll to manufactured paper to computer screen pixel, information storage, retrieval, and delivery methods continue to evolve, and each advance irrevocably affects the way libraries, and librarians, work. The Internet has forever altered information and library science, both in theory and practice, but even within this context the progression continues. Though ambiguously defined, Web 2.0 offers a new outlook and new software, presenting librarians with potentially invaluable new tools and methods. This paper discusses the creation, implementation, and maintenance of a Web 2.0 technology, the wiki, as a resource tool for an academic biomedical library.
ARIADNE: a Tracking System for Relationships in LHCb Metadata
NASA Astrophysics Data System (ADS)
Shapoval, I.; Clemencic, M.; Cattaneo, M.
2014-06-01
The data processing model of the LHCb experiment implies handling of an evolving set of heterogeneous metadata entities and relationships between them. The entities range from software and databases states to architecture specificators and software/data deployment locations. For instance, there is an important relationship between the LHCb Conditions Database (CondDB), which provides versioned, time dependent geometry and conditions data, and the LHCb software, which is the data processing applications (used for simulation, high level triggering, reconstruction and analysis of physics data). The evolution of CondDB and of the LHCb applications is a weakly-homomorphic process. It means that relationships between a CondDB state and LHCb application state may not be preserved across different database and application generations. These issues may lead to various kinds of problems in the LHCb production, varying from unexpected application crashes to incorrect data processing results. In this paper we present Ariadne - a generic metadata relationships tracking system based on the novel NoSQL Neo4j graph database. Its aim is to track and analyze many thousands of evolving relationships for cases such as the one described above, and several others, which would otherwise remain unmanaged and potentially harmful. The highlights of the paper include the system's implementation and management details, infrastructure needed for running it, security issues, first experience of usage in the LHCb production and potential of the system to be applied to a wider set of LHCb tasks.
ERIC Educational Resources Information Center
Organisation for Economic Cooperation and Development, Paris (France). Programme on Educational Building.
This document summarizes themes developed and conclusions from the International Workshop on Educational Infrastructure. The opening topic was "Delivering Education and Training in the Knowledge Society." It was clear to participants that educational infrastructure must go hand-in-hand with reengineering processes to adjust to the needs…
ViDi-O (View-It and Do-It Online): An Approach to Internet Commerce in a Higher Education Setting.
ERIC Educational Resources Information Center
Kesner, Richard M.
1997-01-01
To improve services and reduce costs, Babson College (Massachusetts) has begun continuous quality improvement and reengineering, moving rapidly toward a client/server information technology environment and Internet/Intranet-enabled business processes. The evolution of the strategy and its products, underlying technologies, and development process…
75 FR 48726 - Notice of Lodging of Consent Decree Under the Resource Conservation and Recovery Act
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-11
... single production facility in Plant City, Florida that has been in operation at the current approximate 3... settlement, CFI has re-engineered its plant to cease generating hazardous wastewater previously commingled... reduction program; install synthetic protective barriers beneath its production plants; provide $163.5...
Spent Nuclear Fuel Project Configuration Management Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reilly, M.A.
This document is a rewrite of the draft ``C`` that was agreed to ``in principle`` by SNF Project level 2 managers on EDT 609835, dated March 1995 (not released). The implementation process philosphy was changed in keeping with the ongoing reengineering of the WHC Controlled Manuals to achieve configuration management within the SNF Project.
Code of Federal Regulations, 2011 CFR
2011-01-01
... that basis must meet the following criteria: (1) Nature of emergency. Describe the military threat... particular position. Requests submitted on this basis must include a description of the length, breadth, and... limited to, unusual qualification requirements or working conditions, possibility of job reengineering or...
Code of Federal Regulations, 2013 CFR
2013-01-01
... that basis must meet the following criteria: (1) Nature of emergency. Describe the military threat... particular position. Requests submitted on this basis must include a description of the length, breadth, and... limited to, unusual qualification requirements or working conditions, possibility of job reengineering or...
Code of Federal Regulations, 2014 CFR
2014-01-01
... that basis must meet the following criteria: (1) Nature of emergency. Describe the military threat... particular position. Requests submitted on this basis must include a description of the length, breadth, and... limited to, unusual qualification requirements or working conditions, possibility of job reengineering or...
Code of Federal Regulations, 2012 CFR
2012-01-01
... that basis must meet the following criteria: (1) Nature of emergency. Describe the military threat... particular position. Requests submitted on this basis must include a description of the length, breadth, and... limited to, unusual qualification requirements or working conditions, possibility of job reengineering or...
The Impact of Organisational Change on the Nature and Extent of Training in Australian Enterprises
ERIC Educational Resources Information Center
Smith, Andrew; Oczkowski, Edward; Noble, Charles; Macklin, Robert
2004-01-01
This article reports on a study investigating the relationship between the introduction of new management practices and the training provided by Australian enterprises for their employees. The new management practices investigated include teamworking, total quality management, lean production, business process re-engineering and the learning…
Expanded Learning, Expansive Teacher Leadership
ERIC Educational Resources Information Center
Berry, Barnett; Hess, Frederick M.
2013-01-01
Extended learning opportunities have emerged as a useful way to rethink how schools are organized and funded. That includes reconsidering who teaches and how they go about their work. That effort should be guided by four key policy considerations and goals: 1. Reengineer the role of teacher. 2. Rethink K-12, higher education, and community-based…
Beyond "Commentaries of Despair:" Reengineering Pathways to Design in the Schooling of Black Men
ERIC Educational Resources Information Center
Matthews, Lou Edward; Williams, Brian A.
2007-01-01
Ensuring that young Black men are afforded equal educational opportunities, access, and outcomes in education remains the most pressing challenge of modern schooling. Critical to overcoming this challenge is a focus on engineering pathways to design for school-based initiatives which draw more comprehensively from research and the prevailing…
Re-Engineering Values Education in Nigerian Schools as Catalyst for National Development
ERIC Educational Resources Information Center
Enu, Donald Bette; Esu, A. E. O.
2011-01-01
The decay in the Nigerian society has affected the standard of values and moral character formation. This has manifested in so many anti-social vices. Anti-social behaviour exhibited include corruption, dishonesty, greed, violent crimes, lack of accountability, indiscipline, disrespect to constituted authorities, laziness, low moral courage among…
Energizing corporate culture and creating competitive advantage: a new look at workforce programs.
O'Daniell, E E
1999-01-01
Beliefs about the best ways to attract and retain employees--and keep them continuously motivated and productive--have shifted considerably in recent years. Structural changes wrought by the "3 Rs"--restructuring, rightsizing and reengineering--have given way to organizational changes caused by the "3 Cs"--culture, communication and competencies.
Using data mining to build a customer-focused organization.
Okasha, A
1999-08-01
Data mining is a new buzz word in managed care. More than simply a method of unlocking a vault of useful information in MCO data banks and warehouses, the author believes that it can help steer an organization through the reengineering process, leading to the health system's transformation toward a customer-focused organization.
Reengineering Aircraft Structural Life Prediction Using a Digital Twin
2011-01-01
that exaflop-per-second computers will become available: “extrapolation of current hardware trends suggests that exascale systems could be available in...vol. 28, no. 5, pp. 339–350, 2002. [4] H. Simon, T. Zacharia, and R. Stevens, Modeling and Sim- ulation at the Exascale for Energy and the Environment
Cataloguing Costed and Restructured at Curtin University of Technology.
ERIC Educational Resources Information Center
Wade, Rona; Williamson, Vicki
1998-01-01
Outlines the results of a review of the cataloging operations at the library at Curtin University of Technology and the three other publicly funded universities in WAGUL (the Western Australian Group of University Librarians). Identifies cost savings to be achieved by re-engineering cataloging and related operations, and separates them from the…
Power industry competition, reengineering, and globalization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyman, L.S.
1994-07-01
This article presents the views of a vice-president of a financial company on the future of the US electric utility industry. The topics of the article include a historical perspective of the structure of the industry, government regulation, deregulation and competition, rate structures and pricing, effects on stockholders and bondholders, and strategies for growth.
Re-Engineering the United States Marine Corps Special Education Program (SEP).
1998-03-01
McDonald’s or Burger King based on personal preference. This simple decision has only two alternatives and relatively low consequences (the decision...In the example of restaurants seen above, a person would not be able to examine 2000 different franchises and select the best one in which to invest
ERIC Educational Resources Information Center
Bjoernshauge, Lars
The traditional mode of operation of academic libraries is in crisis due to a combination of zero growth funding, rapidly escalating pricing on information resources (especially scientific journals), necessary investments in technology and human resource development, and increasing customer expectations. These issues are addressed as they relate…
Re-Engineering Motherhood? Sure Start in the Community
ERIC Educational Resources Information Center
Hey, Valerie; Bradford, Simon
2006-01-01
This article draws on data collected through two focus groups and eight telephone interviews undertaken as part of an evaluation of a local Sure Start programme. Located in the context of New Labour's valorisation of parenting, and specifically mothering, the article explores the impact of Sure Start policy on discourses of motherhood in this…
Reengineering Framework for Systems in Education
ERIC Educational Resources Information Center
Choquet, Christophe; Corbiere, Alain
2006-01-01
Specifications recently proposed as standards in the domain of Technology Enhanced Learning (TEL), question the designers of TEL systems on how to put them into practice. Recent studies in Model Driven Engineering have highlighted the need for a framework which could formalize the use of these specifications as well as enhance the quality of the…
D-21B RBCC Modification Feasibility Study
NASA Technical Reports Server (NTRS)
1999-01-01
This report presents a feasibility study on the modifications required to re-engine the Lockheed D-21 Drone for use as a NASA RBCC engine. An introduction, background information, engine configuration and performance, propulsion system integration, loads/thermal analysis, avionics/systems, flight test results, costs and work schedule, and some conclusions are presented.
Tele-Learning and Distance Learning Re-Engineering Process.
ERIC Educational Resources Information Center
Cruz, Dulce Marcia; de Moraes, Marialice; Barcia, Ricardo Miranda
The adoption and use of new interactive technologies in Distance Education, especially Tele-learning is a growing tendency in the most advanced countries. Nowadays, this tendency is so strong that being interactive is seen as a necessary pre-condition. Some reasons for this are a the growing perceived value of group-working; the popularization of…
Business Process Reengineering towards an Integrated Learning Management System
ERIC Educational Resources Information Center
Basal, Abdelraheem Mousa
2010-01-01
The task of managing an information technology (IT) system in a school environment poses unique challenges. For example, one of the greatest challenges facing individual schools is the lack of integration among various information systems. The present situation in many schools is that there are many disconnected systems managing many different…
ERIC Educational Resources Information Center
Sullivan, Linda; Bozeman, William
2010-01-01
Enterprise Resource Planning (ERP) systems can represent one of the largest investments of human and financial resources by a higher education institution. They also bring a significant process reengineering aspect to the institution and the associated implementation project through the integration of compiled industry best practices into the…
Drug transport across the blood–brain barrier
Pardridge, William M
2012-01-01
The blood–brain barrier (BBB) prevents the brain uptake of most pharmaceuticals. This property arises from the epithelial-like tight junctions within the brain capillary endothelium. The BBB is anatomically and functionally distinct from the blood–cerebrospinal fluid barrier at the choroid plexus. Certain small molecule drugs may cross the BBB via lipid-mediated free diffusion, providing the drug has a molecular weight <400 Da and forms <8 hydrogen bonds. These chemical properties are lacking in the majority of small molecule drugs, and all large molecule drugs. Nevertheless, drugs can be reengineered for BBB transport, based on the knowledge of the endogenous transport systems within the BBB. Small molecule drugs can be synthesized that access carrier-mediated transport (CMT) systems within the BBB. Large molecule drugs can be reengineered with molecular Trojan horse delivery systems to access receptor-mediated transport (RMT) systems within the BBB. Peptide and antisense radiopharmaceuticals are made brain-penetrating with the combined use of RMT-based delivery systems and avidin–biotin technology. Knowledge on the endogenous CMT and RMT systems expressed at the BBB enable new solutions to the problem of BBB drug transport. PMID:22929442
An Application of Business Process Management to Health Care Facilities.
Hassan, Mohsen M D
The purpose of this article is to help health care facility managers and personnel identify significant elements of their facilities to address, and steps and actions to follow, when applying business process management to them. The ABPMP (Association of Business Process Management Professionals) life-cycle model of business process management is adopted, and steps from Lean, business process reengineering, and Six Sigma, and actions from operations management are presented to implement it. Managers of health care facilities can find in business process management a more comprehensive approach to improving their facilities than Lean, Six Sigma, business process reengineering, and ad hoc approaches that does not conflict with them because many of their elements can be included under its umbrella. Furthermore, the suggested application of business process management can guide and relieve them from selecting among these approaches, as well as provide them with specific steps and actions that they can follow. This article fills a gap in the literature by presenting a much needed comprehensive application of business process management to health care facilities that has specific steps and actions for implementation.
Re-engineering of Bacterial Luciferase; For New Aspects of Bioluminescence.
Kim, Da-Som; Choi, Jeong-Ran; Ko, Jeong-Ae; Kim, Kangmin
2018-01-01
Bacterial luminescence is the end-product of biochemical reactions catalyzed by the luciferase enzyme. Nowadays, this fascinating phenomenon has been widely used as reporter and/or sensors to detect a variety of biological and environmental processes. The enhancement or diversification of the luciferase activities will increase the versatility of bacterial luminescence. Here, to establish the strategy for luciferase engineering, we summarized the identity and relevant roles of key amino acid residues modulating luciferase in Vibrio harveyi, a model luminous bacterium. The current opinions on crystal structures and the critical amino acid residues involved in the substrate binding sites and unstructured loop have been delineated. Based on these, the potential target residues and/or parameters for enzyme engineering were also suggested in limited scale. In conclusion, even though the accurate knowledge on the bacterial luciferase is yet to be reported, the structure-guided site-directed mutagenesis approaches targeting the regulatory amino acids will provide a useful platform to re-engineer the bacterial luciferase in the future. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Shafeghat, Hossein; Jafari, Mehdi; Monavarian, Abbas; Shafayi, Maryam; Dehnavieh, Reza
2014-01-01
Background: Labor laws and regulations have inevitable effects on employees’ work motivation as well as the overall efficiency and productivity of the organization. Objectives: This study was conducted to assess the effects of the “Countrywide Services Management Law” on the work motivation level of the employees of the Iranian Ministry of Health. Patients and Methods: This cross-sectional study was done in 2011 in the Iran's Ministry of Health. Data was collected by a 51-item Likert scale questionnaire, in five domains including: organizational structure, information technology, training patterns, salary and bonus system and re-engineering process. The reliability and validity of the questionnaire was evaluated (Cronbach's alpha= 0.96). Data analysis was conducted using descriptive and inferential statistics (t-test). Results: Out of 192 samples examined, 55.2% of the respondents were female, 88 (45.8%) had BS degree and 116 (60.4%) had less than 10 years’ experience. The mean scores in the domains of organizational structure, information technology, training patterns, salary and bonus system and re-engineering patterns were: 3.11, 3.51, 3.05, 3.21 and 3.14, respectively. Relationship between the items related to manpower in the “Countrywide Services Management Law”, with employees' work motivation was significant (P < 0.0001). The training patterns did not show a significant relation (P < 0.26) with any of five domains. Conclusions: According to our results and the views of the employees of the Iranian Ministry of Health, “Countrywide Services Management Law” positively affected the personnel's work motivation regarding all the factors associated with motivation including: organizational structure, information technology, training patterns, salary and bonus system and re-engineering pattern. Finally, to enhance the workforce motivation and satisfaction level, application and implementation of the rules and regulations should be based on the organizational needs. PMID:24719736
The Evolution of Big Data and Learning Analytics in American Higher Education
ERIC Educational Resources Information Center
Picciano, Anthony G.
2012-01-01
Data-driven decision making, popularized in the 1980s and 1990s, is evolving into a vastly more sophisticated concept known as big data that relies on software approaches generally referred to as analytics. Big data and analytics for instructional applications are in their infancy and will take a few years to mature, although their presence is…