A study of fault prediction and reliability assessment in the SEL environment
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Patnaik, Debabrata
1986-01-01
An empirical study on estimation and prediction of faults, prediction of fault detection and correction effort, and reliability assessment in the Software Engineering Laboratory environment (SEL) is presented. Fault estimation using empirical relationships and fault prediction using curve fitting method are investigated. Relationships between debugging efforts (fault detection and correction effort) in different test phases are provided, in order to make an early estimate of future debugging effort. This study concludes with the fault analysis, application of a reliability model, and analysis of a normalized metric for reliability assessment and reliability monitoring during development of software.
Communication and Organization in Software Development: An Empirical Study
NASA Technical Reports Server (NTRS)
Seaman, Carolyn B.; Basili, Victor R.
1996-01-01
The empirical study described in this paper addresses the issue of communication among members of a software development organization. The independent variables are various attributes of organizational structure. The dependent variable is the effort spent on sharing information which is required by the software development process in use. The research questions upon which the study is based ask whether or not these attributes of organizational structure have an effect on the amount of communication effort expended. In addition, there are a number of blocking variables which have been identified. These are used to account for factors other than organizational structure which may have an effect on communication effort. The study uses both quantitative and qualitative methods for data collection and analysis. These methods include participant observation, structured interviews, and graphical data presentation. The results of this study indicate that several attributes of organizational structure do affect communication effort, but not in a simple, straightforward way. In particular, the distances between communicators in the reporting structure of the organization, as well as in the physical layout of offices, affects how quickly they can share needed information, especially during meetings. These results provide a better understanding of how organizational structure helps or hinders communication in software development.
Learning from examples - Generation and evaluation of decision trees for software resource analysis
NASA Technical Reports Server (NTRS)
Selby, Richard W.; Porter, Adam A.
1988-01-01
A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.
Semi-Empirical Modeling of SLD Physics
NASA Technical Reports Server (NTRS)
Wright, William B.; Potapczuk, Mark G.
2004-01-01
The effects of supercooled large droplets (SLD) in icing have been an area of much interest in recent years. As part of this effort, the assumptions used for ice accretion software have been reviewed. A literature search was performed to determine advances from other areas of research that could be readily incorporated. Experimental data in the SLD regime was also analyzed. A semi-empirical computational model is presented which incorporates first order physical effects of large droplet phenomena into icing software. This model has been added to the LEWICE software. Comparisons are then made to SLD experimental data that has been collected to date. Results will be presented for the comparison of water collection efficiency, ice shape and ice mass.
ERIC Educational Resources Information Center
Proffitt, Curtis K.
2012-01-01
Project failure remains a challenge within the software development field especially during the early stages of the IT project development. Despite the herculean efforts by project managers and organizations to identify and offset problems, projects remain plagued with issues. If these challenges are not mitigated, to a successful degree,…
Operations analysis (study 2.1): Shuttle upper stage software requirements
NASA Technical Reports Server (NTRS)
Wolfe, R. R.
1974-01-01
An investigation of software costs related to space shuttle upper stage operations with emphasis on the additional costs attributable to space servicing was conducted. The questions and problem areas include the following: (1) the key parameters involved with software costs; (2) historical data for extrapolation of future costs; (3) elements of the basic software development effort that are applicable to servicing functions; (4) effect of multiple servicing on complexity of the operation; and (5) are recurring software costs significant. The results address these questions and provide a foundation for estimating software costs based on the costs of similar programs and a series of empirical factors.
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.
1992-01-01
Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.
Specification-based software sizing: An empirical investigation of function metrics
NASA Technical Reports Server (NTRS)
Jeffery, Ross; Stathis, John
1993-01-01
For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.
NASA Technical Reports Server (NTRS)
Taber, William; Port, Dan
2014-01-01
At the Mission Design and Navigation Software Group at the Jet Propulsion Laboratory we make use of finite exponential based defect models to aid in maintenance planning and management for our widely used critical systems. However a number of pragmatic issues arise when applying defect models for a post-release system in continuous use. These include: how to utilize information from problem reports rather than testing to drive defect discovery and removal effort, practical model calibration, and alignment of model assumptions with our environment.
A research program in empirical computer science
NASA Technical Reports Server (NTRS)
Knight, J. C.
1991-01-01
During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.
Science and Technology Investment Strategy for Squadron Level Training
1993-05-01
be derived from empirically sound and theory -based instructional models. Cmment. The automation of instructional design could favorably impact the...require a significant amount of time to develop and where the underlying theory and/or applications hardware and software is ht flux. Long-term efforts...training or training courses. It does not refer to the initial evaluation of individuals entering Upgrade Training ( UGT ). It Am refer to the evaluation of
Cost and schedule estimation study report
NASA Technical Reports Server (NTRS)
Condon, Steve; Regardie, Myrna; Stark, Mike; Waligora, Sharon
1993-01-01
This report describes the analysis performed and the findings of a study of the software development cost and schedule estimation models used by the Flight Dynamics Division (FDD), Goddard Space Flight Center. The study analyzes typical FDD projects, focusing primarily on those developed since 1982. The study reconfirms the standard SEL effort estimation model that is based on size adjusted for reuse; however, guidelines for the productivity and growth parameters in the baseline effort model have been updated. The study also produced a schedule prediction model based on empirical data that varies depending on application type. Models for the distribution of effort and schedule by life-cycle phase are also presented. Finally, this report explains how to use these models to plan SEL projects.
Real-time sensor data validation
NASA Technical Reports Server (NTRS)
Bickmore, Timothy W.
1994-01-01
This report describes the status of an on-going effort to develop software capable of detecting sensor failures on rocket engines in real time. This software could be used in a rocket engine controller to prevent the erroneous shutdown of an engine due to sensor failures which would otherwise be interpreted as engine failures by the control software. The approach taken combines analytical redundancy with Bayesian belief networks to provide a solution which has well defined real-time characteristics and well-defined error rates. Analytical redundancy is a technique in which a sensor's value is predicted by using values from other sensors and known or empirically derived mathematical relations. A set of sensors and a set of relations among them form a network of cross-checks which can be used to periodically validate all of the sensors in the network. Bayesian belief networks provide a method of determining if each of the sensors in the network is valid, given the results of the cross-checks. This approach has been successfully demonstrated on the Technology Test Bed Engine at the NASA Marshall Space Flight Center. Current efforts are focused on extending the system to provide a validation capability for 100 sensors on the Space Shuttle Main Engine.
VBA: A Probabilistic Treatment of Nonlinear Models for Neurobiological and Behavioural Data
Daunizeau, Jean; Adam, Vincent; Rigoux, Lionel
2014-01-01
This work is in line with an on-going effort tending toward a computational (quantitative and refutable) understanding of human neuro-cognitive processes. Many sophisticated models for behavioural and neurobiological data have flourished during the past decade. Most of these models are partly unspecified (i.e. they have unknown parameters) and nonlinear. This makes them difficult to peer with a formal statistical data analysis framework. In turn, this compromises the reproducibility of model-based empirical studies. This work exposes a software toolbox that provides generic, efficient and robust probabilistic solutions to the three problems of model-based analysis of empirical data: (i) data simulation, (ii) parameter estimation/model selection, and (iii) experimental design optimization. PMID:24465198
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
Software Development Management: Empirical and Analytical Perspectives
ERIC Educational Resources Information Center
Kang, Keumseok
2011-01-01
Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…
NASA Technical Reports Server (NTRS)
Brown, David B.
1990-01-01
The results of research and development efforts are described for Task one, Phase two of a general project entitled The Development of a Program Analysis Environment for Ada. The scope of this task includes the design and development of a prototype system for testing Ada software modules at the unit level. The system is called Query Utility Environment for Software Testing of Ada (QUEST/Ada). The prototype for condition coverage provides a platform that implements expert system interaction with program testing. The expert system can modify data in the instrument source code in order to achieve coverage goals. Given this initial prototype, it is possible to evaluate the rule base in order to develop improved rules for test case generation. The goals of Phase two are the following: (1) to continue to develop and improve the current user interface to support the other goals of this research effort (i.e., those related to improved testing efficiency and increased code reliable); (2) to develop and empirically evaluate a succession of alternative rule bases for the test case generator such that the expert system achieves coverage in a more efficient manner; and (3) to extend the concepts of the current test environment to address the issues of Ada concurrency.
User's Manual for LEWICE Version 3.2
NASA Technical Reports Server (NTRS)
Wright, William
2008-01-01
A research project is underway at NASA Glenn to produce a computer code which can accurately predict ice growth under a wide range of meteorological conditions for any aircraft surface. This report will present a description of the code inputs and outputs from version 3.2 of this software, which is called LEWICE. This version differs from release 2.0 due to the addition of advanced thermal analysis capabilities for de-icing and anti-icing applications using electrothermal heaters or bleed air applications, the addition of automated Navier-Stokes analysis, an empirical model for supercooled large droplets (SLD) and a pneumatic boot option. An extensive effort was also undertaken to compare the results against the database of electrothermal results which have been generated in the NASA Glenn Icing Research Tunnel (IRT) as was performed for the validation effort for version 2.0. This report will primarily describe the features of the software related to the use of the program. Appendix A has been included to list some of the inner workings of the software or the physical models used. This information is also available in the form of several unpublished documents internal to NASA. This report is intended as a replacement for all previous user manuals of LEWICE. In addition to describing the changes and improvements made for this version, information from previous manuals may be duplicated so that the user will not need to consult previous manuals to use this software.
NASA Astrophysics Data System (ADS)
Idaszak, R.; Lenhardt, W. C.; Jones, M. B.; Ahalt, S.; Schildhauer, M.; Hampton, S. E.
2014-12-01
The NSF, in an effort to support the creation of sustainable science software, funded 16 science software institute conceptualization efforts. The goal of these conceptualization efforts is to explore approaches to creating the institutional, sociological, and physical infrastructures to support sustainable science software. This paper will present the lessons learned from two of these conceptualization efforts, the Institute for Sustainable Earth and Environmental Software (ISEES - http://isees.nceas.ucsb.edu) and the Water Science Software Institute (WSSI - http://waters2i2.org). ISEES is a multi-partner effort led by National Center for Ecological Analysis and Synthesis (NCEAS). WSSI, also a multi-partner effort, is led by the Renaissance Computing Institute (RENCI). The two conceptualization efforts have been collaborating due to the complementarity of their approaches and given the potential synergies of their science focus. ISEES and WSSI have engaged in a number of activities to address the challenges of science software such as workshops, hackathons, and coding efforts. More recently, the two institutes have also collaborated on joint activities including training, proposals, and papers. In addition to presenting lessons learned, this paper will synthesize across the two efforts to project a unified vision for a science software institute.
An empirical study of software design practices
NASA Technical Reports Server (NTRS)
Card, David N.; Church, Victor E.; Agresti, William W.
1986-01-01
Software engineers have developed a large body of software design theory and folklore, much of which was never validated. The results of an empirical study of software design practices in one specific environment are presented. The practices examined affect module size, module strength, data coupling, descendant span, unreferenced variables, and software reuse. Measures characteristic of these practices were extracted from 887 FORTRAN modules developed for five flight dynamics software projects monitored by the Software Engineering Laboratory (SEL). The relationship of these measures to cost and fault rate was analyzed using a contingency table procedure. The results show that some recommended design practices, despite their intuitive appeal, are ineffective in this environment, whereas others are very effective.
The Use of Empirical Studies in the Development of High End Computing Applications
2009-12-01
34, Proceeding of 5th ACM-IEEE International Symposium on Empirical Software Engineering (ISESE), Rio de Janeiro , Brazil, September, 2006. 8. Jeffrey C...Symposium on Empirical Software Engineering, (ISESE), Rio de Janeiro , September, 2006. [26] Zelkowitz M. , V. Basili, S. Asgari, L. Hochstein, J...data is consistently collected across studies. 4. Sanitization of sensitive data. The framework provides external researcher with access to the
Recent Progress Towards Predicting Aircraft Ground Handling Performance
NASA Technical Reports Server (NTRS)
Yager, T. J.; White, E. J.
1981-01-01
The significant progress which has been achieved in development of aircraft ground handling simulation capability is reviewed and additional improvements in software modeling identified. The problem associated with providing necessary simulator input data for adequate modeling of aircraft tire/runway friction behavior is discussed and efforts to improve this complex model, and hence simulator fidelity, are described. Aircraft braking performance data obtained on several wet runway surfaces is compared to ground vehicle friction measurements and, by use of empirically derived methods, good agreement between actual and estimated aircraft braking friction from ground vehilce data is shown. The performance of a relatively new friction measuring device, the friction tester, showed great promise in providing data applicable to aircraft friction performance. Additional research efforts to improve methods of predicting tire friction performance are discussed including use of an instrumented tire test vehicle to expand the tire friction data bank and a study of surface texture measurement techniques.
Estimating Software Effort Hours for Major Defense Acquisition Programs
ERIC Educational Resources Information Center
Wallshein, Corinne C.
2010-01-01
Software Cost Estimation (SCE) uses labor hours or effort required to conceptualize, develop, integrate, test, field, or maintain program components. Department of Defense (DoD) SCE can use initial software data parameters to project effort hours for large, software-intensive programs for contractors reporting the top levels of process maturity,…
A second generation experiment in fault-tolerant software
NASA Technical Reports Server (NTRS)
Knight, J. C.
1986-01-01
Information was collected on the efficacy of fault-tolerant software by conducting two large-scale controlled experiments. In the first, an empirical study of multi-version software (MVS) was conducted. The second experiment is an empirical evaluation of self testing as a method of error detection (STED). The purpose ot the MVS experiment was to obtain empirical measurement of the performance of multi-version systems. Twenty versions of a program were prepared at four different sites under reasonably realistic development conditions from the same specifications. The purpose of the STED experiment was to obtain empirical measurements of the performance of assertions in error detection. Eight versions of a program were modified to include assertions at two different sites under controlled conditions. The overall structure of the testing environment for the MVS experiment and its status are described. Work to date in the STED experiment is also presented.
Mechanistic-empirical Pavement Design Guide Implementation
DOT National Transportation Integrated Search
2010-06-01
The recently introduced Mechanistic-Empirical Pavement Design Guide (MEPDG) and associated computer software provides a state-of-practice mechanistic-empirical highway pavement design methodology. The MEPDG methodology is based on pavement responses ...
Analyzing and Predicting Effort Associated with Finding and Fixing Software Faults
NASA Technical Reports Server (NTRS)
Hamill, Maggie; Goseva-Popstojanova, Katerina
2016-01-01
Context: Software developers spend a significant amount of time fixing faults. However, not many papers have addressed the actual effort needed to fix software faults. Objective: The objective of this paper is twofold: (1) analysis of the effort needed to fix software faults and how it was affected by several factors and (2) prediction of the level of fix implementation effort based on the information provided in software change requests. Method: The work is based on data related to 1200 failures, extracted from the change tracking system of a large NASA mission. The analysis includes descriptive and inferential statistics. Predictions are made using three supervised machine learning algorithms and three sampling techniques aimed at addressing the imbalanced data problem. Results: Our results show that (1) 83% of the total fix implementation effort was associated with only 20% of failures. (2) Both safety critical failures and post-release failures required three times more effort to fix compared to non-critical and pre-release counterparts, respectively. (3) Failures with fixes spread across multiple components or across multiple types of software artifacts required more effort. The spread across artifacts was more costly than spread across components. (4) Surprisingly, some types of faults associated with later life-cycle activities did not require significant effort. (5) The level of fix implementation effort was predicted with 73% overall accuracy using the original, imbalanced data. Using oversampling techniques improved the overall accuracy up to 77%. More importantly, oversampling significantly improved the prediction of the high level effort, from 31% to around 85%. Conclusions: This paper shows the importance of tying software failures to changes made to fix all associated faults, in one or more software components and/or in one or more software artifacts, and the benefit of studying how the spread of faults and other factors affect the fix implementation effort.
A Review of Discrete Element Method (DEM) Particle Shapes and Size Distributions for Lunar Soil
NASA Technical Reports Server (NTRS)
Lane, John E.; Metzger, Philip T.; Wilkinson, R. Allen
2010-01-01
As part of ongoing efforts to develop models of lunar soil mechanics, this report reviews two topics that are important to discrete element method (DEM) modeling the behavior of soils (such as lunar soils): (1) methods of modeling particle shapes and (2) analytical representations of particle size distribution. The choice of particle shape complexity is driven primarily by opposing tradeoffs with total number of particles, computer memory, and total simulation computer processing time. The choice is also dependent on available DEM software capabilities. For example, PFC2D/PFC3D and EDEM support clustering of spheres; MIMES incorporates superquadric particle shapes; and BLOKS3D provides polyhedra shapes. Most commercial and custom DEM software supports some type of complex particle shape beyond the standard sphere. Convex polyhedra, clusters of spheres and single parametric particle shapes such as the ellipsoid, polyellipsoid, and superquadric, are all motivated by the desire to introduce asymmetry into the particle shape, as well as edges and corners, in order to better simulate actual granular particle shapes and behavior. An empirical particle size distribution (PSD) formula is shown to fit desert sand data from Bagnold. Particle size data of JSC-1a obtained from a fine particle analyzer at the NASA Kennedy Space Center is also fitted to a similar empirical PSD function.
Students' Different Understandings of Class Diagrams
ERIC Educational Resources Information Center
Boustedt, Jonas
2012-01-01
The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a…
Proceedings of the 14th Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1989-01-01
Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.
A Heuristic for Improving Legacy Software Quality during Maintenance: An Empirical Case Study
ERIC Educational Resources Information Center
Sale, Michael John
2017-01-01
Many organizations depend on the functionality of mission-critical legacy software and the continued maintenance of this software is vital. Legacy software is defined here as software that contains no testing suite, is often foreign to the developer performing the maintenance, lacks meaningful documentation, and over time, has become difficult to…
ERIC Educational Resources Information Center
Lin, Tin-Chun
2016-01-01
In this paper we explore and discuss an important research question in higher education--is there a trade-off relationship between in-class and out-of-class efforts for students? We used an empirical model to test the trade-off hypothesis between these two efforts. We identified a trade-off between in-class and out-of-class efforts, especially for…
Empirical studies of software design: Implications for SSEs
NASA Technical Reports Server (NTRS)
Krasner, Herb
1988-01-01
Implications for Software Engineering Environments (SEEs) are presented in viewgraph format for characteristics of projects studied; significant problems and crucial problem areas in software design for large systems; layered behavioral model of software processes; implications of field study results; software project as an ecological system; results of the LIFT study; information model of design exploration; software design strategies; results of the team design study; and a list of publications.
U.S. ENVIRONMENTAL PROTECTION AGENCY'S LANDFILL GAS EMISSION MODEL (LANDGEM)
The paper discusses EPA's available software for estimating landfill gas emissions. This software is based on a first-order decomposition rate equation using empirical data from U.S. landfills. The software provides a relatively simple approach to estimating landfill gas emissi...
Duffy, Fergal J; Verniere, Mélanie; Devocelle, Marc; Bernard, Elise; Shields, Denis C; Chubb, Anthony J
2011-04-25
We introduce CycloPs, software for the generation of virtual libraries of constrained peptides including natural and nonnatural commercially available amino acids. The software is written in the cross-platform Python programming language, and features include generating virtual libraries in one-dimensional SMILES and three-dimensional SDF formats, suitable for virtual screening. The stand-alone software is capable of filtering the virtual libraries using empirical measurements, including peptide synthesizability by standard peptide synthesis techniques, stability, and the druglike properties of the peptide. The software and accompanying Web interface is designed to enable the rapid generation of large, structurally diverse, synthesizable virtual libraries of constrained peptides quickly and conveniently, for use in virtual screening experiments. The stand-alone software, and the Web interface for evaluating these empirical properties of a single peptide, are available at http://bioware.ucd.ie .
Reaction Wheel Disturbance Model Extraction Software - RWDMES
NASA Technical Reports Server (NTRS)
Blaurock, Carl
2009-01-01
The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral densities); converting PSDs to order analysis data; extracting harmonics; initializing and simultaneously tuning a harmonic model and a wheel structural model; initializing and tuning a broadband model; and verifying the harmonic/broadband/structural model against the measurement data. Functional operation is through a MATLAB GUI that loads test data, performs the various analyses, plots evaluation data for assessment and refinement of analysis parameters, and exports the data to documentation or downstream analysis code. The harmonic models are defined as specified functions of frequency, typically speed-squared. The reaction wheel structural model is realized as mass, damping, and stiffness matrices (typically from a finite element analysis package) with the addition of a gyroscopic forcing matrix. The broadband noise model is realized as a set of speed-dependent filters. The tuning of the combined model is performed using nonlinear least squares techniques. RWDMES is implemented as a MATLAB toolbox comprising the Fit Manager for performing the model extraction, Data Manager for managing input data and output models, the Gyro Manager for modifying wheel structural models, and the Harmonic Editor for evaluating and tuning harmonic models. This software was validated using data from Goodrich E wheels, and from GSFC Lunar Reconnaissance Orbiter (LRO) wheels. The validation testing proved that RWDMES has the capability to extract accurate disturbance models from flight reaction wheels with minimal user effort.
Empirical Data Collection and Analysis Using Camtasia and Transana
ERIC Educational Resources Information Center
Thorsteinsson, Gisli; Page, Tom
2009-01-01
One of the possible techniques for collecting empirical data is video recordings of a computer screen with specific screen capture software. This method for collecting empirical data shows how students use the BSCWII (Be Smart Cooperate Worldwide--a web based collaboration/groupware environment) to coordinate their work and collaborate in…
ERIC Educational Resources Information Center
Tran, Kiet T.
2012-01-01
This study examined the relationship between information technology (IT) governance and software reuse success. Software reuse has been mostly an IT problem but rarely a business one. Studies in software reuse are abundant; however, to date, none has a deep appreciation of IT governance. This study demonstrated that IT governance had a positive…
Testing Software Development Project Productivity Model
NASA Astrophysics Data System (ADS)
Lipkin, Ilya
Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.
ERIC Educational Resources Information Center
Lin, Yu-Wei; Zini, Enrico
2008-01-01
This empirical paper shows how free/libre open source software (FLOSS) contributes to mutual and collaborative learning in an educational environment. Unlike proprietary software, FLOSS allows extensive customisation of software to support the needs of local users better. This also allows users to participate more proactively in the development…
Generic domain models in software engineering
NASA Technical Reports Server (NTRS)
Maiden, Neil
1992-01-01
This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
Software archeology: a case study in software quality assurance and design
DOE Office of Scientific and Technical Information (OSTI.GOV)
Macdonald, John M; Lloyd, Jane A; Turner, Cameron J
2009-01-01
Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less
Students' different understandings of class diagrams
NASA Astrophysics Data System (ADS)
Boustedt, Jonas
2012-03-01
The software industry needs well-trained software designers and one important aspect of software design is the ability to model software designs visually and understand what visual models represent. However, previous research indicates that software design is a difficult task to many students. This article reports empirical findings from a phenomenographic investigation on how students understand class diagrams, Unified Modeling Language (UML) symbols, and relations to object-oriented (OO) concepts. The informants were 20 Computer Science students from four different universities in Sweden. The results show qualitatively different ways to understand and describe UML class diagrams and the "diamond symbols" representing aggregation and composition. The purpose of class diagrams was understood in a varied way, from describing it as a documentation to a more advanced view related to communication. The descriptions of class diagrams varied from seeing them as a specification of classes to a more advanced view, where they were described to show hierarchic structures of classes and relations. The diamond symbols were seen as "relations" and a more advanced way was seeing the white and the black diamonds as different symbols for aggregation and composition. As a consequence of the results, it is recommended that UML should be adopted in courses. It is briefly indicated how the phenomenographic results in combination with variation theory can be used by teachers to enhance students' possibilities to reach advanced understanding of phenomena related to UML class diagrams. Moreover, it is recommended that teachers should put more effort in assessing skills in proper usage of the basic symbols and models and students should be provided with opportunities to practise collaborative design, e.g. using whiteboards.
The Software Management Environment (SME)
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Decker, William; Buell, John
1988-01-01
The Software Management Environment (SME) is a research effort designed to utilize the past experiences and results of the Software Engineering Laboratory (SEL) and to incorporate this knowledge into a tool for managing projects. SME provides the software development manager with the ability to observe, compare, predict, analyze, and control key software development parameters such as effort, reliability, and resource utilization. The major components of the SME, the architecture of the system, and examples of the functionality of the tool are discussed.
ERIC Educational Resources Information Center
Muller, Eugene W.
1985-01-01
Develops generalizations for empirical evaluation of software based upon suitability of several research designs--pretest posttest control group, single-group pretest posttest, nonequivalent control group, time series, and regression discontinuity--to type of software being evaluated, and on circumstances under which evaluation is conducted. (MBR)
Knowledge Sharing through Pair Programming in Learning Environments: An Empirical Study
ERIC Educational Resources Information Center
Kavitha, R. K.; Ahmed, M. S.
2015-01-01
Agile software development is an iterative and incremental methodology, where solutions evolve from self-organizing, cross-functional teams. Pair programming is a type of agile software development technique where two programmers work together with one computer for developing software. This paper reports the results of the pair programming…
Prep-ME Software Implementation and Enhancement
DOT National Transportation Integrated Search
2017-09-01
Highway agencies across the United States are moving from empirical design procedures towards the mechanistic-empirical (ME) based pavement design. Even though the Pavement ME Design presents a new paradigm shift with several dramatic improvements, i...
NASA Technical Reports Server (NTRS)
McNeill, Justin
1995-01-01
The Multimission Image Processing Subsystem (MIPS) at the Jet Propulsion Laboratory (JPL) has managed transitions of application software sets from one operating system and hardware platform to multiple operating systems and hardware platforms. As a part of these transitions, cost estimates were generated from the personal experience of in-house developers and managers to calculate the total effort required for such projects. Productivity measures have been collected for two such transitions, one very large and the other relatively small in terms of source lines of code. These estimates used a cost estimation model similar to the Software Engineering Laboratory (SEL) Effort Estimation Model. Experience in transitioning software within JPL MIPS have uncovered a high incidence of interface complexity. Interfaces, both internal and external to individual software applications, have contributed to software transition project complexity, and thus to scheduling difficulties and larger than anticipated design work on software to be ported.
Earthquake Loss Estimates in Near Real-Time
NASA Astrophysics Data System (ADS)
Wyss, Max; Wang, Rongjiang; Zschau, Jochen; Xia, Ye
2006-10-01
The usefulness to rescue teams of nearreal-time loss estimates after major earthquakes is advancing rapidly. The difference in the quality of data available in highly developed compared with developing countries dictates that different approaches be used to maximize mitigation efforts. In developed countries, extensive information from tax and insurance records, together with accurate census figures, furnish detailed data on the fragility of buildings and on the number of people at risk. For example, these data are exploited by the method to estimate losses used in the Hazards U.S. Multi-Hazard (HAZUSMH)software program (http://www.fema.gov/plan/prevent/hazus/). However, in developing countries, the population at risk is estimated from inferior data sources and the fragility of the building stock often is derived empirically, using past disastrous earthquakes for calibration [Wyss, 2004].
Microstructure Modeling of Third Generation Disk Alloys
NASA Technical Reports Server (NTRS)
Jou, Herng-Jeng
2010-01-01
The objective of this program was to model, validate, and predict the precipitation microstructure evolution, using PrecipiCalc (QuesTek Innovations LLC) software, for 3rd generation Ni-based gas turbine disc superalloys during processing and service, with a set of logical and consistent experiments and characterizations. Furthermore, within this program, the originally research-oriented microstructure simulation tool was to be further improved and implemented to be a useful and user-friendly engineering tool. In this report, the key accomplishments achieved during the third year (2009) of the program are summarized. The activities of this year included: Further development of multistep precipitation simulation framework for gamma prime microstructure evolution during heat treatment; Calibration and validation of gamma prime microstructure modeling with supersolvus heat treated LSHR; Modeling of the microstructure evolution of the minor phases, particularly carbides, during isothermal aging, representing the long term microstructure stability during thermal exposure; and the implementation of software tools. During the research and development efforts to extend the precipitation microstructure modeling and prediction capability in this 3-year program, we identified a hurdle, related to slow gamma prime coarsening rate, with no satisfactory scientific explanation currently available. It is desirable to raise this issue to the Ni-based superalloys research community, with hope that in future there will be a mechanistic understanding and physics-based treatment to overcome the hurdle. In the mean time, an empirical correction factor was developed in this modeling effort to capture the experimental observations.
Student Use of Scaffolding Software: Relationships with Motivation and Conceptual Understanding
ERIC Educational Resources Information Center
Butler, Kyle A.; Lumpe, Andrew
2008-01-01
This study was designed to theoretically articulate and empirically assess the role of computer scaffolds. In this project, several examples of educational software were developed to scaffold the learning of students performing high level cognitive activities. The software used in this study, Artemis, focused on scaffolding the learning of…
ERIC Educational Resources Information Center
Jolicoeur, Karen; Berger, Dale E.
1986-01-01
Examination of methods used by two software review services in evaluating microcomputer courseware--EPIE (Educational Products Information Exchange) and MicroSIFT (Microcomputer Software and Information for Teachers)--found low correlations between their recommendations for 82 programs. This lack of agreement casts doubts on the usefulness of…
Implementation of the AASHTO mechanistic-empirical pavement design guide for Colorado.
DOT National Transportation Integrated Search
2000-01-01
The objective of this project was to integrate the American Association of State Highway and Transportation Officials (AASHTO) Mechanistic-Empirical Pavement Design Guide, Interim Edition: A Manual of Practice and its accompanying software into the d...
NASA Technical Reports Server (NTRS)
Hops, J. M.; Sherif, J. S.
1994-01-01
A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.
Empirical Evaluation of Hunk Metrics as Bug Predictors
NASA Astrophysics Data System (ADS)
Ferzund, Javed; Ahsan, Syed Nadeem; Wotawa, Franz
Reducing the number of bugs is a crucial issue during software development and maintenance. Software process and product metrics are good indicators of software complexity. These metrics have been used to build bug predictor models to help developers maintain the quality of software. In this paper we empirically evaluate the use of hunk metrics as predictor of bugs. We present a technique for bug prediction that works at smallest units of code change called hunks. We build bug prediction models using random forests, which is an efficient machine learning classifier. Hunk metrics are used to train the classifier and each hunk metric is evaluated for its bug prediction capabilities. Our classifier can classify individual hunks as buggy or bug-free with 86 % accuracy, 83 % buggy hunk precision and 77% buggy hunk recall. We find that history based and change level hunk metrics are better predictors of bugs than code level hunk metrics.
The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research
ERIC Educational Resources Information Center
Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy
2016-01-01
This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…
Automated Estimation Of Software-Development Costs
NASA Technical Reports Server (NTRS)
Roush, George B.; Reini, William
1993-01-01
COSTMODL is automated software development-estimation tool. Yields significant reduction in risk of cost overruns and failed projects. Accepts description of software product developed and computes estimates of effort required to produce it, calendar schedule required, and distribution of effort and staffing as function of defined set of development life-cycle phases. Written for IBM PC(R)-compatible computers.
Development of a comprehensive software engineering environment
NASA Technical Reports Server (NTRS)
Hartrum, Thomas C.; Lamont, Gary B.
1987-01-01
The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.
The Role of Empirical Evidence for Transferring a New Technology to Industry
NASA Astrophysics Data System (ADS)
Baldassarre, Maria Teresa; Bruno, Giovanni; Caivano, Danilo; Visaggio, Giuseppe
Technology transfer and innovation diffusion are key success factors for an enterprise. The shift to a new software technology involves, on one hand, inevitable changes to ingrained and familiar processes and, on the other, requires training, changes in practices and commitment on behalf of technical staff and management. Nevertheless, industry is often reluctant to innovation due to the changes it determines. The process of innovation diffusion is easier if the new technology is supported by empirical evidence. In this sense our conjecture is that Empirical Software Engineering (ESE) serves as means for validating and transferring a new technology within production processes. In this paper, the authors report their experience of a method, Multiview Framework, defined in the SERLAB research laboratory as support for designing and managing a goal oriented measurement program that has been validated through various empirical studies before being transferred to an Italian SME. Our discussion points out the important role of empirical evidence for obtaining management commitment and buy-in on behalf of technical staff, and for making technological transfer possible.
Formal Analysis of the Remote Agent Before and After Flight
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.
2000-01-01
This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.
Towards an Early Software Effort Estimation Based on Functional and Non-Functional Requirements
NASA Astrophysics Data System (ADS)
Kassab, Mohamed; Daneva, Maya; Ormandjieva, Olga
The increased awareness of the non-functional requirements as a key to software project and product success makes explicit the need to include them in any software project effort estimation activity. However, the existing approaches to defining size-based effort relationships still pay insufficient attention to this need. This paper presents a flexible, yet systematic approach to the early requirements-based effort estimation, based on Non-Functional Requirements ontology. It complementarily uses one standard functional size measurement model and a linear regression technique. We report on a case study which illustrates the application of our solution approach in context and also helps evaluate our experiences in using it.
Calculation and use of an environment's characteristic software metric set
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
Since both cost/quality and production environments differ, this study presents an approach for customizing a characteristic set of software metrics to an environment. The approach is applied in the Software Engineering Laboratory (SEL), a NASA Goddard production environment, to 49 candidate process and product metrics of 652 modules from six (51,000 to 112,000 lines) projects. For this particular environment, the method yielded the characteristic metric set (source lines, fault correction effort per executable statement, design effort, code effort, number of I/O parameters, number of versions). The uses examined for a characteristic metric set include forecasting the effort for development, modification, and fault correction of modules based on historical data.
Announcing a Community Effort to Create an Information Model for Research Software Archives
NASA Astrophysics Data System (ADS)
Million, C.; Brazier, A.; King, T.; Hayes, A.
2018-04-01
An effort has started to create recommendations and standards for the archiving of planetary science research software. The primary goal is to define an information model that is consistent with OAIS standards.
Genetic Programming as Alternative for Predicting Development Effort of Individual Software Projects
Chavoya, Arturo; Lopez-Martin, Cuauhtemoc; Andalon-Garcia, Irma R.; Meda-Campaña, M. E.
2012-01-01
Statistical and genetic programming techniques have been used to predict the software development effort of large software projects. In this paper, a genetic programming model was used for predicting the effort required in individually developed projects. Accuracy obtained from a genetic programming model was compared against one generated from the application of a statistical regression model. A sample of 219 projects developed by 71 practitioners was used for generating the two models, whereas another sample of 130 projects developed by 38 practitioners was used for validating them. The models used two kinds of lines of code as well as programming language experience as independent variables. Accuracy results from the model obtained with genetic programming suggest that it could be used to predict the software development effort of individual projects when these projects have been developed in a disciplined manner within a development-controlled environment. PMID:23226305
Reuse Metrics for Object Oriented Software
NASA Technical Reports Server (NTRS)
Bieman, James M.
1998-01-01
One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.
1997-12-01
Watts Humphrey and is described in his book A Discipline for Software Engineering [ Humphrey 95]. Its intended use is to guide the planning and...Pat; Humphrey , Watts S .; Khajenoori, Soheil; Macke, Susan; & Matvya, Annette. "Introducing the Personal Software Process: Three Industry Case... Humphrey 95] Humphrey , Watts S . A Discipline for Software Engineering. Reading, Ma.: Addison-Wesley, 1995. [Mauchly 40] Mauchly, J.W. "Significance
ERIC Educational Resources Information Center
Tan, Andrea; Ferreira, Aldónio
2012-01-01
This study investigates the influence of the use of accounting software in teaching activity-based costing (ABC) on the learning process. It draws upon the Theory of Planned Behaviour and uses the end-user computer satisfaction (EUCS) framework to examine students' satisfaction with the ABC software. The study examines students' satisfaction with…
Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)
NASA Technical Reports Server (NTRS)
Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.
1996-01-01
The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,
Rosen's (M,R) system as an X-machine.
Palmer, Michael L; Williams, Richard A; Gatherer, Derek
2016-11-07
Robert Rosen's (M,R) system is an abstract biological network architecture that is allegedly both irreducible to sub-models of its component states and non-computable on a Turing machine. (M,R) stands as an obstacle to both reductionist and mechanistic presentations of systems biology, principally due to its self-referential structure. If (M,R) has the properties claimed for it, computational systems biology will not be possible, or at best will be a science of approximate simulations rather than accurate models. Several attempts have been made, at both empirical and theoretical levels, to disprove this assertion by instantiating (M,R) in software architectures. So far, these efforts have been inconclusive. In this paper, we attempt to demonstrate why - by showing how both finite state machine and stream X-machine formal architectures fail to capture the self-referential requirements of (M,R). We then show that a solution may be found in communicating X-machines, which remove self-reference using parallel computation, and then synthesise such machine architectures with object-orientation to create a formal basis for future software instantiations of (M,R) systems. Copyright © 2016 Elsevier Ltd. All rights reserved.
Designing Control System Application Software for Change
NASA Technical Reports Server (NTRS)
Boulanger, Richard
2001-01-01
The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
Development of Alabama traffic factors for use in mechanistic-empirical pavement design.
DOT National Transportation Integrated Search
2015-02-01
The pavement engineering community is moving toward design practices that use mechanistic-empirical (M-E) approaches to the design and analysis of pavement structures. This effort is : embodied in the Mechanistic-Empirical Pavement Design Guide (MEPD...
Improving Software Engineering on NASA Projects
NASA Technical Reports Server (NTRS)
Crumbley, Tim; Kelly, John C.
2010-01-01
Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.
Linking customisation of ERP systems to support effort: an empirical study
NASA Astrophysics Data System (ADS)
Koch, Stefan; Mitteregger, Kurt
2016-01-01
The amount of customisation to an enterprise resource planning (ERP) system has always been a major concern in the context of the implementation. This article focuses on the phase of maintenance and presents an empirical study about the relationship between the amount of customising and the resulting support effort. We establish a structural equation modelling model that explains support effort using customisation effort, organisational characteristics and scope of implementation. The findings using data from an ERP provider show that there is a statistically significant effect: with an increasing amount of customisation, the quantity of telephone calls to support increases, as well as the duration of each call.
Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866
Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
Software errors and complexity: An empirical investigation
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Perricone, Berry T.
1983-01-01
The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.
Software errors and complexity: An empirical investigation
NASA Technical Reports Server (NTRS)
Basili, V. R.; Perricone, B. T.
1982-01-01
The distributions and relationships derived from the change data collected during the development of a medium scale satellite software project show that meaningful results can be obtained which allow an insight into software traits and the environment in which it is developed. Modified and new modules were shown to behave similarly. An abstract classification scheme for errors which allows a better understanding of the overall traits of a software project is also shown. Finally, various size and complexity metrics are examined with respect to errors detected within the software yielding some interesting results.
State of the art metrics for aspect oriented programming
NASA Astrophysics Data System (ADS)
Ghareb, Mazen Ismaeel; Allen, Gary
2018-04-01
The quality evaluation of software, e.g., defect measurement, gains significance with higher use of software applications. Metric measurements are considered as the primary indicator of imperfection prediction and software maintenance in various empirical studies of software products. However, there is no agreement on which metrics are compelling quality indicators for novel development approaches such as Aspect Oriented Programming (AOP). AOP intends to enhance programming quality, by providing new and novel constructs for the development of systems, for example, point cuts, advice and inter-type relationships. Hence, it is not evident if quality pointers for AOP can be derived from direct expansions of traditional OO measurements. Then again, investigations of AOP do regularly depend on established coupling measurements. Notwithstanding the late reception of AOP in empirical studies, coupling measurements have been adopted as useful markers of flaw inclination in this context. In this paper we will investigate the state of the art metrics for measurement of Aspect Oriented systems development.
Proceedings of the Thirteenth Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1988-01-01
Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.
Language and Program for Documenting Software Design
NASA Technical Reports Server (NTRS)
Kleine, H.; Zepko, T. M.
1986-01-01
Software Design and Documentation Language (SDDL) provides effective communication medium to support design and documentation of complex software applications. SDDL supports communication among all members of software design team and provides for production of informative documentation on design effort. Use of SDDL-generated document to analyze design makes it possible to eliminate many errors not detected until coding and testing attempted. SDDL processor program translates designer's creative thinking into effective document for communication. Processor performs as many automatic functions as possible, freeing designer's energy for creative effort. SDDL processor program written in PASCAL.
ERIC Educational Resources Information Center
Mukala, Patrick; Cerone, Antonio; Turini, Franco
2017-01-01
Free\\Libre Open Source Software (FLOSS) environments are increasingly dubbed as learning environments where practical software engineering skills can be acquired. Numerous studies have extensively investigated how knowledge is acquired in these environments through a collaborative learning model that define a learning process. Such a learning…
Criteria for software modularization
NASA Technical Reports Server (NTRS)
Card, David N.; Page, Gerald T.; Mcgarry, Frank E.
1985-01-01
A central issue in programming practice involves determining the appropriate size and information content of a software module. This study attempted to determine the effectiveness of two widely used criteria for software modularization, strength and size, in reducing fault rate and development cost. Data from 453 FORTRAN modules developed by professional programmers were analyzed. The results indicated that module strength is a good criterion with respect to fault rate, whereas arbitrary module size limitations inhibit programmer productivity. This analysis is a first step toward defining empirically based standards for software modularization.
The Future of Digital Working: Knowledge Migration and Learning
ERIC Educational Resources Information Center
Malcolm, Irene
2014-01-01
Against the backdrop of intensified migration linked to globalisation, this article considers the implications of knowledge migration for future digital workers. It draws empirically on a socio-material analysis of the international software localisation industry. Localisers' work requires linguistic, cultural and software engineering skills to…
P1198: software for tracing decision behavior in lending to small businesses.
Andersson, P
2001-05-01
This paper describes a process-tracing software program specially designed to capture decision behavior in lending to small businesses. The source code was written in Lotus Notes. The software runs in a Web browser and consists of two interacting systems: a database and a user interface. The database includes three realistic loan applications. The user interface consists of different but interacting screens that enable the participant to operate the software. Log files register the decision behavior of the participant. An empirical example is presented in order to show the software's potential in providing insights into judgment and decision making. The implications of the software are discussed.
GESTALT: A Framework for Redesign of Educational Software
ERIC Educational Resources Information Center
Puustinen, M.; Baker, M.; Lund, K.
2006-01-01
Design of educational multimedia rarely starts from scratch, but rather by attempting to reuse existing software. Although redesign has been an issue in research on evaluation and on learning objects, how it should be carried out in a principled way has remained relatively unexplored. Furthermore, understanding how empirical research on…
Analysis of Cross-Cultural Online Collaborative Learning with Social Software
ERIC Educational Resources Information Center
Law, Effie Lai-Chong; Nguyen-Ngoc, Anh Vu
2010-01-01
Purpose: The rising popularity of social software poses challenges to the design and evaluation of pedagogically sound cross-cultural online collaborative learning environments (OCLEs). In the literature of computer-mediated communications, there exist only a limited number of related empirical studies, indicating that it is still an emergent…
An Empirical Study on Students' Ability to Comprehend Design Patterns
ERIC Educational Resources Information Center
Chatzigeorgiou, Alexander; Tsantalis, Nikolaos; Deligiannis, Ignatios
2008-01-01
Design patterns have become a widely acknowledged software engineering practice and therefore have been incorporated in the curricula of most computer science departments. This paper presents an observational study on students' ability to understand and apply design patterns. Within the context of a postgraduate software engineering course,…
Top 10 metrics for life science software good practices.
Artaza, Haydee; Chue Hong, Neil; Corpas, Manuel; Corpuz, Angel; Hooft, Rob; Jimenez, Rafael C; Leskošek, Brane; Olivier, Brett G; Stourac, Jan; Svobodová Vařeková, Radka; Van Parys, Thomas; Vaughan, Daniel
2016-01-01
Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here.
Top 10 metrics for life science software good practices
2016-01-01
Metrics for assessing adoption of good development practices are a useful way to ensure that software is sustainable, reusable and functional. Sustainability means that the software used today will be available - and continue to be improved and supported - in the future. We report here an initial set of metrics that measure good practices in software development. This initiative differs from previously developed efforts in being a community-driven grassroots approach where experts from different organisations propose good software practices that have reasonable potential to be adopted by the communities they represent. We not only focus our efforts on understanding and prioritising good practices, we assess their feasibility for implementation and publish them here. PMID:27635232
Understanding and Predicting the Process of Software Maintenance Releases
NASA Technical Reports Server (NTRS)
Basili, Victor; Briand, Lionel; Condon, Steven; Kim, Yong-Mi; Melo, Walcelio L.; Valett, Jon D.
1996-01-01
One of the major concerns of any maintenance organization is to understand and estimate the cost of maintenance releases of software systems. Planning the next release so as to maximize the increase in functionality and the improvement in quality are vital to successful maintenance management. The objective of this paper is to present the results of a case study in which an incremental approach was used to better understand the effort distribution of releases and build a predictive effort model for software maintenance releases. This study was conducted in the Flight Dynamics Division (FDD) of NASA Goddard Space Flight Center(GSFC). This paper presents three main results: 1) a predictive effort model developed for the FDD's software maintenance release process; 2) measurement-based lessons learned about the maintenance process in the FDD; and 3) a set of lessons learned about the establishment of a measurement-based software maintenance improvement program. In addition, this study provides insights and guidelines for obtaining similar results in other maintenance organizations.
Infusing Software Engineering Technology into Practice at NASA
NASA Technical Reports Server (NTRS)
Pressburger, Thomas; Feather, Martin S.; Hinchey, Michael; Markosia, Lawrence
2006-01-01
We present an ongoing effort of the NASA Software Engineering Initiative to encourage the use of advanced software engineering technology on NASA projects. Technology infusion is in general a difficult process yet this effort seems to have found a modest approach that is successful for some types of technologies. We outline the process and describe the experience of the technology infusions that occurred over a two year period. We also present some lessons from the experiences.
Empirical Histograms in Item Response Theory with Ordinal Data
ERIC Educational Resources Information Center
Woods, Carol M.
2007-01-01
The purpose of this research is to describe, test, and illustrate a new implementation of the empirical histogram (EH) method for ordinal items. The EH method involves the estimation of item response model parameters simultaneously with the approximation of the distribution of the random latent variable (theta) as a histogram. Software for the EH…
University Software Ownership and Litigation: A First Examination*
Rai, Arti K.; Allison, John R.; Sampat, Bhaven N.
2013-01-01
Software patents and university-owned patents represent two of the most controversial intellectual property developments of the last twenty-five years. Despite this reality, and concerns that universities act as “patent trolls” when they assert software patents in litigation against successful commercializers, no scholar has systematically examined the ownership and litigation of university software patents. In this Article, we present the first such examination. Our empirical research reveals that software patents represent a significant and growing proportion of university patent holdings. Additionally, the most important determinant of the number of software patents a university owns is not its research and development (“R&D”) expenditures (whether computer science-related or otherwise) but, rather, its tendency to seek patents in other areas. In other words, universities appear to take a “one size fits all” approach to patenting their inventions. This one size fits all approach is problematic given the empirical evidence that software is likely to follow a different commercialization path than other types of invention. Thus, it is perhaps not surprising that we see a number of lawsuits in which university software patents have been used not for purposes of fostering commercialization, but instead, to extract rents in apparent holdup litigation. The Article concludes by examining whether this trend is likely to continue in the future, particularly given a 2006 Supreme Court decision that appears to diminish the holdup threat by recognizing the possibility of liability rules in patent suits, as well as recent case law that may call into question certain types of software patents. PMID:23750052
Empirical studies of design software: Implications for software engineering environments
NASA Technical Reports Server (NTRS)
Krasner, Herb
1988-01-01
The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.
Theory and Practice Meets in Industrial Process Design -Educational Perspective-
NASA Astrophysics Data System (ADS)
Aramo-Immonen, Heli; Toikka, Tarja
Software engineer should see himself as a business process designer in enterprise resource planning system (ERP) re-engineering project. Software engineers and managers should have design dialogue. The objective of this paper is to discuss the motives to study the design research in connection of management education in order to envision and understand the soft human issues in the management context. Second goal is to develop means of practicing social skills between designers and managers. This article explores the affective components of design thinking in industrial management domain. In the conceptual part of this paper are discussed concepts of network and project economy, creativity, communication, use of metaphors, and design thinking. Finally is introduced empirical research plan and first empirical results from design method experiments among the multi-disciplined groups of the master-level students of industrial engineering and management and software engineering.
Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches
NASA Technical Reports Server (NTRS)
Farassat, Fereidoun; Casper, Jay H.
2006-01-01
In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.
Evolution of Secondary Software Businesses: Understanding Industry Dynamics
NASA Astrophysics Data System (ADS)
Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko
Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.
Metric analysis and data validation across FORTRAN projects
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.; Phillips, Tsai-Yun
1983-01-01
The desire to predict the effort in developing or explaining the quality of software has led to the proposal of several metrics. As a step toward validating these metrics, the Software Engineering Laboratory (SEL) has analyzed the software science metrics, cyclomatic complexity, and various standard program measures for their relation to effort (including design through acceptance testing), development errors (both discrete and weighted according to the amount of time to locate and fix), and one another. The data investigated are collected from a project FORTRAN environment and examined across several projects at once, within individual projects and by reporting accuracy checks demonstrating the need to validate a database. When the data comes from individual programmers or certain validated projects, the metrics' correlations with actual effort seem to be strongest. For modules developed entirely by individual programmers, the validity ratios induce a statistically significant ordering of several of the metrics' correlations. When comparing the strongest correlations, neither software science's E metric cyclomatic complexity not source lines of code appears to relate convincingly better with effort than the others.
NASA Technical Reports Server (NTRS)
Avila, Edwin M. Martinez; Muniz, Ricardo; Szafran, Jamie; Dalton, Adam
2011-01-01
Lines of code (LOC) analysis is one of the methods used to measure programmer productivity and estimate schedules of programming projects. The Launch Control System (LCS) had previously used this method to estimate the amount of work and to plan development efforts. The disadvantage of using LOC as a measure of effort is that one can only measure 30% to 35% of the total effort of software projects involves coding [8]. In the application, instead of using the LOC we are using function point for a better estimation of hours in each software to develop. Because of these disadvantages, Jamie Szafran of the System Software Branch of Control And Data Systems (NE-C3) at Kennedy Space Canter developed a web application called Function Point Analysis (FPA) Depot. The objective of this web application is that the LCS software architecture team can use the data to more accurately estimate the effort required to implement customer requirements. This paper describes the evolution of the domain model used for function point analysis as project managers continually strive to generate more accurate estimates.
The Business Case for Automated Software Engineering
NASA Technical Reports Server (NTRS)
Menzies, Tim; Elrawas, Oussama; Hihn, Jairus M.; Feather, Martin S.; Madachy, Ray; Boehm, Barry
2007-01-01
Adoption of advanced automated SE (ASE) tools would be more favored if a business case could be made that these tools are more valuable than alternate methods. In theory, software prediction models can be used to make that case. In practice, this is complicated by the 'local tuning' problem. Normally. predictors for software effort and defects and threat use local data to tune their predictions. Such local tuning data is often unavailable. This paper shows that assessing the relative merits of different SE methods need not require precise local tunings. STAR 1 is a simulated annealer plus a Bayesian post-processor that explores the space of possible local tunings within software prediction models. STAR 1 ranks project decisions by their effects on effort and defects and threats. In experiments with NASA systems. STARI found one project where ASE were essential for minimizing effort/ defect/ threats; and another project were ASE tools were merely optional.
Scaffolding Executive Function Capabilities via Play-&-Learn Software for Preschoolers
ERIC Educational Resources Information Center
Axelsson, Anton; Andersson, Richard; Gulz, Agneta
2016-01-01
Educational software in the form of games or so called "computer assisted intervention" for young children has become increasingly common receiving a growing interest and support. Currently there are, for instance, more than 1,000 iPad apps tagged for preschool. Thus, it has become increasingly important to empirically investigate…
Examining the Influence of Educational Mobile Application Software on Students' Technology Literacy
ERIC Educational Resources Information Center
Twu, Ming-Lii
2017-01-01
The purpose of this mixed methods study was to employ the International Society for Technology in Education (ISTE) Standards for Students as taxonomy to classify educational mobile application (app) software into seven categories and empirically examine the influence on students' technology literacy. A purposeful sample of fifth grade core subject…
Bringing Science to Bear: An Empirical Assessment of the Comprehensive Soldier Fitness Program
ERIC Educational Resources Information Center
Lester, Paul B.; McBride, Sharon; Bliese, Paul D.; Adler, Amy B.
2011-01-01
This article outlines the U.S. Army's effort to empirically validate and assess the Comprehensive Soldier Fitness (CSF) program. The empirical assessment includes four major components. First, the CSF scientific staff is currently conducting a longitudinal study to determine if the Master Resilience Training program and the Comprehensive…
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
Empirical tests of Zipf's law mechanism in open source Linux distribution.
Maillart, T; Sornette, D; Spaeth, S; von Krogh, G
2008-11-21
Zipf's power law is a ubiquitous empirical regularity found in many systems, thought to result from proportional growth. Here, we establish empirically the usually assumed ingredients of stochastic growth models that have been previously conjectured to be at the origin of Zipf's law. We use exceptionally detailed data on the evolution of open source software projects in Linux distributions, which offer a remarkable example of a growing complex self-organizing adaptive system, exhibiting Zipf's law over four full decades.
The evolution of CMS software performance studies
NASA Astrophysics Data System (ADS)
Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.
2011-12-01
CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.
ERIC Educational Resources Information Center
Radulescu, Iulian Ionut
2006-01-01
Software complexity is the most important software quality attribute and a very useful instrument in the study of software quality. Is one of the factors that affect most of the software quality characteristics, including maintainability. It is very important to quantity this influence and identify the means to keep it under control; by using…
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Pressburger, Thomas; Markosian, Lawrence; Feather, Martin S.
2006-01-01
New processes, methods and tools are constantly appearing in the field of software engineering. Many of these augur great potential in improving software development processes, resulting in higher quality software with greater levels of assurance. However, there are a number of obstacles that impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may most benefit them, and cannot afford to risk time and effort in evaluating and experimenting with them while there is still uncertainty about whether they will have payoff in this particular context. Similarly, researchers cannot readily identify those practitioners whose problems would be amenable to their techniques and lack the feedback from practical applications necessary to help them to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team, and the NASA Research Infusion Initiative, established by NASA s Software Engineering Initiative, to overcome these obstacles.
Second generation experiments in fault tolerant software
NASA Technical Reports Server (NTRS)
Knight, J. C.
1987-01-01
The purpose of the Multi-Version Software (MVS) experiment is to obtain empirical measurements of the performance of multi-version systems. Twenty version of a program were prepared under reasonably realistic development conditions from the same specifications. The overall structure of the testing environment for the MVS experiment and its status are described. A preliminary version of the control system is described that was implemented for the MVS experiment to allow the experimenter to have control over the details of the testing. The results of an empirical study of error detection using self checks are also presented. The analysis of the checks revealed that there are great differences in the ability of individual programmers to design effective checks.
The Waterfall Model in Large-Scale Development
NASA Astrophysics Data System (ADS)
Petersen, Kai; Wohlin, Claes; Baca, Dejan
Waterfall development is still a widely used way of working in software development companies. Many problems have been reported related to the model. Commonly accepted problems are for example to cope with change and that defects all too often are detected too late in the software development process. However, many of the problems mentioned in literature are based on beliefs and experiences, and not on empirical evidence. To address this research gap, we compare the problems in literature with the results of a case study at Ericsson AB in Sweden, investigating issues in the waterfall model. The case study aims at validating or contradicting the beliefs of what the problems are in waterfall development through empirical research.
NASA Technical Reports Server (NTRS)
Picasso, G. O.; Basili, V. R.
1982-01-01
It is noted that previous investigations into the applicability of Rayleigh curve model to medium scale software development efforts have met with mixed results. The results of these investigations are confirmed by analyses of runs and smoothing. The reasons for the models' failure are found in the subcycle effort data. There are four contributing factors: uniqueness of the environment studied, the influence of holidays, varying management techniques and differences in the data studied.
An Uncertainty Structure Matrix for Models and Simulations
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Blattnig, Steve R.; Hemsch, Michael J.; Luckring, James M.; Tripathi, Ram K.
2008-01-01
Software that is used for aerospace flight control and to display information to pilots and crew is expected to be correct and credible at all times. This type of software is typically developed under strict management processes, which are intended to reduce defects in the software product. However, modeling and simulation (M&S) software may exhibit varying degrees of correctness and credibility, depending on a large and complex set of factors. These factors include its intended use, the known physics and numerical approximations within the M&S, and the referent data set against which the M&S correctness is compared. The correctness and credibility of an M&S effort is closely correlated to the uncertainty management (UM) practices that are applied to the M&S effort. This paper describes an uncertainty structure matrix for M&S, which provides a set of objective descriptions for the possible states of UM practices within a given M&S effort. The columns in the uncertainty structure matrix contain UM elements or practices that are common across most M&S efforts, and the rows describe the potential levels of achievement in each of the elements. A practitioner can quickly look at the matrix to determine where an M&S effort falls based on a common set of UM practices that are described in absolute terms that can be applied to virtually any M&S effort. The matrix can also be used to plan those steps and resources that would be needed to improve the UM practices for a given M&S effort.
SCaN Testbed Software Development and Lessons Learned
NASA Technical Reports Server (NTRS)
Kacpura, Thomas J.; Varga, Denise M.
2012-01-01
National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of the software was an extensive effort. The challenges of specifying a suitable test matrix with reconfigurable systems that offer numerous configurations is highlighted. Since the flight system testing requires methodical, controlled testing that limits risk, a nearly identical ground system to the on-orbit flight system was required to develop the software and write verification procedures before it was installed and tested on the flight system. The development of the SCAN testbed was an accelerated effort to meet launch constraints, and this paper discusses tradeoffs made to balance needed software functionality and still maintain the schedule. Future upgrades are discussed that optimize the avionics and allow experimenters to utilize the SCAN testbed potential.
Photometric Modeling of Simulated Surace-Resolved Bennu Images
NASA Astrophysics Data System (ADS)
Golish, D.; DellaGiustina, D. N.; Clark, B.; Li, J. Y.; Zou, X. D.; Bennett, C. A.; Lauretta, D. S.
2017-12-01
The Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) is a NASA mission to study and return a sample of asteroid (101955) Bennu. Imaging data from the mission will be used to develop empirical surface-resolved photometric models of Bennu at a series of wavelengths. These models will be used to photometrically correct panchromatic and color base maps of Bennu, compensating for variations due to shadows and photometric angle differences, thereby minimizing seams in mosaicked images. Well-corrected mosaics are critical to the generation of a global hazard map and a global 1064-nm reflectance map which predicts LIDAR response. These data products directly feed into the selection of a site from which to safely acquire a sample. We also require photometric correction for the creation of color ratio maps of Bennu. Color ratios maps provide insight into the composition and geological history of the surface and allow for comparison to other Solar System small bodies. In advance of OSIRIS-REx's arrival at Bennu, we use simulated images to judge the efficacy of both the photometric modeling software and the mission observation plan. Our simulation software is based on USGS's Integrated Software for Imagers and Spectrometers (ISIS) and uses a synthetic shape model, a camera model, and an empirical photometric model to generate simulated images. This approach gives us the flexibility to create simulated images of Bennu based on analog surfaces from other small Solar System bodies and to test our modeling software under those conditions. Our photometric modeling software fits image data to several conventional empirical photometric models and produces the best fit model parameters. The process is largely automated, which is crucial to the efficient production of data products during proximity operations. The software also produces several metrics on the quality of the observations themselves, such as surface coverage and the completeness of the data set for evaluating the phase and disk functions of the surface. Application of this software to simulated mission data has revealed limitations in the initial mission design, which has fed back into the planning process. The entire photometric pipeline further serves as an exercise of planned activities for proximity operations.
Design and Empirical Evaluation of Search Software for Legal Professionals on the WWW.
ERIC Educational Resources Information Center
Dempsey, Bert J.; Vreeland, Robert C.; Sumner, Robert G., Jr.; Yang, Kiduk
2000-01-01
Discussion of effective search aids for legal researchers on the World Wide Web focuses on the design and evaluation of two software systems developed to explore models for browsing and searching across a user-selected set of Web sites. Describes crawler-enhanced search engines, filters, distributed full-text searching, and natural language…
Analysis of Learning Behavior in a Flipped Programing Classroom Adopting Problem-Solving Strategies
ERIC Educational Resources Information Center
Chiang, Tosti Hsu-Cheng
2017-01-01
Programing is difficult for beginners because they need to learn the new language of computers. Developing software, especially complex software, is bound to result in problems, frustration, and the need to think in new ways. Identifying the learning behavior behind programing by way of empirical studies can help beginners learn more easily. In…
ERIC Educational Resources Information Center
Chen, Liqiang; Keys, Anthony; Gaber, Donald
2015-01-01
It is a challenge for business students or even employees to understand business processes and enterprise software usage without involvement in real-world practices. Many business schools are using ERP software in their curriculum, aiming to expose students to real-world business practices. ERPsim is an Enterprise Resource Planning (ERP)…
Managing configuration software of ground software applications with glueware
NASA Technical Reports Server (NTRS)
Larsen, B.; Herrera, R.; Sesplaukis, T.; Cheng, L.; Sarrel, M.
2003-01-01
This paper reports on a simple, low-cost effort to streamline the configuration of the uplink software tools. Even though the existing ground system consisted of JPL and custom Cassini software rather than COTS, we chose a glueware approach--reintegrating with wrappers and bridges and adding minimal new functionality.
User's Guide for the MapImage Reprojection Software Package, Version 1.01
Finn, Michael P.; Trent, Jason R.
2004-01-01
Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets (such as 30-m data) for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Recently, Usery and others (2003a) expanded on the previously limited empirical work with real geographic data by compiling and tabulating the accuracy of categorical areas in projected raster datasets of global extent. Geographers and applications programmers at the U.S. Geological Survey's (USGS) Mid-Continent Mapping Center (MCMC) undertook an effort to expand and evolve an internal USGS software package, MapImage, or mapimg, for raster map projection transformation (Usery and others, 2003a). Daniel R. Steinwand of Science Applications International Corporation, Earth Resources Observation Systems Data Center in Sioux Falls, S. Dak., originally developed mapimg for the USGS, basing it on the USGS's General Cartographic Transformation Package (GCTP). It operated as a command line program on the Unix operating system. Through efforts at MCMC, and in coordination with Mr. Steinwand, this program has been transformed from an application based on a command line into a software package based on a graphic user interface for Windows, Linux, and Unix machines. Usery and others (2003b) pointed out that many commercial software packages do not use exact projection equations and that even when exact projection equations are used, the software often results in error and sometimes does not complete the transformation for specific projections, at specific resampling resolutions, and for specific singularities. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in these software packages, but implementation with data other than points requires specific adaptation of the equations or prior preparation of the data to allow the transformation to succeed. Additional constraints apply to global raster data. It appears that some packages use the USGS's GCTP or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003b). It is most common for programs to compute transformations of raster data in an inverse fashion. Such mapping can result in an erroneous position and replicate data or create pixels not in the original space. As Usery and others (2003a) indicated, mapimg performs a corresponding forward transformation to ensure the same location results from both methods. The primary benefit of this function is to mask cells outside the domain. MapImage 1.01 is now on the Web. You can download the User's Guide, source, and binaries from the following site: http://mcmcweb.er.usgs.gov/carto_research/projection/acc_proj_data.html
ERIC Educational Resources Information Center
Birchard, Marcy; Dye, Charles; Gordon, John
With limits on both personnel and time available to conduct effective instruction, the decision is being made increasingly to enhance instructor-led courses with Computer-Based Training (CBT). The effectiveness of this conversion is often unknown and in many cases empirical evaluations are never conducted. This paper describes and discusses the…
The Ideologies of American Social Critics: An Empirical Test of Kadushin's Theory
ERIC Educational Resources Information Center
Simon, David R.
1977-01-01
Examines Kadushin's earlier empirical efforts to determine the leading social critics and organizations of social criticism in America and investigates his theory through content analysis of leading journals of social criticism. (MH)
Ground System Harmonization Efforts at NASA's Goddard Space Flight Center
NASA Technical Reports Server (NTRS)
Smith, Dan
2011-01-01
This slide presentation reviews the efforts made at Goddard Space Flight Center in harmonizing the ground systems to assist in collaboration in space ventures. The key elements of this effort are: (1) Moving to a Common Framework (2) Use of Consultative Committee for Space Data Systems (CCSDS) Standards (3) Collaboration Across NASA Centers (4) Collaboration Across Industry and other Space Organizations. These efforts are working to bring into harmony the GSFC systems with CCSDS standards to allow for common software, use of Commercial Off the Shelf Software and low risk development and operations and also to work toward harmonization with other NASA centers
Real-time LMR control parameter generation using advanced adaptive synthesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
King, R.W.; Mott, J.E.
1990-01-01
The reactor delta T'', the difference between the average core inlet and outlet temperatures, for the liquid-sodium-cooled Experimental Breeder Reactor 2 is empirically synthesized in real time from, a multitude of examples of past reactor operation. The real-time empirical synthesis is based on reactor operation. The real-time empirical synthesis is based on system state analysis (SSA) technology embodied in software on the EBR 2 data acquisition computer. Before the real-time system is put into operation, a selection of reactor plant measurements is made which is predictable over long periods encompassing plant shutdowns, core reconfigurations, core load changes, and plant startups.more » A serial data link to a personal computer containing SSA software allows the rapid verification of the predictability of these plant measurements via graphical means. After the selection is made, the real-time synthesis provides a fault-tolerant estimate of the reactor delta T accurate to {plus}/{minus}1{percent}. 5 refs., 7 figs.« less
WFF TOPEX Software Documentation Overview, May 1999. Volume 2
NASA Technical Reports Server (NTRS)
Brooks, Ronald L.; Lee, Jeffrey
2003-01-01
This document provides an overview'of software development activities and the resulting products and procedures developed by the TOPEX Software Development Team (SWDT) at Wallops Flight Facility, in support of the WFF TOPEX Engineering Assessment and Verification efforts.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
Public Education in the Russian Empire at the End of the 19th Century
ERIC Educational Resources Information Center
Shevchenko, Natal'ya A.; Kucherkov, Ivan A.; Shirev, Denis A.; Miku, Natal'ya V.
2018-01-01
The paper reviews primary education in the Russian Empire at the end of the 19th century. It focuses on describing the successes and shortcomings of the public education system, as well as identifying the causes of its poor efficiency. As a summary, the authors concluded that the government of the Russian Empire consolidated major efforts to…
Government Technology Acquisition Policy: The Case of Proprietary versus Open Source Software
ERIC Educational Resources Information Center
Hemphill, Thomas A.
2005-01-01
This article begins by explaining the concepts of proprietary and open source software technology, which are now competing in the marketplace. A review of recent individual and cooperative technology development and public policy advocacy efforts, by both proponents of open source software and advocates of proprietary software, subsequently…
Software Size Estimation Using Expert Estimation: A Fuzzy Logic Approach
ERIC Educational Resources Information Center
Stevenson, Glenn A.
2012-01-01
For decades software managers have been using formal methodologies such as the Constructive Cost Model and Function Points to estimate the effort of software projects during the early stages of project development. While some research shows these methodologies to be effective, many software managers feel that they are overly complicated to use and…
Estimating Software-Development Costs With Greater Accuracy
NASA Technical Reports Server (NTRS)
Baker, Dan; Hihn, Jairus; Lum, Karen
2008-01-01
COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.
NASA software specification and evaluation system design, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.
ERIC Educational Resources Information Center
Shittu, Ahmed Tajudeen; Basha, Kamal Madarsha; AbdulRahman, Nik Suryani Nik; Ahmad, Tunku Badariah Tunku
2011-01-01
Purpose: Social software usage is growing at an exponential rate among the present generation of students. Yet, there is paucity of empirical study to understand the determinant of its use in the present setting of this study. This study, therefore, seeks to investigate factors that predict students' attitudes and intentions to use this…
A Test of the Design of a Video Tutorial for Software Training
ERIC Educational Resources Information Center
van der Meij, J.; van der Meij, H.
2015-01-01
The effectiveness of a video tutorial versus a paper-based tutorial for software training has yet to be established. Mixed outcomes from the empirical studies to date suggest that for a video tutorial to outperform its paper-based counterpart, the former should be crafted so that it addresses the strengths of both designs. This was attempted in…
ERIC Educational Resources Information Center
Davis, Wesley D.
This study evaluated Krell's 1981-82 Scholastic Aptitude Test (SAT) preparatory series software purported to raise students' scores substantially after only a short term of computer-assisted instruction (CAI). Forty-eight college-bound juniors from Escambia County (Florida) were assigned to experimental and control groups. A two-phased pre- and…
NASA Technical Reports Server (NTRS)
Goseva-Popstojanova, Katerina; Tyo, Jacob
2017-01-01
While some prior research work exists on characteristics of software faults (i.e., bugs) and failures, very little work has been published on analysis of software applications vulnerabilities. This paper aims to contribute towards filling that gap by presenting an empirical investigation of application vulnerabilities. The results are based on data extracted from issue tracking systems of two NASA missions. These data were organized in three datasets: Ground mission IVV issues, Flight mission IVV issues, and Flight mission Developers issues. In each dataset, we identified security related software bugs and classified them in specific vulnerability classes. Then, we created the security vulnerability profiles, i.e., determined where and when the security vulnerabilities were introduced and what were the dominating vulnerabilities classes. Our main findings include: (1) In IVV issues datasets the majority of vulnerabilities were code related and were introduced in the Implementation phase. (2) For all datasets, around 90 of the vulnerabilities were located in two to four subsystems. (3) Out of 21 primary classes, five dominated: Exception Management, Memory Access, Other, Risky Values, and Unused Entities. Together, they contributed from 80 to 90 of vulnerabilities in each dataset.
Grading System and Student Effort
ERIC Educational Resources Information Center
Paredes, Valentina
2017-01-01
Several papers have proposed that the grading system affects students' incentives to exert effort. In particular, the previous literature has compared student effort under relative and absolute grading systems, but the results are mixed and the implications of the models have not been empirically tested. In this paper, I build a model where…
Element Load Data Processor (ELDAP) Users Manual
NASA Technical Reports Server (NTRS)
Ramsey, John K., Jr.; Ramsey, John K., Sr.
2015-01-01
Often, the shear and tensile forces and moments are extracted from finite element analyses to be used in off-line calculations for evaluating the integrity of structural connections involving bolts, rivets, and welds. Usually the maximum forces and moments are desired for use in the calculations. In situations where there are numerous structural connections of interest for numerous load cases, the effort in finding the true maximum force and/or moment combinations among all fasteners and welds and load cases becomes difficult. The Element Load Data Processor (ELDAP) software described herein makes this effort manageable. This software eliminates the possibility of overlooking the worst-case forces and moments that could result in erroneous positive margins of safety and/or selecting inconsistent combinations of forces and moments resulting in false negative margins of safety. In addition to forces and moments, any scalar quantity output in a PATRAN report file may be evaluated with this software. This software was originally written to fill an urgent need during the structural analysis of the Ares I-X Interstage segment. As such, this software was coded in a straightforward manner with no effort made to optimize or minimize code or to develop a graphical user interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartlett, Roscoe A; Heroux, Dr. Michael A; Willenbring, James
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science & engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process--respecting the competing needs of research vs. production--cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects thatmore » are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2002-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...and global water column. 1 OBJECTIVES The project’s objective is to develop automated image analysis software to reduce the effort and time
North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg
2007-10-11
Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.
Complexity Measure for the Prototype System Description Language (PSDL)
2002-06-01
Albrecht, A. and Gaffney , J., Software Function Source Lines of Code and Development Effort Prediction, IEEE Transactions on Software Engineering...Through Meausrement”; Proceedings of the IEEE, Vol. 77, No. 4, April 89. Schach, Stephen, R., Software Engineering, Second Edition, IRWIN, Burr Ridge
An experimental investigation of fault tolerant software structures in an avionics application
NASA Technical Reports Server (NTRS)
Caglayan, Alper K.; Eckhardt, Dave E., Jr.
1989-01-01
The objective of this experimental investigation is to compare the functional performance and software reliability of competing fault tolerant software structures utilizing software diversity. In this experiment, three versions of the redundancy management software for a skewed sensor array have been developed using three diverse failure detection and isolation algorithms and incorporated into various N-version, recovery block and hybrid software structures. The empirical results show that, for maximum functional performance improvement in the selected application domain, the results of diverse algorithms should be voted before being processed by multiple versions without enforced diversity. Results also suggest that when the reliability gain with an N-version structure is modest, recovery block structures are more feasible since higher reliability can be obtained using an acceptance check with a modest reliability.
The Computational Infrastructure for Geodynamics as a Community of Practice
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2016-12-01
Computational Infrastructure for Geodynamics (CIG), geodynamics.org, originated in 2005 out of community recognition that the efforts of individual or small groups of researchers to develop scientifically-sound software is impossible to sustain, duplicates effort, and makes it difficult for scientists to adopt state-of-the art computational methods that promote new discovery. As a community of practice, participants in CIG share an interest in computational modeling in geodynamics and work together on open source software to build the capacity to support complex, extensible, scalable, interoperable, reliable, and reusable software in an effort to increase the return on investment in scientific software development and increase the quality of the resulting software. The group interacts regularly to learn from each other and better their practices formally through webinar series, workshops, and tutorials and informally through listservs and hackathons. Over the past decade, we have learned that successful scientific software development requires at a minimum: collaboration between domain-expert researchers, software developers and computational scientists; clearly identified and committed lead developer(s); well-defined scientific and computational goals that are regularly evaluated and updated; well-defined benchmarks and testing throughout development; attention throughout development to usability and extensibility; understanding and evaluation of the complexity of dependent libraries; and managed user expectations through education, training, and support. CIG's code donation standards provide the basis for recently formalized best practices in software development (geodynamics.org/cig/dev/best-practices/). Best practices include use of version control; widely used, open source software libraries; extensive test suites; portable configuration and build systems; extensive documentation internal and external to the code; and structured, human readable input formats.
The Emergence of Open-Source Software in North America
ERIC Educational Resources Information Center
Pan, Guohua; Bonk, Curtis J.
2007-01-01
Unlike conventional models of software development, the open source model is based on the collaborative efforts of users who are also co-developers of the software. Interest in open source software has grown exponentially in recent years. A "Google" search for the phrase open source in early 2005 returned 28.8 million webpage hits, while…
Space Station communications and tracking systems modeling and RF link simulation
NASA Technical Reports Server (NTRS)
Tsang, Chit-Sang; Chie, Chak M.; Lindsey, William C.
1986-01-01
In this final report, the effort spent on Space Station Communications and Tracking System Modeling and RF Link Simulation is described in detail. The effort is mainly divided into three parts: frequency division multiple access (FDMA) system simulation modeling and software implementation; a study on design and evaluation of a functional computerized RF link simulation/analysis system for Space Station; and a study on design and evaluation of simulation system architecture. This report documents the results of these studies. In addition, a separate User's Manual on Space Communications Simulation System (SCSS) (Version 1) documents the software developed for the Space Station FDMA communications system simulation. The final report, SCSS user's manual, and the software located in the NASA JSC system analysis division's VAX 750 computer together serve as the deliverables from LinCom for this project effort.
Layer moduli of Nebraska pavements for the new Mechanistic-Empirical Pavement Design Guide (MEPDG).
DOT National Transportation Integrated Search
2010-12-01
As a step-wise implementation effort of the Mechanistic-Empirical Pavement Design Guide (MEPDG) for the design : and analysis of Nebraska flexible pavement systems, this research developed a database of layer moduli dynamic : modulus, creep compl...
Diagnostic Algorithm Benchmarking
NASA Technical Reports Server (NTRS)
Poll, Scott
2011-01-01
A poster for the NASA Aviation Safety Program Annual Technical Meeting. It describes empirical benchmarking on diagnostic algorithms using data from the ADAPT Electrical Power System testbed and a diagnostic software framework.
Induced Innovation and Social Inequality: Evidence from Infant Medical Care.
Cutler, David M; Meara, Ellen; Richards-Shubik, Seth
2012-01-01
We develop a model of induced innovation that applies to medical research. Our model yields three empirical predictions. First, initial death rates and subsequent research effort should be positively correlated. Second, research effort should be associated with more rapid mortality declines. Third, as a byproduct of targeting the most common conditions in the population as a whole, induced innovation leads to growth in mortality disparities between minority and majority groups. Using information on infant deaths in the U.S. between 1983 and 1998, we find support for all three empirical predictions.
Integrated optomechanical analysis and testing software development at MIT Lincoln Laboratory
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2013-09-01
Advanced analytical software capabilities are being developed to advance the design of prototypical hardware in the Engineering Division at MIT Lincoln Laboratory. The current effort is focused on the integration of analysis tools tailored to the work flow, organizational structure, and current technology demands. These tools are being designed to provide superior insight into the interdisciplinary behavior of optical systems and enable rapid assessment and execution of design trades to optimize the design of optomechanical systems. The custom software architecture is designed to exploit and enhance the functionality of existing industry standard commercial software, provide a framework for centralizing internally developed tools, and deliver greater efficiency, productivity, and accuracy through standardization, automation, and integration. Specific efforts have included the development of a feature-rich software package for Structural-Thermal-Optical Performance (STOP) modeling, advanced Line Of Sight (LOS) jitter simulations, and improved integration of dynamic testing and structural modeling.
Infusing Software Assurance Research Techniques into Use
NASA Technical Reports Server (NTRS)
Pressburger, Thomas; DiVito, Ben; Feather, Martin S.; Hinchey, Michael; Markosian, Lawrence; Trevino, Luis C.
2006-01-01
Research in the software engineering community continues to lead to new development techniques that encompass processes, methods and tools. However, a number of obstacles impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may benefit them, and cannot afford to risk time and effort evaluating and trying one out while there remains uncertainty about whether it will work for them. Researchers cannot readily identify the practitioners whose problems would be amenable to their techniques, and, lacking feedback from practical applications, are hard-pressed to gauge the where and in what ways to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team established by NASA s Software Engineering Initiative to overcome these obstacles. .
Automating Software Design Metrics.
1984-02-01
INTRODUCTION 1 ", ... 0..1 1.2 HISTORICAL PERSPECTIVE High quality software is of interest to both the software engineering com- munity and its users. As...contributions of many other software engineering efforts, most notably [MCC 77] and [Boe 83b], which have defined and refined a framework for quantifying...AUTOMATION OF DESIGN METRICS Software metrics can be useful within the context of an integrated soft- ware engineering environment. The purpose of this
ERIC Educational Resources Information Center
Rushinek, Avi; Rushinek, Sara
1984-01-01
Describes results of a system rating study in which users responded to WPS (word processing software) questions. Study objectives were data collection and evaluation of variables; statistical quantification of WPS's contribution (along with other variables) to user satisfaction; design of an expert system to evaluate WPS; and database update and…
ERIC Educational Resources Information Center
Sanchez, Pablo; Zorrilla, Marta; Duque, Rafael; Nieto-Reyes, Alicia
2011-01-01
Models in Software Engineering are considered as abstract representations of software systems. Models highlight relevant details for a certain purpose, whereas irrelevant ones are hidden. Models are supposed to make system comprehension easier by reducing complexity. Therefore, models should play a key role in education, since they would ease the…
Questioning the Role of Requirements Engineering in the Causes of Safety-Critical Software Failures
NASA Technical Reports Server (NTRS)
Johnson, C. W.; Holloway, C. M.
2006-01-01
Many software failures stem from inadequate requirements engineering. This view has been supported both by detailed accident investigations and by a number of empirical studies; however, such investigations can be misleading. It is often difficult to distinguish between failures in requirements engineering and problems elsewhere in the software development lifecycle. Further pitfalls arise from the assumption that inadequate requirements engineering is a cause of all software related accidents for which the system fails to meet its requirements. This paper identifies some of the problems that have arisen from an undue focus on the role of requirements engineering in the causes of major accidents. The intention is to provoke further debate within the emerging field of forensic software engineering.
An empirical study of flight control software reliability
NASA Technical Reports Server (NTRS)
Dunham, J. R.; Pierce, J. L.
1986-01-01
The results of a laboratory experiment in flight control software reliability are reported. The experiment tests a small sample of implementations of a pitch axis control law for a PA28 aircraft with over 14 million pitch commands with varying levels of additive input and feedback noise. The testing which uses the method of n-version programming for error detection surfaced four software faults in one implementation of the control law. The small number of detected faults precluded the conduct of the error burst analyses. The pitch axis problem provides data for use in constructing a model in the prediction of the reliability of software in systems with feedback. The study is undertaken to find means to perform reliability evaluations of flight control software.
NASA Astrophysics Data System (ADS)
Ahmad, Sabrina; Jalil, Intan Ermahani A.; Ahmad, Sharifah Sakinah Syed
2016-08-01
It is seldom technical issues which impede the process of eliciting software requirements. The involvement of multiple stakeholders usually leads to conflicts and therefore the need of conflict detection and resolution effort is crucial. This paper presents a conceptual model to further improve current efforts. Hence, this paper forwards an improved conceptual model to assist the conflict detection and resolution effort which extends the model ability and improves overall performance. The significant of the new model is to empower the automation of conflicts detection and its severity level with rule-based reasoning.
Maintaining Situation Awareness with Autonomous Airborne Observation Platforms
NASA Technical Reports Server (NTRS)
Freed, Michael; Fitzgerald, Will
2005-01-01
Unmanned Aerial Vehicles (UAVs) offer tremendous potential as intelligence, surveillance and reconnaissance (ISR) platforms for early detection of security threats and for acquisition and maintenance of situation awareness in crisis conditions. However, using their capabilities effectively requires addressing a range of practical and theoretical problems. The paper will describe progress by the "Autonomous Rotorcraft Project," a collaborative effort between NASA and the U.S. Army to develop a practical, flexible capability for UAV-based ISR. Important facets of the project include optimization methods for allocating scarce aircraft resources to observe numerous, distinct sites of interest; intelligent flight automation software than integrates high-level plan generation capabilities with executive control, failure response and flight control functions; a system architecture supporting reconfiguration of onboard sensors to address different kinds of threats; and an advanced prototype vehicle designed to allow large-scale production at low cost. The paper will also address human interaction issues including an empirical method for determining how to allocate roles and responsibilities between flight automation and human operations.
Proceedings of the Workshop on Change of Representation and Problem Reformulation
NASA Technical Reports Server (NTRS)
Lowry, Michael R.
1992-01-01
The proceedings of the third Workshop on Change of representation and Problem Reformulation is presented. In contrast to the first two workshops, this workshop was focused on analytic or knowledge-based approaches, as opposed to statistical or empirical approaches called 'constructive induction'. The organizing committee believes that there is a potential for combining analytic and inductive approaches at a future date. However, it became apparent at the previous two workshops that the communities pursuing these different approaches are currently interested in largely non-overlapping issues. The constructive induction community has been holding its own workshops, principally in conjunction with the machine learning conference. While this workshop is more focused on analytic approaches, the organizing committee has made an effort to include more application domains. We have greatly expanded from the origins in the machine learning community. Participants in this workshop come from the full spectrum of AI application domains including planning, qualitative physics, software engineering, knowledge representation, and machine learning.
Open Source Molecular Modeling
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-01-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126
ERIC Educational Resources Information Center
Lafferty, Mark T.
2010-01-01
The number of project failures and those projects completed over cost and over schedule has been a significant issue for software project managers. Among the many reasons for failure, inaccuracy in software estimation--the basis for project bidding, budgeting, planning, and probability estimates--has been identified as a root cause of a high…
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
NASA Technical Reports Server (NTRS)
Rodriguez, Juan Jared
2014-01-01
The purpose of this report is to detail the tasks accomplished as a NASA NIFS intern for the summer 2014 session. This internship opportunity is to develop an issue tracker Ruby on Rails web application to improve the communication of developmental anomalies between the Support Software Computer Software Configuration Item (CSCI) teams, System Build and Information Architecture. As many may know software development is an arduous, time consuming, collaborative effort. It involves nearly as much work designing, planning, collaborating, discussing, and resolving issues as effort expended in actual development. This internship opportunity was put in place to help alleviate the amount of time spent discussing issues such as bugs, missing tests, new requirements, and usability concerns that arise during development and throughout the life cycle of software applications once in production.
Software reliability through fault-avoidance and fault-tolerance
NASA Technical Reports Server (NTRS)
Vouk, Mladen A.; Mcallister, David F.
1993-01-01
Strategies and tools for the testing, risk assessment and risk control of dependable software-based systems were developed. Part of this project consists of studies to enable the transfer of technology to industry, for example the risk management techniques for safety-concious systems. Theoretical investigations of Boolean and Relational Operator (BRO) testing strategy were conducted for condition-based testing. The Basic Graph Generation and Analysis tool (BGG) was extended to fully incorporate several variants of the BRO metric. Single- and multi-phase risk, coverage and time-based models are being developed to provide additional theoretical and empirical basis for estimation of the reliability and availability of large, highly dependable software. A model for software process and risk management was developed. The use of cause-effect graphing for software specification and validation was investigated. Lastly, advanced software fault-tolerance models were studied to provide alternatives and improvements in situations where simple software fault-tolerance strategies break down.
Continuation of research into software for space operations support, volume 1
NASA Technical Reports Server (NTRS)
Collier, Mark D.; Killough, Ronnie; Martin, Nancy L.
1990-01-01
A prototype workstation executive called the Hardware Independent Software Development Environment (HISDE) was developed. Software technologies relevant to workstation executives were researched and evaluated and HISDE was used as a test bed for prototyping efforts. New X Windows software concepts and technology were introduced into workstation executives and related applications. The four research efforts performed included: (1) Research into the usability and efficiency of Motif (an X Windows based graphic user interface) which consisted of converting the existing Athena widget based HISDE user interface to Motif demonstrating the usability of Motif and providing insight into the level of effort required to translate an application from widget to another; (2) Prototype a real time data display widget which consisted of research methods for and prototyping the selected method of displaying textual values in an efficient manner; (3) X Windows performance evaluation which consisted of a series of performance measurements which demonstrated the ability of low level X Windows to display textural information; (4) Convert the Display Manager to X Window/Motif which is the application used by NASA for data display during operational mode.
Getting started on metrics - Jet Propulsion Laboratory productivity and quality
NASA Technical Reports Server (NTRS)
Bush, M. W.
1990-01-01
A review is presented to describe the effort and difficulties of reconstructing fifteen years of JPL software history. In 1987 the collection and analysis of project data were started with the objective of creating laboratory-wide measures of quality and productivity for software development. As a result of this two-year Software Product Assurance metrics study, a rough measurement foundation for software productivity and software quality, and an order-of-magnitude quantitative baseline for software systems and subsystems are now available.
NASA Technical Reports Server (NTRS)
Ebeling, Charles
1993-01-01
This report documents the work accomplished during the first two years of research to provide support to NASA in predicting operational and support parameters and costs of proposed space systems. The first year's research developed a methodology for deriving reliability and maintainability (R & M) parameters based upon the use of regression analysis to establish empirical relationships between performance and design specifications and corresponding mean times of failure and repair. The second year focused on enhancements to the methodology, increased scope of the model, and software improvements. This follow-on effort expands the prediction of R & M parameters and their effect on the operations and support of space transportation vehicles to include other system components such as booster rockets and external fuel tanks. It also increases the scope of the methodology and the capabilities of the model as implemented by the software. The focus is on the failure and repair of major subsystems and their impact on vehicle reliability, turn times, maintenance manpower, and repairable spares requirements. The report documents the data utilized in this study, outlines the general methodology for estimating and relating R&M parameters, presents the analyses and results of application to the initial data base, and describes the implementation of the methodology through the use of a computer model. The report concludes with a discussion on validation and a summary of the research findings and results.
Enabling joined-up decision making with geotemporal information
NASA Astrophysics Data System (ADS)
Smith, M. J.; Ahmed, S. E.; Purves, D. W.; Emmott, S.; Joppa, L. N.; Caldararu, S.; Visconti, P.; Newbold, T.; Formica, A. F.
2015-12-01
While the use of geospatial data to assist in decision making is becoming increasingly common, the use of geotemporal information: information that can be indexed by geographical space AND time, is much rarer. I will describe our scientific research and software development efforts intended to advance the availability and use of geotemporal information in general. I will show two recent examples of "stacking" geotemporal information to support land use decision making in the Brazilian Amazon and Kenya, involving data-constrained predictive models and empirically derived datasets of road development, deforestation, carbon, agricultural yields, water purification and poverty alleviation services and will show how we use trade-off analyses and constraint reasoning algorithms to explore the costs and benefits of different decisions. For the Brazilian Amazon we explore tradeoffs involved in different deforestation scenarios, while for Kenya we explore the impacts of conserving forest to support international carbon conservation initiatives (REDD+). I will also illustrate the cloud-based software tools we have developed to enable anyone to access geotemporal information, gridded (e.g. climate) or non-gridded (e.g. protected areas), for the past, present or future and incorporate such information into their analyses (e.g. www.fetchclimate.org), including how we train new predictive models to such data using Bayesian techniques: on this latter point I will show how we combine satellite and ground measured data with predictive models to forecast how crops might respond to climate change.
The Software Design Document: More than a User's Manual.
ERIC Educational Resources Information Center
Bowers, Dennis
1989-01-01
Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…
Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)
NASA Technical Reports Server (NTRS)
Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)
1999-01-01
This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.
Collaborative Software and Focused Distraction in the Classroom
ERIC Educational Resources Information Center
Rhine, Steve; Bailey, Mark
2011-01-01
In search of strategies for increasing their pre-service teachers' thoughtful engagement with content and in an effort to model connection between choice of technology and pedagogical goals, the authors utilized collaborative software during class time. Collaborative software allows all students to write simultaneously on a single collective…
Deriving the Cost of Software Maintenance for Software Intensive Systems
2011-08-29
more of software maintenance). Figure 4. SEER-SEM Maintenance Effort by Year Report (Reifer, Allen, Fersch, Hitchings, Judy , & Rosa, 2010...understand the linear relationship between two variables. The formula for the simple Pearson product-moment correlation is represented in Equation 5...standardization is required across the software maintenance community in order to ensure that the data being recorded can be employed beyond the agency or
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and
NASA Technical Reports Server (NTRS)
Stinnett, W. G.
1980-01-01
The modifications, additions, and testing results for a version of the Deep Space Station command software, generated for support of the Voyager Saturn encounter, are discussed. The software update requirements included efforts to: (1) recode portions of the software to permit recovery of approximately 2000 words of memory; (2) correct five Voyager Ground data System liens; (3) provide capability to automatically turn off the command processor assembly local printer during periods of low activity; and (4) correct anomalies existing in the software.
NASA Technical Reports Server (NTRS)
Moran, Susanne I.
2004-01-01
The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications
Preparing Current and Future Practitioners to Integrate Research in Real Practice Settings
ERIC Educational Resources Information Center
Thyer, Bruce A.
2015-01-01
Past efforts aimed at promoting a better integration between research and practice are reviewed. These include the empirical clinical practice movement (ECP), originating within social work; the empirically supported treatment (EST) initiative of clinical psychology; and the evidence-based practice (EBP) model developed within medicine. The…
NASA Astrophysics Data System (ADS)
Brachet, N.; Mialle, P.; Brown, D.; Coyne, J.; Drob, D.; Virieux, J.; Garcés, M.
2009-04-01
The International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty (CTBTO) Preparatory Commission in Vienna is pursuing its automatic processing effort for the return of infrasound data processing into operations in 2009. Concurrently, work is also underway to further improve this process by enhancing the modeling of the infrasound propagation in the atmosphere and then by labeling the phases in order to improve the event categorization and location. In 2008, the IDC acquired WASP-3D Sph (Windy Atmospheric Sonic Propagation) (Virieux et al., 2004) a 3-D ray-tracing based long range propagation software that accounts for the heterogeneity of the atmosphere. Once adapted to the IDC environment, WASP-3 Sph has been used to improve the understanding of infrasound wave propagation and has been compared with the 1-D ray tracing Taupc software (Garcés and Drob, 2007) at the IDC. In addition to performing the infrasound propagation simulation, different atmospheric models are available at the IDC, either real-time: ECMWF (European Centre for Middle-range Weather Forecast), or empiric: HWM93 (Horizontal Wind Model) and HWM07 (Drob, 2008), used in their initial format or interpolated into G2S (Ground to Space) model. The IDC infrasound reference database is used for testing, comparing and validating the various propagation software and atmospheric specifications. Moreover all the performed simulations are giving feedback on the quality of the infrasound reference events and provide useful information to improve their location by refining infrasonic wave propagation characteristics. The results of this study are presented for a selection of reference events and they will help the IDC designing and defining short and mid-term enhancements of the infrasound automatic and interactive processing to take into account the spatial and temporal heterogeneities of the atmosphere.
A collaborative institutional model for integrating computer applications in the medical curriculum.
Friedman, C. P.; Oxford, G. S.; Juliano, E. L.
1991-01-01
The introduction and promotion of information technology in an established medical curriculum with existing academic and technical support structures poses a number of challenges. The UNC School of Medicine has developed the Taskforce on Educational Applications in Medicine (TEAM), to coordinate this effort. TEAM works as a confederation of existing research and support units with interests in computers and education, along with a core of interested faculty with curricular responsibilities. Constituent units of the TEAM confederation include the medical center library, medical television studios, basic science teaching laboratories, educational development office, microcomputer and network support groups, academic affairs administration, and a subset of course directors and teaching faculty. Among our efforts have been the establishment of (1) a mini-grant program to support faculty initiated development and implementation of computer applications in the curriculum, (2) a symposium series with visiting speakers to acquaint faculty with current developments in medical informatics and related curricular efforts at other institution, (3) 20 computer workstations located in the multipurpose teaching labs where first and second year students do much of their academic work, (4) a demonstration center for evaluation of courseware and technologically advanced delivery systems. The student workstations provide convenient access to electronic mail, University schedules and calendars, the CoSy computer conferencing system, and several software applications integral to their courses in pathology, histology, microbiology, biochemistry, and neurobiology. The progress achieved toward the primary goal has modestly exceeded our initial expectations, while the collegiality and interest expressed toward TEAM activities in the local environment stand as empirical measures of the success of the concept. PMID:1807705
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2017-12-01
Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.
Requirement Metrics for Risk Identification
NASA Technical Reports Server (NTRS)
Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence
1996-01-01
The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.
Training Software Developers and Designers to Conduct Usability Evaluations
ERIC Educational Resources Information Center
Skov, Mikael Brasholt; Stage, Jan
2012-01-01
Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…
Ask Pete, software planning and estimation through project characterization
NASA Technical Reports Server (NTRS)
Kurtz, T.
2001-01-01
Ask Pete, was developed by NASA to provide a tool for integrating the estimation and planning activities for a software development effort. It incorporates COCOMO II estimating with NASA's software development practices and IV&V criteria to characterize a project. This characterization is then used to generate estimates and tailored planning documents.
Software measurement guidebook
NASA Technical Reports Server (NTRS)
Bassman, Mitchell J.; Mcgarry, Frank; Pajerski, Rose
1994-01-01
This software Measurement Guidebook presents information on the purpose and importance of measurement. It discusses the specific procedures and activities of a measurement program and the roles of the people involved. The guidebook also clarifies the roles that measurement can and must play in the goal of continual, sustained improvement for all software production and maintenance efforts.
Software architecture standard for simulation virtual machine, version 2.0
NASA Technical Reports Server (NTRS)
Sturtevant, Robert; Wessale, William
1994-01-01
The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.
Chahrour, Marcel
2007-12-01
During the 1840s, physicians from the Habsburg Empire played a decisive role in the reform of medical structures in the Ottoman Empire. This paper discusses different aspects of this scientific and cultural encounter. It emphasizes the importance of Austrian health care structures as a model for the work of these physicians in the Ottoman Empire and studies the role of the medical school ran by the Austrians as a means of representing, on the one hand, the reformatory efforts of the Ottoman Empire and, on the other hand, the motivations of the Habsburg monarchy for an involvement in Ottoman health care affairs, strongly bound up with its own quarantine politics towards the Ottoman Empire.
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.
NASA Technical Reports Server (NTRS)
Simmons, D. B.; Marchbanks, M. P., Jr.; Quick, M. J.
1982-01-01
The results of an effort to thoroughly and objectively analyze the statistical and historical information gathered during the development of the Shuttle Orbiter Primary Flight Software are given. The particular areas of interest include cost of the software, reliability of the software, requirements for the software and how the requirements changed during development of the system. Data related to the current version of the software system produced some interesting results. Suggestions are made for the saving of additional data which will allow additional investigation.
SCA Waveform Development for Space Telemetry
NASA Technical Reports Server (NTRS)
Mortensen, Dale J.; Kifle, Multi; Hall, C. Steve; Quinn, Todd M.
2004-01-01
The NASA Glenn Research Center is investigating and developing suitable reconfigurable radio architectures for future NASA missions. This effort is examining software-based open-architectures for space based transceivers, as well as common hardware platform architectures. The Joint Tactical Radio System's (JTRS) Software Communications Architecture (SCA) is a candidate for the software approach, but may need modifications or adaptations for use in space. An in-house SCA compliant waveform development focuses on increasing understanding of software defined radio architectures and more specifically the JTRS SCA. Space requirements put a premium on size, mass, and power. This waveform development effort is key to evaluating tradeoffs with the SCA for space applications. Existing NASA telemetry links, as well as Space Exploration Initiative scenarios, are the basis for defining the waveform requirements. Modeling and simulations are being developed to determine signal processing requirements associated with a waveform and a mission-specific computational burden. Implementation of the waveform on a laboratory software defined radio platform is proceeding in an iterative fashion. Parallel top-down and bottom-up design approaches are employed.
We empirically examined the sampling effort required to adequately represent species richness and proportionate abundance when backpack electrofishing western Oregon streams. When sampling, we separately recorded data for each habitat unit. In data analyses, we repositioned each...
Management Aspects of Software Maintenance.
1984-09-01
educated in * the complex nature of software maintenance to be able to properly evaluate and manage the software maintenance effort. In this...maintenance and improvement may be called "software evolution". The soft- ware manager must be Educated in the complex nature cf soft- Iware maintenance to be...complaint of error or request for modification is also studied in order to determine what action needs tc be taken. 2. Define Objective and Approach :
CHIME: A Metadata-Based Distributed Software Development Environment
2005-01-01
structures by using typography , graphics , and animation. The Software Im- mersion in our conceptual model for CHIME can be seen as a form of Software...Even small- to medium-sized development efforts may involve hundreds of artifacts -- design documents, change requests, test cases and results, code...for managing and organizing information from all phases of the software lifecycle. CHIME is designed around an XML-based metadata architecture, in
Mapping CMMI Level 2 to Scrum Practices: An Experience Report
NASA Astrophysics Data System (ADS)
Diaz, Jessica; Garbajosa, Juan; Calvo-Manzano, Jose A.
CMMI has been adopted advantageously in large companies for improvements in software quality, budget fulfilling, and customer satisfaction. However SPI strategies based on CMMI-DEV require heavy software development processes and large investments in terms of cost and time that medium/small companies do not deal with. The so-called light software development processes, such as Agile Software Development (ASD), deal with these challenges. ASD welcomes changing requirements and stresses the importance of adaptive planning, simplicity and continuous delivery of valuable software by short time-framed iterations. ASD is becoming convenient in a more and more global, and changing software market. It would be greatly useful to be able to introduce agile methods such as Scrum in compliance with CMMI process model. This paper intends to increase the understanding of the relationship between ASD and CMMI-DEV reporting empirical results that confirm theoretical comparisons between ASD practices and CMMI level2.
Using ERP and WfM Systems for Implementing Business Processes: An Empirical Study
NASA Astrophysics Data System (ADS)
Aversano, Lerina; Tortorella, Maria
Software systems mainly considered from enterprises for dealing with a business process automation belong to the following two categories: Workflow Management Systems (WfMS) and Enterprise Resource Planning (ERP) systems. The wider diffusion of ERP systems tends to favourite this solution, but there are several limitations of most ERP systems for automating business processes. This paper reports an empirical study aiming at comparing the ability of implementing business processes of ERP systems and WfMSs. Two different case studies have been considered in the empirical study. It evaluates and analyses the correctness and completeness of the process models implemented by using ERP and WfM systems.
An Empirical Approach to Analysis of Similarities between Software Failure Regions
1991-09-01
cycle costs after the soft- ware has been marketed (Alberts, 1976). 1 Unfortunately, extensive software testing is frequently necessary in spite of...incidence is primarily syntactic. This mixing of semantic and syntactic forms in the same analysis could lead to some distortion, especially since the...of formulae to improve readability or to indicate precedence of operations. * All defintions within ’Condition I’ of a failure region are assumed to
An Empirical Approach to Logical Clustering of Software Failure Regions
1994-03-01
this is a coincidence or normal behavior of failure regions. " Software faults were numbered in order as they were discovered, by the various testing...locations of the associated faults. The goal of this research will be an improved testing technique that incorporates failure region behavior . To do this...clustering behavior . This, however, does not correlate with the structural clustering of failure regions observed by Ginn (1991) on the same set of data
2011-06-01
USING SPECTRAL CORRELATION FUNCTION THESIS Mujun Song, Captain, ROKA AFIT/GCE/ENG/11-09 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR...Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements for the...generator, Agilent E4438C, ESG Vector Signal Generator. Universal Software Radio Peripheral 2 (USRP2), which is a Software Defined Radio (SDR), is used
2015-03-01
Wireless Sensor Network Using Unreliable GPS Signals Daniel R. Fuhrmann*, Joshua Stomberg§, Saeid Nooshabadi*§ Dustin McIntire†, William Merill... wireless sensor network , when the timing jitter is subject to a empirically determined bimodal non-Gaussian distribution. Specifically, we 1) estimate the...over a nominal 19.2 MHz frequency with an adjustment made every four hours. Index Terms— clock synchronization, GPS, wireless sensor networks , Kalman
1978-09-01
This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a
Requirements: Towards an understanding on why software projects fail
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.
2016-08-01
Requirement engineering is at the foundation of every successful software project. There are many reasons for software project failures; however, poorly engineered requirements process contributes immensely to the reason why software projects fail. Software project failure is usually costly and risky and could also be life threatening. Projects that undermine requirements engineering suffer or are likely to suffer from failures, challenges and other attending risks. The cost of project failures and overruns when estimated is very huge. Furthermore, software project failures or overruns pose a challenge in today's competitive market environment. It affects the company's image, goodwill, and revenue drive and decreases the perceived satisfaction of customers and clients. In this paper, requirements engineering was discussed. Its role in software projects success was elaborated. The place of software requirements process in relation to software project failure was explored and examined. Also, project success and failure factors were also discussed with emphasis placed on requirements factors as they play a major role in software projects' challenges, successes and failures. The paper relied on secondary data and empirical statistics to explore and examine factors responsible for the successes, challenges and failures of software projects in large, medium and small scaled software companies.
DOT National Transportation Integrated Search
2018-05-01
This project implemented additional features into MnPAVE-Rigid, leading to a new version of MnDOTs rigid pavement design software. The database of American Association of State Highway Transportation Officers (AASHTO) mechanistic-empirical (M-E) p...
Interoperability of Neuroscience Modeling Software
Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik
2009-01-01
Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374
Software handlers for process interfaces
NASA Technical Reports Server (NTRS)
Bercaw, R. W.
1976-01-01
Process interfaces are developed in an effort to reduce the time, effort, and money required to install computer systems. Probably the chief obstacle to the achievement of these goals lies in the problem of developing software handlers having the same degree of generality and modularity as the hardware. The problem of combining the advantages of modular instrumentation with those of modern multitask operating systems has not been completely solved, but there are a number of promising developments. The essential principles involved are considered.
Java PathFinder: A Translator From Java to Promela
NASA Technical Reports Server (NTRS)
Havelund, Klaus
1999-01-01
JAVA PATHFINDER, JPF, is a prototype translator from JAVA to PROMELA, the modeling language of the SPIN model checker. JPF is a product of a major effort by the Automated Software Engineering group at NASA Ames to make model checking technology part of the software process. Experience has shown that severe bugs can be found in final code using this technique, and that automated translation from a programming language to a modeling language like PROMELA can help reducing the effort required.
An Empirical Case Study of a Child Sexual Abuse Prevention Initiative in Georgia
ERIC Educational Resources Information Center
Schober, Daniel J.; Fawcett, Stephen B.; Thigpen, Sally; Curtis, Anna; Wright, Renee
2012-01-01
Objective: This empirical case study describes Prevent Child Abuse Georgia's effort to prevent child sexual abuse (CSA) by educating communities throughout the state on supporting preventive behaviour. The initiative consisted of three major components: (1) dissemination of CSA prevention messages and materials; (2) a statewide helpline that…
Online Business Simulations: A Sustainable or Disruptive Innovation in Management Education?
ERIC Educational Resources Information Center
Earl, Jason Scott
2012-01-01
The focal goal of this research was to extend the empirical effort on business simulations as a form of experiential learning by providing the first empirical analysis of business acumen and knowledge application skills. Disruptions in technology are providing more opportunities to improve the simulation gaming learning experience and a number of…
Administration and Organizational Influences on AFDC Case Decision Errors: An Empirical Analysis.
ERIC Educational Resources Information Center
Piliavin, Irving; And Others
The quality of effort among public assistance personnel has been criticized virtually since the inception of welfare programs for the poor. However, until recently, empirical information on the performance of these workers has been nonexistent. The present study, concerned with Aid to Families with Dependent Children (AFDC) case decision errors,…
The U.S. EPA Atlantic Ecology Division (AED) has initiated a multi-year research program to develop empirical nitrogen load-response models. Our research on embayments in southern New England is part of a multi-regional effort to develop cause-effect models for the Gulf of Mexic...
The U.S. EPA Atlantic Ecology Division (AED) has initiated a multi-year research program to develop empirical nitrogen load-response models for embayments in southern New England. This is part of a multi-regional effort to develop nutrient load-response models for the Gulf of Mex...
Intervening in Infancy: Implications for Autism Spectrum Disorders
ERIC Educational Resources Information Center
Wallace, Katherine S.; Rogers, Sally J.
2010-01-01
There is a scarcity of empirically validated treatments for infants and toddlers under age 3 years with autism spectrum disorders (ASD), as well as a scarcity of empirical investigation into successful intervention characteristics for this population. Yet early screening efforts are focused on identifying autism risk in children under age 3 years.…
A generic open-source software framework supporting scenario simulations in bioterrorist crises.
Falenski, Alexander; Filter, Matthias; Thöns, Christian; Weiser, Armin A; Wigger, Jan-Frederik; Davis, Matthew; Douglas, Judith V; Edlund, Stefan; Hu, Kun; Kaufman, James H; Appel, Bernd; Käsbohrer, Annemarie
2013-09-01
Since the 2001 anthrax attack in the United States, awareness of threats originating from bioterrorism has grown. This led internationally to increased research efforts to improve knowledge of and approaches to protecting human and animal populations against the threat from such attacks. A collaborative effort in this context is the extension of the open-source Spatiotemporal Epidemiological Modeler (STEM) simulation and modeling software for agro- or bioterrorist crisis scenarios. STEM, originally designed to enable community-driven public health disease models and simulations, was extended with new features that enable integration of proprietary data as well as visualization of agent spread along supply and production chains. STEM now provides a fully developed open-source software infrastructure supporting critical modeling tasks such as ad hoc model generation, parameter estimation, simulation of scenario evolution, estimation of effects of mitigation or management measures, and documentation. This open-source software resource can be used free of charge. Additionally, STEM provides critical features like built-in worldwide data on administrative boundaries, transportation networks, or environmental conditions (eg, rainfall, temperature, elevation, vegetation). Users can easily combine their own confidential data with built-in public data to create customized models of desired resolution. STEM also supports collaborative and joint efforts in crisis situations by extended import and export functionalities. In this article we demonstrate specifically those new software features implemented to accomplish STEM application in agro- or bioterrorist crisis scenarios.
The Dispassionate Discourse of Children's Adjustment to Divorce.
ERIC Educational Resources Information Center
Allen, Katherine R.
1993-01-01
Responds to previous article by Amato on children's adjustment to divorce. Applauds Amato's efforts, but sees efforts hindered by insufficient reporting and inconsistent use of empirical literature, unsupported speculations about inconsistencies found in some hypotheses, and unacknowledged bias toward traditional family structure. Discusses many…
Some Methods of Applied Numerical Analysis to 3d Facial Reconstruction Software
NASA Astrophysics Data System (ADS)
Roşu, Şerban; Ianeş, Emilia; Roşu, Doina
2010-09-01
This paper deals with the collective work performed by medical doctors from the University Of Medicine and Pharmacy Timisoara and engineers from the Politechnical Institute Timisoara in the effort to create the first Romanian 3d reconstruction software based on CT or MRI scans and to test the created software in clinical practice.
Parallelization of Rocket Engine Simulator Software (PRESS)
NASA Technical Reports Server (NTRS)
Cezzar, Ruknet
1997-01-01
Parallelization of Rocket Engine System Software (PRESS) project is part of a collaborative effort with Southern University at Baton Rouge (SUBR), University of West Florida (UWF), and Jackson State University (JSU). The second-year funding, which supports two graduate students enrolled in our new Master's program in Computer Science at Hampton University and the principal investigator, have been obtained for the period from October 19, 1996 through October 18, 1997. The key part of the interim report was new directions for the second year funding. This came about from discussions during Rocket Engine Numeric Simulator (RENS) project meeting in Pensacola on January 17-18, 1997. At that time, a software agreement between Hampton University and NASA Lewis Research Center had already been concluded. That agreement concerns off-NASA-site experimentation with PUMPDES/TURBDES software. Before this agreement, during the first year of the project, another large-scale FORTRAN-based software, Two-Dimensional Kinetics (TDK), was being used for translation to an object-oriented language and parallelization experiments. However, that package proved to be too complex and lacking sufficient documentation for effective translation effort to the object-oriented C + + source code. The focus, this time with better documented and more manageable PUMPDES/TURBDES package, was still on translation to C + + with design improvements. At the RENS Meeting, however, the new impetus for the RENS projects in general, and PRESS in particular, has shifted in two important ways. One was closer alignment with the work on Numerical Propulsion System Simulator (NPSS) through cooperation and collaboration with LERC ACLU organization. The other was to see whether and how NASA's various rocket design software can be run over local and intra nets without any radical efforts for redesign and translation into object-oriented source code. There were also suggestions that the Fortran based code be encapsulated in C + + code thereby facilitating reuse without undue development effort. The details are covered in the aforementioned section of the interim report filed on April 28, 1997.
Soil Carbon Data: long tail recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
2017-07-25
The software is intended to be part of an open source effort regarding soils data. The software provides customized data ingestion scripts for soil carbon related data sets and scripts for output databases that conform to common templates.
Failure-Modes-And-Effects Analysis Of Software Logic
NASA Technical Reports Server (NTRS)
Garcia, Danny; Hartline, Thomas; Minor, Terry; Statum, David; Vice, David
1996-01-01
Rigorous analysis applied early in design effort. Method of identifying potential inadequacies and modes and effects of failures caused by inadequacies (failure-modes-and-effects analysis or "FMEA" for short) devised for application to software logic.
Comparison of Aircraft Icing Growth Assessment Software
NASA Technical Reports Server (NTRS)
Wright, William; Potapczuk, Mark G.; Levinson, Laurie H.
2011-01-01
A research project is underway to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. The project shows the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. The project addresses the validation of the software against a recent set of ice-shape data in the SLD regime. This validation effort mirrors a similar effort undertaken for previous validations of LEWICE. Those reports quantified the ice accretion prediction capabilities of the LEWICE software. Several ice geometry features were proposed for comparing ice shapes in a quantitative manner. The resulting analysis showed that LEWICE compared well to the available experimental data.
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.
1993-01-01
Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.
Sebok, Angelia; Wickens, Christopher D
2017-03-01
The objectives were to (a) implement theoretical perspectives regarding human-automation interaction (HAI) into model-based tools to assist designers in developing systems that support effective performance and (b) conduct validations to assess the ability of the models to predict operator performance. Two key concepts in HAI, the lumberjack analogy and black swan events, have been studied extensively. The lumberjack analogy describes the effects of imperfect automation on operator performance. In routine operations, an increased degree of automation supports performance, but in failure conditions, increased automation results in more significantly impaired performance. Black swans are the rare and unexpected failures of imperfect automation. The lumberjack analogy and black swan concepts have been implemented into three model-based tools that predict operator performance in different systems. These tools include a flight management system, a remotely controlled robotic arm, and an environmental process control system. Each modeling effort included a corresponding validation. In one validation, the software tool was used to compare three flight management system designs, which were ranked in the same order as predicted by subject matter experts. The second validation compared model-predicted operator complacency with empirical performance in the same conditions. The third validation compared model-predicted and empirically determined time to detect and repair faults in four automation conditions. The three model-based tools offer useful ways to predict operator performance in complex systems. The three tools offer ways to predict the effects of different automation designs on operator performance.
A research review of quality assessment for software
NASA Technical Reports Server (NTRS)
1991-01-01
Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness.
2016-04-30
software (OSS) and proprietary (CSS) software elements or remote services (Scacchi, 2002, 2010), eventually including recent efforts to support Web ...specific platforms, including those operating on secured Web /mobile devices. Common Development Technology provides AC development tools and common...transition to OA systems and OSS software elements, specifically for Web and Mobile devices within the realm of C3CB. OA, Open APIs, OSS, and CSS OA
Using neural networks in software repositories
NASA Technical Reports Server (NTRS)
Eichmann, David (Editor); Srinivas, Kankanahalli; Boetticher, G.
1992-01-01
The first topic is an exploration of the use of neural network techniques to improve the effectiveness of retrieval in software repositories. The second topic relates to a series of experiments conducted to evaluate the feasibility of using adaptive neural networks as a means of deriving (or more specifically, learning) measures on software. Taken together, these two efforts illuminate a very promising mechanism supporting software infrastructures - one based upon a flexible and responsive technology.
Software IV and V Research Priorities and Applied Program Accomplishments Within NASA
NASA Technical Reports Server (NTRS)
Blazy, Louis J.
2000-01-01
The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering
NASA Astrophysics Data System (ADS)
Vaucouleur, Sebastien
2011-02-01
We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.
Software reliability: Additional investigations into modeling with replicated experiments
NASA Technical Reports Server (NTRS)
Nagel, P. M.; Schotz, F. M.; Skirvan, J. A.
1984-01-01
The effects of programmer experience level, different program usage distributions, and programming languages are explored. All these factors affect performance, and some tentative relational hypotheses are presented. An analytic framework for replicated and non-replicated (traditional) software experiments is presented. A method of obtaining an upper bound on the error rate of the next error is proposed. The method was validated empirically by comparing forecasts with actual data. In all 14 cases the bound exceeded the observed parameter, albeit somewhat conservatively. Two other forecasting methods are proposed and compared to observed results. Although demonstrated relative to this framework that stages are neither independent nor exponentially distributed, empirical estimates show that the exponential assumption is nearly valid for all but the extreme tails of the distribution. Except for the dependence in the stage probabilities, Cox's model approximates to a degree what is being observed.
A taxonomy and discussion of software attack technologies
NASA Astrophysics Data System (ADS)
Banks, Sheila B.; Stytz, Martin R.
2005-03-01
Software is a complex thing. It is not an engineering artifact that springs forth from a design by simply following software coding rules; creativity and the human element are at the heart of the process. Software development is part science, part art, and part craft. Design, architecture, and coding are equally important activities and in each of these activities, errors may be introduced that lead to security vulnerabilities. Therefore, inevitably, errors enter into the code. Some of these errors are discovered during testing; however, some are not. The best way to find security errors, whether they are introduced as part of the architecture development effort or coding effort, is to automate the security testing process to the maximum extent possible and add this class of tools to the tools available, which aids in the compilation process, testing, test analysis, and software distribution. Recent technological advances, improvements in computer-generated forces (CGFs), and results in research in information assurance and software protection indicate that we can build a semi-intelligent software security testing tool. However, before we can undertake the security testing automation effort, we must understand the scope of the required testing, the security failures that need to be uncovered during testing, and the characteristics of the failures. Therefore, we undertook the research reported in the paper, which is the development of a taxonomy and a discussion of software attacks generated from the point of view of the security tester with the goal of using the taxonomy to guide the development of the knowledge base for the automated security testing tool. The representation for attacks and threat cases yielded by this research captures the strategies, tactics, and other considerations that come into play during the planning and execution of attacks upon application software. The paper is organized as follows. Section one contains an introduction to our research and a discussion of the motivation for our work. Section two contains a presents our taxonomy of software attacks and a discussion of the strategies employed and general weaknesses exploited for each attack. Section three contains a summary and suggestions for further research.
Data collection procedures for the Software Engineering Laboratory (SEL) database
NASA Technical Reports Server (NTRS)
Heller, Gerard; Valett, Jon; Wild, Mary
1992-01-01
This document is a guidebook to collecting software engineering data on software development and maintenance efforts, as practiced in the Software Engineering Laboratory (SEL). It supersedes the document entitled Data Collection Procedures for the Rehosted SEL Database, number SEL-87-008 in the SEL series, which was published in October 1987. It presents procedures to be followed on software development and maintenance projects in the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC) for collecting data in support of SEL software engineering research activities. These procedures include detailed instructions for the completion and submission of SEL data collection forms.
Changes and challenges in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Pajerski, Rose
1994-01-01
Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD), develops, maintains, and manages complex flight dynamics systems. The SEL is composed of three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation. During the past 18 years, the SEL's overall goal has remained the same: to improve the FDD's software products and processes in a measured manner. This requires that each development and maintenance effort be viewed, in part, as a SEL experiment which examines a specific technology or builds a model of interest for use on subsequent efforts. The SEL has undertaken many technology studies while developing operational support systems for numerous NASA spacecraft missions.
NASA Technical Reports Server (NTRS)
Chaput, Armand; Johns, Zachary; Hodges, Todd; Selfridge, Justin; Bevirt, Joeben; Ahuja, Vivek
2015-01-01
Advanced Concepts Modeling software validation, analysis, and design. This was a National Institute of Aerospace contract with a lot of pieces. Efforts ranged from software development and validation for structures and aerodynamics, through flight control development, and aeropropulsive analysis, to UAV piloting services.
Computerizing the Accounting Curriculum.
ERIC Educational Resources Information Center
Nash, John F.; England, Thomas G.
1986-01-01
Discusses the use of computers in college accounting courses. Argues that the success of new efforts in using computers in teaching accounting is dependent upon increasing instructors' computer skills, and choosing appropriate hardware and software, including commercially available business software packages. (TW)
Globus Quick Start Guide. Globus Software Version 1.1
NASA Technical Reports Server (NTRS)
1999-01-01
The Globus Project is a community effort, led by Argonne National Laboratory and the University of Southern California's Information Sciences Institute. Globus is developing the basic software infrastructure for computations that integrate geographically distributed computational and information resources.
SOFTCOST - DEEP SPACE NETWORK SOFTWARE COST MODEL
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1994-01-01
The early-on estimation of required resources and a schedule for the development and maintenance of software is usually the least precise aspect of the software life cycle. However, it is desirable to make some sort of an orderly and rational attempt at estimation in order to plan and organize an implementation effort. The Software Cost Estimation Model program, SOFTCOST, was developed to provide a consistent automated resource and schedule model which is more formalized than the often used guesswork model based on experience, intuition, and luck. SOFTCOST was developed after the evaluation of a number of existing cost estimation programs indicated that there was a need for a cost estimation program with a wide range of application and adaptability to diverse kinds of software. SOFTCOST combines several software cost models found in the open literature into one comprehensive set of algorithms that compensate for nearly fifty implementation factors relative to size of the task, inherited baseline, organizational and system environment, and difficulty of the task. SOFTCOST produces mean and variance estimates of software size, implementation productivity, recommended staff level, probable duration, amount of computer resources required, and amount and cost of software documentation. Since the confidence level for a project using mean estimates is small, the user is given the opportunity to enter risk-biased values for effort, duration, and staffing, to achieve higher confidence levels. SOFTCOST then produces a PERT/CPM file with subtask efforts, durations, and precedences defined so as to produce the Work Breakdown Structure (WBS) and schedule having the asked-for overall effort and duration. The SOFTCOST program operates in an interactive environment prompting the user for all of the required input. The program builds the supporting PERT data base in a file for later report generation or revision. The PERT schedule and the WBS schedule may be printed and stored in a file for later use. The SOFTCOST program is written in Microsoft BASIC for interactive execution and has been implemented on an IBM PC-XT/AT operating MS-DOS 2.1 or higher with 256K bytes of memory. SOFTCOST was originally developed for the Zylog Z80 system running under CP/M in 1981. It was converted to run on the IBM PC XT/AT in 1986. SOFTCOST is a copyrighted work with all copyright vested in NASA.
Design of a nickel-hydrogen battery simulator for the NASA EOS testbed
NASA Technical Reports Server (NTRS)
Gur, Zvi; Mang, Xuesi; Patil, Ashok R.; Sable, Dan M.; Cho, Bo H.; Lee, Fred C.
1992-01-01
The hardware and software design of a nickel-hydrogen (Ni-H2) battery simulator (BS) with application to the NASA Earth Observation System (EOS) satellite is presented. The battery simulator is developed as a part of a complete testbed for the EOS satellite power system. The battery simulator involves both hardware and software components. The hardware component includes the capability of sourcing and sinking current at a constant programmable voltage. The software component includes the capability of monitoring the battery's ampere-hours (Ah) and programming the battery voltage according to an empirical model of the nickel-hydrogen battery stored in a computer.
Classification software technique assessment
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.; Atkinson, R.; Dasarathy, B. V.; Lybanon, M.; Ramapryian, H. K.
1976-01-01
A catalog of software options is presented for the use of local user communities to obtain software for analyzing remotely sensed multispectral imagery. The resources required to utilize a particular software program are described. Descriptions of how a particular program analyzes data and the performance of that program for an application and data set provided by the user are shown. An effort is made to establish a statistical performance base for various software programs with regard to different data sets and analysis applications, to determine the status of the state-of-the-art.
Gaining Control and Predictability of Software-Intensive Systems Development and Sustainment
2015-02-04
implementation of the baselines, audits , and technical reviews within an overarching systems engineering process (SEP; Defense Acquisition University...warfighters’ needs. This management and metrics effort supplements and supports the system’s technical development through the baselines, audits and...other areas that could be researched and added into the nine-tier model. Areas including software metrics, quality assurance , software-oriented
Open Source Software in Teaching Physics: A Case Study on Vector Algebra and Visual Representations
ERIC Educational Resources Information Center
Cataloglu, Erdat
2006-01-01
This study aims to report the effort on teaching vector algebra using free open source software (FOSS). Recent studies showed that students have difficulties in learning basic physics concepts. Constructivist learning theories suggest the use of visual and hands-on activities in learning. We will report on the software used for this purpose. The…
2011-02-01
written in C and assembly languages. 2) executable code for the low-power wakeup controller in the tag. This software is responsible for the VHF...used in the tag software. The multi-rate processing in the new tag necessitated a more complex task- scheduling software architecture. The effort of
ERIC Educational Resources Information Center
Farzaneh, Mandana; Vanani, Iman Raeesi; Sohrabi, Babak
2012-01-01
E-learning is one of the most important learning approaches within which intelligent software agents can be efficiently used so as to automate and facilitate the process of learning. The aim of this paper is to illustrate a comprehensive categorization of intelligent software agent features, which is valuable for being deployed in the virtual…
An Analysis of Botnet Vulnerabilities
2007-06-01
Definition Currently, the primary defense against botnets is prompt patching of vulnerable systems and antivirus software . Network monitoring can identify...IRCd software , none were identified during this effort. AFIT iv For my wife, for her caring and support throughout the course of this...are software agents designed to automatically perform tasks. Examples include web-spiders that catalog the Internet and bots found in popular online
ERIC Educational Resources Information Center
Woody, Jane D.; D'Souza, Henry J.; Dartman, Rebecca
2006-01-01
Objective: A questionnaire to examine efforts toward the teaching of empirically supported interventions (ESI) was mailed to the 165 deans and directors of Council on Social Work Education-accredited Master's in social work (MSW) programs; 66 (40%) responded. Method: Questions included program characteristics and items assessing both faculty and…
Management of Inclusive Education in Oman: A Framework for Action
ERIC Educational Resources Information Center
Mohamed Emam, Mahmoud
2016-01-01
Inclusive education (IE) and the special education services related to it are relatively new in Oman. Efforts to manage special/inclusive education face many challenges due to a number of culturally rooted factors. Further, empirical research on IE in Oman is scarce and there is a need to advance IE discourse based on empirically validated…
NASA Technical Reports Server (NTRS)
Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David
1990-01-01
This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.
A Human Reliability Based Usability Evaluation Method for Safety-Critical Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillippe Palanque; Regina Bernhaupt; Ronald Boring
2006-04-01
Recent years have seen an increasing use of sophisticated interaction techniques including in the field of safety critical interactive software [8]. The use of such techniques has been required in order to increase the bandwidth between the users and systems and thus to help them deal efficiently with increasingly complex systems. These techniques come from research and innovation done in the field of humancomputer interaction (HCI). A significant effort is currently being undertaken by the HCI community in order to apply and extend current usability evaluation techniques to these new kinds of interaction techniques. However, very little has been donemore » to improve the reliability of software offering these kinds of interaction techniques. Even testing basic graphical user interfaces remains a challenge that has rarely been addressed in the field of software engineering [9]. However, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. The aim of this SIG is to provide a forum for both researchers and practitioners interested in testing interactive software. Our goal is to define a roadmap of activities to cross fertilize usability and reliability testing of these kinds of systems to minimize duplicate efforts in both communities.« less
NASA Astrophysics Data System (ADS)
He, Xin
2017-03-01
The ideal observer is widely used in imaging system optimization. One practical question remains open: do the ideal and human observers have the same preference in system optimization and evaluation? Based on the ideal observer's mathematical properties proposed by Barrett et. al. and the empirical properties of human observers investigated by Myers et. al., I attempt to pursue the general rules regarding the applicability of the ideal observer in system optimization. Particularly, in software optimization, the ideal observer pursues data conservation while humans pursue data presentation or perception. In hardware optimization, the ideal observer pursues a system with the maximum total information, while humans pursue a system with the maximum selected (e.g., certain frequency bands) information. These different objectives may result in different system optimizations between human and the ideal observers. Thus, an ideal observer optimized system is not necessarily optimal for humans. I cite empirical evidence in search and detection tasks, in hardware and software evaluation, in X-ray CT, pinhole imaging, as well as emission computed tomography to corroborate the claims. (Disclaimer: the views expressed in this work do not necessarily represent those of the FDA)
Improving a data-acquisition software system with abstract data type components
NASA Technical Reports Server (NTRS)
Howard, S. D.
1990-01-01
Abstract data types and object-oriented design are active research areas in computer science and software engineering. Much of the interest is aimed at new software development. Abstract data type packages developed for a discontinued software project were used to improve a real-time data-acquisition system under maintenance. The result saved effort and contributed to a significant improvement in the performance, maintainability, and reliability of the Goldstone Solar System Radar Data Acquisition System.
Software Engineering Laboratory (SEL) cleanroom process model
NASA Technical Reports Server (NTRS)
Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon
1991-01-01
The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.
A Model Independent S/W Framework for Search-Based Software Testing
Baik, Jongmoon
2014-01-01
In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314
Sharing Research Models: Using Software Engineering Practices for Facilitation
Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.
2011-01-01
Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780
NASA Technical Reports Server (NTRS)
Denny, Barbara A.; McKenney, Paul E., Sr.; Lee, Danny
1994-01-01
This document is Volume 3 of the final technical report on the work performed by SRI International (SRI) on SRI Project 8600. The document includes source listings for all software developed by SRI under this effort. Since some of our work involved the use of ST-II and the Sun Microsystems, Inc. (Sun) High-Speed Serial Interface (HSI/S) driver, we have included some of the source developed by LBL and BBN as well. In most cases, our decision to include source developed by other contractors depended on whether it was necessary to modify the original code. If we have modified the software in any way, it is included in this document. In the case of the Traffic Generator (TG), however, we have included all the ST-II software, even though BBN performed the integration, because the ST-II software is part of the standard TG release. It is important to note that all the code developed by other contractors is in the public domain, so that all software developed under this effort can be re-created from the source included here.
Computational Models of Anterior Cingulate Cortex: At the Crossroads between Prediction and Effort.
Vassena, Eliana; Holroyd, Clay B; Alexander, William H
2017-01-01
In the last two decades the anterior cingulate cortex (ACC) has become one of the most investigated areas of the brain. Extensive neuroimaging evidence suggests countless functions for this region, ranging from conflict and error coding, to social cognition, pain and effortful control. In response to this burgeoning amount of data, a proliferation of computational models has tried to characterize the neurocognitive architecture of ACC. Early seminal models provided a computational explanation for a relatively circumscribed set of empirical findings, mainly accounting for EEG and fMRI evidence. More recent models have focused on ACC's contribution to effortful control. In parallel to these developments, several proposals attempted to explain within a single computational framework a wider variety of empirical findings that span different cognitive processes and experimental modalities. Here we critically evaluate these modeling attempts, highlighting the continued need to reconcile the array of disparate ACC observations within a coherent, unifying framework.
Chen, Ya-Chen; Hsiao, Tzu-Chien
2018-07-01
Respiratory inductance plethysmography (RIP) sensor is an inexpensive, non-invasive, easy-to-use transducer for collecting respiratory movement data. Studies have reported that the RIP signal's amplitude and frequency can be used to discriminate respiratory diseases. However, with the conventional approach of RIP data analysis, respiratory muscle effort cannot be estimated. In this paper, the estimation of the respiratory muscle effort through RIP signal was proposed. A complementary ensemble empirical mode decomposition method was used, to extract hidden signals from the RIP signals based on the frequency bands of the activities of different respiratory muscles. To validate the proposed method, an experiment to collect subjects' RIP signal under thoracic breathing (TB) and abdominal breathing (AB) was conducted. The experimental results for both the TB and AB indicate that the proposed method can be used to loosely estimate the activities of thoracic muscles, abdominal muscles, and diaphragm. Graphical abstract ᅟ.
DenInv3D: a geophysical software for three-dimensional density inversion of gravity field data
NASA Astrophysics Data System (ADS)
Tian, Yu; Ke, Xiaoping; Wang, Yong
2018-04-01
This paper presents a three-dimensional density inversion software called DenInv3D that operates on gravity and gravity gradient data. The software performs inversion modelling, kernel function calculation, and inversion calculations using the improved preconditioned conjugate gradient (PCG) algorithm. In the PCG algorithm, due to the uncertainty of empirical parameters, such as the Lagrange multiplier, we use the inflection point of the L-curve as the regularisation parameter. The software can construct unequally spaced grids and perform inversions using such grids, which enables changing the resolution of the inversion results at different depths. Through inversion of airborne gradiometry data on the Australian Kauring test site, we discovered that anomalous blocks of different sizes are present within the study area in addition to the central anomalies. The software of DenInv3D can be downloaded from http://159.226.162.30.
A theoretical basis for the analysis of multiversion software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.
Consolidated View on Space Software Engineering Problems - An Empirical Study
NASA Astrophysics Data System (ADS)
Silva, N.; Vieira, M.; Ricci, D.; Cotroneo, D.
2015-09-01
Independent software verification and validation (ISVV) has been a key process for engineering quality assessment for decades, and is considered in several international standards. The “European Space Agency (ESA) ISVV Guide” is used for the European Space market to drive the ISVV tasks and plans, and to select applicable tasks and techniques. Software artefacts have room for improvement due to the amount if issues found during ISVV tasks. This article presents the analysis of the results of a large set of ISVV issues originated from three different ESA missions-amounting to more than 1000 issues. The study presents the main types, triggers and impacts related to the ISVV issues found and sets the path for a global software engineering improvement based on the most common deficiencies identified for space projects.
Profile of software engineering within the National Aeronautics and Space Administration (NASA)
NASA Technical Reports Server (NTRS)
Sinclair, Craig C.; Jeletic, Kellyann F.
1994-01-01
This paper presents findings of baselining activities being performed to characterize software practices within the National Aeronautics and Space Administration. It describes how such baseline findings might be used to focus software process improvement activities. Finally, based on the findings to date, it presents specific recommendations in focusing future NASA software process improvement efforts. The findings presented in this paper are based on data gathered and analyzed to date. As such, the quantitative data presented in this paper are preliminary in nature.
Open source molecular modeling.
Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan
2016-09-01
The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.
2015-09-30
originate from NASA , NOAA , and community modeling efforts, and support for creation of the suite was shared by sponsors from other agencies. ESPS...Framework (ESMF) Software and Application Development Cecelia Deluca NESII/CIRES/ NOAA Earth System Research Laboratory 325 Broadway Boulder, CO...Capability (NUOPC) was established between NOAA and Navy to develop a common software architecture for easy and efficient interoperability. The
ERIC Educational Resources Information Center
Patel, Sunil S.
2013-01-01
Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…
Power subsystem automation study
NASA Technical Reports Server (NTRS)
Tietz, J. C.; Sewy, D.; Pickering, C.; Sauers, R.
1984-01-01
The purpose of the phase 2 of the power subsystem automation study was to demonstrate the feasibility of using computer software to manage an aspect of the electrical power subsystem on a space station. The state of the art in expert systems software was investigated in this study. This effort resulted in the demonstration of prototype expert system software for managing one aspect of a simulated space station power subsystem.
Improving software quality - The use of formal inspections at the Jet Propulsion Laboratory
NASA Technical Reports Server (NTRS)
Bush, Marilyn
1990-01-01
The introduction of software formal inspections (Fagan Inspections) at JPL for finding and fixing defects early in the software development life cycle are reviewed. It is estimated that, by the year 2000, some software efforts will rise to as much as 80 percent of the total. Software problems are especially important at NASA as critical flight software must be error-free. It is shown that formal inspections are particularly effective at finding and removing defects having to do with clarity, correctness, consistency, and completeness. A very significant discovery was that code audits were not as effective at finding defects as code inspections.
Software Engineering Improvement Plan
NASA Technical Reports Server (NTRS)
2006-01-01
In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.
Induced Innovation and Social Inequality: Evidence from Infant Medical Care
ERIC Educational Resources Information Center
Cutler, David M.; Meara, Ellen; Richards-Shubik, Seth
2012-01-01
We develop a model of induced innovation that applies to medical research. Our model yields three empirical predictions. First, initial death rates and subsequent research effort should be positively correlated. Second, research effort should be associated with more rapid mortality declines. Third, as a byproduct of targeting the most common…
ERIC Educational Resources Information Center
Arens, A. Katrin; Hasselhorn, Marcus
2015-01-01
This study aimed to address two underexplored research questions regarding support for the separation between competence and affect self-perceptions due to differential relations to outcome criteria. First, it is tested whether higher relations between affect self-perceptions and effort than between competence self-perceptions and effort can also…
ERIC Educational Resources Information Center
Tempelaar, Dirk T.; Rienties, Bart; Giesbers, Bas; Gijselaers, Wim H.
2015-01-01
Empirical studies into meaning systems surrounding implicit theories of intelligence typically entail two stringent assumptions: that different implicit theories and different effort beliefs represent opposite poles on a single scale, and that implicit theories directly impact the constructs as achievement goals and academic motivations. Through…
Effect of Learning Activity on Students' Motivation, Physical Activity Levels and Effort/Persistence
ERIC Educational Resources Information Center
Gao, Zan; Lee, Amelia M.; Xiang, Ping; Kosma, Maria
2011-01-01
The type of learning activity offered in physical education may influence students' motivational beliefs, physical activity participation and effort/persistence in class. However, most empirical studies have focused on the individual level rather than on the learner-content interactions. Accordingly, the potential effects of learning activities on…
ERIC Educational Resources Information Center
Gaudino, James L.; Steele, Michael E.
To investigate whether researchers are developing empirically-based public relations research efforts, and whether such efforts could be considered useful to public relations practitioners, a study conducted a content analysis of all articles published in "Public Relations Review" from 1977 through 1987. Articles (196 were coded in all)…
Shah, Hemant; Allard, Raymond D; Enberg, Robert; Krishnan, Ganesh; Williams, Patricia; Nadkarni, Prakash M
2012-03-09
A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies.
2012-01-01
Background A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. Methods In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). Results The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. Conclusions When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies. PMID:22405400
Securing Ground Data System Applications for Space Operations
NASA Technical Reports Server (NTRS)
Pajevski, Michael J.; Tso, Kam S.; Johnson, Bryan
2014-01-01
The increasing prevalence and sophistication of cyber attacks has prompted the Multimission Ground Systems and Services (MGSS) Program Office at Jet Propulsion Laboratory (JPL) to initiate the Common Access Manager (CAM) effort to protect software applications used in Ground Data Systems (GDSs) at JPL and other NASA Centers. The CAM software provides centralized services and software components used by GDS subsystems to meet access control requirements and ensure data integrity, confidentiality, and availability. In this paper we describe the CAM software; examples of its integration with spacecraft commanding software applications and an information management service; and measurements of its performance and reliability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen
2012-01-01
Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for manymore » CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.« less
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
NASA Technical Reports Server (NTRS)
Rosenberg, Linda H.; Arthur, James D.; Stapko, Ruth K.; Davani, Darush
1999-01-01
The Software Assurance Technology Center (SATC) at NASA Goddard Space Flight Center has been investigating how projects can determine when sufficient testing has been completed. For most projects, schedules are underestimated, and the last phase of the software development, testing, must be decreased. Two questions are frequently asked: "To what extent is the software error-free? " and "How much time and effort is required to detect and remove the remaining errors? " Clearly, neither question can be answered with absolute certainty. Nonetheless, the ability to answer these questions with some acceptable level of confidence is highly desirable. First, knowing the extent to which a product is error-free, we can judge when it is time to terminate testing. Secondly, if errors are judged to be present, we can perform a cost/benefit trade-off analysis to estimate when the software will be ready for use and at what cost. This paper explains the efforts of the SATC to help projects determine what is sufficient testing and when is the most cost-effective time to stop testing.
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
Key Questions in Building Defect Prediction Models in Practice
NASA Astrophysics Data System (ADS)
Ramler, Rudolf; Wolfmaier, Klaus; Stauder, Erwin; Kossak, Felix; Natschläger, Thomas
The information about which modules of a future version of a software system are defect-prone is a valuable planning aid for quality managers and testers. Defect prediction promises to indicate these defect-prone modules. However, constructing effective defect prediction models in an industrial setting involves a number of key questions. In this paper we discuss ten key questions identified in context of establishing defect prediction in a large software development project. Seven consecutive versions of the software system have been used to construct and validate defect prediction models for system test planning. Furthermore, the paper presents initial empirical results from the studied project and, by this means, contributes answers to the identified questions.
NASA Technical Reports Server (NTRS)
Tijidjian, Raffi P.
2010-01-01
The TEAMS model analyzer is a supporting tool developed to work with models created with TEAMS (Testability, Engineering, and Maintenance System), which was developed by QSI. In an effort to reduce the time spent in the manual process that each TEAMS modeler must perform in the preparation of reporting for model reviews, a new tool has been developed as an aid to models developed in TEAMS. The software allows for the viewing, reporting, and checking of TEAMS models that are checked into the TEAMS model database. The software allows the user to selectively model in a hierarchical tree outline view that displays the components, failure modes, and ports. The reporting features allow the user to quickly gather statistics about the model, and generate an input/output report pertaining to all of the components. Rules can be automatically validated against the model, with a report generated containing resulting inconsistencies. In addition to reducing manual effort, this software also provides an automated process framework for the Verification and Validation (V&V) effort that will follow development of these models. The aid of such an automated tool would have a significant impact on the V&V process.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
Open-source software: not quite endsville.
Stahl, Matthew T
2005-02-01
Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.
Survivability as a Tool for Evaluating Open Source Software
2015-06-01
the thesis limited the program development, so it is only able to process project issues (bugs or feature requests), which is an important metric for...Ideally, these insights may provide an analytic framework to generate guidance for decision makers that may support the inclusion of OSS to more...refine their efforts to build quality software and to strengthen their software development communities. 1.4 Research Questions This thesis addresses
Development of a Communications Front End Processor (FEP) for the VAX-11/780 Using an LSI-11/23.
1983-12-01
9 Approach . . . . . . . . . . . . . . . . . . 11 Software Development Life Cycle . . . . . . . 11 Requirements Analysis...proven to be useful (25] during the Software Development Life Cycle of a project. Development tools and documentation aids used throughout this effort...include "Structure Charts" ( ref Appendix B ), a "Data Dictionary" ( ref Appendix C ),and Program Design Language CPDL). 1.5.1 Software Development- Life
Evaluation and Validation (E&V) Team Public Report. Volume 5
1990-10-31
aspects, software engineering practices, etc. The E&V requirements which are developed will be used to guide the E&V technical effort. The currently...interoperability of Ada software engineering environment tools and data. The scope of the CAIS-A includes the functionality affecting transportability that is...requirement that they be CAIS conforming tools or data. That is, for example numerous CIVC data exist on special purpose software currently available
Warfighting Concepts to Future Weapon System Designs (WARCON)
2003-09-12
34* Software design documents rise to litigation. "* A Material List "Cost information that may support, or may * Final Engineering Process Maps be...document may include design the system as derived from the engineering design, software development, SRD. MTS Technologies, Inc. 26 FOR OFFICIAL USE...document, early in the development phase. It is software engineers produce the vision of important to establish a standard, formal the design effort. As
An investigation of error characteristics and coding performance
NASA Technical Reports Server (NTRS)
Ebel, William J.; Ingels, Frank M.
1993-01-01
The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.
Data analysis and software support for the Earth radiation budget experiment
NASA Technical Reports Server (NTRS)
Edmonds, W.; Natarajan, S.
1987-01-01
Computer programming and data analysis efforts were performed in support of the Earth Radiation Budget Experiment (ERBE) at NASA/Langley. A brief description of the ERBE followed by sections describing software development and data analysis for both prelaunch and postlaunch instrument data are presented.
The IEEE Software Engineering Standards Process
Buckley, Fletcher J.
1984-01-01
Software Engineering has emerged as a field in recent years, and those involved increasingly recognize the need for standards. As a result, members of the Institute of Electrical and Electronics Engineers (IEEE) formed a subcommittee to develop these standards. This paper discusses the ongoing standards development, and associated efforts.
Using Combined SFTA and SFMECA Techniques for Space Critical Software
NASA Astrophysics Data System (ADS)
Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.
2012-01-01
This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.
Software development predictors, error analysis, reliability models and software metric analysis
NASA Technical Reports Server (NTRS)
Basili, Victor
1983-01-01
The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.
NASA Technical Reports Server (NTRS)
1992-01-01
This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.
Maureen V. Duane; Warren B. Cohen; John L. Campbell; Tara Hudiburg; David P. Turner; Dale Weyermann
2010-01-01
Empirical models relating forest attributes to remotely sensed metrics are widespread in the literature and underpin many of our efforts to map forest structure across complex landscapes. In this study we compared empirical models relating Landsat reflectance to forest age across Oregon using two alternate sets of ground data: one from a large (n ~ 1500) systematic...
IETM Usability: Using Empirical Studies to Improve Performance Aiding
2001-05-14
is highly graphical . An example screen is shown in Figure 3. Following is a description of the design solutions incorporated in Interface B to...in enabling lesser skilled U.S. Navy maintainers to perform their jobs. This empirical study about the design and effectiveness of high level IETMs...novice preferences and performance were documented to inform future adaptive interface design efforts. The maintainers who participated were
Empirical Knowledge Transfer and Collaboration with Self-Regenerative Systems
2007-06-01
SYSTEMS Raytheon Company Sponsored by Defense Advanced Research Projects Agency DARPA Order No. T120 APPROVED FOR PUBLIC RELEASE...FA8750-04-C-0286 5b. GRANT NUMBER 4. TITLE AND SUBTITLE EMPIRICAL KNOWLEDGE TRANSFER AND COLLABORATION WITH SELF-REGENERATIVE SYSTEMS 5c...Self-Regenerative Systems program to develop new technologies supporting granular scalable redundancy. The key focus of Raytheon’s effort was to
Space shuttle onboard navigation console expert/trainer system
NASA Technical Reports Server (NTRS)
Wang, Lui; Bochsler, Dan
1987-01-01
A software system for use in enhancing operational performance as well as training ground controllers in monitoring onboard Space Shuttle navigation sensors is described. The Onboard Navigation (ONAV) development reflects a trend toward following a structured and methodical approach to development. The ONAV system must deal with integrated conventional and expert system software, complex interfaces, and implementation limitations due to the target operational environment. An overview of the onboard navigation sensor monitoring function is presented, along with a description of guidelines driving the development effort, requirements that the system must meet, current progress, and future efforts.
An approach to software cost estimation
NASA Technical Reports Server (NTRS)
Mcgarry, F.; Page, J.; Card, D.; Rohleder, M.; Church, V.
1984-01-01
A general procedure for software cost estimation in any environment is outlined. The basic concepts of work and effort estimation are explained, some popular resource estimation models are reviewed, and the accuracy of source estimates is discussed. A software cost prediction procedure based on the experiences of the Software Engineering Laboratory in the flight dynamics area and incorporating management expertise, cost models, and historical data is described. The sources of information and relevant parameters available during each phase of the software life cycle are identified. The methodology suggested incorporates these elements into a customized management tool for software cost prediction. Detailed guidelines for estimation in the flight dynamics environment developed using this methodology are presented.
SDDL- SOFTWARE DESIGN AND DOCUMENTATION LANGUAGE
NASA Technical Reports Server (NTRS)
Kleine, H.
1994-01-01
Effective, efficient communication is an essential element of the software development process. The Software Design and Documentation Language (SDDL) provides an effective communication medium to support the design and documentation of complex software applications. SDDL supports communication between all the members of a software design team and provides for the production of informative documentation on the design effort. Even when an entire development task is performed by a single individual, it is important to explicitly express and document communication between the various aspects of the design effort including concept development, program specification, program development, and program maintenance. SDDL ensures that accurate documentation will be available throughout the entire software life cycle. SDDL offers an extremely valuable capability for the design and documentation of complex programming efforts ranging from scientific and engineering applications to data management and business sytems. Throughout the development of a software design, the SDDL generated Software Design Document always represents the definitive word on the current status of the ongoing, dynamic design development process. The document is easily updated and readily accessible in a familiar, informative form to all members of the development team. This makes the Software Design Document an effective instrument for reconciling misunderstandings and disagreements in the development of design specifications, engineering support concepts, and the software design itself. Using the SDDL generated document to analyze the design makes it possible to eliminate many errors that might not be detected until coding and testing is attempted. As a project management aid, the Software Design Document is useful for monitoring progress and for recording task responsibilities. SDDL is a combination of language, processor, and methodology. The SDDL syntax consists of keywords to invoke design structures and a collection of directives which control processor actions. The designer has complete control over the choice of keywords, commanding the capabilities of the processor in a way which is best suited to communicating the intent of the design. The SDDL processor translates the designer's creative thinking into an effective document for communication. The processor performs as many automatic functions as possible, thereby freeing the designer's energy for the creative effort. Document formatting includes graphical highlighting of structure logic, accentuation of structure escapes and module invocations, logic error detection, and special handling of title pages and text segments. The SDDL generated document contains software design summary information including module invocation hierarchy, module cross reference, and cross reference tables of user selected words or phrases appearing in the document. The basic forms of the methodology are module and block structures and the module invocation statement. A design is stated in terms of modules that represent problem abstractions which are complete and independent enough to be treated as separate problem entities. Blocks are lower-level structures used to build the modules. Both kinds of structures may have an initiator part, a terminator part, an escape segment, or a substructure. The SDDL processor is written in PASCAL for batch execution on a DEC VAX series computer under VMS. SDDL was developed in 1981 and last updated in 1984.
Using a Novel Spatial Tool to Inform Invasive Species Early Detection and Rapid Response Efforts
NASA Astrophysics Data System (ADS)
Davidson, Alisha D.; Fusaro, Abigail J.; Kashian, Donna R.
2015-07-01
Management of invasive species has increasingly emphasized the importance of early detection and rapid response (EDRR) programs in limiting introductions, establishment, and impacts. These programs require an understanding of vector and species spatial dynamics to prioritize monitoring sites and efficiently allocate resources. Yet managers often lack the empirical data necessary to make these decisions. We developed an empirical mapping tool that can facilitate development of EDRR programs through identifying high-risk locations, particularly within the recreational boating vector. We demonstrated the utility of this tool in the Great Lakes watershed. We surveyed boaters to identify trips among water bodies and to quantify behaviors associated with high likelihood of species transfer (e.g., not removing organic materials from boat trailers) during that trip. We mapped water bodies with high-risk inbound and outbound boater movements using ArcGIS. We also tested for differences in high-risk behaviors based on demographic variables to understand risk differences among boater groups. Incorporation of boater behavior led to identification of additional high-risk water bodies compared to using the number of trips alone. Therefore, the number of trips itself may not fully reflect the likelihood of invasion. This tool can be broadly applied in other geographic contexts and with different taxa, and can be adjusted according to varying levels of information concerning the vector or species of interest. The methodology is straightforward and can be followed after a basic introduction to ArcGIS software. The visual nature of the mapping tool will facilitate site prioritization by managers and stakeholders from diverse backgrounds.
Profile of NASA software engineering: Lessons learned from building the baseline
NASA Technical Reports Server (NTRS)
Hall, Dana; Mcgarry, Frank
1993-01-01
It is critically important in any improvement activity to first understand the organization's current status, strengths, and weaknesses and, only after that understanding is achieved, examine and implement promising improvements. This fundamental rule is certainly true for an organization seeking to further its software viability and effectiveness. This paper addresses the role of the organizational process baseline in a software improvement effort and the lessons we learned assembling such an understanding for NASA overall and for the NASA Goddard Space Flight Center in particular. We discuss important, core data that must be captured and contrast that with our experience in actually finding such information. Our baselining efforts have evolved into a set of data gathering, analysis, and crosschecking techniques and information presentation formats that may prove useful to others seeking to establish similar baselines for their organization.
Dalmaijer, Edwin S; Mathôt, Sebastiaan; Van der Stigchel, Stefan
2014-12-01
The PyGaze toolbox is an open-source software package for Python, a high-level programming language. It is designed for creating eyetracking experiments in Python syntax with the least possible effort, and it offers programming ease and script readability without constraining functionality and flexibility. PyGaze can be used for visual and auditory stimulus presentation; for response collection via keyboard, mouse, joystick, and other external hardware; and for the online detection of eye movements using a custom algorithm. A wide range of eyetrackers of different brands (EyeLink, SMI, and Tobii systems) are supported. The novelty of PyGaze lies in providing an easy-to-use layer on top of the many different software libraries that are required for implementing eyetracking experiments. Essentially, PyGaze is a software bridge for eyetracking research.
Concept Development for Software Health Management
NASA Technical Reports Server (NTRS)
Riecks, Jung; Storm, Walter; Hollingsworth, Mark
2011-01-01
This report documents the work performed by Lockheed Martin Aeronautics (LM Aero) under NASA contract NNL06AA08B, delivery order NNL07AB06T. The Concept Development for Software Health Management (CDSHM) program was a NASA funded effort sponsored by the Integrated Vehicle Health Management Project, one of the four pillars of the NASA Aviation Safety Program. The CD-SHM program focused on defining a structured approach to software health management (SHM) through the development of a comprehensive failure taxonomy that is used to characterize the fundamental failure modes of safety-critical software.
Managing Complexity in Next Generation Robotic Spacecraft: From a Software Perspective
NASA Technical Reports Server (NTRS)
Reinholtz, Kirk
2008-01-01
This presentation highlights the challenges in the design of software to support robotic spacecraft. Robotic spacecraft offer a higher degree of autonomy, however currently more capabilities are required, primarily in the software, while providing the same or higher degree of reliability. The complexity of designing such an autonomous system is great, particularly while attempting to address the needs for increased capabilities and high reliability without increased needs for time or money. The efforts to develop programming models for the new hardware and the integration of software architecture are highlighted.
NASA Technical Reports Server (NTRS)
Barnes, Jeffrey M.
2011-01-01
All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.
In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, S.A.
In computing landscape which has a plethora of different hardware architectures and supporting software systems ranging from compilers to operating systems, there is an obvious and strong need for a philosophy of software development that lends itself to the design and construction of portable code systems. The current efforts to standardize software bear witness to this need. SABrE is an effort to implement a software development environment which is itself portable and promotes the design and construction of portable applications. SABrE does not include such important tools as editors and compilers. Well built tools of that kind are readily availablemore » across virtually all computer platforms. The areas that SABrE addresses are at a higher level involving issues such as data portability, portable inter-process communication, and graphics. These blocks of functionality have particular significance to the kind of code development done at LLNL. That is partly why the general computing community has not supplied us with these tools already. This is another key feature of the software development environments which we must recognize. The general computing community cannot and should not be expected to produce all of the tools which we require.« less
NASA Astrophysics Data System (ADS)
Le Bras, R. J.; Arora, N. S.; Kushida, N.; Kebede, F.; Feitio, P.; Tomuta, E.
2017-12-01
The International Monitoring System of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has reached out to the broader scientific community through a series of conferences, the later one of which took place in June 2017 in Vienna, Austria. Stemming out of this outreach effort, after the inception of research and development efforts in 2009, the NET-VISA software, following a Bayesian modelling approach, has been elaborated to improve on the key step of automatic association of joint seismic, hydro-acoustic, and infrasound detections. When compared with the current operational system, it has been consistently shown on off-line tests to improve the overlap with the analyst-reviewed Reviewed Event Bulletin (REB) by ten percent for an average of 85% overlap, while the inconsistency rate is essentially the same at about 50%. Testing by analysts in realistic conditions on a few days of data has also demonstrated the software performance in finding additional events which qualify for publication in the REB. Starting in August 2017, the automatic events produced by the software will be reviewed by analysts at the CTBTO, and we report on the initial evaluation of this introduction into operations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dinda, Peter August
2015-03-17
This report describes the activities, findings, and products of the Northwestern University component of the "Enabling Exascale Hardware and Software Design through Scalable System Virtualization" project. The purpose of this project has been to extend the state of the art of systems software for high-end computing (HEC) platforms, and to use systems software to better enable the evaluation of potential future HEC platforms, for example exascale platforms. Such platforms, and their systems software, have the goal of providing scientific computation at new scales, thus enabling new research in the physical sciences and engineering. Over time, the innovations in systems softwaremore » for such platforms also become applicable to more widely used computing clusters, data centers, and clouds. This was a five-institution project, centered on the Palacios virtual machine monitor (VMM) systems software, a project begun at Northwestern, and originally developed in a previous collaboration between Northwestern University and the University of New Mexico. In this project, Northwestern (including via our subcontract to the University of Pittsburgh) contributed to the continued development of Palacios, along with other team members. We took the leadership role in (1) continued extension of support for emerging Intel and AMD hardware, (2) integration and performance enhancement of overlay networking, (3) connectivity with architectural simulation, (4) binary translation, and (5) support for modern Non-Uniform Memory Access (NUMA) hosts and guests. We also took a supporting role in support for specialized hardware for I/O virtualization, profiling, configurability, and integration with configuration tools. The efforts we led (1-5) were largely successful and executed as expected, with code and papers resulting from them. The project demonstrated the feasibility of a virtualization layer for HEC computing, similar to such layers for cloud or datacenter computing. For effort (3), although a prototype connecting Palacios with the GEM5 architectural simulator was demonstrated, our conclusion was that such a platform was less useful for design space exploration than anticipated due to inherent complexity of the connection between the instruction set architecture level and the microarchitectural level. For effort (4), we found that a code injection approach proved to be more fruitful. The results of our efforts are publicly available in the open source Palacios codebase and published papers, all of which are available from the project web site, v3vee.org. Palacios is currently one of the two codebases (the other being Sandia’s Kitten lightweight kernel) that underlies the node operating system for the DOE Hobbes Project, one of two projects tasked with building a systems software prototype for the national exascale computing effort.« less
Modeling percent tree canopy cover: a pilot study
John W. Coulston; Gretchen G. Moisen; Barry T. Wilson; Mark V. Finco; Warren B. Cohen; C. Kenneth Brewer
2012-01-01
Tree canopy cover is a fundamental component of the landscape, and the amount of cover influences fire behavior, air pollution mitigation, and carbon storage. As such, efforts to empirically model percent tree canopy cover across the United States are a critical area of research. The 2001 national-scale canopy cover modeling and mapping effort was completed in 2006,...
Computational Infrastructure for Geodynamics (CIG)
NASA Astrophysics Data System (ADS)
Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.
2004-12-01
Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to leverage and develop long-term strategic partnerships with open source development efforts within the larger thrusts of scientific computing and geoinformatics. These strategic partnerships are essential as the frontier has moved into multi-scale and multi-physics problems in which many investigators now want to use simulation software for data interpretation, data assimilation, and hypothesis testing.
ERIC Educational Resources Information Center
Haans, Antal
2018-01-01
Contrast analysis is a relatively simple but effective statistical method for testing theoretical predictions about differences between group means against the empirical data. Despite its advantages, contrast analysis is hardly used to date, perhaps because it is not implemented in a convenient manner in many statistical software packages. This…
Usability evaluation of mobile applications using ISO 9241 and ISO 25062 standards.
Moumane, Karima; Idri, Ali; Abran, Alain
2016-01-01
This paper presents an empirical study based on a set of measures to evaluate the usability of mobile applications running on different mobile operating systems, including Android, iOS and Symbian. The aim is to evaluate empirically a framework that we have developed on the use of the Software Quality Standard ISO 9126 in mobile environments, especially the usability characteristic. To do that, 32 users had participated in the experiment and we have used ISO 25062 and ISO 9241 standards for objective measures by working with two widely used mobile applications: Google Apps and Google Maps. The QUIS 7.0 questionnaire have been used to collect measures assessing the users' level of satisfaction when using these two mobile applications. By analyzing the results we highlighted a set of mobile usability issues that are related to the hardware as well as to the software and that need to be taken into account by designers and developers in order to improve the usability of mobile applications.
Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.
1997-09-30
set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has
Measuring the Impact of Agile Coaching on Students' Performance
ERIC Educational Resources Information Center
Rodríguez, Guillermo; Soria, Álvaro; Campo, Marcelo
2016-01-01
Nowadays, considerable attention is paid to agile methods as a means to improve management of software development processes. The widespread use of such methods in professional contexts has encouraged their integration into software engineering training and undergraduate courses. Although several research efforts have focused on teaching Scrum…
Early-Stage Software Design for Usability
ERIC Educational Resources Information Center
Golden, Elspeth
2010-01-01
In spite of the goodwill and best efforts of software engineers and usability professionals, systems continue to be built and released with glaring usability flaws that are costly and difficult to fix after the system has been built. Although user interface (UI) designers, be they usability or design experts, communicate usability requirements to…
Pilot-in-the-Loop CFD Method Development
2014-06-16
CFD analysis. Coupled simulations will be run at PSU on the COCOA -4 cluster, a high performance computing cluster. The CRUNCH CFD software has...been installed on the COCOA -4 servers and initial software tests are being conducted. Initial efforts will use the Generic Frigate Shape SFS-2 to
Advanced technologies for Mission Control Centers
NASA Technical Reports Server (NTRS)
Dalton, John T.; Hughes, Peter M.
1991-01-01
Advance technologies for Mission Control Centers are presented in the form of the viewgraphs. The following subject areas are covered: technology needs; current technology efforts at GSFC (human-machine interface development, object oriented software development, expert systems, knowledge-based software engineering environments, and high performance VLSI telemetry systems); and test beds.
The Stabilization, Exploration, and Expression of Computer Game History
ERIC Educational Resources Information Center
Kaltman, Eric
2017-01-01
Computer games are now a significant cultural phenomenon, and a significant artistic output of humanity. However, little effort and attention have been paid to how the medium of games and interactive software developed, and even less to the historical storage of software development documentation. This thesis borrows methodologies and practices…
Learning Teamwork Skills in University Programming Courses
ERIC Educational Resources Information Center
Sancho-Thomas, Pilar; Fuentes-Fernandez, Ruben; Fernandez-Manjon, Baltasar
2009-01-01
University courses about computer programming usually seek to provide students not only with technical knowledge, but also with the skills required to work in real-life software projects. Nowadays, the development of software applications requires the coordinated efforts of the members of one or more teams. Therefore, it is important for software…
Open source approaches to health information systems in Kenya.
Drury, Peter; Dahlman, Bruce
2005-01-01
This paper focuses on the experience to date of an installation of a Free Open Source Software (FOSS) product, Care2X, at a church hospital in Kenya. The FOSS movement has been maturing rapidly. In developed countries, its benefits relative to proprietary software have been extensively discussed and ways of quantifying the total costs of the development have been developed. Nevertheless, empirical data on the impact of FOSS, particularly in the developing world, concerning its use and development is still quite limited, although the possibilities of FOSS are becoming increasingly attractive.
2009-09-01
NII)/CIO Assistant Secretary of Defense for Networks and Information Integration/Chief Information Officer CMMI Capability Maturity Model...a Web-based portal to share knowledge about software process-related methodologies, such as the SEI’s Capability Maturity Model Integration ( CMMI ...19 SEI’s IDEALSM model, and Lean Six Sigma.20 For example, the portal features content areas such as software acquisition management, the SEI CMMI
Maintenance Metrics for Jovial (J73) Software
1988-12-01
pacing technology in advanced fighters, just as it has in most other weapon systems and information systems" ( Canan , 1986:49). Another reason for...the magnitude of the software inside an aircraft may represent only a fraction of that aircraft’s total software requirement." ( Canan , 1986:49) One more...art than a science" marks program development as a largely labor-intensive, human endeavor ( Canan , 1986:50). Individual effort and creativity therefore
Penn State University ground software support for X-ray missions.
NASA Astrophysics Data System (ADS)
Townsley, L. K.; Nousek, J. A.; Corbet, R. H. D.
1995-03-01
The X-ray group at Penn State is charged with two software development efforts in support of X-ray satellite missions. As part of the ACIS instrument team for AXAF, the authors are developing part of the ground software to support the instrument's calibration. They are also designing a translation program for Ginga data, to change it from the non-standard FRF format, which closely parallels the original telemetry format, to FITS.
CrossTalk. The Journal of Defense Software Engineering. Volume 23, Number 6, Nov/Dec 2010
2010-11-01
Model of archi- tectural design. It guides developers to apply effort to their software architecture commensurate with the risks faced by...Driven Model is the promotion of risk to prominence. It is possible to apply the Risk-Driven Model to essentially any software development process...succeed without any planned architecture work, while many high-risk projects would fail without it . The Risk-Driven Model walks a middle path
Technology Transfer Challenges for High-Assurance Software Engineering Tools
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.
2003-01-01
In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.
Reviewing the effort-reward imbalance model: drawing up the balance of 45 empirical studies.
van Vegchel, Natasja; de Jonge, Jan; Bosma, Hans; Schaufeli, Wilmar
2005-03-01
The present paper provides a review of 45 studies on the Effort-Reward Imbalance (ERI) Model published from 1986 to 2003 (inclusive). In 1986, the ERI Model was introduced by Siegrist et al. (Biological and Psychological Factors in Cardiovascular Disease, Springer, Berlin, 1986, pp. 104-126; Social Science & Medicine 22 (1986) 247). The central tenet of the ERI Model is that an imbalance between (high) efforts and (low) rewards leads to (sustained) strain reactions. Besides efforts and rewards, overcommitment (i.e., a personality characteristic) is a crucial aspect of the model. Essentially, the ERI Model contains three main assumptions, which could be labeled as (1) the extrinsic ERI hypothesis: high efforts in combination with low rewards increase the risk of poor health, (2) the intrinsic overcommitment hypothesis: a high level of overcommitment may increase the risk of poor health, and (3) the interaction hypothesis: employees reporting an extrinsic ERI and a high level of overcommitment have an even higher risk of poor health. The review showed that the extrinsic ERI hypothesis has gained considerable empirical support. Results for overcommitment remain inconsistent and the moderating effect of overcommitment on the relation between ERI and employee health has been scarcely examined. Based on these review results suggestions for future research are proposed.
NASA Technical Reports Server (NTRS)
Barghouty, A. F.; Falconer, D. A.; Adams, J. H., Jr.
2010-01-01
This presentation describes a new forecasting tool developed for and is currently being tested by NASA s Space Radiation Analysis Group (SRAG) at JSC, which is responsible for the monitoring and forecasting of radiation exposure levels of astronauts. The new software tool is designed for the empirical forecasting of M and X-class flares, coronal mass ejections, as well as solar energetic particle events. Its algorithm is based on an empirical relationship between the various types of events rates and a proxy of the active region s free magnetic energy, determined from a data set of approx.40,000 active-region magnetograms from approx.1,300 active regions observed by SOHO/MDI that have known histories of flare, coronal mass ejection, and solar energetic particle event production. The new tool automatically extracts each strong-field magnetic areas from an MDI full-disk magnetogram, identifies each as an NOAA active region, and measures a proxy of the active region s free magnetic energy from the extracted magnetogram. For each active region, the empirical relationship is then used to convert the free magnetic energy proxy into an expected event rate. The expected event rate in turn can be readily converted into the probability that the active region will produce such an event in a given forward time window. Descriptions of the datasets, algorithm, and software in addition to sample applications and a validation test are presented. Further development and transition of the new tool in anticipation of SDO/HMI is briefly discussed.
Knowledge-based approach for generating target system specifications from a domain model
NASA Technical Reports Server (NTRS)
Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan
1992-01-01
Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.
Toward using alpha and theta brain waves to quantify programmer expertise.
Crk, Igor; Kluthe, Timothy
2014-01-01
Empirical studies of programming language learnability and usability have thus far depended on indirect measures of human cognitive performance, attempting to capture what is at its essence a purely cognitive exercise through various indicators of comprehension, such as the correctness of coding tasks or the time spent working out the meaning of code and producing acceptable solutions. Understanding program comprehension is essential to understanding the inherent complexity of programming languages, and ultimately, having a measure of mental effort based on direct observation of the brain at work will illuminate the nature of the work of programming. We provide evidence of direct observation of the cognitive effort associated with programming tasks, through a carefully constructed empirical study using a cross-section of undergraduate computer science students and an inexpensive, off-the-shelf brain-computer interface device. This study presents a link between expertise and programming language comprehension, draws conclusions about the observed indicators of cognitive effort using recent cognitive theories, and proposes directions for future work that is now possible.
Collected Software Engineering Papers, Volume 10
NASA Technical Reports Server (NTRS)
1992-01-01
This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from Oct. 1991 - Nov. 1992. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document. For the convenience of this presentation, the 11 papers contained here are grouped into 5 major sections: (1) the Software Engineering Laboratory; (2) software tools studies; (3) software models studies; (4) software measurement studies; and (5) Ada technology studies.
A Database for Propagation Models and Conversion to C++ Programming Language
NASA Technical Reports Server (NTRS)
Kantak, Anil V.; Angkasa, Krisjani; Rucker, James
1996-01-01
The telecommunications system design engineer generally needs the quantification of effects of the propagation medium (definition of the propagation channel) to design an optimal communications system. To obtain the definition of the channel, the systems engineer generally has a few choices. A search of the relevant publications such as the IEEE Transactions, CCIR's, NASA propagation handbook, etc., may be conducted to find the desired channel values. This method may need excessive amounts of time and effort on the systems engineer's part and there is a possibility that the search may not even yield the needed results. To help the researcher and the systems engineers, it was recommended by the conference participants of NASA Propagation Experimenters (NAPEX) XV (London, Ontario, Canada, June 28 and 29, 1991) that a software should be produced that would contain propagation models and the necessary prediction methods of most propagation phenomena. Moreover, the software should be flexible enough for the user to make slight changes to the models without expending a substantial effort in programming. In the past few years, a software was produced to fit these requirements as best as could be done. The software was distributed to all NAPEX participants for evaluation and use, the participant reactions, suggestions etc., were gathered and were used to improve the subsequent releases of the software. The existing database program is in the Microsoft Excel application software and works fine within the guidelines of that environment, however, recently there have been some questions about the robustness and survivability of the Excel software in the ever changing (hopefully improving) world of software packages.
Cultural and Technological Issues and Solutions for Geodynamics Software Citation
NASA Astrophysics Data System (ADS)
Heien, E. M.; Hwang, L.; Fish, A. E.; Smith, M.; Dumit, J.; Kellogg, L. H.
2014-12-01
Computational software and custom-written codes play a key role in scientific research and teaching, providing tools to perform data analysis and forward modeling through numerical computation. However, development of these codes is often hampered by the fact that there is no well-defined way for the authors to receive credit or professional recognition for their work through the standard methods of scientific publication and subsequent citation of the work. This in turn may discourage researchers from publishing their codes or making them easier for other scientists to use. We investigate the issues involved in citing software in a scientific context, and introduce features that should be components of a citation infrastructure, particularly oriented towards the codes and scientific culture in the area of geodynamics research. The codes used in geodynamics are primarily specialized numerical modeling codes for continuum mechanics problems; they may be developed by individual researchers, teams of researchers, geophysicists in collaboration with computational scientists and applied mathematicians, or by coordinated community efforts such as the Computational Infrastructure for Geodynamics. Some but not all geodynamics codes are open-source. These characteristics are common to many areas of geophysical software development and use. We provide background on the problem of software citation and discuss some of the barriers preventing adoption of such citations, including social/cultural barriers, insufficient technological support infrastructure, and an overall lack of agreement about what a software citation should consist of. We suggest solutions in an initial effort to create a system to support citation of software and promotion of scientific software development.
Promoting Entrepreneurship among Informatics Engineering Students: Insights from a Case Study
ERIC Educational Resources Information Center
Fernandes, João M.; Afonso, Paulo; Fonte, Victor; Alves, Victor; Ribeiro, António Nestor
2017-01-01
Universities seek to promote entrepreneurship through effective education approaches, which need to be in permanent evolution. Nevertheless, the literature in entrepreneurship education lacks empirical evidence. This article discusses relevant issues related to promoting entrepreneurship in the software field, based on the experience of a…
ERIC Educational Resources Information Center
Dalgarno, Barney; Kennedy, Gregor; Bennett, Sue
2014-01-01
Discovery-based learning designs incorporating active exploration are common within instructional software. However, researchers have highlighted empirical evidence showing that "pure" discovery learning is of limited value and strategies which reduce complexity and provide guidance to learners are important if potential learning…
A Comparison and Evaluation of Real-Time Software Systems Modeling Languages
NASA Technical Reports Server (NTRS)
Evensen, Kenneth D.; Weiss, Kathryn Anne
2010-01-01
A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.
Modeling and managing risk early in software development
NASA Technical Reports Server (NTRS)
Briand, Lionel C.; Thomas, William M.; Hetmanski, Christopher J.
1993-01-01
In order to improve the quality of the software development process, we need to be able to build empirical multivariate models based on data collectable early in the software process. These models need to be both useful for prediction and easy to interpret, so that remedial actions may be taken in order to control and optimize the development process. We present an automated modeling technique which can be used as an alternative to regression techniques. We show how it can be used to facilitate the identification and aid the interpretation of the significant trends which characterize 'high risk' components in several Ada systems. Finally, we evaluate the effectiveness of our technique based on a comparison with logistic regression based models.
Political Economy, the Internet and FL/OSS Development
NASA Astrophysics Data System (ADS)
Mansell, Robin; Berdou, Evangelia
Despite the growing amount of research on Free/Libre/Open Source Software (FL/OSS) development, there is little insight into how structural factors associated with institutions influence the patterns of software developer activity in this area. This article examines some of the dynamics of the development of this type of software and the extent to which these dynamics are associated with features of the gift economy as is frequently suggested in the literature. Drawing on an empirical analysis of contributors to the GNOME FL/OSS project, we suggest that greater attention should be given to the emergence of a mixed economy in which features of the exchange economy come to the fore with implications for the power relationships among those contributing to FL/OSS.
Toward Intelligent Software Defect Detection
NASA Technical Reports Server (NTRS)
Benson, Markland J.
2011-01-01
Source code level software defect detection has gone from state of the art to a software engineering best practice. Automated code analysis tools streamline many of the aspects of formal code inspections but have the drawback of being difficult to construct and either prone to false positives or severely limited in the set of defects that can be detected. Machine learning technology provides the promise of learning software defects by example, easing construction of detectors and broadening the range of defects that can be found. Pinpointing software defects with the same level of granularity as prominent source code analysis tools distinguishes this research from past efforts, which focused on analyzing software engineering metrics data with granularity limited to that of a particular function rather than a line of code.
From Exotic to Mainstream: A 10-year Odyssey from Internet Speed to Boundary Spanning with Scrum
NASA Astrophysics Data System (ADS)
Baskerville, Richard; Pries-Heje, Jan; Madsen, Sabine
Based on four empirical studies conducted over a 10-year time period from 1999 to 2008 we investigate how local software processes interact with global changes in the software development context. In 1999 companies were developing software at high speed in a desperate rush to be first-to-market. In 2001 a new high speed/quick results development process had become established practice. In 2003 changes in the market created the need for a more balanced view on speed and quality, and in 2008 companies were successfully combining agile and plan driven approaches to achieve the benefits of both. The studies reveal a twostage pattern in which dramatic changes in the market causes disruption of established practices, experimentation, and process adaptations followed by consolidation of lessons learnt into a new (and once again mature) software development process. Limitations, implications, and areas for future research are discussed.
The role of the ADS in software discovery and citation
NASA Astrophysics Data System (ADS)
Accomazzi, Alberto
2018-01-01
As the primary index of scholarly content in astronomy and physics, the NASA Astrophysics Data System (ADS) is collaborating with the AAS journals and the Zenodo repository in an effort to promote the preservation of scientific software used in astronomy research and its citation in scholarly publications. In this talk I will discuss how ADS is updating its service infrastructure to allow for the publication, indexing, and citation of software records in scientific articles.
ASSIP Study of Real-Time Safety-Critical Embedded Software-Intensive System Engineering Practices
2008-02-01
and assessment 2. product engineering processes 3. tooling processes 6 | CMU/SEI-2008-SR-001 Slide 1 Process Standards IEC/ ISO 12207 Software...and technical effort to align with 12207 IEC/ ISO 15026 System & Software Integrity Levels Generic Safety SAE ARP 4754 Certification Considerations...Process Frameworks in revision – ISO 9001, ISO 9004 – ISO 15288/ ISO 12207 harmonization – RTCA DO-178B, MOD Standard UK 00-56/3, … • Methods & Tools
NASA Technical Reports Server (NTRS)
1981-01-01
Guidelines and recommendations are presented for the collection of software development data. Motivation and planning for, and implementation and management of, a data collection effort are discussed. Topics covered include types, sources, and availability of data; methods and costs of data collection; types of analyses supported; and warnings and suggestions based on software engineering laboratory (SEL) experiences. This document is intended as a practical guide for software managers and engineers, abstracted and generalized from 5 years of SEL data collection.
ERIC Educational Resources Information Center
English, Leona M.
2012-01-01
This article uses the lens of critical discourse analysis to examine the religious education efforts of the Newfoundland School Society (NSS), the main provider of religious education in Newfoundland in the 19th century. Although its focus was initially this colony, the NSS quickly broadened its reach to the whole British empire, making it one of…
Studies in Software Cost Model Behavior: Do We Really Understand Cost Model Performance?
NASA Technical Reports Server (NTRS)
Lum, Karen; Hihn, Jairus; Menzies, Tim
2006-01-01
While there exists extensive literature on software cost estimation techniques, industry practice continues to rely upon standard regression-based algorithms. These software effort models are typically calibrated or tuned to local conditions using local data. This paper cautions that current approaches to model calibration often produce sub-optimal models because of the large variance problem inherent in cost data and by including far more effort multipliers than the data supports. Building optimal models requires that a wider range of models be considered while correctly calibrating these models requires rejection rules that prune variables and records and use multiple criteria for evaluating model performance. The main contribution of this paper is to document a standard method that integrates formal model identification, estimation, and validation. It also documents what we call the large variance problem that is a leading cause of cost model brittleness or instability.
Experience Transitioning Models and Data at the NOAA Space Weather Prediction Center
NASA Astrophysics Data System (ADS)
Berger, Thomas
2016-07-01
The NOAA Space Weather Prediction Center has a long history of transitioning research data and models into operations and with the validation activities required. The first stage in this process involves demonstrating that the capability has sufficient value to customers to justify the cost needed to transition it and to run it continuously and reliably in operations. Once the overall value is demonstrated, a substantial effort is then required to develop the operational software from the research codes. The next stage is to implement and test the software and product generation on the operational computers. Finally, effort must be devoted to establishing long-term measures of performance, maintaining the software, and working with forecasters, customers, and researchers to improve over time the operational capabilities. This multi-stage process of identifying, transitioning, and improving operational space weather capabilities will be discussed using recent examples. Plans for future activities will also be described.
Software Carpentry: lessons learned
Wilson, Greg
2016-01-01
Since its start in 1998, Software Carpentry has evolved from a week-long training course at the US national laboratories into a worldwide volunteer effort to improve researchers' computing skills. This paper explains what we have learned along the way, the challenges we now face, and our plans for the future. PMID:24715981
The impact of organizational structure on flight software cost risk
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lum, Karen; Monson, Erik
2004-01-01
This paper summarizes the final results of the follow-up study updating the estimated software effort growth for those projects that were still under development and including an evaluation of the roles versus observed cost risk for the missions included in the original study which expands the data set to thirteen missions.
2005-12-01
courses in which students learn programming techniques using gaming software models . As part of the effort, teaching modules and entire courses will......assets. The United States Navy recently implemented new policies to restrict service personnel’s use of commercial, web- based email applications in an
IDEA: Identifying Design Principles in Educational Applets
ERIC Educational Resources Information Center
Underwood, Jody S.; Hoadley, Christopher; Lee, Hollylynne Stohl; Hollebrands, Karen; DiGiano, Chris; Renninger, K. Ann
2005-01-01
The Internet is increasingly being used as a medium for educational software in the form of miniature applications (e.g., applets) to explore concepts in a domain. One such effort in mathematics education, the Educational Software Components of Tomorrow (ESCOT) project, created 42 miniature applications each consisting of a context, a set of…
ERIC Educational Resources Information Center
Oliver, Astrid; Dahlquist, Janet; Tankersley, Jan; Emrich, Beth
2010-01-01
This article discusses the processes that occurred when the Library, Controller's Office, and Information Technology Department agreed to create an interface between the Library's Innovative Interfaces patron database and campus administrative software, Banner, using file transfer protocol, in an effort to streamline the Library's accounts…
Beyond the Evident Content Goals Part I. Tapping the Depth and Flow of the Educational Undercurrent.
ERIC Educational Resources Information Center
Dugdale, Sharon; Kibbey, David
1990-01-01
The first in a series of three articles, successful instructional materials from a 15-year software development effort are analyzed and characterized with special attention given to educational experiences intended to shape students' perceptions of the fundamental nature, interconnectedness, and usefulness of mathematics. The software programs…
Enhancement of Spatial Thinking with Virtual Spaces 1.0
ERIC Educational Resources Information Center
Hauptman, Hanoch
2010-01-01
Developing a software environment to enhance 3D geometric proficiency demands the consideration of theoretical views of the learning process. Simultaneously, this effort requires taking into account the range of tools that technology offers, as well as their limitations. In this paper, we report on the design of Virtual Spaces 1.0 software, a…
The Use of the Software MATLAB To Improve Chemical Engineering Education.
ERIC Educational Resources Information Center
Damatto, T.; Maegava, L. M.; Filho, R. Maciel
In all the Brazilian Universities involved with the project "Prodenge-Reenge", the main objective is to improve teaching and learning procedures for the engineering disciplines. The Chemical Engineering College of Campinas State University focused its effort on the use of engineering softwares. The work developed by this project has…
Man-computer Inactive Data Access System (McIDAS). [design, development, fabrication, and testing
NASA Technical Reports Server (NTRS)
1973-01-01
A technical description is given of the effort to design, develop, fabricate, and test the two dimensional data processing system, McIDAS. The system has three basic sections: an access and data archive section, a control section, and a display section. Areas reported include hardware, system software, and applications software.
Combining analysis with optimization at Langley Research Center. An evolutionary process
NASA Technical Reports Server (NTRS)
Rogers, J. L., Jr.
1982-01-01
The evolutionary process of combining analysis and optimization codes was traced with a view toward providing insight into the long term goal of developing the methodology for an integrated, multidisciplinary software system for the concurrent analysis and optimization of aerospace structures. It was traced along the lines of strength sizing, concurrent strength and flutter sizing, and general optimization to define a near-term goal for combining analysis and optimization codes. Development of a modular software system combining general-purpose, state-of-the-art, production-level analysis computer programs for structures, aerodynamics, and aeroelasticity with a state-of-the-art optimization program is required. Incorporation of a modular and flexible structural optimization software system into a state-of-the-art finite element analysis computer program will facilitate this effort. This effort results in the software system used that is controlled with a special-purpose language, communicates with a data management system, and is easily modified for adding new programs and capabilities. A 337 degree-of-freedom finite element model is used in verifying the accuracy of this system.
Case study of open-source enterprise resource planning implementation in a small business
NASA Astrophysics Data System (ADS)
Olson, David L.; Staley, Jesse
2012-02-01
Enterprise resource planning (ERP) systems have been recognised as offering great benefit to some organisations, although they are expensive and problematic to implement. The cost and risk make well-developed proprietorial systems unaffordable to small businesses. Open-source software (OSS) has become a viable means of producing ERP system products. The question this paper addresses is the feasibility of OSS ERP systems for small businesses. A case is reported involving two efforts to implement freely distributed ERP software products in a small US make-to-order engineering firm. The case emphasises the potential of freely distributed ERP systems, as well as some of the hurdles involved in their implementation. The paper briefly reviews highlights of OSS ERP systems, with the primary focus on reporting the case experiences for efforts to implement ERPLite software and xTuple software. While both systems worked from a technical perspective, both failed due to economic factors. While these economic conditions led to imperfect results, the case demonstrates the feasibility of OSS ERP for small businesses. Both experiences are evaluated in terms of risk dimension.
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
MAVEN Data Analysis and Visualization Toolkits
NASA Astrophysics Data System (ADS)
Harter, B., Jr.; DeWolfe, A. W.; Brain, D.; Chaffin, M.
2017-12-01
The Mars Atmospheric and Volatile Evolution (MAVEN) mission has been collecting data at Mars since September 2014. The MAVEN Science Data Center has developed software toolkits for analyzing and visualizing the science data. Our Data Intercomparison and Visualization Development Effort (DIVIDE) toolkit is written in IDL, and utilizes the widely used "tplot" IDL libraries. Recently, we have converted DIVIDE into Python in an effort to increase the accessibility of the MAVEN data. This conversion also necessitated the development of a Python version of the tplot libraries, which we have dubbed "PyTplot". PyTplot is generalized to work with missions beyond MAVEN, and our software is available on Github.
Kendler, K S
2012-04-01
Our tendency to see the world of psychiatric illness in dichotomous and opposing terms has three major sources: the philosophy of Descartes, the state of neuropathology in late nineteenth century Europe (when disorders were divided into those with and without demonstrable pathology and labeled, respectively, organic and functional), and the influential concept of computer functionalism wherein the computer is viewed as a model for the human mind-brain system (brain=hardware, mind=software). These mutually re-enforcing dichotomies, which have had a pernicious influence on our field, make a clear prediction about how 'difference-makers' (aka causal risk factors) for psychiatric disorders should be distributed in nature. In particular, are psychiatric disorders like our laptops, which when they dysfunction, can be cleanly divided into those with software versus hardware problems? I propose 11 categories of difference-makers for psychiatric illness from molecular genetics through culture and review their distribution in schizophrenia, major depression and alcohol dependence. In no case do these distributions resemble that predicted by the organic-functional/hardware-software dichotomy. Instead, the causes of psychiatric illness are dappled, distributed widely across multiple categories. We should abandon Cartesian and computer-functionalism-based dichotomies as scientifically inadequate and an impediment to our ability to integrate the diverse information about psychiatric illness our research has produced. Empirically based pluralism provides a rigorous but dappled view of the etiology of psychiatric illness. Critically, it is based not on how we wish the world to be but how the difference-makers for psychiatric illness are in fact distributed.
An empirical comparison of a dynamic software testability metric to static cyclomatic complexity
NASA Technical Reports Server (NTRS)
Voas, Jeffrey M.; Miller, Keith W.; Payne, Jeffrey E.
1993-01-01
This paper compares the dynamic testability prediction technique termed 'sensitivity analysis' to the static testability technique termed cyclomatic complexity. The application that we chose in this empirical study is a CASE generated version of a B-737 autoland system. For the B-737 system we analyzed, we isolated those functions that we predict are more prone to hide errors during system/reliability testing. We also analyzed the code with several other well-known static metrics. This paper compares and contrasts the results of sensitivity analysis to the results of the static metrics.
Lessons Learned from Autonomous Sciencecraft Experiment
NASA Technical Reports Server (NTRS)
Chien, Steve A.; Sherwood, Rob; Tran, Daniel; Cichy, Benjamin; Rabideau, Gregg; Castano, Rebecca; Davies, Ashley; Mandl, Dan; Frye, Stuart; Trout, Bruce;
2005-01-01
An Autonomous Science Agent has been flying onboard the Earth Observing One Spacecraft since 2003. This software enables the spacecraft to autonomously detect and responds to science events occurring on the Earth such as volcanoes, flooding, and snow melt. The package includes AI-based software systems that perform science data analysis, deliberative planning, and run-time robust execution. This software is in routine use to fly the EO-l mission. In this paper we briefly review the agent architecture and discuss lessons learned from this multi-year flight effort pertinent to deployment of software agents to critical applications.
Development of a Unix/VME data acquisition system
NASA Astrophysics Data System (ADS)
Miller, M. C.; Ahern, S.; Clark, S. M.
1992-01-01
The current status of a Unix-based VME data acquisition development project is described. It is planned to use existing Fortran data collection software to drive the existing CAMAC electronics via a VME CAMAC branch driver card and associated Daresbury Unix driving software. The first usable Unix driver has been written and produces single-action CAMAC cycles from test software. The data acquisition code has been implemented in test mode under Unix with few problems and effort is now being directed toward finalizing calls to the CAMAC-driving software and ultimate evaluation of the complete system.
NASA Astrophysics Data System (ADS)
Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan
A relevant initiative from the software engineering community called Model Driven Engineering (MDE) is being developed in parallel with the Semantic Web (Mellor et al. 2003a). The MDE approach to software development suggests that one should first develop a model of the system under study, which is then transformed into the real thing (i.e., an executable software entity). The most important research initiative in this area is the Model Driven Architecture (MDA), which is Model Driven Architecture being developed under the umbrella of the Object Management Group (OMG). This chapter describes the basic concepts of this software engineering effort.
Effective organizational solutions for implementation of DBMS software packages
NASA Technical Reports Server (NTRS)
Jones, D.
1984-01-01
The space telescope management information system development effort is a guideline for discussing effective organizational solutions used in implementing DBMS software. Focus is on the importance of strategic planning. The value of constructing an information system architecture to conform to the organization's managerial needs, the need for a senior decision maker, dealing with shifting user requirements, and the establishment of a reliable working relationship with the DBMS vendor are examined. Requirements for a schedule to demonstrate progress against a defined timeline and the importance of continued monitoring for production software control, production data control, and software enhancements are also discussed.
Making Information Available to Partially Sighted and Blind Clients.
ERIC Educational Resources Information Center
Long, C. A.
1993-01-01
Provides an empirical review of problems facing library users with visual impairments using computers, and reviews some of the technology that can help alleviate these problems. Highlights include software; GUI (Graphical User Interfaces); advising and training; library automation; and appendices that list further sources of relevant information.…
Assistive Technology and Literacy Partnerships
ERIC Educational Resources Information Center
Gillette, Yvonne
2006-01-01
Assistive technology (AT) has the potential to support the literacy skills of students with disabilities as they participate in the general education curriculum. Empirical evidence is presented to support the use of AT, at least for some students. A case study interwoven within the article illustrates team decision-making regarding software and…
Krueger, Robert F; Tackett, Jennifer L; MacDonald, Angus
2016-11-01
Traditionally, psychopathology has been conceptualized in terms of polythetic categories derived from committee deliberations and enshrined in authoritative psychiatric nosologies-most notably the Diagnostic and Statistical Manual of Mental Disorders (DSM; American Psychiatric Association [APA], 2013). As the limitations of this form of classification have become evident, empirical data have been increasingly relied upon to investigate the structure of psychopathology. These efforts have borne fruit in terms of an increasingly consistent set of psychopathological constructs closely connected with similar personality constructs. However, the work of validating these constructs using convergent sources of data is an ongoing enterprise. This special section collects several new efforts to use structural approaches to study the validity of this empirically based organizational scheme for psychopathology. Inasmuch as a structural approach reflects the natural organization of psychopathology, it has great potential to facilitate comprehensive organization of information on the correlates of psychopathology, providing evidence for the convergent and discriminant validity of an empirical approach to classification. Here, we highlight several themes that emerge from this burgeoning literature. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Study of fault-tolerant software technology
NASA Technical Reports Server (NTRS)
Slivinski, T.; Broglio, C.; Wild, C.; Goldberg, J.; Levitt, K.; Hitt, E.; Webb, J.
1984-01-01
Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance.
Software Management for the NOνAExperiment
NASA Astrophysics Data System (ADS)
Davies, G. S.; Davies, J. P.; C Group; Rebel, B.; Sachdev, K.; Zirnstein, J.
2015-12-01
The NOvAsoftware (NOνASoft) is written in C++, and built on the Fermilab Computing Division's art framework that uses ROOT analysis software. NOνASoftmakes use of more than 50 external software packages, is developed by more than 50 developers and is used by more than 100 physicists from over 30 universities and laboratories in 3 continents. The software builds are handled by Fermilab's custom version of Software Release Tools (SRT), a UNIX based software management system for large, collaborative projects that is used by several experiments at Fermilab. The system provides software version control with SVN configured in a client-server mode and is based on the code originally developed by the BaBar collaboration. In this paper, we present efforts towards distributing the NOvA software via the CernVM File System distributed file system. We will also describe our recent work to use a CMake build system and Jenkins, the open source continuous integration system, for NOνASoft.
Price, Charles A.; Symonova, Olga; Mileyko, Yuriy; Hilley, Troy; Weitz, Joshua S.
2011-01-01
Interest in the structure and function of physical biological networks has spurred the development of a number of theoretical models that predict optimal network structures across a broad array of taxonomic groups, from mammals to plants. In many cases, direct tests of predicted network structure are impossible given the lack of suitable empirical methods to quantify physical network geometry with sufficient scope and resolution. There is a long history of empirical methods to quantify the network structure of plants, from roots, to xylem networks in shoots and within leaves. However, with few exceptions, current methods emphasize the analysis of portions of, rather than entire networks. Here, we introduce the Leaf Extraction and Analysis Framework Graphical User Interface (LEAF GUI), a user-assisted software tool that facilitates improved empirical understanding of leaf network structure. LEAF GUI takes images of leaves where veins have been enhanced relative to the background, and following a series of interactive thresholding and cleaning steps, returns a suite of statistics and information on the structure of leaf venation networks and areoles. Metrics include the dimensions, position, and connectivity of all network veins, and the dimensions, shape, and position of the areoles they surround. Available for free download, the LEAF GUI software promises to facilitate improved understanding of the adaptive and ecological significance of leaf vein network structure. PMID:21057114
Price, Charles A; Symonova, Olga; Mileyko, Yuriy; Hilley, Troy; Weitz, Joshua S
2011-01-01
Interest in the structure and function of physical biological networks has spurred the development of a number of theoretical models that predict optimal network structures across a broad array of taxonomic groups, from mammals to plants. In many cases, direct tests of predicted network structure are impossible given the lack of suitable empirical methods to quantify physical network geometry with sufficient scope and resolution. There is a long history of empirical methods to quantify the network structure of plants, from roots, to xylem networks in shoots and within leaves. However, with few exceptions, current methods emphasize the analysis of portions of, rather than entire networks. Here, we introduce the Leaf Extraction and Analysis Framework Graphical User Interface (LEAF GUI), a user-assisted software tool that facilitates improved empirical understanding of leaf network structure. LEAF GUI takes images of leaves where veins have been enhanced relative to the background, and following a series of interactive thresholding and cleaning steps, returns a suite of statistics and information on the structure of leaf venation networks and areoles. Metrics include the dimensions, position, and connectivity of all network veins, and the dimensions, shape, and position of the areoles they surround. Available for free download, the LEAF GUI software promises to facilitate improved understanding of the adaptive and ecological significance of leaf vein network structure.
A study of software standards used in the avionics industry
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.
1994-01-01
Within the past decade, software has become an increasingly common element in computing systems. In particular, the role of software used in the aerospace industry, especially in life- or safety-critical applications, is rapidly expanding. This intensifies the need to use effective techniques for achieving and verifying the reliability of avionics software. Although certain software development processes and techniques are mandated by government regulating agencies, no one methodology has been shown to consistently produce reliable software. The knowledge base for designing reliable software simply has not reached the maturity of its hardware counterpart. In an effort to increase our understanding of software, the Langley Research Center conducted a series of experiments over 15 years with the goal of understanding why and how software fails. As part of this program, the effectiveness of current industry standards for the development of avionics is being investigated. This study involves the generation of a controlled environment to conduct scientific experiments on software processes.
NASA Technical Reports Server (NTRS)
1976-01-01
Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.
An opportunity cost model of subjective effort and task performance
Kurzban, Robert; Duckworth, Angela; Kable, Joseph W.; Myers, Justus
2013-01-01
Why does performing certain tasks cause the aversive experience of mental effort and concomitant deterioration in task performance? One explanation posits a physical resource that is depleted over time. We propose an alternate explanation that centers on mental representations of the costs and benefits associated with task performance. Specifically, certain computational mechanisms, especially those associated with executive function, can be deployed for only a limited number of simultaneous tasks at any given moment. Consequently, the deployment of these computational mechanisms carries an opportunity cost – that is, the next-best use to which these systems might be put. We argue that the phenomenology of effort can be understood as the felt output of these cost/benefit computations. In turn, the subjective experience of effort motivates reduced deployment of these computational mechanisms in the service of the present task. These opportunity cost representations, then, together with other cost/benefit calculations, determine effort expended and, everything else equal, result in performance reductions. In making our case for this position, we review alternate explanations both for the phenomenology of effort associated with these tasks and for performance reductions over time. Likewise, we review the broad range of relevant empirical results from across subdisciplines, especially psychology and neuroscience. We hope that our proposal will help to build links among the diverse fields that have been addressing similar questions from different perspectives, and we emphasize ways in which alternate models might be empirically distinguished. PMID:24304775
A CMMI-based approach for medical software project life cycle study.
Chen, Jui-Jen; Su, Wu-Chen; Wang, Pei-Wen; Yen, Hung-Chi
2013-01-01
In terms of medical techniques, Taiwan has gained international recognition in recent years. However, the medical information system industry in Taiwan is still at a developing stage compared with the software industries in other nations. In addition, systematic development processes are indispensable elements of software development. They can help developers increase their productivity and efficiency and also avoid unnecessary risks arising during the development process. Thus, this paper presents an application of Light-Weight Capability Maturity Model Integration (LW-CMMI) to Chang Gung Medical Research Project (CMRP) in the Nuclear medicine field. This application was intended to integrate user requirements, system design and testing of software development processes into three layers (Domain, Concept and Instance) model. Then, expressing in structural System Modeling Language (SysML) diagrams and converts part of the manual effort necessary for project management maintenance into computational effort, for example: (semi-) automatic delivery of traceability management. In this application, it supports establishing artifacts of "requirement specification document", "project execution plan document", "system design document" and "system test document", and can deliver a prototype of lightweight project management tool on the Nuclear Medicine software project. The results of this application can be a reference for other medical institutions in developing medical information systems and support of project management to achieve the aim of patient safety.
Fault Tolerant Software Technology for Distributed Computer Systems
1989-03-01
RAY.) &-TR-88-296 I Fin;.’ Technical Report ,r 19,39 i A28 3329 F’ULT TOLERANT SOFTWARE TECHNOLOGY FOR DISTRIBUTED COMPUTER SYSTEMS Georgia Institute...GrfisABN 34-70IiWftlI NO0. IN?3. NO IACCESSION NO. 158 21 7 11. TITLE (Incld security Cassification) FAULT TOLERANT SOFTWARE FOR DISTRIBUTED COMPUTER ...Technology for Distributed Computing Systems," a two year effort performed at Georgia Institute of Technology as part of the Clouds Project. The Clouds
NASA Technical Reports Server (NTRS)
1989-01-01
Loredan Biomedical, Inc.'s LIDO, a computerized physical therapy system, was purchased by NASA in 1985 for evaluation as a Space Station Freedom exercise program. In 1986, while involved in an ARC muscle conditioning project, Malcom Bond, Loredan's chairman, designed an advanced software package for NASA which became the basis for LIDOSOFT software used in the commercially available system. The system employs a "proprioceptive" software program which perceives internal body conditions, induces perturbations to muscular effort and evaluates the response. Biofeedback on a screen allows a patient to observe his own performance.
1983-05-01
observed end-of-course scores for tasks .- trained to criterion. e MGA software was calibrated to provide retention estimates at two levels of...exceed the MGA estimates. Thirty-five out of forty, or 87.5,o0 of the tasks met this expectation. . * For these first trial data, MGA software predicts...Objective: The objective of this effort was to perform an operational test of the capability of MGA Skill Training and Retention (STAR©) software to
A Survey On Management Of Software Engineering In Japan
NASA Astrophysics Data System (ADS)
Kadono, Yasuo; Tsubaki, Hiroe; Tsuruho, Seishiro
2008-05-01
The purpose of this study is to clarity the mechanism of how software engineering capabilities relate to the business performance of IT vendors in Japan. To do this, we developed a structural model using factors related to software engineering, business performance and competitive environment. By analyzing the data collected from 78 major IT vendors in Japan, we found that superior deliverables and business performance were correlated with the effort expended particularly on human resource development, quality assurance, research and development and process improvement.
NASA Technical Reports Server (NTRS)
1979-01-01
Program elements of the power module (PM) system, are identified, structured, and defined according to the planned work breakdown structure. Efforts required to design, develop, manufacture, test, checkout, launch and operate a protoflight assembled 25 kW, 50 kW and 100 kW PM include the preparation and delivery of related software, government furnished equipment, space support equipment, ground support equipment, launch site verification software, orbital verification software, and all related data items.
Collected software engineering papers, volume 8
NASA Technical Reports Server (NTRS)
1990-01-01
A collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period November 1989 through October 1990 is presented. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography. The seven presented papers are grouped into four major categories: (1) experimental research and evaluation of software measurement; (2) studies on models for software reuse; (3) a software tool evaluation; and (4) Ada technology and studies in the areas of reuse and specification.
Space Shuttle Software Development and Certification
NASA Technical Reports Server (NTRS)
Orr, James K.; Henderson, Johnnie A
2000-01-01
Man-rated software, "software which is in control of systems and environments upon which human life is critically dependent," must be highly reliable. The Space Shuttle Primary Avionics Software System is an excellent example of such a software system. Lessons learn from more than 20 years of effort have identified basic elements that must be present to achieve this high degree of reliability. The elements include rigorous application of appropriate software development processes, use of trusted tools to support those processes, quantitative process management, and defect elimination and prevention. This presentation highlights methods used within the Space Shuttle project and raises questions that must be addressed to provide similar success in a cost effective manner on future long-term projects where key application development tools are COTS rather than internally developed custom application development tools
Mining Software Usage with the Automatic Library Tracking Database (ALTD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hadri, Bilel; Fahey, Mark R
2013-01-01
Tracking software usage is important for HPC centers, computer vendors, code developers and funding agencies to provide more efficient and targeted software support, and to forecast needs and guide HPC software effort towards the Exascale era. However, accurately tracking software usage on HPC systems has been a challenging task. In this paper, we present a tool called Automatic Library Tracking Database (ALTD) that has been developed and put in production on several Cray systems. The ALTD infrastructure prototype automatically and transparently stores information about libraries linked into an application at compilation time and also the executables launched in a batchmore » job. We will illustrate the usage of libraries, compilers and third party software applications on a system managed by the National Institute for Computational Sciences.« less
In My Own Time: Tuition Fees, Class Time and Student Effort in Non-Formal (Or Continuing) Education
ERIC Educational Resources Information Center
Bolli, Thomas; Johnes, Geraint
2015-01-01
We develop and empirically test a model which examines the impact of changes in class time and tuition fees on student effort in the form of private study. The data come from the European Union's Adult Education Survey, conducted over the period 2005-2008. We find, in line with theoretical predictions, that the time students devote to private…
ERIC Educational Resources Information Center
Colletta, Nat J.
In the fall of 1974, I was invited to serve as a consultant to the Indonesian effort to develop a National Strategy for Non-Formal Education. The brunt of my effort concerned action research for developing and testing an empirical "Community Learning System" designed to link local learning needs with the management-resource-learning…
Application of software technology to automatic test data analysis
NASA Technical Reports Server (NTRS)
Stagner, J. R.
1991-01-01
The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.
ERIC Educational Resources Information Center
Wright, Gerald P.
2013-01-01
Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…
HALOE test and evaluation software
NASA Technical Reports Server (NTRS)
Edmonds, W.; Natarajan, S.
1987-01-01
Computer programming, system development and analysis efforts during this contract were carried out in support of the Halogen Occultation Experiment (HALOE) at NASA/Langley. Support in the major areas of data acquisition and monitoring, data reduction and system development are described along with a brief explanation of the HALOE project. Documented listings of major software are located in the appendix.
ERIC Educational Resources Information Center
Zhan, Wei; Goulart, Ana; Morgan, Joseph A.; Porter, Jay R.
2011-01-01
This paper discusses the details of the curricular development effort with a focus on the vertical and horizontal integration of laboratory curricula and course projects within the Electronic Engineering Technology (EET) program at Texas A&M University. Both software and hardware aspects are addressed. A common set of software tools are…
ERIC Educational Resources Information Center
Pribela, Ivan; Ivanovic, Mirjana; Budimac, Zoran
2009-01-01
This paper discusses Svetovid, cross-platform software that helps instructors to assess the amount of effort put into practical exercises and exams in courses related to computer programming. The software was developed as an attempt at solving problems associated with practical exercises and exams. This paper discusses the design and use of…
Development of a Coordinate Transformation method for direct georeferencing in map projection frames
NASA Astrophysics Data System (ADS)
Zhao, Haitao; Zhang, Bing; Wu, Changshan; Zuo, Zhengli; Chen, Zhengchao
2013-03-01
This paper develops a novel Coordinate Transformation method (CT-method), with which the orientation angles (roll, pitch, heading) of the local tangent frame of the GPS/INS system are transformed into those (omega, phi, kappa) of the map projection frame for direct georeferencing (DG). Especially, the orientation angles in the map projection frame were derived from a sequence of coordinate transformations. The effectiveness of orientation angles transformation was verified through comparing with DG results obtained from conventional methods (Legat method and POSPac method) using empirical data. Moreover, the CT-method was also validated with simulated data. One advantage of the proposed method is that the orientation angles can be acquired simultaneously while calculating position elements of exterior orientation (EO) parameters and auxiliary points coordinates by coordinate transformation. These three methods were demonstrated and compared using empirical data. Empirical results show that the CT-method is both as sound and effective as Legat method. Compared with POSPac method, the CT-method is more suitable for calculating EO parameters for DG in map projection frames. DG accuracy of the CT-method and Legat method are at the same level. DG results of all these three methods have systematic errors in height due to inconsistent length projection distortion in the vertical and horizontal components, and these errors can be significantly reduced using the EO height correction technique in Legat's approach. Similar to the results obtained with empirical data, the effectiveness of the CT-method was also proved with simulated data. POSPac method: The method is presented by Applanix POSPac software technical note (Hutton and Savina, 1997). It is implemented in the POSEO module of POSPac software.
Pattern of state coal taxation. [Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gulley, D.A.
1981-01-01
This paper reviews the recent history of state coal taxation and reports an empirically-based effort at defining the key determinants of state and local coal taxation. A pattern emerges but the analysis is complicated by the empirical and conceptual difficulties typical of such studies. Perhaps as important a result as the detection of a pattern is the recognition that many seemingly important variables do not appear to have consistently influenced tax levels. For policy makers and for industry, it appears that the present concern over a coal-states cartel is excessive. One can speculate that draconian tax adjustments on the basismore » of a crude-indicator-like reserve base will ultimately transfer less wealth than would skillful preemption of rent. It is also noteworthy that the sign of the tax effort variable is positive, indicating that coal tax rates are consistent with other tax efforts, not a substitute for them. Accepting impacts and general tax effort variables as the best explanations of interstate variations in tax effort is a somewhat different matter than determining what any given state's tax rate ought to be; such a question lies beyond the scope of this paper. This tax-determinant study can not define the right level of coal taxation, but it can suggest that no trend is yet evident toward entrepreneurial tax rates. 20 references, 4 figures.« less
ERIC Educational Resources Information Center
O'Connor, Robert
1992-01-01
If market economies are to emerge from the ruins of the Soviet empire, people need training in almost everything. Efforts are being made in the areas of management, service, accounting, and police work. (Author/JOW)
Patmanidis, Ilias
2018-01-01
In bionanotechnology, the field of creating functional materials consisting of bio-inspired molecules, the function and shape of a nanostructure only appear through the assembly of many small molecules together. The large number of building blocks required to define a nanostructure combined with the many degrees of freedom in packing small molecules has long precluded molecular simulations, but recent advances in computational hardware as well as software have made classical simulations available to this strongly expanding field. Here, we review the state of the art in simulations of self-assembling bio-inspired supramolecular systems. We will first discuss progress in force fields, simulation protocols and enhanced sampling techniques using recent examples. Secondly, we will focus on efforts to enable the comparison of experimentally accessible observables and computational results. Experimental quantities that can be measured by microscopy, spectroscopy and scattering can be linked to simulation output either directly or indirectly, via quantum mechanical or semi-empirical techniques. Overall, we aim to provide an overview of the various computational approaches to understand not only the molecular architecture of nanostructures, but also the mechanism of their formation. PMID:29688238
A model for evaluating the social performance of construction waste management.
Yuan, Hongping
2012-06-01
It has been determined by existing literature that a lot of research efforts have been made to the economic performance of construction waste management (CWM), but less attention is paid to investigation of the social performance of CWM. This study therefore attempts to develop a model for quantitatively evaluating the social performance of CWM by using a system dynamics (SD) approach. Firstly, major variables affecting the social performance of CWM are identified and a holistic system for assessing the social performance of CWM is formulated in line with feedback relationships underlying these variables. The developed system is then converted into a SD model through the software iThink. An empirical case study is finally conducted to demonstrate application of the model. Results of model validation indicate that the model is robust and reasonable to reflect the situation of the real system under study. Findings of the case study offer helpful insights into effectively promoting the social performance of CWM of the project investigated. Furthermore, the model exhibits great potential to function as an experimental platform for dynamically evaluating effects of management measures on improving the social performance of CWM of construction projects. Copyright © 2012 Elsevier Ltd. All rights reserved.
A formative evaluation of CU-SeeMe
NASA Astrophysics Data System (ADS)
Bibeau, Michael
1995-02-01
CU-SeeMe is a video conferencing software package that was designed and programmed at Cornell University. The program works with the TCP/IP network protocol and allows two or more parties to conduct a real-time video conference with full audio support. In this paper we evaluate CU-SeeMe through the process of Formative Evaluation. We first perform a Critical Review of the software using a subset of the Smith and Mosier Guidelines for Human-Computer Interaction. Next, we empirically review the software interface through a series of benchmark tests that are derived directly from a set of scenarios. The scenarios attempt to model real world situations that might be encountered by an individual in the target user class. Designing benchmark tasks becomes a natural and straightforward process when they are derived from the scenario set. Empirical measures are taken for each task, including completion times and error counts. These measures are accompanied by critical incident analysis 2 7 13 which serves to identify problems with the interface and the cognitive roots of those problems. The critical incidents reported by participants are accompanied by explanations of what caused the problem and why This helps in the process of formulating solutions for observed usability problems. All the testing results are combined in the Appendix in an illustrated partial redesign of the CU-SeeMe Interface.
Stability of cosmetic emulsion containing different amount of hemp oil.
Kowalska, M; Ziomek, M; Żbikowska, A
2015-08-01
The aim of the study was to determine the optimal conditions, that is the content of hemp oil and time of homogenization to obtain stable dispersion systems. For this purpose, six emulsions were prepared, their stability was examined empirically and the most correctly formulated emulsion composition was determined using a computer simulation. Variable parameters (oil content and homogenization time) were indicated by the optimization software based on Kleeman's method. Physical properties of the synthesized emulsions were studied by numerous techniques involving particle size analysis, optical microscopy, Turbiscan test and viscosity of emulsions. The emulsion containing 50 g of oil and being homogenized for 6 min had the highest stability. Empirically determined parameters proved to be consistent with the results obtained using the computer software. The computer simulation showed that the most stable emulsion should contain from 30 to 50 g of oil and should be homogenized for 2.5-6 min. The computer software based on Kleeman's method proved to be useful for quick optimization of the composition and production parameters of stable emulsion systems. Moreover, obtaining an emulsion system with proper stability justifies further research extended with sensory analysis, which will allow the application of such systems (containing hemp oil, beneficial for skin) in the cosmetic industry. © 2015 Society of Cosmetic Scientists and the Société Française de Cosmétologie.
Sustainable Software Decisions for Long-term Projects (Invited)
NASA Astrophysics Data System (ADS)
Shepherd, A.; Groman, R. C.; Chandler, C. L.; Gaylord, D.; Sun, M.
2013-12-01
Adopting new, emerging technologies can be difficult for established projects that are positioned to exist for years to come. In some cases the challenge lies in the pre-existing software architecture. In others, the challenge lies in the fluctuation of resources like people, time and funding. The Biological and Chemical Oceanography Data Management Office (BCO-DMO) was created in late 2006 by combining the data management offices for the U.S. GLOBEC and U.S. JGOFS programs to publish data for researchers funded by the National Science Foundation (NSF). Since its inception, BCO-DMO has been supporting access and discovery of these data through web-accessible software systems, and the office has worked through many of the challenges of incorporating new technologies into its software systems. From migrating human readable, flat file metadata storage into a relational database, and now, into a content management system (Drupal) to incorporating controlled vocabularies, new technologies can radically affect the existing software architecture. However, through the use of science-driven use cases, effective resource management, and loosely coupled software components, BCO-DMO has been able to adapt its existing software architecture to adopt new technologies. One of the latest efforts at BCO-DMO revolves around applying metadata semantics for publishing linked data in support of data discovery. This effort primarily affects the metadata web interface software at http://bco-dmo.org and the geospatial interface software at http://mapservice.bco-dmo.org/. With guidance from science-driven use cases and consideration of our resources, implementation decisions are made using a strategy to loosely couple the existing software systems to the new technologies. The results of this process led to the use of REST web services and a combination of contributed and custom Drupal modules for publishing BCO-DMO's content using the Resource Description Framework (RDF) via an instance of the Virtuoso Open-Source triplestore.
NASA Technical Reports Server (NTRS)
Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey
1993-01-01
Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.
Kapur, Tina; Pieper, Steve; Fedorov, Andriy; Fillion-Robin, J-C; Halle, Michael; O'Donnell, Lauren; Lasso, Andras; Ungi, Tamas; Pinter, Csaba; Finet, Julien; Pujol, Sonia; Jagadeesan, Jayender; Tokuda, Junichi; Norton, Isaiah; Estepar, Raul San Jose; Gering, David; Aerts, Hugo J W L; Jakab, Marianna; Hata, Nobuhiko; Ibanez, Luiz; Blezek, Daniel; Miller, Jim; Aylward, Stephen; Grimson, W Eric L; Fichtinger, Gabor; Wells, William M; Lorensen, William E; Schroeder, Will; Kikinis, Ron
2016-10-01
The National Alliance for Medical Image Computing (NA-MIC) was launched in 2004 with the goal of investigating and developing an open source software infrastructure for the extraction of information and knowledge from medical images using computational methods. Several leading research and engineering groups participated in this effort that was funded by the US National Institutes of Health through a variety of infrastructure grants. This effort transformed 3D Slicer from an internal, Boston-based, academic research software application into a professionally maintained, robust, open source platform with an international leadership and developer and user communities. Critical improvements to the widely used underlying open source libraries and tools-VTK, ITK, CMake, CDash, DCMTK-were an additional consequence of this effort. This project has contributed to close to a thousand peer-reviewed publications and a growing portfolio of US and international funded efforts expanding the use of these tools in new medical computing applications every year. In this editorial, we discuss what we believe are gaps in the way medical image computing is pursued today; how a well-executed research platform can enable discovery, innovation and reproducible science ("Open Science"); and how our quest to build such a software platform has evolved into a productive and rewarding social engineering exercise in building an open-access community with a shared vision. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less
ERIC Educational Resources Information Center
Sexton, Randall; Hignite, Michael; Margavio, Thomas M.; Margavio, Geanie W.
2009-01-01
Information Literacy is a concept that evolved as a result of efforts to move technology-based instructional and research efforts beyond the concepts previously associated with "computer literacy." While computer literacy was largely a topic devoted to knowledge of hardware and software, information literacy is concerned with students' abilities…
Software Development Processes Applied to Computational Icing Simulation
NASA Technical Reports Server (NTRS)
Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.
1999-01-01
The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.
Artificial Intelligence In Computational Fluid Dynamics
NASA Technical Reports Server (NTRS)
Vogel, Alison Andrews
1991-01-01
Paper compares four first-generation artificial-intelligence (Al) software systems for computational fluid dynamics. Includes: Expert Cooling Fan Design System (EXFAN), PAN AIR Knowledge System (PAKS), grid-adaptation program MITOSIS, and Expert Zonal Grid Generation (EZGrid). Focuses on knowledge-based ("expert") software systems. Analyzes intended tasks, kinds of knowledge possessed, magnitude of effort required to codify knowledge, how quickly constructed, performances, and return on investment. On basis of comparison, concludes Al most successful when applied to well-formulated problems solved by classifying or selecting preenumerated solutions. In contrast, application of Al to poorly understood or poorly formulated problems generally results in long development time and large investment of effort, with no guarantee of success.
Intelligent Command and Control Systems for Satellite Ground Operations
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1999-01-01
This grant, Intelligent Command and Control Systems for Satellite Ground Operations, funded by NASA Goddard Space Flight Center, has spanned almost a decade. During this time, it has supported a broad range of research addressing the changing needs of NASA operations. It is important to note that many of NASA's evolving needs, for example, use of automation to drastically reduce (e.g., 70%) operations costs, are similar requirements in both government and private sectors. Initially the research addressed the appropriate use of emerging and inexpensive computational technologies, such as X Windows, graphics, and color, together with COTS (commercial-off-the-shelf) hardware and software such as standard Unix workstations to re-engineer satellite operations centers. The first phase of research supported by this grant explored the development of principled design methodologies to make effective use of emerging and inexpensive technologies. The ultimate performance measures for new designs were whether or not they increased system effectiveness while decreasing costs. GT-MOCA (The Georgia Tech Mission Operations Cooperative Associate) and GT-VITA (Georgia Tech Visual and Inspectable Tutor and Assistant), whose latter stages were supported by this research, explored model-based design of collaborative operations teams and the design of intelligent tutoring systems, respectively. Implemented in proof-of-concept form for satellite operations, empirical evaluations of both, using satellite operators for the former and personnel involved in satellite control operations for the latter, demonstrated unequivocally the feasibility and effectiveness of the proposed modeling and design strategy underlying both research efforts. The proof-of-concept implementation of GT-MOCA showed that the methodology could specify software requirements that enabled a human-computer operations team to perform without any significant performance differences from the standard two-person satellite operations team. GT-VITA, using the same underlying methodology, the operator function model (OFM), and its computational implementation, OFMspert, successfully taught satellite control knowledge required by flight operations team members. The tutor structured knowledge in three ways: declarative knowledge (e.g., What is this? What does it do?), procedural knowledge, and operational skill. Operational skill is essential in real-time operations. It combines the two former knowledge types, assisting a student to use them effectively in a dynamic, multi-tasking, real-time operations environment. A high-fidelity simulator of the operator interface to the ground control system, including an almost full replication of both the human-computer interface and human interaction with the dynamic system, was used in the GT-MOCA and GT-VITA evaluations. The GT-VITA empirical evaluation, conducted with a range of'novices' that included GSFC operations management, GSFC operations software developers, and new flight operations team members, demonstrated that GT-VITA effectively taught a wide range of knowledge in a succinct and engaging manner.
NASA Astrophysics Data System (ADS)
Mandache, C.; Khan, M.; Fahr, A.; Yanishevsky, M.
2011-03-01
Probability of detection (PoD) studies are broadly used to determine the reliability of specific nondestructive inspection procedures, as well as to provide data for damage tolerance life estimations and calculation of inspection intervals for critical components. They require inspections on a large set of samples, a fact that makes these statistical assessments time- and cost-consuming. Physics-based numerical simulations of nondestructive testing inspections could be used as a cost-effective alternative to empirical investigations. They realistically predict the inspection outputs as functions of the input characteristics related to the test piece, transducer and instrument settings, which are subsequently used to partially substitute and/or complement inspection data in PoD analysis. This work focuses on the numerical modelling aspects of eddy current testing for the bolt hole inspections of wing box structures typical of the Lockheed Martin C-130 Hercules and P-3 Orion aircraft, found in the air force inventory of many countries. Boundary element-based numerical modelling software was employed to predict the eddy current signal responses when varying inspection parameters related to probe characteristics, crack geometry and test piece properties. Two demonstrator exercises were used for eddy current signal prediction when lowering the driver probe frequency and changing the material's electrical conductivity, followed by subsequent discussions and examination of the implications on using simulated data in the PoD analysis. Despite some simplifying assumptions, the modelled eddy current signals were found to provide similar results to the actual inspections. It is concluded that physics-based numerical simulations have the potential to partially substitute or complement inspection data required for PoD studies, reducing the cost, time, effort and resources necessary for a full empirical PoD assessment.
How do feelings influence effort? An empirical study of entrepreneurs' affect and venture effort.
Foo, Maw-Der; Uy, Marilyn A; Baron, Robert A
2009-07-01
How do feelings influence the effort of entrepreneurs? To obtain data on this issue, the authors implemented experience sampling methodology in which 46 entrepreneurs used cell phones to provide reports on their affect, future temporal focus, and venture effort twice daily for 24 days. Drawing on the affect-as-information theory, the study found that entrepreneurs' negative affect directly predicts entrepreneurs' effort toward tasks that are required immediately. Results were consistent for within-day and next-day time lags. Extending the theory, the study found that positive affect predicts venture effort beyond what is immediately required and that this relationship is mediated by future temporal focus. The mediating effects were significant only for next-day outcomes. Implications of findings on the nature of the affect-effort relationship for different time lags are discussed.
ERIC Educational Resources Information Center
Orfanou, Konstantina; Tselios, Nikolaos; Katsanos, Christos
2015-01-01
Perceived usability affects greatly student's learning effectiveness and overall learning experience, and thus is an important requirement of educational software. The System Usability Scale (SUS) is a well-researched and widely used questionnaire for perceived usability evaluation. However, surprisingly few studies have used SUS to evaluate the…
Beyond the Quantitative and Qualitative Divide: Research in Art Education as Border Skirmish.
ERIC Educational Resources Information Center
Sullivan, Graeme
1996-01-01
Analyzes a research project that utilizes a coherent conceptual model of art education research incorporating the demand for empirical rigor and providing for diverse interpretive frameworks. Briefly profiles the NUD*IST (Non-numerical Unstructured Data Indexing Searching and Theorizing) software system that can organize and retrieve complex…
ERIC Educational Resources Information Center
Perry, James L.; Kraemer, Kenneth L.
1978-01-01
Argues that innovation attributes, together with policies associated with the diffusion on an innovation, account for significant differences in diffusion patterns. An empirical analysis of this thesis focuses on the diffusion of computer applications software in local government. Available from Elsevier Scientific Publishing Co., Box 211,…
Students' Understanding of the Concept of Interface in a Situated Context
ERIC Educational Resources Information Center
Boustedt, Jonas
2009-01-01
The current paper describes an empirical study with the aim of producing insights about how students experience programming and software engineering. The research aims to investigate the students' world, and hence, we have chosen a phenomenographic approach. Our questions focus on the students' experiences of concepts related to a realistic…
ERIC Educational Resources Information Center
Eteokleous, Nikleia; Pavlou, Victoria; Tsolakidis, Simos
2015-01-01
As a way to respond to the contemporary challenges for promoting multiliteracies and multimodality in education, the current study proposes a theoretical framework--the multiliteracies model--in identifying, developing and evaluating multimodal material. The article examines, first theoretically and then empirically, the promotion of…
Using Empirical Evidence in the Process of Proving: The Case of Dynamic Geometry
ERIC Educational Resources Information Center
Guven, Bulent; Cekmez, Erdem; Karatas, Ilhan
2010-01-01
With the emergence of Dynamic Geometry Software (DGS), a theoretical gap between the acquisition (inductive) and the justification (deductive) of a mathematical statement has started a debate. Some educators believe that deductive proof in geometry should be abandoned in favour of an experimental approach to mathematical justification. This…
An Empirical Research Study of the Efficacy of Two Plagiarism-Detection Applications
ERIC Educational Resources Information Center
Hill, Jacob D.; Page, Elaine Fetyko
2009-01-01
This article describes a study of the two most popular plagiarism-detection software platforms available on today's market--Turnitin (http://www.turnitin.com/static/index.html) and SafeAssign (http://www.safeassign.com/). After a brief discussion of plagiarism's relevance to librarians, the authors examine plagiarism-detection methodology and…
Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis
ERIC Educational Resources Information Center
Young, Cristobal; Holsteen, Katherine
2017-01-01
Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…
Using ITS to Create an Insurance Industry Application: A Joint Case Study.
ERIC Educational Resources Information Center
Boies, Stephen J.; And Others
1993-01-01
Presents an empirical case study of the use of ITS, a software development environment designed by IBM, by Continental Insurance for underwriting applications. Use of a rule-based user interface style that made electronic forms look like standard insurance industry paper forms and worked according to Continental's guidelines is described.…
ERIC Educational Resources Information Center
Brasiel, Sarah; Martin, Taylor; Jeong, Soojeong; Yuan, Min
2016-01-01
An extensive body of research has demonstrated that the use in a K-12 classroom of technology, such as the Internet, computers, and software programs, enhances the learning of mathematics (Cheung & Slavin, 2013; Cohen & Hollebrands, 2011). In particular, growing empirical evidence supports that certain types of technology, such as…
Supporting C2 Research and Evaluation: An Infrastructure and its Potential Impact
2011-06-01
Potential Impact,” Empirical Software Engineering, Vol. 10 No. 4, pp. 405-435, 2005. http://sir.unl.edu [16] J. O. Engene , Terrorism in Western...Evaluation and Conference: Proceedings of the 3rd-6th DARPA Workshops, Morgan Kaufman Publishers, 1996. … [16] J. O. Engene , Terrorism in Western Europe
Adapting to the Pedagogy of Technology in Educational Administration
ERIC Educational Resources Information Center
Berry, James E.; Marx, Gary
2010-01-01
The field of educational administration is in a pedagogical transition. Though empirical evidence may be lacking about the efficacy of online teaching and learning, programs in educational administration are part of the greater movement to Internet delivery by virtue of market forces and advances in software and hardware tools for teaching in a…
Proposal for a CLIPS software library
NASA Technical Reports Server (NTRS)
Porter, Ken
1991-01-01
This paper is a proposal to create a software library for the C Language Integrated Production System (CLIPS) expert system shell developed by NASA. Many innovative ideas for extending CLIPS were presented at the First CLIPS Users Conference, including useful user and database interfaces. CLIPS developers would benefit from a software library of reusable code. The CLIPS Users Group should establish a software library-- a course of action to make that happen is proposed. Open discussion to revise this library concept is essential, since only a group effort is likely to succeed. A response form intended to solicit opinions and support from the CLIPS community is included.
Object-oriented productivity metrics
NASA Technical Reports Server (NTRS)
Connell, John L.; Eller, Nancy
1992-01-01
Software productivity metrics are useful for sizing and costing proposed software and for measuring development productivity. Estimating and measuring source lines of code (SLOC) has proven to be a bad idea because it encourages writing more lines of code and using lower level languages. Function Point Analysis is an improved software metric system, but it is not compatible with newer rapid prototyping and object-oriented approaches to software development. A process is presented here for counting object-oriented effort points, based on a preliminary object-oriented analysis. It is proposed that this approach is compatible with object-oriented analysis, design, programming, and rapid prototyping. Statistics gathered on actual projects are presented to validate the approach.
NASA Astrophysics Data System (ADS)
Pozna, E.; Ramirez, A.; Mérand, A.; Mueller, A.; Abuter, R.; Frahm, R.; Morel, S.; Schmid, C.; Duc, T. Phan; Delplancke-Ströbele, F.
2014-07-01
The quality of data obtained by VLTI instruments may be refined by analyzing the continuous data supplied by the Reflective Memory Network (RMN). Based on 5 years experience providing VLTI instruments (PACMAN, AMBER, MIDI) with RMN data, the procedure has been generalized to make the synchronization with observation trouble-free. The present software interface saves not only months of efforts for each instrument but also provides the benefits of software frameworks. Recent applications (GRAVITY, MATISSE) supply feedback for the software to evolve. The paper highlights the way common features been identified to be able to offer reusable code in due course.
Introduction: Cybersecurity and Software Assurance Minitrack
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, Luanne; George, Richard; Linger, Richard C
Modern society is dependent on software systems of remarkable scope and complexity. Yet methods for assuring their security and functionality have not kept pace. The result is persistent compromises and failures despite best efforts. Cybersecurity methods must work together for situational awareness, attack prevention and detection, threat attribution, minimization of consequences, and attack recovery. Because defective software cannot be secure, assurance technologies must play a central role in cybersecurity approaches. There is increasing recognition of the need for rigorous methods for cybersecurity and software assurance. The goal of this minitrack is to develop science foundations, technologies, and practices that canmore » improve the security and dependability of complex systems.« less
NASA Technical Reports Server (NTRS)
Ensey, Tyler S.
2013-01-01
During my internship at NASA, I was a model developer for Ground Support Equipment (GSE). The purpose of a model developer is to develop and unit test model component libraries (fluid, electrical, gas, etc.). The models are designed to simulate software for GSE (Ground Special Power, Crew Access Arm, Cryo, Fire and Leak Detection System, Environmental Control System (ECS), etc. .) before they are implemented into hardware. These models support verifying local control and remote software for End-Item Software Under Test (SUT). The model simulates the physical behavior (function, state, limits and 110) of each end-item and it's dependencies as defined in the Subsystem Interface Table, Software Requirements & Design Specification (SRDS), Ground Integrated Schematic (GIS), and System Mechanical Schematic.(SMS). The software of each specific model component is simulated through MATLAB's Simulink program. The intensiv model development life cycle is a.s follows: Identify source documents; identify model scope; update schedule; preliminary design review; develop model requirements; update model.. scope; update schedule; detailed design review; create/modify library component; implement library components reference; implement subsystem components; develop a test script; run the test script; develop users guide; send model out for peer review; the model is sent out for verifictionlvalidation; if there is empirical data, a validation data package is generated; if there is not empirical data, a verification package is generated; the test results are then reviewed; and finally, the user. requests accreditation, and a statement of accreditation is prepared. Once each component model is reviewed and approved, they are intertwined together into one integrated model. This integrated model is then tested itself, through a test script and autotest, so that it can be concluded that all models work conjointly, for a single purpose. The component I was assigned, specifically, was a fluid component, a discrete pressure switch. The switch takes a fluid pressure input, and if the pressure is greater than a designated cutoff pressure, the switch would stop fluid flow.
Software Development and Test Methodology for a Distributed Ground System
NASA Technical Reports Server (NTRS)
Ritter, George; Guillebeau, Pat; McNair, Ann R. (Technical Monitor)
2002-01-01
The Marshall Space Flight Center's (MSFC) Payload Operations Center (POC) ground system has evolved over a period of about 10 years. During this time the software processes have migrated from more traditional to more contemporary development processes in an effort to minimize unnecessary overhead while maximizing process benefits. The Software processes that have evolved still emphasize requirements capture, software configuration management, design documenting, and making sure the products that have been developed are accountable to initial requirements. This paper will give an overview of how the Software Processes have evolved, highlighting the positives as well as the negatives. In addition, we will mention the COTS tools that have been integrated into the processes and how the COTS have provided value to the project.
Deindividuation and Internet software piracy.
Hinduja, Sameer
2008-08-01
Computer crime has increased exponentially in recent years as hardware, software, and network resources become more affordable and available to individuals from all walks of life. Software piracy is one prevalent type of cybercrime and has detrimentally affected the economic health of the software industry. Moreover, piracy arguably represents a rend in the moral fabric associated with the respect of intellectual property and reduces the financial incentive of product creation and innovation. Deindividuation theory, originating from the field of social psychology, argues that individuals are extricated from responsibility for their actions simply because they no longer have an acute awareness of the identity of self and of others. That is, external and internal constraints that would typically regulate questionable behavior are rendered less effective via certain anonymizing and disinhibiting conditions of the social and environmental context. This exploratory piece seeks to establish the role of deindividuation in liberating individuals to commit software piracy by testing the hypothesis that persons who prefer the anonymity and pseudonymity associated with interaction on the Internet are more likely to pirate software. Through this research, it is hoped that the empirical identification of such a social psychological determinant will help further illuminate the phenomenon.
On the need and use of models to explore the role of economic confidence:a survey.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprigg, James A.; Paez, Paul J.; Hand, Michael S.
2005-04-01
Empirical studies suggest that consumption is more sensitive to current income than suggested under the permanent income hypothesis, which raises questions regarding expectations for future income, risk aversion, and the role of economic confidence measures. This report surveys a body of fundamental economic literature as well as burgeoning computational modeling methods to support efforts to better anticipate cascading economic responses to terrorist threats and attacks. This is a three part survey to support the incorporation of models of economic confidence into agent-based microeconomic simulations. We first review broad underlying economic principles related to this topic. We then review the economicmore » principle of confidence and related empirical studies. Finally, we provide a brief survey of efforts and publications related to agent-based economic simulation.« less
A theoretical basis for the analysis of redundant software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.
Security Force Assistance in Afghanistan: Identifying Lessons for Future Efforts
2011-01-01
Reports & Bookstore Make a charitable contribution Limited Electronic Distribution Rights This document and trademark(s) contained herein are protected by...waged by U.S. and coalition forces in Afghani- stan. The outcome of the campaign hinges, in large measure, on the effectiveness of the assistance... effectiveness of SFA in Afghanistan, and few empirically rigorous assessments exist to help answer these questions. This monograph analyzes SFA efforts in
Software Cost Estimation Using a Decision Graph Process: A Knowledge Engineering Approach
NASA Technical Reports Server (NTRS)
Stukes, Sherry; Spagnuolo, John, Jr.
2011-01-01
This paper is not a description per se of the efforts by two software cost analysts. Rather, it is an outline of the methodology used for FSW cost analysis presented in a form that would serve as a foundation upon which others may gain insight into how to perform FSW cost analyses for their own problems at hand.
1992-07-01
methodologies ; software performance analysis; software testing; and concurrent languages. Finally, efforts in algorithms, which are primarily designed to upgrade...These codes provide a powerful research tool for testing new concepts and designs prior to experimental implementation. DoE’s laser program has also...development, and specially designed production facilities. World leadership in bth non -fluorinated and fluorinated materials resides in the U.S. but Japan
An Investigation of Techniques for Detecting Data Anomalies in Earned Value Management Data
2011-12-01
Management Studio Harte Hanks Trillium Software Trillium Software System IBM Info Sphere Foundation Tools Informatica Data Explorer Informatica ...Analyst Informatica Developer Informatica Administrator Pitney Bowes Business Insight Spectrum SAP BusinessObjects Data Quality Management DataFlux...menting quality monitoring efforts and tracking data quality improvements Informatica http://www.informatica.com/products_services/Pages/index.aspx
Computer software documentation
NASA Technical Reports Server (NTRS)
Comella, P. A.
1973-01-01
A tutorial in the documentation of computer software is presented. It presents a methodology for achieving an adequate level of documentation as a natural outgrowth of the total programming effort commencing with the initial problem statement and definition and terminating with the final verification of code. It discusses the content of adequate documentation, the necessity for such documentation and the problems impeding achievement of adequate documentation.
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, Charlie; Crook, Jerry
1997-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for a generic, advanced engine control system that will result in lower software maintenance (operations) costs. It effectively accommodates software requirements changes that occur due to hardware. technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives and benefits of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishment are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software, architecture, reuse software, and reduced software reverification time related to software changes. Currently, the program is focused on supporting MSFC in accomplishing a Space Shuttle Main Engine (SSME) hot-fire test at Stennis Space Center and the Low Cost Boost Technology (LCBT) Program.
Usability analysis of 2D graphics software for designing technical clothing.
Teodoroski, Rita de Cassia Clark; Espíndola, Edilene Zilma; Silva, Enéias; Moro, Antônio Renato Pereira; Pereira, Vera Lucia D V
2012-01-01
With the advent of technology, the computer became a working tool increasingly present in companies. Its purpose is to increase production and reduce the inherent errors in manual production. The aim of this study was to analyze the usability of 2D graphics software in creating clothing designs by a professional during his work. The movements of the mouse, keyboard and graphical tools were monitored in real time by software Camtasia 7® installed on the user's computer. To register the use of mouse and keyboard we used auxiliary software called MouseMeter®, which quantifies the number of times they pressed the right, middle and left mouse's buttons, the keyboard and also the distance traveled in meters by the cursor on the screen. Data was collected in periods of 15 minutes, 1 hour and 8 hours, consecutively. The results showed that the job is considered repetitive and high demands physical efforts, which can lead to the appearance of repetitive strain injuries. Thus, the goal of minimizing operator efforts and thereby enhance the usability of the examined tool, becomes imperative to replace the mouse by a device called tablet, which also offers an electronic pen and a drawing platform for design development.
Development and Application of New Quality Model for Software Projects
Karnavel, K.; Dillibabu, R.
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594
Development and application of new quality model for software projects.
Karnavel, K; Dillibabu, R
2014-01-01
The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.
Impact of Growing Business on Software Processes
NASA Astrophysics Data System (ADS)
Nikitina, Natalja; Kajko-Mattsson, Mira
When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.
Space Station Mission Planning System (MPS) development study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
Klus, W. J.
1987-01-01
The basic objective of the Space Station (SS) Mission Planning System (MPS) Development Study was to define a baseline Space Station mission plan and the associated hardware and software requirements for the system. A detailed definition of the Spacelab (SL) payload mission planning process and SL Mission Integration Planning System (MIPS) software was derived. A baseline concept was developed for performing SS manned base payload mission planning, and it was consistent with current Space Station design/operations concepts and philosophies. The SS MPS software requirements were defined. Also, requirements for new software include candidate programs for the application of artificial intelligence techniques to capture and make more effective use of mission planning expertise. A SS MPS Software Development Plan was developed which phases efforts for the development software to implement the SS mission planning concept.
Smith, Katherine E; Savell, Emily; Gilmore, Anna B
2013-03-01
To systematically review studies of tobacco industry efforts to influence tobacco tax policies. Searches were conducted between 1 October 2009 and 31 March 2010 in 14 databases/websites, in relevant bibliographies and via experts. Studies were included if they focused on industry efforts to influence tobacco tax policies, drew on empirical evidence, were in English and concerned the period 1985-2010. In total, 36 studies met these criteria. Two reviewers undertook data extraction and critical appraisal. A random selection of 15 studies (42%) was subject to second review. Evidence was assessed thematically to identify distinct tobacco industry aims, arguments and tactics. A total of 34 studies examined industry efforts to influence tax levels. They suggest the tobacco industry works hard to prevent significant increases and particularly dislikes taxes 'earmarked' for tobacco control. Key arguments to counter increases are that tobacco taxes are socially regressive, unfair and lead to increased levels of illicit trade and negative economic impacts. For earmarked taxes, the industry also frequently tries to raise concerns about revenue allocation. Assessing industry arguments against established evidence demonstrates most are unsupported. Key industry tactics include: establishing 'front groups', securing credible allies, direct lobbying and publicity campaigns. Only seven studies examined efforts to influence tax structures. They suggest company preferences vary and tactics centre on direct lobbying. The tobacco industry has historically tried to keep tobacco taxes low using consistent tactics and misleading arguments. Further research is required to explore efforts to influence tax structures, excise policies beyond the USA and recent policies.
Smith, K.E.; Savell, E.; Gilmore, A.B.
2013-01-01
Objective To systematically review studies of tobacco industry efforts to influence tax policies. Data sources We conducted searches between 1st October 2009 and 31st March 2010 on 14 databases/websites, in relevant bibliographies and via experts. Study selection We included studies if they: focused on industry efforts to influence tobacco tax policies; drew on empirical evidence; were in English; concerned the period 1985–2010. 36 studies met these criteria. Data extraction Two reviewers undertook data extraction and critical appraisal. A random selection of 15 studies (42%) was subject to second review. Data synthesis We assessed evidence thematically to identify distinct tobacco industry aims, arguments and tactics. 34 studies examined industry efforts to influence tax levels. They suggest industry works hard to prevent significant increases and particularly dislikes taxes ‘earmarked’ for tobacco control. Key arguments to counter increases are that tobacco taxes are socially regressive, unfair and lead to increased levels of illicit trade and negative economic impacts. For earmarked taxes, the industry also frequently tries to raise concerns about revenue allocation. Assessing industry arguments against established evidence demonstrates most are unsupported. Key industry tactics include: establishing ‘front groups’; securing credible allies, direct lobbying; and publicity campaigns. Only seven studies examined efforts to influence tax structures. They suggest company preferences vary and tactics centre on direct lobbying. Conclusions The tobacco industry has historically tried to keep tobacco taxes low using consistent tactics and misleading arguments. Further research is required to explore efforts to influence: tax structures; excise policies beyond the US; recent policies. PMID:22887175
Using Pilots to Assess the Value and Approach of CMMI Implementation
NASA Technical Reports Server (NTRS)
Godfrey, Sara; Andary, James; Rosenberg, Linda
2002-01-01
At Goddard Space Flight Center (GSFC), we have chosen to use Capability Maturity Model Integrated (CMMI) to guide our process improvement program. Projects at GSFC consist of complex systems of software and hardware that control satellites, operate ground systems, run instruments, manage databases and data and support scientific research. It is a challenge to launch a process improvement program that encompasses our diverse systems, yet is manageable in terms of cost effectiveness. In order to establish the best approach for improvement, our process improvement effort was divided into three phases: 1) Pilot projects; 2) Staged implementation; and 3) Sustainment and continual improvement. During Phase 1 the focus of the activities was on a baselining process, using pre-appraisals in order to get a baseline for making a better cost and effort estimate for the improvement effort. Pilot pre-appraisals were conducted from different perspectives so different approaches for process implementation could be evaluated. Phase 1 also concentrated on establishing an improvement infrastructure and training of the improvement teams. At the time of this paper, three pilot appraisals have been completed. Our initial appraisal was performed in a flight software area, considering the flight software organization as the organization. The second appraisal was done from a project perspective, focusing on systems engineering and acquisition, and using the organization as GSFC. The final appraisal was in a ground support software area, again using GSFC as the organization. This paper will present our initial approach, lessons learned from all three pilots and the changes in our approach based on the lessons learned.
Onyx-Advanced Aeropropulsion Simulation Framework Created
NASA Technical Reports Server (NTRS)
Reed, John A.
2001-01-01
The Numerical Propulsion System Simulation (NPSS) project at the NASA Glenn Research Center is developing a new software environment for analyzing and designing aircraft engines and, eventually, space transportation systems. Its purpose is to dramatically reduce the time, effort, and expense necessary to design and test jet engines by creating sophisticated computer simulations of an aerospace object or system (refs. 1 and 2). Through a university grant as part of that effort, researchers at the University of Toledo have developed Onyx, an extensible Java-based (Sun Micro-systems, Inc.), objectoriented simulation framework, to investigate how advanced software design techniques can be successfully applied to aeropropulsion system simulation (refs. 3 and 4). The design of Onyx's architecture enables users to customize and extend the framework to add new functionality or adapt simulation behavior as required. It exploits object-oriented technologies, such as design patterns, domain frameworks, and software components, to develop a modular system in which users can dynamically replace components with others having different functionality.
Automating the parallel processing of fluid and structural dynamics calculations
NASA Technical Reports Server (NTRS)
Arpasi, Dale J.; Cole, Gary L.
1987-01-01
The NASA Lewis Research Center is actively involved in the development of expert system technology to assist users in applying parallel processing to computational fluid and structural dynamic analysis. The goal of this effort is to eliminate the necessity for the physical scientist to become a computer scientist in order to effectively use the computer as a research tool. Programming and operating software utilities have previously been developed to solve systems of ordinary nonlinear differential equations on parallel scalar processors. Current efforts are aimed at extending these capabilities to systems of partial differential equations, that describe the complex behavior of fluids and structures within aerospace propulsion systems. This paper presents some important considerations in the redesign, in particular, the need for algorithms and software utilities that can automatically identify data flow patterns in the application program and partition and allocate calculations to the parallel processors. A library-oriented multiprocessing concept for integrating the hardware and software functions is described.
A Novel Rules Based Approach for Estimating Software Birthmark
Binti Alias, Norma; Anwar, Sajid
2015-01-01
Software birthmark is a unique quality of software to detect software theft. Comparing birthmarks of software can tell us whether a program or software is a copy of another. Software theft and piracy are rapidly increasing problems of copying, stealing, and misusing the software without proper permission, as mentioned in the desired license agreement. The estimation of birthmark can play a key role in understanding the effectiveness of a birthmark. In this paper, a new technique is presented to evaluate and estimate software birthmark based on the two most sought-after properties of birthmarks, that is, credibility and resilience. For this purpose, the concept of soft computing such as probabilistic and fuzzy computing has been taken into account and fuzzy logic is used to estimate properties of birthmark. The proposed fuzzy rule based technique is validated through a case study and the results show that the technique is successful in assessing the specified properties of the birthmark, its resilience and credibility. This, in turn, shows how much effort will be required to detect the originality of the software based on its birthmark. PMID:25945363
Salvy, Sarah-Jeanne; Bowker, Julie C.
2015-01-01
Obesity during childhood and adolescence is a growing problem in the United States, Canada, and around the world that leads to significant physical, psychological, and social impairment. In recent years, empirical research on factors that contribute to the development and maintenance of obesity has begun to consider peer experiences, such as peer rejection, peer victimization, and friendship. Peer experiences have been theoretically and empirically related to the “Big Two” contributors to the obesity epidemic, eating and physical activity, but there has not been a comprehensive review of the extant empirical literature. In this article, we review and synthesize the emerging theoretical and empirical literatures on peer experiences in relation to: (a) eating (food consumption and food selection); and (b) physical activity, during childhood and adolescence. A number of limitations and issues in the theoretical and empirical literatures are also discussed, along with future research directions. In conclusion, we argue that the involvement of children and adolescents’ peer networks in prevention and intervention efforts may be critical for promoting and maintaining positive behavioral health trajectories. PMID:28090396
High Speed Jet Noise Prediction Using Large Eddy Simulation
NASA Technical Reports Server (NTRS)
Lele, Sanjiva K.
2002-01-01
Current methods for predicting the noise of high speed jets are largely empirical. These empirical methods are based on the jet noise data gathered by varying primarily the jet flow speed, and jet temperature for a fixed nozzle geometry. Efforts have been made to correlate the noise data of co-annular (multi-stream) jets and for the changes associated with the forward flight within these empirical correlations. But ultimately these emipirical methods fail to provide suitable guidance in the selection of new, low-noise nozzle designs. This motivates the development of a new class of prediction methods which are based on computational simulations, in an attempt to remove the empiricism of the present day noise predictions.
Development of Autonomous Aerobraking - Phase 2
NASA Technical Reports Server (NTRS)
Murri, Daniel G.
2013-01-01
Phase 1 of the Development of Autonomous Aerobraking (AA) Assessment investigated the technical capability of transferring the processes of aerobraking maneuver (ABM) decision-making (currently performed on the ground by an extensive workforce and communicated to the spacecraft via the deep space network) to an efficient flight software algorithm onboard the spacecraft. This document describes Phase 2 of this study, which was a 12-month effort to improve and rigorously test the AA Development Software developed in Phase 1. Aerobraking maneuver; Autonomous Aerobraking; Autonomous Aerobraking Development Software; Deep Space Network; NASA Engineering and Safety Center
The software product assurance metrics study: JPL's software systems quality and productivity
NASA Technical Reports Server (NTRS)
Bush, Marilyn W.
1989-01-01
The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.
NASA integrated vehicle health management technology experiment for X-37
NASA Astrophysics Data System (ADS)
Schwabacher, Mark; Samuels, Jeff; Brownston, Lee
2002-07-01
The NASA Integrated Vehicle Health Management (IVHM) Technology Experiment for X-37 was intended to run IVHM software on board the X-37 spacecraft. The X-37 is an unpiloted vehicle designed to orbit the Earth for up to 21 days before landing on a runway. The objectives of the experiment were to demonstrate the benefits of in-flight IVHM to the operation of a Reusable Launch Vehicle, to advance the Technology Readiness Level of this IVHM technology within a flight environment, and to demonstrate that the IVHM software could operate on the Vehicle Management Computer. The scope of the experiment was to perform real-time fault detection and isolation for X-37's electrical power system and electro-mechanical actuators. The experiment used Livingstone, a software system that performs diagnosis using a qualitative, model-based reasoning approach that searches system-wide interactions to detect and isolate failures. Two of the challenges we faced were to make this research software more efficient so that it would fit within the limited computational resources that were available to us on the X-37 spacecraft, and to modify it so that it satisfied the X-37's software safety requirements. Although the experiment is currently unfunded, the development effort resulted in major improvements in Livingstone's efficiency and safety. This paper reviews some of the details of the modeling and integration efforts, and some of the lessons that were learned.
System Software Framework for System of Systems Avionics
NASA Technical Reports Server (NTRS)
Ferguson, Roscoe C.; Peterson, Benjamin L; Thompson, Hiram C.
2005-01-01
Project Constellation implements NASA's vision for space exploration to expand human presence in our solar system. The engineering focus of this project is developing a system of systems architecture. This architecture allows for the incremental development of the overall program. Systems can be built and connected in a "Lego style" manner to generate configurations supporting various mission objectives. The development of the avionics or control systems of such a massive project will result in concurrent engineering. Also, each system will have software and the need to communicate with other (possibly heterogeneous) systems. Fortunately, this design problem has already been solved during the creation and evolution of systems such as the Internet and the Department of Defense's successful effort to standardize distributed simulation (now IEEE 1516). The solution relies on the use of a standard layered software framework and a communication protocol. A standard framework and communication protocol is suggested for the development and maintenance of Project Constellation systems. The ARINC 653 standard is a great start for such a common software framework. This paper proposes a common system software framework that uses the Real Time Publish/Subscribe protocol for framework-to-framework communication to extend ARINC 653. It is highly recommended that such a framework be established before development. This is important for the success of concurrent engineering. The framework provides an infrastructure for general system services and is designed for flexibility to support a spiral development effort.
ITS logical architecture : volume 3, data dictionary.
DOT National Transportation Integrated Search
1981-01-01
The objective of the research effort was to develop an empirically and experiencially based model pedestrian safety program which cities can use as guidelines for pedestrian safety program planning, implementation, and evaluation. The basis of these ...
Towards a balanced software team formation based on Belbin team role using fuzzy technique
NASA Astrophysics Data System (ADS)
Omar, Mazni; Hasan, Bikhtiyar; Ahmad, Mazida; Yasin, Azman; Baharom, Fauziah; Mohd, Haslina; Darus, Norida Muhd
2016-08-01
In software engineering (SE), team roles play significant impact in determining the project success. To ensure the optimal outcome of the project the team is working on, it is essential to ensure that the team members are assigned to the right role with the right characteristics. One of the prevalent team roles is Belbin team role. A successful team must have a balance of team roles. Thus, this study demonstrates steps taken to determine balance of software team formation based on Belbin team role using fuzzy technique. Fuzzy technique was chosen because it allows analyzing of imprecise data and classifying selected criteria. In this study, two roles in Belbin team role, which are Shaper (Sh) and Plant (Pl) were chosen to assign the specific role in software team. Results show that the technique is able to be used for determining the balance of team roles. Future works will focus on the validation of the proposed method by using empirical data in industrial setting.
OntoSoft: A Software Registry for Geosciences
NASA Astrophysics Data System (ADS)
Garijo, D.; Gil, Y.
2017-12-01
The goal of the EarthCube OntoSoft project is to enable the creation of an ecosystem for software stewardship in geosciences that will empower scientists to manage their software as valuable scientific assets. By sharing software metadata in OntoSoft, scientists enable broader access to that software by other scientists, software professionals, students, and decision makers. Our work to date includes: 1) an ontology for describing scientific software metadata, 2) a distributed scientific software repository that contains more than 750 entries that can be searched and compared across metadata fields, 3) an intelligent user interface that guides scientists to publish software and allows them to crowdsource its corresponding metadata. We have also developed a training program where scientists learn to describe and cite software in their papers in addition to data and provenance, and we are using OntoSoft to show them the benefits of publishing their software metadata. This training program is part of a Geoscience Papers of the Future Initiative, where scientists are reflecting on their current practices, benefits and effort for sharing software and data. This journal paper can be submitted to a Special Section of the AGU Earth and Space Science Journal.
Parallelization of Rocket Engine System Software (Press)
NASA Technical Reports Server (NTRS)
Cezzar, Ruknet
1996-01-01
The main goal is to assess parallelization requirements for the Rocket Engine Numeric Simulator (RENS) project which, aside from gathering information on liquid-propelled rocket engines and setting forth requirements, involve a large FORTRAN based package at NASA Lewis Research Center and TDK software developed by SUBR/UWF. The ultimate aim is to develop, test, integrate, and suitably deploy a family of software packages on various aspects and facets of rocket engines using liquid-propellants. At present, all project efforts by the funding agency, NASA Lewis Research Center, and the HBCU participants are disseminated over the internet using world wide web home pages. Considering obviously expensive methods of actual field trails, the benefits of software simulators are potentially enormous. When realized, these benefits will be analogous to those provided by numerous CAD/CAM packages and flight-training simulators. According to the overall task assignments, Hampton University's role is to collect all available software, place them in a common format, assess and evaluate, define interfaces, and provide integration. Most importantly, the HU's mission is to see to it that the real-time performance is assured. This involves source code translations, porting, and distribution. The porting will be done in two phases: First, place all software on Cray XMP platform using FORTRAN. After testing and evaluation on the Cray X-MP, the code will be translated to C + + and ported to the parallel nCUBE platform. At present, we are evaluating another option of distributed processing over local area networks using Sun NFS, Ethernet, TCP/IP. Considering the heterogeneous nature of the present software (e.g., first started as an expert system using LISP machines) which now involve FORTRAN code, the effort is expected to be quite challenging.
STEAM: a software tool based on empirical analysis for micro electro mechanical systems
NASA Astrophysics Data System (ADS)
Devasia, Archana; Pasupuleti, Ajay; Sahin, Ferat
2006-03-01
In this research a generalized software framework that enables accurate computer aided design of MEMS devices is developed. The proposed simulation engine utilizes a novel material property estimation technique that generates effective material properties at the microscopic level. The material property models were developed based on empirical analysis and the behavior extraction of standard test structures. A literature review is provided on the physical phenomena that govern the mechanical behavior of thin films materials. This survey indicates that the present day models operate under a wide range of assumptions that may not be applicable to the micro-world. Thus, this methodology is foreseen to be an essential tool for MEMS designers as it would develop empirical models that relate the loading parameters, material properties, and the geometry of the microstructures with its performance characteristics. This process involves learning the relationship between the above parameters using non-parametric learning algorithms such as radial basis function networks and genetic algorithms. The proposed simulation engine has a graphical user interface (GUI) which is very adaptable, flexible, and transparent. The GUI is able to encompass all parameters associated with the determination of the desired material property so as to create models that provide an accurate estimation of the desired property. This technique was verified by fabricating and simulating bilayer cantilevers consisting of aluminum and glass (TEOS oxide) in our previous work. The results obtained were found to be very encouraging.
Application of Open-Source Enterprise Information System Modules: An Empirical Study
ERIC Educational Resources Information Center
Lee, Sang-Heui
2010-01-01
Although there have been a number of studies on large scale implementation of proprietary enterprise information systems (EIS), open-source software (OSS) for EIS has received limited attention in spite of its potential as a disruptive innovation. Cost saving is the main driver for adopting OSS among the other possible benefits including security…
ERIC Educational Resources Information Center
Wallgren, Lillemor; Dahlgren, Lars Owe
2005-01-01
This article reports on an empirical study of industry PhD students in the Swedish Graduate School for Applied IT and Software Engineering. The students were questioned in semi-structured interviews about their experiences of sharing their postgraduate studies between industrial and academic environments. The results from the first analysis…
ERIC Educational Resources Information Center
Carrington, Michal; Chen, Richard; Davies, Martin; Kaur, Jagjit; Neville, Benjamin
2011-01-01
An argument map visually represents the structure of an argument, outlining its informal logical connections and informing judgments as to its worthiness. Argument mapping can be augmented with dedicated software that aids the mapping process. Empirical evidence suggests that semester-length subjects using argument mapping along with dedicated…
High-Speed Video Analysis of Damped Harmonic Motion
ERIC Educational Resources Information Center
Poonyawatpornkul, J.; Wattanakasiwich, P.
2013-01-01
In this paper, we acquire and analyse high-speed videos of a spring-mass system oscillating in glycerin at different temperatures. Three cases of damped harmonic oscillation are investigated and analysed by using high-speed video at a rate of 120 frames s[superscript -1] and Tracker Video Analysis (Tracker) software. We present empirical data for…
Dexter Time: The Space, Time, and Matterings of School Absence Registration
ERIC Educational Resources Information Center
Bodén, Linnea
2016-01-01
Working with a posthumanist approach, this article explores how the computer software Dexter, used for the registration of students' absences and presences, is part of the production of different practices of time, place, space, and matter in Swedish schools. The empirical material engaged with comes from two schools, and the students involved are…
Children's Practice of Computer-Based Composition
ERIC Educational Resources Information Center
Nilsson, Bo; Folkestad, Goran
2005-01-01
Today's children live in a world where music in all its different forms has become a significant factor in their everyday life. This article describes a 2-year empirical study of nine 8-year-old Swedish children creating music with synthesiser and computer software. The aim of the study is to describe and clarify the creative processes of…
Effects of Real-Time Visual Feedback on Pre-Service Teachers' Singing
ERIC Educational Resources Information Center
Leong, S.; Cheng, L.
2014-01-01
This pilot study focuses on the use real-time visual feedback technology (VFT) in vocal training. The empirical research has two aims: to ascertain the effectiveness of the real-time visual feedback software "Sing & See" in the vocal training of pre-service music teachers and the teachers' perspective on their experience with…
Student Plagiarism in Higher Education in Vietnam: An Empirical Study
ERIC Educational Resources Information Center
Do Ba, Khang; Do Ba, Khai; Lam, Quoc Dung; Le, Dao Thanh Binh An; Nguyen, Phuong Lien; Nguyen, Phuong Quynh; Pham, Quoc Loc
2017-01-01
This paper assesses and compares the prevalence of plagiarism across different student and assignment characteristics at a university in Vietnam, using the similarity index reported by the text-matching software Turnitin as a proxy measure of plagiarism on a sample of 681 student papers. The findings present a level of match higher than reported…
An Empirical Consideration of the Use of R in Actively Constructing Sampling Distributions
ERIC Educational Resources Information Center
Vaughn, Brandon K.
2009-01-01
In this paper, an interactive teaching approach to introduce the concept of sampling distributions using the statistical software program, R, is shown. One advantage of this approach is that the program R is freely available via the internet. Instructors can easily demonstrate concepts in class, outfit entire computer labs, and/or assign the…
Rain Rate Statistics in Southern New Mexico
NASA Technical Reports Server (NTRS)
Paulic, Frank J., Jr.; Horan, Stephen
1997-01-01
The methodology used in determining empirical rain-rate distributions for Southern New Mexico in the vicinity of White Sands APT site is discussed. The hardware and the software developed to extract rain rate from the rain accumulation data collected at White Sands APT site are described. The accuracy of Crane's Global Model for rain rate predictions is analyzed.
The Improvement Cycle: Analyzing Our Experience
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Waligora, Sharon
1996-01-01
NASA's Software Engineering Laboratory (SEL), one of the earliest pioneers in the areas of software process improvement and measurement, has had a significant impact on the software business at NASA Goddard. At the heart of the SEL's improvement program is a belief that software products can be improved by optimizing the software engineering process used to develop them and a long-term improvement strategy that facilitates small incremental improvements that accumulate into significant gains. As a result of its efforts, the SEL has incrementally reduced development costs by 60%, decreased error rates by 85%, and reduced cycle time by 25%. In this paper, we analyze the SEL's experiences on three major improvement initiatives to better understand the cyclic nature of the improvement process and to understand why some improvements take much longer than others.
Effort Drivers Estimation for Brazilian Geographically Distributed Software Development
NASA Astrophysics Data System (ADS)
Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio
To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.
Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, Wes
2016-07-24
The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less
The Federal Aviation Administration Plan for Research, Engineering and Development, 1994
1994-05-01
Aeronautical Data Link Communications and (COTS) runway incursion system software will Applications, and 051-130 Airport Safety be demonstrated as a... airport departure and ar- efforts rival scheduling plans that optimize daily traffic flows for long-range flights between major city- * OTFP System to...Expanded HARS planning capabilities to in- aviation dispatchers to develop optimized high clude enhanced communications software for altitude flight
Hardware Evolution of Closed-Loop Controller Designs
NASA Technical Reports Server (NTRS)
Gwaltney, David; Ferguson, Ian
2002-01-01
Poster presentation will outline on-going efforts at NASA, MSFC to employ various Evolvable Hardware experimental platforms in the evolution of digital and analog circuitry for application to automatic control. Included will be information concerning the application of commercially available hardware and software along with the use of the JPL developed FPTA2 integrated circuit and supporting JPL developed software. Results to date will be presented.
Virtual Exercise Training Software System
NASA Technical Reports Server (NTRS)
Vu, L.; Kim, H.; Benson, E.; Amonette, W. E.; Barrera, J.; Perera, J.; Rajulu, S.; Hanson, A.
2018-01-01
The purpose of this study was to develop and evaluate a virtual exercise training software system (VETSS) capable of providing real-time instruction and exercise feedback during exploration missions. A resistive exercise instructional system was developed using a Microsoft Kinect depth-camera device, which provides markerless 3-D whole-body motion capture at a small form factor and minimal setup effort. It was hypothesized that subjects using the newly developed instructional software tool would perform the deadlift exercise with more optimal kinematics and consistent technique than those without the instructional software. Following a comprehensive evaluation in the laboratory, the system was deployed for testing and refinement in the NASA Extreme Environment Mission Operations (NEEMO) analog.
A Roadmap for HEP Software and Computing R&D for the 2020s
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alves, Antonio Augusto, Jr; et al.
Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to preparemore » for this software upgrade.« less
General-Purpose Front End for Real-Time Data Processing
NASA Technical Reports Server (NTRS)
James, Mark
2007-01-01
FRONTIER is a computer program that functions as a front end for any of a variety of other software of both the artificial intelligence (AI) and conventional data-processing types. As used here, front end signifies interface software needed for acquiring and preprocessing data and making the data available for analysis by the other software. FRONTIER is reusable in that it can be rapidly tailored to any such other software with minimum effort. Each component of FRONTIER is programmable and is executed in an embedded virtual machine. Each component can be reconfigured during execution. The virtual-machine implementation making FRONTIER independent of the type of computing hardware on which it is executed.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1988-01-01
Reported here is beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size), the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. Also derived is an upper bound to productivity that shows that software reuse is the only means than can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
A communication channel model of the software process
NASA Technical Reports Server (NTRS)
Tausworthe, Robert C.
1988-01-01
Beginning research into a noisy communication channel analogy of software development process productivity, in order to establish quantifiable behavior and theoretical bounds is discussed. The analogy leads to a fundamental mathematical relationship between human productivity and the amount of information supplied by the developers, the capacity of the human channel for processing and transmitting information, the software product yield (object size) the work effort, requirements efficiency, tool and process efficiency, and programming environment advantage. An upper bound to productivity is derived that shows that software reuse is the only means that can lead to unbounded productivity growth; practical considerations of size and cost of reusable components may reduce this to a finite bound.
Verification Testing: Meet User Needs Figure of Merit
NASA Technical Reports Server (NTRS)
Kelly, Bryan W.; Welch, Bryan W.
2017-01-01
Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible calculations. Additionally, this project will look at the coding generally and note inconsistencies, redundancies, and other aspects that may become problematic or slow down the codes run time. Certain scripts lacking in documentation also will be commented and cataloged.
Study on the leakage flow through a clearance gap between two stationary walls
NASA Astrophysics Data System (ADS)
Zhao, W.; Billdal, J. T.; Nielsen, T. K.; Brekke, H.
2012-11-01
In the present paper, the leakage flow in the clearance gap between stationary walls was studied experimentally, theoretically and numerically by the computational fluid dynamics (CFD) in order to find the relationship between leakage flow, pressure difference and clearance gap. The experimental set-up of the clearance gap between two stationary walls is the simplification of the gap between the guide vane faces and facing plates in Francis turbines. This model was built in the Waterpower laboratory at Norwegian University of Science and Technology (NTNU). The empirical formula for calculating the leakage flow rate between the two stationary walls was derived from the empirical study. The experimental model is simulated by computational fluid dynamics employing the ANSYS CFX commercial software in order to study the flow structure. Both numerical simulation results and empirical formula results are in good agreement with the experimental results. The correction of the empirical formula is verified by experimental data and has been proven to be very useful in terms of quickly predicting the leakage flow rate in the guide vanes for hydraulic turbines.
2010-01-01
These include: Afghanistan is the graveyard of empires ; efforts to centralize power in Afghanistan provoke local resistance; and Afghanistan is an...ethnically fragmented and decentralized country inca - pable of forming a unified state. Afghanistan in Transition Autumn 2010 7 The realities
Meredith, Pamela Joy
2013-04-01
Theoretical and empirical evidence suggests that adult attachment and pain-related variables are predictably and consistently linked, and that understanding these links may guide pain intervention and prevention efforts. In general, insecure attachment has been portrayed as a risk factor, and secure attachment as a protective factor, for people with chronic pain conditions. In an effort to better understand the relationships among attachment and pain variables, these links have been investigated in pain-free samples using induced-pain techniques. The present paper reviews the available research linking adult attachment and laboratory-induced pain. While the diverse nature of the studies precludes definitive conclusions, together these papers offer support for associations between insecure attachment and a more negative pain experience. The evidence presented in this review highlights areas for further empirical attention, as well as providing some guidance for clinicians who may wish to employ preventive approaches and other interventions informed by attachment theory.
Software for Optimizing Quality Assurance of Other Software
NASA Technical Reports Server (NTRS)
Feather, Martin; Cornford, Steven; Menzies, Tim
2004-01-01
Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.
Jusot, Florence; Tubeuf, Sandy; Trannoy, Alain
2013-12-01
The way to treat the correlation between circumstances and effort is a central, yet largely neglected issue in the applied literature on inequality of opportunity. This paper adopts three alternative normative ways of treating this correlation championed by Roemer, Barry and Swift and assesses their empirical relevance using survey data. We combine regression analysis with the natural decomposition of the variance to compare the relative contributions of circumstances and efforts to overall health inequality according to the different normative principles. Our results suggest that, in practice, the normative principle on the way to treat the correlation between circumstances and effort makes little difference on the relative contributions of circumstances and efforts to explained health inequality. Copyright © 2013 John Wiley & Sons, Ltd.
'Nobody tosses a dwarf!' The relation between the empirical and the normative reexamined.
Leget, Carlo; Borry, Pascal; de Vries, Raymond
2009-05-01
This article discusses the relation between empirical and normative approaches in bioethics. The issue of dwarf tossing, while admittedly unusual, is chosen as a point of departure because it challenges the reader to look with fresh eyes upon several central bioethical themes, including human dignity, autonomy, and the protection of vulnerable people. After an overview of current approaches to the integration of empirical and normative ethics, we consider five ways that the empirical and normative can be brought together to speak to the problem of dwarf tossing: prescriptive applied ethics, theoretical ethics, critical applied ethics, particularist ethics and integrated empirical ethics. We defend a position of critical applied ethics that allows for a two-way relation between empirical and normative theories. Against efforts fully to integrate the normative and the empirical into one synthesis, we propose that the two should stand in tension and relation to one another. The approach we endorse acknowledges that a social practice can and should be judged both by the gathering of empirical data and by normative ethics. Critical applied ethics uses a five stage process that includes: (a) determination of the problem, (b) description of the problem, (c) empirical study of effects and alternatives, (d) normative weighing and (e) evaluation of the effects of a decision. In each stage, we explore the perspective from both the empirical (sociological) and the normative ethical point of view. We conclude by applying our five-stage critical applied ethics to the example of dwarf tossing.
Measuring 'virtue' in medicine.
Kotzee, Ben; Ignatowicz, Agnieszka
2016-06-01
Virtue-approaches to medical ethics are becoming ever more influential. Virtue theorists advocate redefining right or good action in medicine in terms of the character of the doctor performing the action (rather than adherence to rules or principles). In medical education, too, calls are growing to reconceive medical education as a form of character formation (rather than instruction in rules or principles). Empirical studies of doctors' ethics from a virtue-perspective, however, are few and far between. In this respect, theoretical and empirical study of medical ethics are out of alignment. In this paper, we survey the empirical study of medical ethics and find that most studies of doctors' ethics are rules- or principles-based and not virtue-based. We outline the challenges that exist for studying medical ethics empirically from a virtue-based perspective and canvas the runners and riders in the effort to find virtue-based assessments of medical ethics.
Influence of Peers and Friends on Children’s and Adolescents’ Eating and Activity Behaviors
Salvy, Sarah-Jeanne; de la Haye, Kayla; Bowker, Julie C.; Hermans, Roel C.J.
2012-01-01
Obesity during childhood and adolescence is a growing problem in the United States, Canada, and around the world that leads to significant physical, psychological, and social consequences. Peer experiences have been theoretically and empirically related to the “Big Two” contributors to the obesity epidemic, unhealthy eating and physical inactivity [1]. In this article, we synthesize the empirical literature on the influence of peers and friends on youth’s eating and physical activity. Limitations and issues in the theoretical and empirical literatures are also discussed, along with future research directions. In conclusion, we argue that the involvement of children’s and adolescents’ peer networks in prevention and intervention efforts may be critical for promoting and maintaining positive behavioral health trajectories. However, further theoretical and empirical work is needed to better understand the specific mechanisms underlying the effects of peers on youth’s eating and physical activity. PMID:22480733
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, C.; Crook, J.
1998-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.
NASA Technical Reports Server (NTRS)
Smith, H. E.
1990-01-01
Present software development accomplishments are indicative of the emerging interest in and increasing efforts to provide risk assessment backbone tools in the manned spacecraft engineering community. There are indications that similar efforts are underway in the chemical processes industry and are probably being planned for other high risk ground base environments. It appears that complex flight systems intended for extended manned planetary exploration will drive this technology.
Federal Register 2010, 2011, 2012, 2013, 2014
2002-05-03
... (web-serving software), Linux, Perl, and those who are building a compatible & free version of MS`s..., Argument from Design Argument from Design-Web & Multimedia [email protected] http://www.ardes.com MTC-00003464... organization could be a good target for this effort. Their web address is http:// www.gnu.org/. This effort...
Has the DOTS Strategy Improved Case Finding or Treatment Success? An Empirical Assessment
Obermeyer, Ziad; Abbott-Klafter, Jesse; Murray, Christopher J. L.
2008-01-01
Background Nearly fifteen years after the start of WHO's DOTS strategy, tuberculosis remains a major global health problem. Given the lack of empirical evidence that DOTS reduces tuberculosis burden, considerable debate has arisen about its place in the future of global tuberculosis control efforts. An independent evaluation of DOTS, one of the most widely-implemented and longest-running interventions in global health, is a prerequisite for meaningful improvements to tuberculosis control efforts, including WHO's new Stop TB Strategy. We investigate the impact of the expansion of the DOTS strategy on tuberculosis case finding and treatment success, using only empirical data. Methods and Findings We study the effect of DOTS using time-series cross-sectional methods. We first estimate the impact of DOTS expansion on case detection, using reported case notification data and controlling for other determinants of change in notifications, including HIV prevalence, GDP, and country-specific effects. We then estimate the effect of DOTS expansion on treatment success. DOTS programme variables had no statistically significant impact on case detection in a wide range of models and specifications. DOTS population coverage had a significant effect on overall treatment success rates, such that countries with full DOTS coverage benefit from at least an 18% increase in treatment success (95% CI: 5–31%). Conclusions The DOTS technical package improved overall treatment success. By contrast, DOTS expansion had no effect on case detection. This finding is less optimistic than previous analyses. Better epidemiological and programme data would facilitate future monitoring and evaluation efforts. PMID:18320042
Experimental research control software system
NASA Astrophysics Data System (ADS)
Cohn, I. A.; Kovalenko, A. G.; Vystavkin, A. N.
2014-05-01
A software system, intended for automation of a small scale research, has been developed. The software allows one to control equipment, acquire and process data by means of simple scripts. The main purpose of that development is to increase experiment automation easiness, thus significantly reducing experimental setup automation efforts. In particular, minimal programming skills are required and supervisors have no reviewing troubles. Interactions between scripts and equipment are managed automatically, thus allowing to run multiple scripts simultaneously. Unlike well-known data acquisition commercial software systems, the control is performed by an imperative scripting language. This approach eases complex control and data acquisition algorithms implementation. A modular interface library performs interaction with external interfaces. While most widely used interfaces are already implemented, a simple framework is developed for fast implementations of new software and hardware interfaces. While the software is in continuous development with new features being implemented, it is already used in our laboratory for automation of a helium-3 cryostat control and data acquisition. The software is open source and distributed under Gnu Public License.
Lessons learned in deploying software estimation technology and tools
NASA Technical Reports Server (NTRS)
Panlilio-Yap, Nikki; Ho, Danny
1994-01-01
Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.
Empirical observations of red light running at arterial signalized intersection.
DOT National Transportation Integrated Search
2008-12-01
Red Light Running (RLR) has become an increasely national safety issue at signalized intersections. : Significant efforts have been made to understand the RLR related driver behaviors and develop : countermeasures to reduce RLR and its related crashe...
Ardal, Christine; Alstadsæter, Annette; Røttingen, John-Arne
2011-09-28
Innovation through an open source model has proven to be successful for software development. This success has led many to speculate if open source can be applied to other industries with similar success. We attempt to provide an understanding of open source software development characteristics for researchers, business leaders and government officials who may be interested in utilizing open source innovation in other contexts and with an emphasis on drug discovery. A systematic review was performed by searching relevant, multidisciplinary databases to extract empirical research regarding the common characteristics and barriers of initiating and maintaining an open source software development project. Common characteristics to open source software development pertinent to open source drug discovery were extracted. The characteristics were then grouped into the areas of participant attraction, management of volunteers, control mechanisms, legal framework and physical constraints. Lastly, their applicability to drug discovery was examined. We believe that the open source model is viable for drug discovery, although it is unlikely that it will exactly follow the form used in software development. Hybrids will likely develop that suit the unique characteristics of drug discovery. We suggest potential motivations for organizations to join an open source drug discovery project. We also examine specific differences between software and medicines, specifically how the need for laboratories and physical goods will impact the model as well as the effect of patents.
Automated Operations Development for Advanced Exploration Systems
NASA Technical Reports Server (NTRS)
Haddock, Angie; Stetson, Howard K.
2012-01-01
Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide single button intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system on-board the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System [1] , along with the execution component design from within the HAL 9000 Space Operating System [2] , this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA s Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.
Automated Operations Development for Advanced Exploration Systems
NASA Technical Reports Server (NTRS)
Haddock, Angie T.; Stetson, Howard
2012-01-01
Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide "single button" intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system onboard the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System, along with the execution component design from within the HAL 9000 Space Operating System, this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA's Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.
NASA Astrophysics Data System (ADS)
Shameoni Niaei, M.; Kilic, Y.; Yildiran, B. E.; Yüzlükoglu, F.; Yesilyaprak, C.
2016-12-01
We have described a new software (MIPS) about the analysis and image processing of the meteorological satellite (Meteosat) data for an astronomical observatory. This software will be able to help to make some atmospherical forecast (cloud, humidity, rain) using meteosat data for robotic telescopes. MIPS uses a python library for Eumetsat data that aims to be completely open-source and licenced under GNU/General Public Licence (GPL). MIPS is a platform independent and uses h5py, numpy, and PIL with the general-purpose and high-level programming language Python and the QT framework.
Near-infrared face recognition utilizing open CV software
NASA Astrophysics Data System (ADS)
Sellami, Louiza; Ngo, Hau; Fowler, Chris J.; Kearney, Liam M.
2014-06-01
Commercially available hardware, freely available algorithms, and authors' developed software are synergized successfully to detect and recognize subjects in an environment without visible light. This project integrates three major components: an illumination device operating in near infrared (NIR) spectrum, a NIR capable camera and a software algorithm capable of performing image manipulation, facial detection and recognition. Focusing our efforts in the near infrared spectrum allows the low budget system to operate covertly while still allowing for accurate face recognition. In doing so a valuable function has been developed which presents potential benefits in future civilian and military security and surveillance operations.
A Unique Software System For Simulation-to-Flight Research
NASA Technical Reports Server (NTRS)
Chung, Victoria I.; Hutchinson, Brian K.
2001-01-01
"Simulation-to-Flight" is a research development concept to reduce costs and increase testing efficiency of future major aeronautical research efforts at NASA. The simulation-to-flight concept is achieved by using common software and hardware, procedures, and processes for both piloted-simulation and flight testing. This concept was applied to the design and development of two full-size transport simulators, a research system installed on a NASA B-757 airplane, and two supporting laboratories. This paper describes the software system that supports the simulation-to-flight facilities. Examples of various simulation-to-flight experimental applications were also provided.
Formal specification and verification of Ada software
NASA Technical Reports Server (NTRS)
Hird, Geoffrey R.
1991-01-01
The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.
A Platform-Independent Plugin for Navigating Online Radiology Cases.
Balkman, Jason D; Awan, Omer A
2016-06-01
Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.
Holt, Marla M; Noren, Dawn P; Dunkin, Robin C; Williams, Terrie M
2015-06-01
Many animals produce louder, longer or more repetitious vocalizations to compensate for increases in environmental noise. Biological costs of increased vocal effort in response to noise, including energetic costs, remain empirically undefined in many taxa, particularly in marine mammals that rely on sound for fundamental biological functions in increasingly noisy habitats. For this investigation, we tested the hypothesis that an increase in vocal effort would result in an energetic cost to the signaler by experimentally measuring oxygen consumption during rest and a 2 min vocal period in dolphins that were trained to vary vocal loudness across trials. Vocal effort was quantified as the total acoustic energy of sounds produced. Metabolic rates during the vocal period were, on average, 1.2 and 1.5 times resting metabolic rate (RMR) in dolphin A and B, respectively. As vocal effort increased, we found that there was a significant increase in metabolic rate over RMR during the 2 min following sound production in both dolphins, and in total oxygen consumption (metabolic cost of sound production plus recovery costs) in the dolphin that showed a wider range of vocal effort across trials. Increases in vocal effort, as a consequence of increases in vocal amplitude, repetition rate and/or duration, are consistent with behavioral responses to noise in free-ranging animals. Here, we empirically demonstrate for the first time in a marine mammal, that these vocal modifications can have an energetic impact at the individual level and, importantly, these data provide a mechanistic foundation for evaluating biological consequences of vocal modification in noise-polluted habitats. © 2015. Published by The Company of Biologists Ltd.
1993-09-30
97 Accesion For NTIS CRA&I DTIC TAB Unannounced 0 Justification ----- ---.......................... Ry Di. t ,:; t,.: 1...months of effort. The product was important for demonstrating to IBM management the potential of the Cleanroom methodology. 3.2.4 Software Architecture ...for Oscilloscopes Using Z (Tektronix) Tektronix in Beaverton, Oregon, used Z to develop a reusable software architecture to be shared among a number
Irregular Applications: Architectures & Algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feo, John T.; Villa, Oreste; Tumeo, Antonino
Irregular applications are characterized by irregular data structures, control and communication patterns. Novel irregular high performance applications which deal with large data sets and require have recently appeared. Unfortunately, current high performance systems and software infrastructures executes irregular algorithms poorly. Only coordinated efforts by end user, area specialists and computer scientists that consider both the architecture and the software stack may be able to provide solutions to the challenges of modern irregular applications.
2010-04-30
estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining...previous and current complex SW development efforts, the program offices will have a source of objective lessons learned and metrics that can be applied...the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this