Software Models Impact Stresses
NASA Technical Reports Server (NTRS)
Hanshaw, Timothy C.; Roy, Dipankar; Toyooka, Mark
1991-01-01
Generalized Impact Stress Software designed to assist engineers in predicting stresses caused by variety of impacts. Program straightforward, simple to implement on personal computers, "user friendly", and handles variety of boundary conditions applied to struck body being analyzed. Applications include mathematical modeling of motions and transient stresses of spacecraft, analysis of slamming of piston, of fast valve shutoffs, and play of rotating bearing assembly. Provides fast and inexpensive analytical tool for analysis of stresses and reduces dependency on expensive impact tests. Written in FORTRAN 77. Requires use of commercial software package PLOT88.
Maintaining the Health of Software Monitors
NASA Technical Reports Server (NTRS)
Person, Suzette; Rungta, Neha
2013-01-01
Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.
Corridor-based forecasts of work-zone impacts for freeways.
DOT National Transportation Integrated Search
2011-08-09
This project developed an analysis methodology and associated software implementation for the evaluation of : significant work zone impacts on freeways in North Carolina. The FREEVAL-WZ software tool allows the analyst : to predict the operational im...
Calibration of work zone impact analysis software for Missouri.
DOT National Transportation Integrated Search
2013-12-01
This project calibrated two software programs used for estimating the traffic impacts of work zones. The WZ Spreadsheet : and VISSIM programs were recommended in a previous study by the authors. The two programs were calibrated using : field data fro...
Research in nonlinear structural and solid mechanics
NASA Technical Reports Server (NTRS)
Mccomb, H. G., Jr. (Compiler); Noor, A. K. (Compiler)
1981-01-01
Recent and projected advances in applied mechanics, numerical analysis, computer hardware and engineering software, and their impact on modeling and solution techniques in nonlinear structural and solid mechanics are discussed. The fields covered are rapidly changing and are strongly impacted by current and projected advances in computer hardware. To foster effective development of the technology perceptions on computing systems and nonlinear analysis software systems are presented.
NASA Astrophysics Data System (ADS)
Poluyan, L. V.; Syutkina, E. V.; Guryev, E. S.
2017-11-01
The comparative analysis of key features of the software systems TOXI+Risk and ALOHA is presented. The authors made a comparison of domestic (TOXI+Risk) and foreign (ALOHA) software systems allowing to give the quantitative assessment of impact areas (pressure, thermal, toxic) in case of hypothetical emergencies in potentially hazardous objects of the oil, gas, chemical, petrochemical and oil-processing industry. Both software systems use different mathematical models for assessment of the release rate of a chemically hazardous substance from a storage tank and its evaporation. The comparison of the accuracy of definition of impact areas made by both software systems to verify the examples shows good convergence of both products. The analysis results showed that the ALOHA software can be actively used for forecasting and immediate assessment of emergency situations, assessment of damage as a result of emergencies on the territories of municipalities.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Acoustic Emission Analysis Applet (AEAA) Software
NASA Technical Reports Server (NTRS)
Nichols, Charles T.; Roth, Don J.
2013-01-01
NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.
Impact of new computing systems on computational mechanics and flight-vehicle structures technology
NASA Technical Reports Server (NTRS)
Noor, A. K.; Storaasli, O. O.; Fulton, R. E.
1984-01-01
Advances in computer technology which may have an impact on computational mechanics and flight vehicle structures technology were reviewed. The characteristics of supersystems, highly parallel systems, and small systems are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario for future hardware/software environment and engineering analysis systems is presented. Research areas with potential for improving the effectiveness of analysis methods in the new environment are identified.
The impact of software quality characteristics on healthcare outcome: a literature review.
Aghazadeh, Sakineh; Pirnejad, Habibollah; Moradkhani, Alireza; Aliev, Alvosat
2014-01-01
The aim of this study was to discover the effect of software quality characteristics on healthcare quality and efficiency indicators. Through a systematic literature review, we selected and analyzed 37 original research papers to investigate the impact of the software indicators (coming from the standard ISO 9126 quality characteristics and sub-characteristics) on some of healthcare important outcome indicators and finally ranked these software indicators. The results showed that the software characteristics usability, reliability and efficiency were mostly favored in the studies, indicating their importance. On the other hand, user satisfaction, quality of patient care, clinical workflow efficiency, providers' communication and information exchange, patient satisfaction and care costs were among the healthcare outcome indicators frequently evaluated in relation to the mentioned software characteristics. Regression Logistic Method was the most common assessment methodology, and Confirmatory Factor Analysis and Structural Equation Modeling were performed to test the structural model's fit. The software characteristics were considered to impact the healthcare outcome indicators through other intermediate factors (variables).
Mining Bug Databases for Unidentified Software Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dumidu Wijayasekara; Milos Manic; Jason Wright
2012-06-01
Identifying software vulnerabilities is becoming more important as critical and sensitive systems increasingly rely on complex software systems. It has been suggested in previous work that some bugs are only identified as vulnerabilities long after the bug has been made public. These vulnerabilities are known as hidden impact vulnerabilities. This paper discusses the feasibility and necessity to mine common publicly available bug databases for vulnerabilities that are yet to be identified. We present bug database analysis of two well known and frequently used software packages, namely Linux kernel and MySQL. It is shown that for both Linux and MySQL, amore » significant portion of vulnerabilities that were discovered for the time period from January 2006 to April 2011 were hidden impact vulnerabilities. It is also shown that the percentage of hidden impact vulnerabilities has increased in the last two years, for both software packages. We then propose an improved hidden impact vulnerability identification methodology based on text mining bug databases, and conclude by discussing a few potential problems faced by such a classifier.« less
High-fidelity modeling and impact footprint prediction for vehicle breakup analysis
NASA Astrophysics Data System (ADS)
Ling, Lisa
For decades, vehicle breakup analysis had been performed for space missions that used nuclear heater or power units in order to assess aerospace nuclear safety for potential launch failures leading to inadvertent atmospheric reentry. Such pre-launch risk analysis is imperative to assess possible environmental impacts, obtain launch approval, and for launch contingency planning. In order to accurately perform a vehicle breakup analysis, the analysis tool should include a trajectory propagation algorithm coupled with thermal and structural analyses and influences. Since such a software tool was not available commercially or in the public domain, a basic analysis tool was developed by Dr. Angus McRonald prior to this study. This legacy software consisted of low-fidelity modeling and had the capability to predict vehicle breakup, but did not predict the surface impact point of the nuclear component. Thus the main thrust of this study was to develop and verify the additional dynamics modeling and capabilities for the analysis tool with the objectives to (1) have the capability to predict impact point and footprint, (2) increase the fidelity in the prediction of vehicle breakup, and (3) reduce the effort and time required to complete an analysis. The new functions developed for predicting the impact point and footprint included 3-degrees-of-freedom trajectory propagation, the generation of non-arbitrary entry conditions, sensitivity analysis, and the calculation of impact footprint. The functions to increase the fidelity in the prediction of vehicle breakup included a panel code to calculate the hypersonic aerodynamic coefficients for an arbitrary-shaped body and the modeling of local winds. The function to reduce the effort and time required to complete an analysis included the calculation of node failure criteria. The derivation and development of these new functions are presented in this dissertation, and examples are given to demonstrate the new capabilities and the improvements made, with comparisons between the results obtained from the upgraded analysis tool and the legacy software wherever applicable.
MIDAS: Software for the detection and analysis of lunar impact flashes
NASA Astrophysics Data System (ADS)
Madiedo, José M.; Ortiz, José L.; Morales, Nicolás; Cabrera-Caño, Jesús
2015-06-01
Since 2009 we are running a project to identify flashes produced by the impact of meteoroids on the surface of the Moon. For this purpose we are employing small telescopes and high-sensitivity CCD video cameras. To automatically identify these events a software package called MIDAS was developed and tested. This package can also perform the photometric analysis of these flashes and estimate the value of the luminous efficiency. Besides, we have implemented in MIDAS a new method to establish which is the likely source of the meteoroids (known meteoroid stream or sporadic background). The main features of this computer program are analyzed here, and some examples of lunar impact events are presented.
Study of application of space telescope science operations software for SIRTF use
NASA Technical Reports Server (NTRS)
Dignam, F.; Stetson, E.; Allendoerfer, W.
1985-01-01
The design and development of the Space Telescope Science Operations Ground System (ST SOGS) was evaluated to compile a history of lessons learned that would benefit NASA's Space Infrared Telescope Facility (SIRTF). Forty-nine specific recommendations resulted and were categorized as follows: (1) requirements: a discussion of the content, timeliness and proper allocation of the system and segment requirements and the resulting impact on SOGS development; (2) science instruments: a consideration of the impact of the Science Instrument design and data streams on SOGS software; and (3) contract phasing: an analysis of the impact of beginning the various ST program segments at different times. Approximately half of the software design and source code might be useable for SIRTF. Transportability of this software requires, at minimum, a compatible DEC VAX-based architecture and VMS operating system, system support software similar to that developed for SOGS, and continued evolution of the SIRTF operations concept and requirements such that they remain compatible with ST SOGS operation.
The ImageJ ecosystem: an open platform for biomedical image analysis
Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368
The ImageJ ecosystem: An open platform for biomedical image analysis.
Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.
Building Energy Simulation Test for Existing Homes (BESTEST-EX) (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judkoff, R.; Neymark, J.; Polly, B.
2011-12-01
This presentation discusses the goals of NREL Analysis Accuracy R&D; BESTEST-EX goals; what BESTEST-EX is; how it works; 'Building Physics' cases; 'Building Physics' reference results; 'utility bill calibration' cases; limitations and potential future work. Goals of NREL Analysis Accuracy R&D are: (1) Provide industry with the tools and technical information needed to improve the accuracy and consistency of analysis methods; (2) Reduce the risks associated with purchasing, financing, and selling energy efficiency upgrades; and (3) Enhance software and input collection methods considering impacts on accuracy, cost, and time of energy assessments. BESTEST-EX Goals are: (1) Test software predictions of retrofitmore » energy savings in existing homes; (2) Ensure building physics calculations and utility bill calibration procedures perform up to a minimum standard; and (3) Quantify impact of uncertainties in input audit data and occupant behavior. BESTEST-EX is a repeatable procedure that tests how well audit software predictions compare to the current state of the art in building energy simulation. There is no direct truth standard. However, reference software have been subjected to validation testing, including comparisons with empirical data.« less
Mason F. Patterson; P. Eric Wiseman; Matthew F. Winn; Sang-mook Lee; Philip A. Araman
2011-01-01
UrbanCrowns is a software program developed by the USDA Forest Service that computes crown attributes using a side-view digital photograph and a few basic field measurements. From an operational standpoint, it is not known how well the software performs under varying photographic conditions for trees of diverse size, which could impact measurement reproducibility and...
Performing Verification and Validation in Reuse-Based Software Engineering
NASA Technical Reports Server (NTRS)
Addy, Edward A.
1999-01-01
The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.
RAD-ADAPT: Software for modelling clonogenic assay data in radiation biology.
Zhang, Yaping; Hu, Kaiqiang; Beumer, Jan H; Bakkenist, Christopher J; D'Argenio, David Z
2017-04-01
We present a comprehensive software program, RAD-ADAPT, for the quantitative analysis of clonogenic assays in radiation biology. Two commonly used models for clonogenic assay analysis, the linear-quadratic model and single-hit multi-target model, are included in the software. RAD-ADAPT uses maximum likelihood estimation method to obtain parameter estimates with the assumption that cell colony count data follow a Poisson distribution. The program has an intuitive interface, generates model prediction plots, tabulates model parameter estimates, and allows automatic statistical comparison of parameters between different groups. The RAD-ADAPT interface is written using the statistical software R and the underlying computations are accomplished by the ADAPT software system for pharmacokinetic/pharmacodynamic systems analysis. The use of RAD-ADAPT is demonstrated using an example that examines the impact of pharmacologic ATM and ATR kinase inhibition on human lung cancer cell line A549 after ionizing radiation. Copyright © 2017 Elsevier B.V. All rights reserved.
A Change Impact Analysis to Characterize Evolving Program Behaviors
NASA Technical Reports Server (NTRS)
Rungta, Neha Shyam; Person, Suzette; Branchaud, Joshua
2012-01-01
Change impact analysis techniques estimate the potential effects of changes made to software. Directed Incremental Symbolic Execution (DiSE) is an intraprocedural technique for characterizing the impact of software changes on program behaviors. DiSE first estimates the impact of the changes on the source code using program slicing techniques, and then uses the impact sets to guide symbolic execution to generate path conditions that characterize impacted program behaviors. DiSE, however, cannot reason about the flow of impact between methods and will fail to generate path conditions for certain impacted program behaviors. In this work, we present iDiSE, an extension to DiSE that performs an interprocedural analysis. iDiSE combines static and dynamic calling context information to efficiently generate impacted program behaviors across calling contexts. Information about impacted program behaviors is useful for testing, verification, and debugging of evolving programs. We present a case-study of our implementation of the iDiSE algorithm to demonstrate its efficiency at computing impacted program behaviors. Traditional notions of coverage are insufficient for characterizing the testing efforts used to validate evolving program behaviors because they do not take into account the impact of changes to the code. In this work we present novel definitions of impacted coverage metrics that are useful for evaluating the testing effort required to test evolving programs. We then describe how the notions of impacted coverage can be used to configure techniques such as DiSE and iDiSE in order to support regression testing related tasks. We also discuss how DiSE and iDiSE can be configured for debugging finding the root cause of errors introduced by changes made to the code. In our empirical evaluation we demonstrate that the configurations of DiSE and iDiSE can be used to support various software maintenance tasks
Wind Data and Tools | Wind | NREL
integrated system design and analysis tools. All software is available for download. Wind-Wildlife Impacts database. It contains a collection of articles, reports, studies, and more that focus on the impacts that
Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.
Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik
2015-02-06
High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.
NASA Technical Reports Server (NTRS)
Sridhar, Banavar; Chen, Neil; Ng, Hok K.
2010-01-01
There is increased awareness of anthropogenic factors affecting climate change and urgency to slow the negative impact. Greenhouse gases, oxides of Nitrogen and contrails resulting from aviation affect the climate in different and uncertain ways. This paper develops a flexible simulation and optimization software architecture to study the trade-offs involved in reducing emissions. The software environment is used to conduct analysis of two approaches for avoiding contrails using the concepts of contrail frequency index and optimal avoidance trajectories.
NASA Astrophysics Data System (ADS)
Silva, N.; Esper, A.
2012-01-01
The work presented in this article represents the results of applying RAMS analysis to a critical space control system, both at system and software levels. The system level RAMS analysis allowed the assignment of criticalities to the high level components, which was further refined by a tailored software level RAMS analysis. The importance of the software level RAMS analysis in the identification of new failure modes and its impact on the system level RAMS analysis is discussed. Recommendations of changes in the software architecture have also been proposed in order to reduce the criticality of the SW components to an acceptable minimum. The dependability analysis was performed in accordance to ECSS-Q-ST-80, which had to be tailored and complemented in some aspects. This tailoring will also be detailed in the article and lessons learned from the application of this tailoring will be shared, stating the importance to space systems safety evaluations. The paper presents the applied techniques, the relevant results obtained, the effort required for performing the tasks and the planned strategy for ROI estimation, as well as the soft skills required and acquired during these activities.
Livingood, Wiliiam C; Coughlin, Susan; Bowman, Walter; Bryant, Thomas; Goldhagen, Jeffrey
2007-01-01
Public health systems are stressed by increasing demands and inadequate resources. This study was designed to demonstrate how economic impact analysis can estimate the economic value of a local public health system's infrastructure as well as the economic assets of an "Academic Health Department" model. This study involved the secondary analysis of publicly available data on health department finances and employment using proprietary software specifically designed to assess economic impacts. The health department's impact on the local community was estimated at over 100 million dollars, exceeding the economic impact of other recently studied local industries with no additional costs to local taxpayers.
ERIC Educational Resources Information Center
Wright, Gerald P.
2013-01-01
Despite over half a century of Project Management research, project success rates are still too low. Organizations spend a tremendous amount of valuable resources on Information Technology projects and seek to maximize the utility gained from their efforts. The author investigated the impact of software development methodology choice on ten…
High-performance equation solvers and their impact on finite element analysis
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. Dale, Jr.
1990-01-01
The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number of operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.
High-performance equation solvers and their impact on finite element analysis
NASA Technical Reports Server (NTRS)
Poole, Eugene L.; Knight, Norman F., Jr.; Davis, D. D., Jr.
1992-01-01
The role of equation solvers in modern structural analysis software is described. Direct and iterative equation solvers which exploit vectorization on modern high-performance computer systems are described and compared. The direct solvers are two Cholesky factorization methods. The first method utilizes a novel variable-band data storage format to achieve very high computation rates and the second method uses a sparse data storage format designed to reduce the number od operations. The iterative solvers are preconditioned conjugate gradient methods. Two different preconditioners are included; the first uses a diagonal matrix storage scheme to achieve high computation rates and the second requires a sparse data storage scheme and converges to the solution in fewer iterations that the first. The impact of using all of the equation solvers in a common structural analysis software system is demonstrated by solving several representative structural analysis problems.
A theoretical basis for the analysis of multiversion software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques (known as fault-tolerant software) is an understanding of the impact of multiple joint occurrences of errors, referred to here as coincident errors. A theoretical basis for the study of redundant software is developed which: (1) provides a probabilistic framework for empirically evaluating the effectiveness of a general multiversion strategy when component versions are subject to coincident errors, and (2) permits an analytical study of the effects of these errors. An intensity function, called the intensity of coincident errors, has a central role in this analysis. This function describes the propensity of programmers to introduce design faults in such a way that software components fail together when executing in the application environment. A condition under which a multiversion system is a better strategy than relying on a single version is given.
Fowler, K. R.; Jenkins, E.W.; Parno, M.; Chrispell, J.C.; Colón, A. I.; Hanson, Randall T.
2016-01-01
The development of appropriate water management strategies requires, in part, a methodology for quantifying and evaluating the impact of water policy decisions on regional stakeholders. In this work, we describe the framework we are developing to enhance the body of resources available to policy makers, farmers, and other community members in their e orts to understand, quantify, and assess the often competing objectives water consumers have with respect to usage. The foundation for the framework is the construction of a simulation-based optimization software tool using two existing software packages. In particular, we couple a robust optimization software suite (DAKOTA) with the USGS MF-OWHM water management simulation tool to provide a flexible software environment that will enable the evaluation of one or multiple (possibly competing) user-defined (or stakeholder) objectives. We introduce the individual software components and outline the communication strategy we defined for the coupled development. We present numerical results for case studies related to crop portfolio management with several defined objectives. The objectives are not optimally satisfied for any single user class, demonstrating the capability of the software tool to aid in the evaluation of a variety of competing interests.
NASA Technical Reports Server (NTRS)
Allen, B. Danette
1998-01-01
In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.
NASA Astrophysics Data System (ADS)
Gan, Chenquan; Yang, Xiaofan
2015-05-01
In this paper, a new computer virus propagation model, which incorporates the effects of removable storage media and antivirus software, is proposed and analyzed. The global stability of the unique equilibrium of the model is independent of system parameters. Numerical simulations not only verify this result, but also illustrate the influences of removable storage media and antivirus software on viral spread. On this basis, some applicable measures for suppressing virus prevalence are suggested.
Software dependability in the Tandem GUARDIAN system
NASA Technical Reports Server (NTRS)
Lee, Inhwan; Iyer, Ravishankar K.
1995-01-01
Based on extensive field failure data for Tandem's GUARDIAN operating system this paper discusses evaluation of the dependability of operational software. Software faults considered are major defects that result in processor failures and invoke backup processes to take over. The paper categorizes the underlying causes of software failures and evaluates the effectiveness of the process pair technique in tolerating software faults. A model to describe the impact of software faults on the reliability of an overall system is proposed. The model is used to evaluate the significance of key factors that determine software dependability and to identify areas for improvement. An analysis of the data shows that about 77% of processor failures that are initially considered due to software are confirmed as software problems. The analysis shows that the use of process pairs to provide checkpointing and restart (originally intended for tolerating hardware faults) allows the system to tolerate about 75% of reported software faults that result in processor failures. The loose coupling between processors, which results in the backup execution (the processor state and the sequence of events) being different from the original execution, is a major reason for the measured software fault tolerance. Over two-thirds (72%) of measured software failures are recurrences of previously reported faults. Modeling, based on the data, shows that, in addition to reducing the number of software faults, software dependability can be enhanced by reducing the recurrence rate.
Simplified formulae for the estimation of offshore wind turbines clutter on marine radars.
Grande, Olatz; Cañizo, Josune; Angulo, Itziar; Jenn, David; Danoon, Laith R; Guerra, David; de la Vega, David
2014-01-01
The potential impact that offshore wind farms may cause on nearby marine radars should be considered before the wind farm is installed. Strong radar echoes from the turbines may degrade radars' detection capability in the area around the wind farm. Although conventional computational methods provide accurate results of scattering by wind turbines, they are not directly implementable in software tools that can be used to conduct the impact studies. This paper proposes a simple model to assess the clutter that wind turbines may generate on marine radars. This method can be easily implemented in the system modeling software tools for the impact analysis of a wind farm in a real scenario.
Simplified Formulae for the Estimation of Offshore Wind Turbines Clutter on Marine Radars
Grande, Olatz; Cañizo, Josune; Jenn, David; Danoon, Laith R.; Guerra, David
2014-01-01
The potential impact that offshore wind farms may cause on nearby marine radars should be considered before the wind farm is installed. Strong radar echoes from the turbines may degrade radars' detection capability in the area around the wind farm. Although conventional computational methods provide accurate results of scattering by wind turbines, they are not directly implementable in software tools that can be used to conduct the impact studies. This paper proposes a simple model to assess the clutter that wind turbines may generate on marine radars. This method can be easily implemented in the system modeling software tools for the impact analysis of a wind farm in a real scenario. PMID:24782682
Scoring of medical publications with SIGAPS software: Application to orthopedics.
Rouvillain, J-L; Derancourt, C; Moore, N; Devos, P
2014-11-01
SIGAPS is a bibliometric software tool developed in France to identify and analyze Medline-indexed publications that are produced by a researcher or research group. This measurement takes into account the author's ranking on the paper along with the journal's prestige according to its impact factor within the research field. However, use of this impact factor is the primary limitation of SIGAPS. SIGAPS analysis results are used to assign a financial value to hospital facilities. The impact of the journal Revue de Chirurgie Orthopédique and its successor-Orthopaedics & Traumatology: Surgery & Research-was compared using the Medline-based ISI (SIGAPS) and SCOPUS-based SCImago journal rankings. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun
2018-01-01
To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.
Study of fault-tolerant software technology
NASA Technical Reports Server (NTRS)
Slivinski, T.; Broglio, C.; Wild, C.; Goldberg, J.; Levitt, K.; Hitt, E.; Webb, J.
1984-01-01
Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance.
[Finite Element Analysis of Intravascular Stent Based on ANSYS Software].
Shi, Gengqiang; Song, Xiaobing
2015-10-01
This paper adopted UG8.0 to bulid the stent and blood vessel models. The models were then imported into the finite element analysis software ANSYS. The simulation results of ANSYS software showed that after endothelial stent implantation, the velocity of the blood was slow and the fluctuation of velocity was small, which meant the flow was relatively stable. When blood flowed through the endothelial stent, the pressure gradually became smaller, and the range of the pressure was not wide. The endothelial shear stress basically unchanged. In general, it can be concluded that the endothelial stents have little impact on the flow of blood and can fully realize its function.
Software reliability experiments data analysis and investigation
NASA Technical Reports Server (NTRS)
Walker, J. Leslie; Caglayan, Alper K.
1991-01-01
The objectives are to investigate the fundamental reasons which cause independently developed software programs to fail dependently, and to examine fault tolerant software structures which maximize reliability gain in the presence of such dependent failure behavior. The authors used 20 redundant programs from a software reliability experiment to analyze the software errors causing coincident failures, to compare the reliability of N-version and recovery block structures composed of these programs, and to examine the impact of diversity on software reliability using subpopulations of these programs. The results indicate that both conceptually related and unrelated errors can cause coincident failures and that recovery block structures offer more reliability gain than N-version structures if acceptance checks that fail independently from the software components are available. The authors present a theory of general program checkers that have potential application for acceptance tests.
Computing in Qualitative Analysis: A Healthy Development?
ERIC Educational Resources Information Center
Richards, Lyn; Richards, Tom
1991-01-01
Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…
Towards understanding software: 15 years in the SEL
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose
1990-01-01
For 15 years, the Software Engineering Laboratory (SEL) at GSFC has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software, and software processes within a production software environment. The SEL comprises three major organizations: (1) the GSFC Flight Dynamics Division; (2) the University of Maryland Computer Science Department; and (3) the Computer Sciences Corporation Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents: all describing some aspect of the software engineering technology that has undergone analysis in the flight dynamics environment. The studies range from small controlled experiments (such as analyzing the effectiveness of code reading versus functional testing) to large, multiple-project studies (such as assessing the impacts of Ada on a production environment). The key findings that NASA feels have laid the foundation for ongoing and future software development and research activities are summarized.
A computer-guided minimally-invasive technique for orthodontic forced eruption of impacted canines.
BERTELè, Matteo; Minniti, Paola P; Dalessandri, Domenico; Bonetti, Stefano; Visconti, Luca; Paganelli, Corrado
2016-06-01
The aim of this study was to develop a computer-guided minimally-invasive protocol for the surgical application of an orthodontic traction during the forced eruption of an impacted canine. 3Diagnosys® software was used to evaluate impacted canines position and to plan the surgical access, taking into account soft and hard tissues thickness, orthodontic traction path and presence of possible obstacles. Geomagic® software was used for reverse engineering and RhinocerosTM software was employed as three-dimensional modeller in preparing individualized surgical guides. Surgical access was gained flapless through the use of a mucosal punch for soft tissues, followed by a trephine bur with a pre-adjusted stop for bone path creation. A diamond bur mounted on SONICflex® 2003/L handpiece was used to prepare a 2-mm-deep calibrated hole into the canine enamel where a titanium screw connected with a stainless steel ligature was screwed. In-vitro pull-out tests, radiological and SEM analysis were realized in order to investigate screw stability and position. In two out of ten samples the screw was removed after the application of a 1-kg pull-out force. Radiological and SEM analysis demonstrated that all the screws were inserted into the enamel without affecting dentine integrity. This computer-guided minimally-invasive technique allowed a precise and reliable positioning of screws utilized during the orthodontic traction of impacted canines.
NASA Astrophysics Data System (ADS)
Yousif, Dilon
The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).
NASA Technical Reports Server (NTRS)
Flanders, J. H.; Helmers, C. T.; Stanten, S. F.
1973-01-01
The relationship is examined between the space shuttle onboard avionics and the ground test computer language GOAL when used in the onboard computers. The study is aimed at providing system analysis support to the feasibility analysis of a GOAL to HAL translator, where HAL is the language used to program the onboard computers for flight. The subject is dealt with in three aspects. First, the system configuration at checkout, the general checkout and launch sequences, and the inventory of subsystems are described. Secondly, the hierarchic organization of onboard software and different ways of introducing GOAL-derived software onboard are described. Also the flow of commands and test data during checkout is diagrammed. Finally, possible impact of error detection and redundancy management on the GOAL language is discussed.
Impact of Computer Software on Advertising Media Courses: A Study of 71 Colleges and Universities.
ERIC Educational Resources Information Center
Martin, Dennis G.; Vaccaro-Lloyd, Carla
A study evaluated the impact of microcomputers and software in advertising media courses by surveying media planning instructors in 71 (out of 111) colleges and universities. The survey asked instructors to evaluate textbooks, workbooks, and computer software; to assess the impact of software on the media course; and to compare software users with…
2010-04-01
for decoupled parallel development Ref: Barry Boehm 12 Impacts of Technological Changes in the Cyber Environment on Software/Systems Engineering... Pressman , R.S., Software Engineering: A Practitioner’s Approach, 13 Impacts of Technological Changes in the Cyber Environment on Software/Systems
Design and Analysis of Tooth Impact Test Rig for Spur Gear
NASA Astrophysics Data System (ADS)
Ghazali, Wafiuddin Bin Md; Aziz, Ismail Ali Bin Abdul; Daing Idris, Daing Mohamad Nafiz Bin; Ismail, Nurazima Binti; Sofian, Azizul Helmi Bin
2016-02-01
This paper is about the design and analysis of a prototype of tooth impact test rig for spur gear. The test rig was fabricated and analysis was conducted to study its’ limitation and capabilities. The design of the rig is analysed to ensure that there will be no problem occurring during the test and reliable data can be obtained. From the result of the analysis, the maximum amount of load that can be applied, the factor of safety of the machine, the stresses on the test rig parts were determined. This is important in the design consideration of the test rig. The materials used for the fabrication of the test rig were also discussed and analysed. MSC Nastran Patran software was used to analyse the model, which was designed by using SolidWorks 2014 software. Based from the results, there were limitations found from the initial design and the test rig design needs to be improved in order for the test rig to operate properly.
NASA Astrophysics Data System (ADS)
Fabianová, Jana; Kačmáry, Peter; Molnár, Vieroslav; Michalik, Peter
2016-10-01
Forecasting is one of the logistics activities and a sales forecast is the starting point for the elaboration of business plans. Forecast accuracy affects the business outcomes and ultimately may significantly affect the economic stability of the company. The accuracy of the prediction depends on the suitability of the use of forecasting methods, experience, quality of input data, time period and other factors. The input data are usually not deterministic but they are often of random nature. They are affected by uncertainties of the market environment, and many other factors. Taking into account the input data uncertainty, the forecast error can by reduced. This article deals with the use of the software tool for incorporating data uncertainty into forecasting. Proposals are presented of a forecasting approach and simulation of the impact of uncertain input parameters to the target forecasted value by this case study model. The statistical analysis and risk analysis of the forecast results is carried out including sensitivity analysis and variables impact analysis.
Simulation of Attacks for Security in Wireless Sensor Network.
Diaz, Alvaro; Sanchez, Pablo
2016-11-18
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node's software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work.
William F. Lazarus; Douglas G. Tiffany; Ronald S. Zalesny Jr.; Don E. Riemenschneider
2011-01-01
Short-rotation woody crops (SRWC) such as hybrid poplars are becoming increasingly competitive with agriculture on marginal land. The trees can be grown for energy and for traditional uses such as oriented strandboard. Using IMPLAN (Impact Analysis for Planning) software, we modeled the impacts of shifting land use from hay and pasture for cow-calf beef operations to...
Modeling and analysis of visual digital impact model for a Chinese human thorax.
Zhu, Jin; Wang, Kai-Ming; Li, Shu; Liu, Hai-Yan; Jing, Xiao; Li, Xiao-Fang; Liu, Yi-He
2017-01-01
To establish a three-dimensional finite element model of the human chest for engineering research on individual protection. Computed tomography (CT) scanning data were used for three-dimensional reconstruction with the medical image reconstruction software Mimics. The finite element method (FEM) preprocessing software ANSYS ICEM CFD was used for cell mesh generation, and the relevant material behavior parameters of all of the model's parts were specified. The finite element model was constructed with the FEM software, and the model availability was verified based on previous cadaver experimental data. A finite element model approximating the anatomical structure of the human chest was established, and the model's simulation results conformed to the results of the cadaver experiment overall. Segment data of the human body and specialized software can be utilized for FEM model reconstruction to satisfy the need for numerical analysis of shocks to the human chest in engineering research on body mechanics.
A coverage and slicing dependencies analysis for seeking software security defects.
He, Hui; Zhang, Dongyan; Liu, Min; Zhang, Weizhe; Gao, Dongmin
2014-01-01
Software security defects have a serious impact on the software quality and reliability. It is a major hidden danger for the operation of a system that a software system has some security flaws. When the scale of the software increases, its vulnerability has becoming much more difficult to find out. Once these vulnerabilities are exploited, it may lead to great loss. In this situation, the concept of Software Assurance is carried out by some experts. And the automated fault localization technique is a part of the research of Software Assurance. Currently, automated fault localization method includes coverage based fault localization (CBFL) and program slicing. Both of the methods have their own location advantages and defects. In this paper, we have put forward a new method, named Reverse Data Dependence Analysis Model, which integrates the two methods by analyzing the program structure. On this basis, we finally proposed a new automated fault localization method. This method not only is automation lossless but also changes the basic location unit into single sentence, which makes the location effect more accurate. Through several experiments, we proved that our method is more effective. Furthermore, we analyzed the effectiveness among these existing methods and different faults.
ERIC Educational Resources Information Center
Mobray, Deborah, Ed.
Papers on local area networks (LANs), modelling techniques, software improvement, capacity planning, software engineering, microcomputers and end user computing, cost accounting and chargeback, configuration and performance management, and benchmarking presented at this conference include: (1) "Theoretical Performance Analysis of Virtual…
Open cyberGIS software for geospatial research and education in the big data era
NASA Astrophysics Data System (ADS)
Wang, Shaowen; Liu, Yan; Padmanabhan, Anand
CyberGIS represents an interdisciplinary field combining advanced cyberinfrastructure, geographic information science and systems (GIS), spatial analysis and modeling, and a number of geospatial domains to improve research productivity and enable scientific breakthroughs. It has emerged as new-generation GIS that enable unprecedented advances in data-driven knowledge discovery, visualization and visual analytics, and collaborative problem solving and decision-making. This paper describes three open software strategies-open access, source, and integration-to serve various research and education purposes of diverse geospatial communities. These strategies have been implemented in a leading-edge cyberGIS software environment through three corresponding software modalities: CyberGIS Gateway, Toolkit, and Middleware, and achieved broad and significant impacts.
High Performance Computing Software Applications for Space Situational Awareness
NASA Astrophysics Data System (ADS)
Giuliano, C.; Schumacher, P.; Matson, C.; Chun, F.; Duncan, B.; Borelli, K.; Desonia, R.; Gusciora, G.; Roe, K.
The High Performance Computing Software Applications Institute for Space Situational Awareness (HSAI-SSA) has completed its first full year of applications development. The emphasis of our work in this first year was in improving space surveillance sensor models and image enhancement software. These applications are the Space Surveillance Network Analysis Model (SSNAM), the Air Force Space Fence simulation (SimFence), and physically constrained iterative de-convolution (PCID) image enhancement software tool. Specifically, we have demonstrated order of magnitude speed-up in those codes running on the latest Cray XD-1 Linux supercomputer (Hoku) at the Maui High Performance Computing Center. The software applications improvements that HSAI-SSA has made, has had significant impact to the warfighter and has fundamentally changed the role of high performance computing in SSA.
A Ballistic Limit Analysis Program for Shielding Against Micrometeoroids and Orbital Debris
NASA Technical Reports Server (NTRS)
Ryan, Shannon; Christiansen, Erie
2010-01-01
A software program has been developed that enables the user to quickly and simply perform ballistic limit calculations for common spacecraft structures that are subject to hypervelocity impact of micrometeoroid and orbital debris (MMOD) projectiles. This analysis program consists of two core modules: design, and; performance. The design module enables a user to calculate preliminary dimensions of a shield configuration (e.g., thicknesses/areal densities, spacing, etc.) for a ?design? particle (diameter, density, impact velocity, incidence). The performance module enables a more detailed shielding analysis, providing the performance of a user-defined shielding configuration over the range of relevant in-orbit impact conditions.
DOT National Transportation Integrated Search
1987-11-01
Performance requirements are developed which define the kinematic and kinetic response of the head for a seated subject exposed to frontal, lateral or oblique impact. Response is expressed in terms of variables which are readily measured in an anthro...
"Library Quarterly," 1956-2004: An Exploratory Bibliometric Analysis
ERIC Educational Resources Information Center
Young, Arthur P.
2006-01-01
"Library Quarterly's" seventy-fifth anniversary invites an analysis of the journal's bibliometric dimension, including contributor attributes, various author rankings, and citation impact. Eugene Garfield's HistCite software, linked to Thomson Scientific's Web of Science, as made available by Garfield, for the period 1956-2004, was used as the…
Large Smoke Plumes, Alberta Canada
Atmospheric Science Data Center
2016-12-30
... has adverse impacts on human health. These data were acquired during Terra orbit 87148. The stereoscopic analysis was ... software tool, which is publicly available through the Open Channel Foundation at: https://www.openchannelsoftware.com/projects/MINX ...
OAO-C Copernicus Operations Report
NASA Technical Reports Server (NTRS)
1981-01-01
An analysis of the operation of OAO 3, Copernicus, the orbiting astronomical observatory, is given with particular emphasis upon the Princeton Experiment Package. Malfunctions and their impact are discussed, as are orbital observations and operations. Software is described.
Simulation of Attacks for Security in Wireless Sensor Network
Diaz, Alvaro; Sanchez, Pablo
2016-01-01
The increasing complexity and low-power constraints of current Wireless Sensor Networks (WSN) require efficient methodologies for network simulation and embedded software performance analysis of nodes. In addition, security is also a very important feature that has to be addressed in most WSNs, since they may work with sensitive data and operate in hostile unattended environments. In this paper, a methodology for security analysis of Wireless Sensor Networks is presented. The methodology allows designing attack-aware embedded software/firmware or attack countermeasures to provide security in WSNs. The proposed methodology includes attacker modeling and attack simulation with performance analysis (node’s software execution time and power consumption estimation). After an analysis of different WSN attack types, an attacker model is proposed. This model defines three different types of attackers that can emulate most WSN attacks. In addition, this paper presents a virtual platform that is able to model the node hardware, embedded software and basic wireless channel features. This virtual simulation analyzes the embedded software behavior and node power consumption while it takes into account the network deployment and topology. Additionally, this simulator integrates the previously mentioned attacker model. Thus, the impact of attacks on power consumption and software behavior/execution-time can be analyzed. This provides developers with essential information about the effects that one or multiple attacks could have on the network, helping them to develop more secure WSN systems. This WSN attack simulator is an essential element of the attack-aware embedded software development methodology that is also introduced in this work. PMID:27869710
Impacts of software and its engineering on the carbon footprint of ICT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kern, Eva, E-mail: e.kern@umwelt-campus.de; Dick, Markus, E-mail: sustainablesoftwareblog@gmail.com; Naumann, Stefan, E-mail: s.naumann@umwelt-campus.de
2015-04-15
The energy consumption of information and communication technology (ICT) is still increasing. Even though several solutions regarding the hardware side of Green IT exist, the software contribution to Green IT is not well investigated. The carbon footprint is one way to rate the environmental impacts of ICT. In order to get an impression of the induced CO{sub 2} emissions of software, we will present a calculation method for the carbon footprint of a software product over its life cycle. We also offer an approach on how to integrate some aspects of carbon footprint calculation into software development processes and discussmore » impacts and tools regarding this calculation method. We thus show the relevance of energy measurements and the attention to impacts on the carbon footprint by software within Green Software Engineering.« less
Sensitivity of surface meteorological analyses to observation networks
NASA Astrophysics Data System (ADS)
Tyndall, Daniel Paul
A computationally efficient variational analysis system for two-dimensional meteorological fields is developed and described. This analysis approach is most efficient when the number of analysis grid points is much larger than the number of available observations, such as for large domain mesoscale analyses. The analysis system is developed using MATLAB software and can take advantage of multiple processors or processor cores. A version of the analysis system has been exported as a platform independent application (i.e., can be run on Windows, Linux, or Macintosh OS X desktop computers without a MATLAB license) with input/output operations handled by commonly available internet software combined with data archives at the University of Utah. The impact of observation networks on the meteorological analyses is assessed by utilizing a percentile ranking of individual observation sensitivity and impact, which is computed by using the adjoint of the variational surface assimilation system. This methodology is demonstrated using a case study of the analysis from 1400 UTC 27 October 2010 over the entire contiguous United States domain. The sensitivity of this approach to the dependence of the background error covariance on observation density is examined. Observation sensitivity and impact provide insight on the influence of observations from heterogeneous observing networks as well as serve as objective metrics for quality control procedures that may help to identify stations with significant siting, reporting, or representativeness issues.
Mei, Lin; He, Lin; Song, Yuhua; Lv, Yang; Zhang, Lijiu; Hao, Fengxi; Xu, Mengmeng
2018-05-01
To investigate the relationship between obesity and disease-free survival (DFS) and overall survival (OS) of triple-negative breast cancer. Citations were searched in PubMed, Cochrane Library, and Web of Science. Random effect model meta-analysis was conducted by using Revman software version 5.0, and publication bias was evaluated by creating Egger regression with STATA software version 12. Nine studies (4412 patients) were included for DFS meta-analysis, 8 studies (4392 patients) include for OS meta-analysis. There were no statistical significances between obesity with DFS (P = .60) and OS (P = .71) in triple-negative breast cancer (TNBC) patients. Obesity has no impact on DFS and OS in patients with TNBC.
NASA Technical Reports Server (NTRS)
Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David
1990-01-01
This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.
Improving Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-02-01
New test procedure evaluates quality and accuracy of energy analysis tools for the residential building retrofit market. Reducing the energy use of existing homes in the United States offers significant energy-saving opportunities, which can be identified through building simulation software tools that calculate optimal packages of efficiency measures. To improve the accuracy of energy analysis for residential buildings, the National Renewable Energy Laboratory's (NREL) Buildings Research team developed the Building Energy Simulation Test for Existing Homes (BESTEST-EX), a method for diagnosing and correcting errors in building energy audit software and calibration procedures. BESTEST-EX consists of building physics and utility billmore » calibration test cases, which software developers can use to compare their tools simulation findings to reference results generated with state-of-the-art simulation tools. Overall, the BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX is helping software developers identify and correct bugs in their software, as well as develop and test utility bill calibration procedures.« less
Selecting information technology for physicians' practices: a cross-sectional study.
Eden, Karen Beekman
2002-04-05
Many physicians are transitioning from paper to electronic formats for billing, scheduling, medical charts, communications, etc. The primary objective of this research was to identify the relationship (if any) between the software selection process and the office staff's perceptions of the software's impact on practice activities. A telephone survey was conducted with office representatives of 407 physician practices in Oregon who had purchased information technology. The respondents, usually office managers, answered scripted questions about their selection process and their perceptions of the software after implementation. Multiple logistic regression revealed that software type, selection steps, and certain factors influencing the purchase were related to whether the respondents felt the software improved the scheduling and financial analysis practice activities. Specifically, practices that selected electronic medical record or practice management software, that made software comparisons, or that considered prior user testimony as important were more likely to have perceived improvements in the scheduling process than were other practices. Practices that considered value important, that did not consider compatibility important, that selected managed care software, that spent less than 10,000 dollars, or that provided learning time (most dramatic increase in odds ratio, 8.2) during implementation were more likely to perceive that the software had improved the financial analysis process than were other practices. Perhaps one of the most important predictors of improvement was providing learning time during implementation, particularly when the software involves several practice activities. Despite this importance, less than half of the practices reported performing this step.
Seizing the Digital High Ground: Military Operations and Politics in the Social Media Era
2015-04-13
divided on the relative threats and opportunities. Through the analysis of social media’s technological evolution, its impact on crowd behaviour , and... Through the analysis of social media’s technological evolution, its impact on crowd behaviour , and using case studies of the Arab Spring and Islamic... arguments and recommendations proposed. Web 2.0 is a term used to describe the way in which software developers and end-users utilize the World Wide Web as
NASA Technical Reports Server (NTRS)
Mizell, Carolyn Barrett; Malone, Linda
2007-01-01
The development process for a large software development project is very complex and dependent on many variables that are dynamic and interrelated. Factors such as size, productivity and defect injection rates will have substantial impact on the project in terms of cost and schedule. These factors can be affected by the intricacies of the process itself as well as human behavior because the process is very labor intensive. The complex nature of the development process can be investigated with software development process models that utilize discrete event simulation to analyze the effects of process changes. The organizational environment and its effects on the workforce can be analyzed with system dynamics that utilizes continuous simulation. Each has unique strengths and the benefits of both types can be exploited by combining a system dynamics model and a discrete event process model. This paper will demonstrate how the two types of models can be combined to investigate the impacts of human resource interactions on productivity and ultimately on cost and schedule.
Reliability analysis of different structure parameters of PCBA under drop impact
NASA Astrophysics Data System (ADS)
Liu, P. S.; Fan, G. M.; Liu, Y. H.
2018-03-01
The establishing process of PCBA is modelled by finite element analysis software ABAQUS. Firstly, introduce the Input-G method and the fatigue life under drop impact are introduced and the mechanism of the solder joint failure in the process of drop is analysed. The main reason of solder joint failure is that the PCB component is suffering repeated tension and compression stress during the drop impact. Finally, the equivalent stress and peel stress of different solder joint and plate-level components under different impact acceleration are also analysed. The results show that the reliability of tin-silver copper joint is better than that of tin- lead solder joint, and the fatigue life of solder joint expectancy decrease as the impact pulse amplitude increases.
Kobayashi, Masanao; Asada, Yasuki; Matsubara, Kosuke; Suzuki, Shouichi; Matsunaga, Yuta; Haba, Tomonobu; Kawaguchi, Ai; Daioku, Tomihiko; Toyama, Hiroshi; Kato, Ryoichi
2017-05-01
Adequate dose management during computed tomography is important. In the present study, the dosimetric application software ImPACT was added to a functional calculator of the size-specific dose estimate and was part of the scan settings for the auto exposure control (AEC) technique. This study aimed to assess the practicality and accuracy of the modified ImPACT software for dose estimation. We compared the conversion factors identified by the software with the values reported by the American Association of Physicists in Medicine Task Group 204, and we noted similar results. Moreover, doses were calculated with the AEC technique and a fixed-tube current of 200 mA for the chest-pelvis region. The modified ImPACT software could estimate each organ dose, which was based on the modulated tube current. The ability to perform beneficial modifications indicates the flexibility of the ImPACT software. The ImPACT software can be further modified for estimation of other doses. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Project IMPACT Software Documentation: Overview of the Computer-Administered Instruction Subsystem.
ERIC Educational Resources Information Center
Stelzer, John; Garneau, Jean
Research in Project IMPACT, prototypes of computerized training for Army personnel, is documented in an overview of the IMPACT computer software system for computer-administered instruction, exclusive of instructional software. The overview description provides a basis for an understanding of the rationale and motivation for the development of the…
Impact of Domain Analysis on Reuse Methods
1989-11-06
return on the investment. The potential negative effects a "bad" domain analysis has on developing systems in the domain also increases the risks of a...importance of domain analysis as part of a software reuse program. A particular goal is to assist in avoiding the potential negative effects of ad hoc or...are specification objects discovered by performing object-oriented analysis. Object-based analysis approaches thus serve to capture a model of reality
Predictive Software Cost Model Study. Volume II. Software Package Detailed Data.
1980-06-01
will not be limited to: a. ASN-91 NWDS Computer b. Armament System Control Unit ( ASCU ) c. AN/ASN-90 IMS 6. CONFIGURATION CONTROL. OFP/OTP...planned approach. 3. Detailed analysis and study; impacts on hardware, manuals, data, AGE , etc; alternatives with pros and cons; cost estimates; ECP...WAIT UNTIL RESOURCE REQUEST FOR * : HAG TAPE HAS BEEN FULFILLED )MTS 0 RI * Ae* NESDIIRCE MAG TAPE (SHORT FORM)I:TST IN I" . TEST " AG TAPE RESOURCE
Impact of Requirements Quality on Project Success or Failure
NASA Astrophysics Data System (ADS)
Tamai, Tetsuo; Kamata, Mayumi Itakura
We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.
Consolidated View on Space Software Engineering Problems - An Empirical Study
NASA Astrophysics Data System (ADS)
Silva, N.; Vieira, M.; Ricci, D.; Cotroneo, D.
2015-09-01
Independent software verification and validation (ISVV) has been a key process for engineering quality assessment for decades, and is considered in several international standards. The “European Space Agency (ESA) ISVV Guide” is used for the European Space market to drive the ISVV tasks and plans, and to select applicable tasks and techniques. Software artefacts have room for improvement due to the amount if issues found during ISVV tasks. This article presents the analysis of the results of a large set of ISVV issues originated from three different ESA missions-amounting to more than 1000 issues. The study presents the main types, triggers and impacts related to the ISVV issues found and sets the path for a global software engineering improvement based on the most common deficiencies identified for space projects.
Data Analysis Tools and Methods for Improving the Interaction Design in E-Learning
ERIC Educational Resources Information Center
Popescu, Paul Stefan
2015-01-01
In this digital era, learning from data gathered from different software systems may have a great impact on the quality of the interaction experience. There are two main directions that come to enhance this emerging research domain, Intelligent Data Analysis (IDA) and Human Computer Interaction (HCI). HCI specific research methodologies can be…
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Leary, Patrick
The primary challenge motivating this project is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who can perform analysis only on a small fraction of the data they calculate, resulting in the substantial likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, which is known as in situ processing. The idea in situ processing was not new at the time ofmore » the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by Department of Energy (DOE) science projects. Our objective was to produce and enable the use of production-quality in situ methods and infrastructure, at scale, on DOE high-performance computing (HPC) facilities, though we expected to have an impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve this objective, we engaged in software technology research and development (R&D), in close partnerships with DOE science code teams, to produce software technologies that were shown to run efficiently at scale on DOE HPC platforms.« less
Impact of Truck Loading on Design and Analysis of Asphaltic Pavement Structures : Phase II
DOT National Transportation Integrated Search
2011-02-01
In this study, Schaperys nonlinear viscoelastic constitutive model is implemented into the commercial finite element (FE) software ABAQUS via user defined subroutine (user material, or UMAT) to analyze asphalt pavement subjected to heavy truck loa...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prowell, Stacy J; Symons, Christopher T
2015-01-01
Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.
Aquatic Acoustic Metrics Interface Utility for Underwater Sound Monitoring and Analysis
Ren, Huiying; Halvorsen, Michele B.; Deng, Zhiqun Daniel; Carlson, Thomas J.
2012-01-01
Fishes and marine mammals may suffer a range of potential effects from exposure to intense underwater sound generated by anthropogenic activities such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording (USR) devices have been built to acquire samples of the underwater sound generated by anthropogenic activities. Software becomes indispensable for processing and analyzing the audio files recorded by these USRs. In this paper, we provide a detailed description of a new software package, the Aquatic Acoustic Metrics Interface (AAMI), specifically designed for analysis of underwater sound recordings to provide data in metrics that facilitate evaluation of the potential impacts of the sound on aquatic animals. In addition to the basic functions, such as loading and editing audio files recorded by USRs and batch processing of sound files, the software utilizes recording system calibration data to compute important parameters in physical units. The software also facilitates comparison of the noise sound sample metrics with biological measures such as audiograms of the sensitivity of aquatic animals to the sound, integrating various components into a single analytical frame. The features of the AAMI software are discussed, and several case studies are presented to illustrate its functionality. PMID:22969353
Finite Element Analysis of Folded Airbag in Frontal Impact of Adapted Vehicles for Disabled Drivers
NASA Astrophysics Data System (ADS)
Masiá, J.; Eixerés, B.; Dols, J. F.; Esquerdo, T. V.
2009-11-01
The car control adaptations are used in vehicles in order to facilitate the driving to persons with physical handicaps. This does not have to suppose a decrease of the passive safety that is required to the vehicles. In order to analyze this relation there will be characterized the different control adaptations that are in use together with the different devices of passive safety that can be mounted in the vehicles in diverse cases of impact in order to generate models of simulation. The methodology used to generate this simulation consists of the first phase in which there develops the three-dimensional model of the driving place. For it, there has been used a commercial software of three-dimensional design. Once realized this one divides, the model is imported to the finite elements software in which meshing is generated. Finally, dynamic simulation software is used to assign the most important characteristics like material properties, contact interfaces, gas expansion models, airbag fold types, etc.
Mapping analysis and planning system for the John F. Kennedy Space Center
NASA Technical Reports Server (NTRS)
Hall, C. R.; Barkaszi, M. J.; Provancha, M. J.; Reddick, N. A.; Hinkle, C. R.; Engel, B. A.; Summerfield, B. R.
1994-01-01
Environmental management, impact assessment, research and monitoring are multidisciplinary activities which are ideally suited to incorporate a multi-media approach to environmental problem solving. Geographic information systems (GIS), simulation models, neural networks and expert-system software are some of the advancing technologies being used for data management, query, analysis and display. At the 140,000 acre John F. Kennedy Space Center, the Advanced Software Technology group has been supporting development and implementation of a program that integrates these and other rapidly evolving hardware and software capabilities into a comprehensive Mapping, Analysis and Planning System (MAPS) based in a workstation/local are network environment. An expert-system shell is being developed to link the various databases to guide users through the numerous stages of a facility siting and environmental assessment. The expert-system shell approach is appealing for its ease of data access by management-level decision makers while maintaining the involvement of the data specialists. This, as well as increased efficiency and accuracy in data analysis and report preparation, can benefit any organization involved in natural resources management.
Ronquillo, Jay G; Zuckerman, Diana M
2017-09-01
Policy Points: Medical software has become an increasingly critical component of health care, yet the regulation of these devices is inconsistent and controversial. No studies of medical devices and software assess the impact on patient safety of the FDA's current regulatory safeguards and new legislative changes to those standards. Our analysis quantifies the impact of software problems in regulated medical devices and indicates that current regulations are necessary but not sufficient for ensuring patient safety by identifying and eliminating dangerous defects in software currently on the market. New legislative changes will further deregulate health IT, reducing safeguards that facilitate the reporting and timely recall of flawed medical software that could harm patients. Medical software has become an increasingly critical component of health care, yet the regulatory landscape for digital health is inconsistent and controversial. To understand which policies might best protect patients, we examined the impact of the US Food and Drug Administration's (FDA's) regulatory safeguards on software-related technologies in recent years and the implications for newly passed legislative changes in regulatory policy. Using FDA databases, we identified all medical devices that were recalled from 2011 through 2015 primarily because of software defects. We counted all software-related recalls for each FDA risk category and evaluated each high-risk and moderate-risk recall of electronic medical records to determine the manufacturer, device classification, submission type, number of units, and product details. A total of 627 software devices (1.4 million units) were subject to recalls, with 12 of these devices (190,596 units) subject to the highest-risk recalls. Eleven of the devices recalled as high risk had entered the market through the FDA review process that does not require evidence of safety or effectiveness, and one device was completely exempt from regulatory review. The largest high-risk recall categories were anesthesiology and general hospital, with one each in cardiovascular and neurology. Five electronic medical record systems (9,347 units) were recalled for software defects classified as posing a moderate risk to patient safety. Software problems in medical devices are not rare and have the potential to negatively influence medical care. Premarket regulation has not captured all the software issues that could harm patients, evidenced by the potentially large number of patients exposed to software products later subject to high-risk and moderate-risk recalls. Provisions of the 21st Century Cures Act that became law in late 2016 will reduce safeguards further. Absent stronger regulations and implementation to create robust risk assessment and adverse event reporting, physicians and their patients are likely to be at risk from medical errors caused by software-related problems in medical devices. © 2017 Milbank Memorial Fund.
Analysis of capacity and traffic operations impacts of the World Trade Bridge in Laredo
DOT National Transportation Integrated Search
2001-07-01
Project 0-1800 pioneered the use of modern micro-simulation software to analyze the complex procedures involved in international border crossings. The animated models simulate the entire southbound commercial traffic flow, starting with U.S. Customs ...
Consuelo Brandeis; Donald G. Hodges
2015-01-01
The analysis in this article provides an update on the southern forest sector economic activity after the downturn experienced in 2008â2009. The analysis was conducted using Impact Analysis for Planning (IMPLAN) software and data sets for 2009 and 2011 and results from the USDA Forest Service Timber Products Output latest survey of primary wood processing mills....
2008-09-01
re-considered for future use in the PCRs. Its reintroduction should be accompanied with more adequate support for selecting appropriate quality...cnr.it/Papers/ODBASE- CONTEXT.pdf. [Goethert 2007] Goethert, Wolf & Goldenson, Dennis. "Implementing CMMI® Measurement & Analysis Using Goal-Driven...9th Annual Practical Software and Systems Measurement Users’ Group Conference. Keystone, Colorado , July 2005. [Monarch 1995] Ira A. Monarch. An
Hierarchical Simulation to Assess Hardware and Software Dependability
NASA Technical Reports Server (NTRS)
Ries, Gregory Lawrence
1997-01-01
This thesis presents a method for conducting hierarchical simulations to assess system hardware and software dependability. The method is intended to model embedded microprocessor systems. A key contribution of the thesis is the idea of using fault dictionaries to propagate fault effects upward from the level of abstraction where a fault model is assumed to the system level where the ultimate impact of the fault is observed. A second important contribution is the analysis of the software behavior under faults as well as the hardware behavior. The simulation method is demonstrated and validated in four case studies analyzing Myrinet, a commercial, high-speed networking system. One key result from the case studies shows that the simulation method predicts the same fault impact 87.5% of the time as is obtained by similar fault injections into a real Myrinet system. Reasons for the remaining discrepancy are examined in the thesis. A second key result shows the reduction in the number of simulations needed due to the fault dictionary method. In one case study, 500 faults were injected at the chip level, but only 255 propagated to the system level. Of these 255 faults, 110 shared identical fault dictionary entries at the system level and so did not need to be resimulated. The necessary number of system-level simulations was therefore reduced from 500 to 145. Finally, the case studies show how the simulation method can be used to improve the dependability of the target system. The simulation analysis was used to add recovery to the target software for the most common fault propagation mechanisms that would cause the software to hang. After the modification, the number of hangs was reduced by 60% for fault injections into the real system.
Extended Testability Analysis Tool
NASA Technical Reports Server (NTRS)
Melcher, Kevin; Maul, William A.; Fulton, Christopher
2012-01-01
The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.
ERIC Educational Resources Information Center
Smith, Garth Spencer; Hardman, Joanne
2014-01-01
In this study the impact of computer immersion on performance of school leavers Senior Certificate mathematics scores was investigated across 31 schools in the EMDC East education district of Cape Town, South Africa by comparing performance between two groups: a control and an experimental group. The experimental group (14 high schools) had access…
Analysis of worldwide research in the field of cybernetics during 1997-2011.
Singh, Virender; Perdigones, Alicia; García, José Luis; Cañas-Guerrero, Ignacio; Mazarrón, Fernando R
2014-12-01
The study provides an overview of the research activity carried out in the field of cybernetics. To do so, all research papers from 1997 to 2011 (16,445 research papers) under the category of "Computer Science, Cybernetics" of Web of Science have been processed using our in-house software which is developed specifically for this purpose. Among its multiple capabilities, this software analyses individual and compound keywords, quantifies productivity taking into account the work distribution, estimates the impact of each article and determines the collaborations established at different scales. Keywords analysis identifies the evolution of the most important research topics in the field of cybernetics and their specificity in biological aspects, as well as the research topics with lesser interest. The analysis of productivity, impact and collaborations provides a framework to assess research activity in a specific and realistic context. The geographical and institutional distribution of publications reveals the leading countries and research centres, analysing their relation to main research journals. Moreover, collaborations analysis reveals great differences in terms of internationalization and complexity of research networks. The results of this study may be very useful for the characterization and the decisions made by research in the field of cybernetics.
Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program
NASA Technical Reports Server (NTRS)
Ryan, Shannon
2013-01-01
This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks.
Tra, Yolande V; Evans, Irene M
2010-01-01
BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course.
Evans, Irene M.
2010-01-01
BIO2010 put forth the goal of improving the mathematical educational background of biology students. The analysis and interpretation of microarray high-dimensional data can be very challenging and is best done by a statistician and a biologist working and teaching in a collaborative manner. We set up such a collaboration and designed a course on microarray data analysis. We started using Genome Consortium for Active Teaching (GCAT) materials and Microarray Genome and Clustering Tool software and added R statistical software along with Bioconductor packages. In response to student feedback, one microarray data set was fully analyzed in class, starting from preprocessing to gene discovery to pathway analysis using the latter software. A class project was to conduct a similar analysis where students analyzed their own data or data from a published journal paper. This exercise showed the impact that filtering, preprocessing, and different normalization methods had on gene inclusion in the final data set. We conclude that this course achieved its goals to equip students with skills to analyze data from a microarray experiment. We offer our insight about collaborative teaching as well as how other faculty might design and implement a similar interdisciplinary course. PMID:20810954
Rosetta CONSERT operations and data analysis preparation: simulation software tools.
NASA Astrophysics Data System (ADS)
Rogez, Yves; Hérique, Alain; Cardiet, Maël; Zine, Sonia; Westphal, Mathieu; Micallef, Mickael; Berquin, Yann; Kofman, Wlodek
2014-05-01
The CONSERT experiment onboard Rosetta and Philae will perform the tomography of the 67P/CG comet nucleus by measuring radio waves transmission from the Rosetta S/C to the Philae Lander. The accurate analysis of travel time measurements will deliver unique knowledge of the nucleus interior dielectric properties. The challenging complexity of CONSERT operations requirements, combining both Rosetta and Philae, allows only a few set of opportunities to acquire data. Thus, we need a fine analysis of the impact of Rosetta trajectory, Philae position and comet shape on CONSERT measurements, in order to take optimal decisions in a short time. The integration of simulation results and mission parameters provides synthetic information to evaluate performances and risks for each opportunity. The preparation of CONSERT measurements before space operations is a key to achieve the best science return of the experiment. In addition, during Rosetta space operations, these software tools will allow a "real-time" first analysis of the latest measurements to improve the next acquisition sequences. The software tools themselves are built around a 3D electromagnetic radio wave simulation, taking into account the signal polarization. It is based on ray-tracing algorithms specifically designed for quick orbit analysis and radar signal generation. This allows computation on big domains relatively to the wavelength. The extensive use of 3D visualization tools provides comprehensive and synthetic views of the results. The software suite is designed to be extended, after Rosetta operations, to the full 3D measurement data analysis using inversion methods.
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2009-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor. Morphometry of the corneal endothelium is presently done by semi-automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development of fully automated analysis of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images. The digitally enhanced images of the corneal endothelium were transformed, using the fast Fourier transform (FFT). Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Conte, Gian Marco; Castellano, Antonella; Altabella, Luisa; Iadanza, Antonella; Cadioli, Marcello; Falini, Andrea; Anzalone, Nicoletta
2017-04-01
Dynamic susceptibility contrast MRI (DSC) and dynamic contrast-enhanced MRI (DCE) are useful tools in the diagnosis and follow-up of brain gliomas; nevertheless, both techniques leave the open issue of data reproducibility. We evaluated the reproducibility of data obtained using two different commercial software for perfusion maps calculation and analysis, as one of the potential sources of variability can be the software itself. DSC and DCE analyses from 20 patients with gliomas were tested for both the intrasoftware (as intraobserver and interobserver reproducibility) and the intersoftware reproducibility, as well as the impact of different postprocessing choices [vascular input function (VIF) selection and deconvolution algorithms] on the quantification of perfusion biomarkers plasma volume (Vp), volume transfer constant (K trans ) and rCBV. Data reproducibility was evaluated with the intraclass correlation coefficient (ICC) and Bland-Altman analysis. For all the biomarkers, the intra- and interobserver reproducibility resulted in almost perfect agreement in each software, whereas for the intersoftware reproducibility the value ranged from 0.311 to 0.577, suggesting fair to moderate agreement; Bland-Altman analysis showed high dispersion of data, thus confirming these findings. Comparisons of different VIF estimation methods for DCE biomarkers resulted in ICC of 0.636 for K trans and 0.662 for Vp; comparison of two deconvolution algorithms in DSC resulted in an ICC of 0.999. The use of single software ensures very good intraobserver and interobservers reproducibility. Caution should be taken when comparing data obtained using different software or different postprocessing within the same software, as reproducibility is not guaranteed anymore.
Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, Wes
2016-07-24
The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing wasmore » not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.« less
Impact of Growing Business on Software Processes
NASA Astrophysics Data System (ADS)
Nikitina, Natalja; Kajko-Mattsson, Mira
When growing their businesses, software organizations should not only put effort into developing and executing their business strategies, but also into managing and improving their internal software development processes and aligning them with business growth strategies. It is only in this way they may confirm that their businesses grow in a healthy and sustainable way. In this paper, we map out one software company's business growth on the course of its historical events and identify its impact on the company's software production processes and capabilities. The impact concerns benefits, challenges, problems and lessons learned. The most important lesson learned is that although business growth has become a stimulus for starting thinking and improving software processes, the organization lacked guidelines aiding it in and aligning it to business growth. Finally, the paper generates research questions providing a platform for future research.
Scalability Assessments for the Malicious Activity Simulation Tool (MAST)
2012-09-01
the scalability characteristics of MAST. Specifically, we show that an exponential increase in clients using the MAST software does not impact...an exponential increase in clients using the MAST software does not impact network and system resources significantly. Additionally, we...31 1. Hardware .....................................31 2. Software .....................................32 3. Common PC
NASA Technical Reports Server (NTRS)
Greenspan, Sol; Feblowitz, Mark
1992-01-01
ACME is an experimental environment for investigating new approaches to modeling and analysis of system requirements and designs. ACME is built on and extends object-oriented conceptual modeling techniques and knowledge representation and reasoning (KRR) tools. The most immediate intended use for ACME is to help represent, understand, and communicate system designs during the early stages of system planning and requirements engineering. While our research is ostensibly aimed at software systems in general, we are particularly motivated to make an impact in the telecommunications domain, especially in the area referred to as Intelligent Networks (IN's). IN systems contain the software to provide services to users of a telecommunications network (e.g., call processing services, information services, etc.) as well as the software that provides the internal infrastructure for providing the services (e.g., resource management, billing, etc.). The software includes not only systems developed by the network proprietors but also by a growing group of independent service software providers.
NASA Technical Reports Server (NTRS)
Orr, James K.
2010-01-01
This presentation focuses on the Space Shuttle Primary Avionics Software System (PASS) and the people who developed and maintained this system. One theme is to provide quantitative data on software quality and reliability over a 30 year period. Consistent data relates to code break discrepancies. Requirements were supplied from external sources. Requirement inspections and measurements not implemented until later, beginning in 1985. Second theme is to focus on the people and organization of PASS. Many individuals have supported the PASS project over the entire period while transitioning from company to company and contract to contract. Major events and transitions have impacted morale (both positively and negatively) across the life of the project.
An Analysis of the Use of Social Software and Its Impact on Organizational Processes
NASA Astrophysics Data System (ADS)
Pascual-Miguel, Félix; Chaparro-Peláez, Julián; Hernández-García, Ángel
This article proposes a study on the implementation rate of the most relevant 2.0 tools and technologies in Spanish enterprises, and their impact on 12 important aspects of business processes. In order to characterize the grade of implementation and the perceived improvements on the processes two indexes, Implementation Index and Impact Rate, have been created and displayed in a matrix called "2.0 Success Matrix". Data has been analyzed from a survey taken to directors and executives of large companies and small and medium businesses.
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.
2009-01-01
Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1--fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2--subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3--full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with ice projectile impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated three types of debris projectiles: Single-crystal, polycrystal, and "soft" ice. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the ice and RCC models for use in LS-DYNA.
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Revilock, Duane M.; Pereira, Michael J.; Lyle, Karen H.
2009-01-01
Following the tragedy of the Orbiter Columbia (STS-107) on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the space shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize reinforced carbon-carbon (RCC) along with ice and foam debris materials, which could shed on ascent and impact the orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS-DYNA (Livermore Software Technology Corp.) to predict damage by potential and actual impact events on the orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: Level 1-fundamental tests to obtain independent static and dynamic constitutive model properties of materials of interest, Level 2-subcomponent impact tests to provide highly controlled impact test data for the correlation and validation of the models, and Level 3-full-scale orbiter leading-edge impact tests to establish the final level of confidence for the analysis methodology. This report discusses the Level 2 test program conducted in the NASA Glenn Research Center (GRC) Ballistic Impact Laboratory with external tank foam impact tests on flat RCC panels, and presents the data observed. The Level 2 testing consisted of 54 impact tests in the NASA GRC Ballistic Impact Laboratory on 6- by 6-in. and 6- by 12-in. flat plates of RCC and evaluated two types of debris projectiles: BX-265 and PDL-1034 external tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile and validated the use of the foam and RCC models for use in LS-DYNA.
Data Standards for Flow Cytometry
SPIDLEN, JOSEF; GENTLEMAN, ROBERT C.; HAALAND, PERRY D.; LANGILLE, MORGAN; MEUR, NOLWENN LE; OCHS, MICHAEL F.; SCHMITT, CHARLES; SMITH, CLAYTON A.; TREISTER, ADAM S.; BRINKMAN, RYAN R.
2009-01-01
Flow cytometry (FCM) is an analytical tool widely used for cancer and HIV/AIDS research, and treatment, stem cell manipulation and detecting microorganisms in environmental samples. Current data standards do not capture the full scope of FCM experiments and there is a demand for software tools that can assist in the exploration and analysis of large FCM datasets. We are implementing a standardized approach to capturing, analyzing, and disseminating FCM data that will facilitate both more complex analyses and analysis of datasets that could not previously be efficiently studied. Initial work has focused on developing a community-based guideline for recording and reporting the details of FCM experiments. Open source software tools that implement this standard are being created, with an emphasis on facilitating reproducible and extensible data analyses. As well, tools for electronic collaboration will assist the integrated access and comprehension of experiments to empower users to collaborate on FCM analyses. This coordinated, joint development of bioinformatics standards and software tools for FCM data analysis has the potential to greatly facilitate both basic and clinical research—impacting a notably diverse range of medical and environmental research areas. PMID:16901228
AN/UPX-41(C) Test Data Analysis of Impacts to Secondary Surveillance Radars.
DOT National Transportation Integrated Search
2015-02-01
In 2012, the Navy requested spectrum certification for the shipboard AN/UPX-41(C) Digital Interrogator System, Software Version 5.5 with Mode 5. Current operating conditions for the Navys AN/UPX-41(C) are the same as restrictions imposed on the AN...
Life Cycle Assessment Software for Product and Process Sustainability Analysis
ERIC Educational Resources Information Center
Vervaeke, Marina
2012-01-01
In recent years, life cycle assessment (LCA), a methodology for assessment of environmental impacts of products and services, has become increasingly important. This methodology is applied by decision makers in industry and policy, product developers, environmental managers, and other non-LCA specialists working on environmental issues in a wide…
The Changing Business Environment: Implications for Vocational Curricula. State-of-the-Art Paper.
ERIC Educational Resources Information Center
Smith, E. Ray; Stallard, John J.
The widespread use of the micro/personal computer and related technological advancements are having important impacts on information management in the modern electronic office. Some of the most common software applications include word processing, spread sheet analysis, data management, graphics, and communications. Ancillary hardware/software…
Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis
2016-03-01
Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring.
Blagec, Kathrin; Jungwirth, David; Haluza, Daniela; Samwald, Matthias
2018-01-01
Medical device regulations which aim to ensure safety standards do not only apply to hardware devices but also to standalone medical software, e.g. mobile apps. To explore the effects of these regulations on the development and distribution of medical standalone software. We invited a convenience sample of 130 domain experts to participate in an online survey about the impact of current regulations on the development and distribution of medical standalone software. 21 respondents completed the questionnaire. Participants reported slight positive effects on usability, reliability, and data security of their products, whereas the ability to modify already deployed software and customization by end users were negatively impacted. The additional time and costs needed to go through the regulatory process were perceived as the greatest obstacles in developing and distributing medical software. Further research is needed to compare positive effects on software quality with negative impacts on market access and innovation. Strategies for avoiding over-regulation while still ensuring safety standards need to be devised.
Real Time Metrology Using Heterodyne Interferometry
NASA Astrophysics Data System (ADS)
Evans, Joseph T..., Jr.
1983-11-01
The Air Force Weapons Laboratory (AFWL) located at Albuquerque, NM has developed a digital heterodyne interferometer capable of real-time, closed loop analysis and control of adaptive optics. The device uses independent phase modulation of two orthogonal polarizations of an argon ion laser to produce a temporally phase modulated interferogram of the test object in a Twyman-Green interferometer. Differential phase detection under the control of a Data General minicomputer helps reconstruct the phase front without noise effects from amplitude modulation in the optical train. The system consists of the interferometer optics, phase detection circuitry, and the minicomputer, allowing for complete software control of the process. The software has been unified into a powerful package that performs automatic data acquisition, OPD reconstruction, and Zernike analysis of the resulting wavefront. The minicomputer has the capability to control external devices so that closed loop analysis and control is possible. New software under development will provide a framework of data acquisition, display, and storage packages which can be integrated with analysis and control packages customized to the user's needs. Preliminary measurements with the system show that it is noise limited by laser beam phase quality and vibration of the optics. Active measures are necessary to reduce the impact of these noise sources.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
NASA Astrophysics Data System (ADS)
Králik, Juraj
2017-07-01
The paper presents the probabilistic and sensitivity analysis of the efficiency of the damping devices cover of nuclear power plant under impact of the container of nuclear fuel of type TK C30 drop. The finite element idealization of nuclear power plant structure is used in space. The steel pipe damper system is proposed for dissipation of the kinetic energy of the container free fall. The experimental results of the shock-damper basic element behavior under impact loads are presented. The Newmark integration method is used for solution of the dynamic equations. The sensitivity and probabilistic analysis of damping devices was realized in the AntHILL and ANSYS software.
An automated digital data collection and analysis system for the Charpy Impact Tester
NASA Technical Reports Server (NTRS)
Kohne, Glenn S.; Spiegel, F. Xavier
1994-01-01
The standard Charpy Impact Tester has been modified by the addition of a system of hardware and software to improve the accuracy and consistency of measurements made during specimen fracturing experiments. An optical disc, light source, and detector generate signals that indicate the pendulum position as a function of time. These signals are used by a computer to calculate the velocity and kinetic energy of the pendulum as a function of its position.
A Framework of the Use of Information in Software Testing
ERIC Educational Resources Information Center
Kaveh, Payman
2010-01-01
With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…
Impact of Spacecraft Shielding on Direct Ionization Soft Error Rates for sub-130 nm Technologies
NASA Technical Reports Server (NTRS)
Pellish, Jonathan A.; Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Michael M.; Sanders, Anthony B.; Ladbury, Raymond L.; Oldham, Timothy R.; Marshall, Paul W.; Heidel, David F.; Rodbell, Kenneth P.
2010-01-01
We use ray tracing software to model various levels of spacecraft shielding complexity and energy deposition pulse height analysis to study how it affects the direct ionization soft error rate of microelectronic components in space. The analysis incorporates the galactic cosmic ray background, trapped proton, and solar heavy ion environments as well as the October 1989 and July 2000 solar particle events.
Impact of Spacecraft Shielding on Direct Ionization Soft Error Rates for Sub-130 nm Technologies
NASA Technical Reports Server (NTRS)
Pellish, Jonathan A.; Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Sanders, Anthony B.; Ladbury, Raymond L.; Oldham, Timothy R.; Marshall, Paul W.; Heidel, David F.; Rodbell, Kenneth P.
2010-01-01
We use ray tracing software to model various levels of spacecraft shielding complexity and energy deposition pulse height analysis to study how it affects the direct ionization soft error rate of microelectronic components in space. The analysis incorporates the galactic cosmic ray background, trapped proton, and solar heavy ion environments as well as the October 1989 and July 2000 solar particle events.
Impact of Agile Software Development Model on Software Maintainability
ERIC Educational Resources Information Center
Gawali, Ajay R.
2012-01-01
Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…
Effect of Software Version on the Accuracy of an Intraoral Scanning Device.
Haddadi, Yasser; Bahrami, Golnosh; Isidor, Flemming
2018-04-06
To investigate the impact of software version on the accuracy of an intraoral scanning device. A master tooth was scanned with a high-precision optical scanner and then 10 times with a CEREC Omnicam scanner with software versions 4.4.0 and 4.4.4. Discrepancies were measured using quality control software. Mean deviation for 4.4.0 was 36.2 ± 35 μm and for 4.4.4 was 20.7 ± 14.2 μm (P ≤ .001). Software version has a significant impact on the accuracy of an intraoral scanner. It is important that researchers also publish the software version of scanners when publishing their findings.
ERIC Educational Resources Information Center
Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas
2004-01-01
The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…
ERIC Educational Resources Information Center
Su, C. Y.; Wang, T. I.
2010-01-01
The rapid advance of information and communication technologies (ICT) has important impacts on teaching and learning, as well as on the educational assessment. Teachers may create assessments utilizing some developed assessment software or test authoring tools. However, problems could occur, such as neglecting key concepts in the curriculum or…
Metro Navigation for the Blind
ERIC Educational Resources Information Center
Sanchez, Jaime; Saenz, Mauricio
2010-01-01
This study evaluates the impact of using the software program AudioMetro, a tool that supports the orientation and mobility of people who are blind in the Metro system of Santiago de Chile. A quasi-experimental study considering experimental and control groups and using the paired Student's t in a two sample test analysis (pretest-posttest) was…
Measuring the impact of computer resource quality on the software development process and product
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Valett, Jon; Hall, Dana
1985-01-01
The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kohout, E.F.; Folga, S.; Mueller, C.
1996-03-01
This paper describes the Waste Management Facility Accident Analysis (WASTE{underscore}ACC) software, which was developed at Argonne National Laboratory (ANL) to support the US Department of Energy`s (DOE`s) Waste Management (WM) Programmatic Environmental Impact Statement (PEIS). WASTE{underscore}ACC is a decision support and database system that is compatible with Microsoft{reg_sign} Windows{trademark}. It assesses potential atmospheric releases from accidents at waste management facilities. The software provides the user with an easy-to-use tool to determine the risk-dominant accident sequences for the many possible combinations of process technologies, waste and facility types, and alternative cases described in the WM PEIS. In addition, its structure willmore » allow additional alternative cases and assumptions to be tested as part of the future DOE programmatic decision-making process. The WASTE{underscore}ACC system demonstrates one approach to performing a generic, systemwide evaluation of accident risks at waste management facilities. The advantages of WASTE{underscore}ACC are threefold. First, the software gets waste volume and radiological profile data that were used to perform other WM PEIS-related analyses directly from the WASTE{underscore}MGMT system. Second, the system allows for a consistent analysis across all sites and waste streams, which enables decision makers to understand more fully the trade-offs among various policy options and scenarios. Third, the system is easy to operate; even complex scenario runs are completed within minutes.« less
Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan
2012-01-01
Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).
Wang, Wei-Wei; Cai, Yue-Yin; Sun, Yong-Guang; Ma, Hong-Wei
2015-07-01
Using spatial analysis function of Arcgis software, the present study investigated the building environment impact evaluation index system of coastal development in Liaoning Province. The factors of it included of current state of environmental quality, environmental impact of marine development and marine environmental disaster. Weighted factor analysis and comprehensive index method were utilized. At the end, comprehensive environment effect of coastal development in Liaoning Province were evaluated successfully. The result showed that the environmental effect of development activity were most serious, along the Zhao Jiatun coast in north of Zhimao bay and coast of Mianhua island in Dalian bay.
A Bibliometric History of the Journal of Psychology Between 1936 and 2015.
Tur-Porcar, Ana; Mas-Tur, Alicia; Merigó, José M; Roig-Tierno, Norat; Watt, John
2018-05-19
The Journal of Psychology: Interdisciplinary and Applied is a leading international journal in psychology dating back to 1935. This study examines its publications since its creation utilizing a bibliometric analysis. The primary objective is to provide a complete overview of the key factors affecting the journal. This analysis includes such key issues as the publication and citation structure of the journal, its most cited articles, and the leading authors, institutions, and countries referenced in the journal. The work uses the Scopus database to classify the bibliographic material. Additionally, the analysis provides a graphical mapping of the bibliographic data by using visualization of similarities viewer software. This software uses several bibliometric techniques including co-citation, bibliographic coupling and co-occurrence of keywords. The Journal of Psychology is strongly connected to most of the current leading journals in psychology, and currently has a 5-year impact factor of 1.77 (Thomson Reuters, 2015 Journal Citation Reports).
Castel, Anne Laure; Toledano, Manuel; Tribouilloy, Christophe; Delelis, François; Mailliet, Amandine; Marotte, Nathalie; Guerbaai, Raphaëlle A; Levy, Franck; Graux, Pierre; Ennezat, Pierre-Vladimir; Maréchaux, Sylvestre
2018-05-27
Whether echocardiography platform and analysis software impact left ventricular (LV) volumes, ejection fraction (EF), and stroke volume (SV) by transthoracic tridimensional echocardiography (3DE) has not yet been assessed. Hence, our aim was to compare 3DE LV end-diastolic and end-systolic volumes (EDV and ESV), LVEF, and SV obtained with echocardiography platform from 2 different manufacturers. 3DE was performed in 84 patients (65% of screened consecutive patients), with equipment from 2 different manufacturers, with subsequent off-line postprocessing to obtain parameters of LV function and size (Philips QLAB 3DQ and General Electric EchoPAC 4D autoLVQ). Twenty-five patients with clinical indication for cardiac magnetic resonance imaging served as a validation subgroup. LVEDV and LVESV from 2 vendors were highly correlated (r = 0.93), but compared with 4D autoLVQ, the use of Qlab 3DQ resulted in lower LVEDV and LVESV (bias: 11 mL, limits of agreement: -25 to +47 and bias: 6 mL, limits of agreement: -22 to +34, respectively). The agreement between LVEF values of each software was poor (intraclass correlation coefficient 0.62) despite no or minimal bias. SVs were also lower with Qlab 3DQ advanced compared with 4D autoLVQ, and both were poorly correlated (r = 0.66). Consistently, the underestimation of LVEDV, LVESV, and SV by 3DE compared with cardiac magnetic resonance imaging was more pronounced with Philips QLAB 3DQ advanced than with 4D autoLVQ. The echocardiography platform and analysis software significantly affect the values of LV parameters obtained by 3DE. Intervendor standardization and improvements in 3DE modalities are needed to broaden the use of LV parameters obtained by 3DE in clinical practice. Copyright © 2018. Published by Elsevier Inc.
NASA Technical Reports Server (NTRS)
Lunsford, Myrtis Leigh
1998-01-01
The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.
Analyzing the costs to deliver medication therapy management services.
Rupp, Michael T
2011-01-01
To provide pharmacy managers and consultant pharmacists with a step-by-step approach for analyzing of the costs of delivering medication therapy management (MTM) services and to describe use of a free online software application for determining costs of delivering MTM. The process described is applicable to community pharmacies and consultant pharmacists who provide MTM services from nonpharmacy settings. The PharmAccount Service Cost Calculator is an Internet- based software application that uses a guided online interview to collect information needed to conduct a comprehensive cost analysis of any specialized pharmacy service. In addition to direct variable and fixed costs, the software automatically allocates indirect and overhead costs to the service and generates an itemized report that details the components of service delivery costs. The service cost calculator is sufficiently flexible to support the analysis of virtually any specialized pharmacy service, irrespective of whether the service is being delivered from a physical pharmacy. The software application allows users to perform sensitivity analysis to quickly determine the potential impact that alternate scenarios would have on service delivery cost. It is therefore particularly well suited to assist in the design and planning of a new pharmacy service. Good management requires that the cost implications of service delivery decisions are known and considered. Analyzing the cost of an MTM service is an important step in developing a sustainable business model.
New Directions in Software Quality Assurance Automation
2009-06-01
generation process. 4.1 Parameterized Safety Analysis We can do a qualitative analysis as well and ask questions like “ what has contributed to this...the probability of interception p1 in the previous example, we can determine what impact those parameters have on the probability of hazardous...assumed that the AEG is traversed top-down and left-to-right and only once to produce a particular event trace Randomized decisions about what
Benchmarking the ATLAS software through the Kit Validation engine
NASA Astrophysics Data System (ADS)
De Salvo, Alessandro; Brasolin, Franco
2010-04-01
The measurement of the experiment software performance is a very important metric in order to choose the most effective resources to be used and to discover the bottlenecks of the code implementation. In this work we present the benchmark techniques used to measure the ATLAS software performance through the ATLAS offline testing engine Kit Validation and the online portal Global Kit Validation. The performance measurements, the data collection, the online analysis and display of the results will be presented. The results of the measurement on different platforms and architectures will be shown, giving a full report on the CPU power and memory consumption of the Monte Carlo generation, simulation, digitization and reconstruction of the most CPU-intensive channels. The impact of the multi-core computing on the ATLAS software performance will also be presented, comparing the behavior of different architectures when increasing the number of concurrent processes. The benchmark techniques described in this paper have been used in the HEPiX group since the beginning of 2008 to help defining the performance metrics for the High Energy Physics applications, based on the real experiment software.
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.; Housner, Jerrold M.
1993-01-01
Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.
[Genetic diversity analysis of Andrographis paniculata in China based on SRAP and SNP].
Chen, Rong; Wang, Xiao-Yun; Song, Yu-Ning; Zhu, Yun-feng; Wang, Peng-liang; Li, Min; Zhong, Guo-Yue
2014-12-01
In order to reveal genetic diversity of domestic Andrographis paniculata and its impact on quality, genetic backgrounds of 103 samples from 7 provinces in China were analyzed using SRAP marker and SNP marker. Genetic structures of the A. paniculata populations were estimated with Powermarker V 3.25 and Mega 6.0 software, and polymorphic SNPs were identified with CodonCode Aligner software. The results showed that the genetic distances of domestic A. paniculata germplasm ranged from 0. 01 to 0.09, and no polymorphic SNPs were discovered in coding sequence fragments of ent-copalyl diphosphate synthase. A. paniculata germplasm from various regions in China had poor genetic diversity. This phenomenon was closely related to strict self-fertilization and earlier introduction from the same origin. Therefore, genetic background had little impact on variable qualities of A. paniculata in domestic market. Mutation breeding, polyploid breeding and molecular breeding were proposed as promising strategies in germplasm innovation.
A Legal Perspective on Business: Modeling the Impact of Law
NASA Astrophysics Data System (ADS)
Ghanavati, Sepideh; Siena, Alberto; Perini, Anna; Amyot, Daniel; Peyton, Liam; Susi, Angelo
Modern goal-oriented requirements engineering frameworks use modeling as a means of better understanding a domain, leading to an overall improvement in the quality of the requirements. Regulations and laws impose additional context and constraints on software goals and can limit the satisfaction of stakeholder needs. Organizations and software developers need modeling tools that can properly address the potential deep impact legal issues can have on the effectiveness of business strategies. In this paper, we perform a preliminary study into the development of a modeling framework able to support the analysis of legal prescriptions alongside business strategies. We demonstrate, via an example drawn from a case study of the Health Insurance Portability and Accountability Act (HIPAA), how models of this law can be built with the GRL modeling language and how they can be evaluated as part of the business goal models.
ISEES: an institute for sustainable software to accelerate environmental science
NASA Astrophysics Data System (ADS)
Jones, M. B.; Schildhauer, M.; Fox, P. A.
2013-12-01
Software is essential to the full science lifecycle, spanning data acquisition, processing, quality assessment, data integration, analysis, modeling, and visualization. Software runs our meteorological sensor systems, our data loggers, and our ocean gliders. Every aspect of science is impacted by, and improved by, software. Scientific advances ranging from modeling climate change to the sequencing of the human genome have been rendered possible in the last few decades due to the massive improvements in the capabilities of computers to process data through software. This pivotal role of software in science is broadly acknowledged, while simultaneously being systematically undervalued through minimal investments in maintenance and innovation. As a community, we need to embrace the creation, use, and maintenance of software within science, and address problems such as code complexity, openness,reproducibility, and accessibility. We also need to fully develop new skills and practices in software engineering as a core competency in our earth science disciplines, starting with undergraduate and graduate education and extending into university and agency professional positions. The Institute for Sustainable Earth and Environmental Software (ISEES) is being envisioned as a community-driven activity that can facilitate and galvanize activites around scientific software in an analogous way to synthesis centers such as NCEAS and NESCent that have stimulated massive advances in ecology and evolution. We will describe the results of six workshops (Science Drivers, Software Lifecycles, Software Components, Workforce Development and Training, Sustainability and Governance, and Community Engagement) that have been held in 2013 to envision such an institute. We will present community recommendations from these workshops and our strategic vision for how ISEES will address the technical issues in the software lifecycle, sustainability of the whole software ecosystem, and the critical issue of computational training for the scientific community. Process for envisioning ISEES.
A theoretical basis for the analysis of redundant software subject to coincident errors
NASA Technical Reports Server (NTRS)
Eckhardt, D. E., Jr.; Lee, L. D.
1985-01-01
Fundamental to the development of redundant software techniques fault-tolerant software, is an understanding of the impact of multiple-joint occurrences of coincident errors. A theoretical basis for the study of redundant software is developed which provides a probabilistic framework for empirically evaluating the effectiveness of the general (N-Version) strategy when component versions are subject to coincident errors, and permits an analytical study of the effects of these errors. The basic assumptions of the model are: (1) independently designed software components are chosen in a random sample; and (2) in the user environment, the system is required to execute on a stationary input series. The intensity of coincident errors, has a central role in the model. This function describes the propensity to introduce design faults in such a way that software components fail together when executing in the user environment. The model is used to give conditions under which an N-Version system is a better strategy for reducing system failure probability than relying on a single version of software. A condition which limits the effectiveness of a fault-tolerant strategy is studied, and it is posted whether system failure probability varies monotonically with increasing N or whether an optimal choice of N exists.
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
Data-driven traffic impact assessment tool for work zones.
DOT National Transportation Integrated Search
2017-03-01
Traditionally, traffic impacts of work zones have been assessed using planning software such as Quick Zone, custom spreadsheets, and others. These software programs generate delay, queuing, and other mobility measures but are difficult to validate du...
Software cost/resource modeling: Software quality tradeoff measurement
NASA Technical Reports Server (NTRS)
Lawler, R. W.
1980-01-01
A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.
ERIC Educational Resources Information Center
Biju, Soly Mathew
2008-01-01
Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…
Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.
2011-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.
The impact of the maternal experience with a jaundiced newborn on the breastfeeding relationship.
Willis, Sharla K; Hannon, Patricia R; Scrimshaw, Susan C
2002-05-01
To examine the process by which mothers' experiences with neonatal jaundice affect breastfeeding. We used ethnographic interviews with grounded theory methodology. Audiotaped data were transcribed and analyzed for themes using ATLAS/ti qualitative data analysis software (Scientific Software Development, Berlin, Germany). We studied a total of 47 Spanish- and English-speaking breastfeeding mothers of otherwise healthy infants diagnosed with neonatal jaundice. Our outcomes were the qualitative descriptions of maternal experiences with neonatal jaundice. Interactions with medical professionals emerged as the most important factor mediating the impact of neonatal jaundice on breastfeeding. Breastfeeding orders and the level of encouragement from medical professionals toward breastfeeding had the strongest effect on feeding decisions. Maternal reaction to and understanding of information from their physicians also played an important role. Guilt was common, as many mothers felt they had caused the jaundice by breastfeeding. By providing accurate information and encouragement to breastfeed, medical professionals have great impact on whether a mother continues breastfeeding after her experience with neonatal jaundice. Health care providers must be aware of how mothers receive and interpret information related to jaundice to minimize maternal reactions, such as guilt, that have a negative impact on breastfeeding.
NASA Astrophysics Data System (ADS)
Arif Shah, Muhammad; Hashim, Rathiah; Shah, Adil Ali; Farooq Khattak, Umar
2016-11-01
Developing software through Global Software Development (GSD) became very common now days in the software industry. Pakistan is one of the countries where projects are taken and designed from different countries including Afghanistan. The purpose of this paper is to identify and provide an analysis on several communication barriers that can have a negative impact on the project and to provide management guidelines for medium size software organizations working in Pakistan with clients from Afghanistan and to overcome these communication barriers and challenges organizations face when coordinating with client. Initially we performed a literature review to identify different communication barriers and to check if there are any standardized communications management guidelines for medium size software houses provided in the past. The second stage of the research paper involves guidelines with vendor's perspective that include interviews and focus group discussions with different stakeholders and employees of software houses with clients from Afghanistan. Based on those interviews and discussions we established communication management guidelines in order to overcome the communication problems and barriers working with clients from Afghanistan. As a result of the literature review, we have identified that barriers such as cultural barriers and language barrier were one of the main reasons behind the project failure and suggested that software organizations working in Pakistan should follow certain defined communication guidelines in order to overcome communication barriers that affect the project directly.
Aquatic Acoustic Metrics Interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-12-18
Fishes and marine mammals may suffer a range of potential effects from exposure to intense underwater sound generated by anthropogenic activities such as pile driving, shipping, sonars, and underwater blasting. Several underwater sound recording (USR) devices have been built to acquire samples of the underwater sound generated by anthropogenic activities. Software becomes indispensable for processing and analyzing the audio files recorded by these USRs. The new Aquatic Acoustic Metrics Interface Utility Software (AAMI) is specifically designed for analysis of underwater sound recordings to provide data in metrics that facilitate evaluation of the potential impacts of the sound on aquatic animals.more » In addition to the basic functions, such as loading and editing audio files recorded by USRs and batch processing of sound files, the software utilizes recording system calibration data to compute important parameters in physical units. The software also facilitates comparison of the noise sound sample metrics with biological measures such as audiograms of the sensitivity of aquatic animals to the sound, integrating various components into a single analytical frame.« less
NASA Astrophysics Data System (ADS)
Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Yusof, Muhammad Mat
2016-08-01
This paper reports the effect of proposed software products features on the satisfaction and dissatisfaction of potential customers of proposed software products. Kano model's functional and dysfunctional technique was used along with Berger et al.'s customer satisfaction coefficients. The result shows that only two features performed the most in influencing the satisfaction and dissatisfaction of would-be customers of the proposed software product. Attractive and one-dimensional features had the highest impact on the satisfaction and dissatisfaction of customers. This result will benefit requirements analysts, developers, designers, projects and sales managers in preparing for proposed products. Additional analysis showed that the Kano model's satisfaction and dissatisfaction scores were highly related to the Park et al.'s average satisfaction coefficient (r=96%), implying that these variables can be used interchangeably or in place of one another to elicit customer satisfaction. Furthermore, average satisfaction coefficients and satisfaction and dissatisfaction indexes were all positively and linearly correlated.
Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).
Kane, Kay
2014-03-01
The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.
ERIC Educational Resources Information Center
Stanford, Roger John
2012-01-01
Web-conferencing software was chosen for course delivery to provide flexible options for students at a two-year technical college. Students used technology to access a live, synchronous microeconomics course over the internet instead of a traditional face-to-face lecture. This investigation studied the impact of implementing web-conferencing…
NASA Technical Reports Server (NTRS)
Badler, N. I.; Korein, J. D.; Meyer, C.; Manoochehri, K.; Rovins, J.; Beale, J.; Barr, B.
1985-01-01
Several areas that arise from the system integration issue were examined. Intersystem analysis is discussed as it relates to software development, shared data bases and interfaces between TEMPUS and PLAID, shaded graphics rendering systems, object design (BUILD), the TEMPUS animation system, anthropometric lab integration, ongoing TEMPUS support and maintenance, and the impact of UNIX and local workstations on the OSDS environment.
ERIC Educational Resources Information Center
Morueta, Ramon Tirado; Igado, Manuel Fandos; Gomez, J. Ignacio Aguaded
2010-01-01
This work, part of the Spanish government's National I + D Plan 2004/07, entitled "Observatics: the implementation of free software in ICT centres in Andalusia: an analysis of its effect on the teaching-learning process", aims to describe the most recent impact of online communication technologies on education in Andalusia (Spain),…
The Role of the Mass Media in Parenting Education.
ERIC Educational Resources Information Center
Simpson, A. Rae
Although there has been an explosion of information and advice about child rearing in the mass media, little attention has been given to the nature or extent of the media's impact on parents or to ways in which media could be used more effectively. Based on an analysis of books, magazines, newspapers, radio, television, film, videotapes, software,…
A mathematical model of salmonid spawning habitat
Robert N. Havis; Carlos V. Alonzo; Keith E Woeste; Russell F. Thurow
1993-01-01
A simulation model [Salmonid Spawning Analysis Model (SSAM)I was developed as a management tool to evaluate the relative impacts of stream sediment load and water temperature on salmonid egg survival. The model is usefi.il for estimating acceptable sediment loads to spawning habitat that may result from upland development, such as logging and agriculture. Software in...
Software Requirements Analysis as Fault Predictor
NASA Technical Reports Server (NTRS)
Wallace, Dolores
2003-01-01
Waiting until the integration and system test phase to discover errors leads to more costly rework than resolving those same errors earlier in the lifecycle. Costs increase even more significantly once a software system has become operational. WE can assess the quality of system requirements, but do little to correlate this information either to system assurance activities or long-term reliability projections - both of which remain unclear and anecdotal. Extending earlier work on requirements accomplished by the ARM tool, measuring requirements quality information against code complexity and test data for the same system may be used to predict specific software modules containing high impact or deeply embedded faults now escaping in operational systems. Such knowledge would lead to more effective and efficient test programs. It may enable insight into whether a program should be maintained or started over.
A guide to the visual analysis and communication of biomolecular structural data.
Johnson, Graham T; Hertig, Samuel
2014-10-01
Biologists regularly face an increasingly difficult task - to effectively communicate bigger and more complex structural data using an ever-expanding suite of visualization tools. Whether presenting results to peers or educating an outreach audience, a scientist can achieve maximal impact with minimal production time by systematically identifying an audience's needs, planning solutions from a variety of visual communication techniques and then applying the most appropriate software tools. A guide to available resources that range from software tools to professional illustrators can help researchers to generate better figures and presentations tailored to any audience's needs, and enable artistically inclined scientists to create captivating outreach imagery.
Self-stress control of real civil engineering tensegrity structures
NASA Astrophysics Data System (ADS)
Kłosowska, Joanna; Obara, Paulina; Gilewski, Wojciech
2018-01-01
The paper introduces the impact of the self-stress level on the behaviour of the tensegrity truss structures. Displacements for real civil engineering tensegrity structures are analysed. Full-scale tensegrity tower Warnow Tower which consists of six Simplex trusses is considered in this paper. Three models consisting of one, two and six modules are analysed. The analysis is performed by the second and third order theory. Mathematica software and Sofistik programme is applied to the analysis.
Identifying impact of software dependencies on replicability of biomedical workflows.
Miksa, Tomasz; Rauber, Andreas; Mina, Eleni
2016-12-01
Complex data driven experiments form the basis of biomedical research. Recent findings warn that the context in which the software is run, that is the infrastructure and the third party dependencies, can have a crucial impact on the final results delivered by a computational experiment. This implies that in order to replicate the same result, not only the same data must be used, but also it must be run on an equivalent software stack. In this paper we present the VFramework that enables assessing replicability of workflows. It identifies whether any differences in software dependencies among two executions of the same workflow exist and whether they have impact on the produced results. We also conduct a case study in which we investigate the impact of software dependencies on replicability of Taverna workflows used in biomedical research of Huntington's disease. We re-execute analysed workflows in environments differing in operating system distribution and configuration. The results show that the VFramework can be used to identify the impact of software dependencies on the replicability of biomedical workflows. Furthermore, we observe that despite the fact that the workflows are executed in a controlled environment, they still depend on specific tools installed in the environment. The context model used by the VFramework improves the deficiencies of provenance traces and documents also such tools. Based on our findings we define guidelines for workflow owners that enable them to improve replicability of their workflows. Copyright © 2016 Elsevier Inc. All rights reserved.
SureTrak Probability of Impact Display
NASA Technical Reports Server (NTRS)
Elliott, John
2012-01-01
The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.
NASA Technical Reports Server (NTRS)
Wilmington, R. P.; Klute, Glenn K. (Editor); Carroll, Amy E. (Editor); Stuart, Mark A. (Editor); Poliner, Jeff (Editor); Rajulu, Sudhakar (Editor); Stanush, Julie (Editor)
1992-01-01
Kinematics, the study of motion exclusive of the influences of mass and force, is one of the primary methods used for the analysis of human biomechanical systems as well as other types of mechanical systems. The Anthropometry and Biomechanics Laboratory (ABL) in the Crew Interface Analysis section of the Man-Systems Division performs both human body kinematics as well as mechanical system kinematics using the Ariel Performance Analysis System (APAS). The APAS supports both analysis of analog signals (e.g. force plate data collection) as well as digitization and analysis of video data. The current evaluations address several methodology issues concerning the accuracy of the kinematic data collection and analysis used in the ABL. This document describes a series of evaluations performed to gain quantitative data pertaining to position and constant angular velocity movements under several operating conditions. Two-dimensional as well as three-dimensional data collection and analyses were completed in a controlled laboratory environment using typical hardware setups. In addition, an evaluation was performed to evaluate the accuracy impact due to a single axis camera offset. Segment length and positional data exhibited errors within 3 percent when using three-dimensional analysis and yielded errors within 8 percent through two-dimensional analysis (Direct Linear Software). Peak angular velocities displayed errors within 6 percent through three-dimensional analyses and exhibited errors of 12 percent when using two-dimensional analysis (Direct Linear Software). The specific results from this series of evaluations and their impacts on the methodology issues of kinematic data collection and analyses are presented in detail. The accuracy levels observed in these evaluations are also presented.
NASA Astrophysics Data System (ADS)
Pérez Peña, José Vicente; Baldó, Mane; Acosta, Yarci; Verschueren, Laurent; Thibaud, Kenmognie; Bilivogui, Pépé; Jean-Paul Ngandu, Alain; Beavogui, Maoro
2017-04-01
In the last decade the increasing interest for public health has promoted specific regulations for the transport, storage, transformation and/or elimination of potentially toxic waste. A special concern should focus on the effective management of biomedical waste, due to the environmental and health risk associated with them. The first stage for the effective management these waste includes the selection of the best sites for the location of facilities for its storage and/or elimination. Best-site selection is accomplished by means of multi-criteria decision analyses (MCDA) that aim to minimize the social and environmental impact, and to maximize management efficiency. In this work we presented a methodology that uses open-source software and data to analyze the best location for the implantation of a centralized waste management system in a developing country (Guinea, Conakry). We applied an analytical hierarchy process (AHP) using different thematic layers such as land use (derived from up-to-date Sentinel 2 remote sensing images), soil type, distance and type of roads, hydrography, distance to dense populated areas, etc. Land-use data were derived from up-to-date Sentinel 2 remote sensing images, whereas roads and hydrography were obtained from the Open Street Map database and latter validated with administrative data. We performed the AHP analysis with the aid of QGIS open-software Geospatial Information System. This methodology is very effective for developing countries as it uses open-source software and data for the MCDA analysis, thus reducing costs in these first stages of the integrated analysis.
A dynamic wheel-rail impact analysis of railway track under wheel flat by finite element analysis
NASA Astrophysics Data System (ADS)
Bian, Jian; Gu, Yuantong; Murray, Martin Howard
2013-06-01
Wheel-rail interaction is one of the most important research topics in railway engineering. It involves track impact response, track vibration and track safety. Track structure failures caused by wheel-rail impact forces can lead to significant economic loss for track owners through damage to rails and to the sleepers beneath. Wheel-rail impact forces occur because of imperfections in the wheels or rails such as wheel flats, irregular wheel profiles, rail corrugations and differences in the heights of rails connected at a welded joint. A wheel flat can cause a large dynamic impact force as well as a forced vibration with a high frequency, which can cause damage to the track structure. In the present work, a three-dimensional finite element (FE) model for the impact analysis induced by the wheel flat is developed by the use of the FE analysis (FEA) software package ANSYS and validated by another validated simulation. The effect of wheel flats on impact forces is thoroughly investigated. It is found that the presence of a wheel flat will significantly increase the dynamic impact force on both rail and sleeper. The impact force will monotonically increase with the size of wheel flats. The relationships between the impact force and the wheel flat size are explored from this FEA and they are important for track engineers to improve their understanding of the design and maintenance of the track system.
Analysis of e-beam impact on the resist stack in e-beam lithography process
NASA Astrophysics Data System (ADS)
Indykeiwicz, K.; Paszkiewicz, B.
2013-07-01
Paper presents research on the sub-micron gate, AlGaN /GaN HEMT type transistors, fabrication by e-beam lithography and lift-off technique. The impact of the electron beam on the resists layer and the substrate was analyzed by MC method in Casino v3.2 software. The influence of technological process parameters on the metal structures resolution and quality for paths 100 nm, 300 nm and 500 nm wide and 20 μm long was studied. Qualitative simulation correspondences to the conducted experiments were obtained.
The Effect of Software Features on Software Adoption and Training in the Audit Profession
ERIC Educational Resources Information Center
Kim, Hyo-Jeong
2012-01-01
Although software has been studied with technology adoption and training research, the study of specific software features for professional groups has been limited. To address this gap, I researched the impact of software features of varying complexity on internal audit (IA) professionals. Two studies along with the development of training…
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2010-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
ERIC Educational Resources Information Center
Urdegar, Steven M.
2014-01-01
My Virtual Reading Coach (MVRC) is an online program for students who have been identified as struggling readers. It is used as an intervention within the Response to Intervention (RtI) framework, as well as for students with disabilities. The software addresses reading sub-skills (i.e., comprehension, fluency, phonemic awareness, phonics, and…
Impact of Machine-Translated Text on Entity and Relationship Extraction
2014-12-01
20 1 1. Introduction Using social network analysis tools is an important asset in...semantic modeling software to automatically build detailed network models from unstructured text. Contour imports unstructured text and then maps the text...onto an existing ontology of frames at the sentence level, using FrameNet, a structured language model, and through Semantic Role Labeling ( SRL
ERIC Educational Resources Information Center
Okongo, James
2014-01-01
The failure rate of information technology (IT) development projects is a significant concern for today's organizations. Perceptions of IT project risk and project performance have been identified as important factors by scholars studying the topic, and Wallace, Keil, and Rai (2004a) developed a survey instrument to measure how dimensions of…
Dacia M. Meneguzzo; Mark H. Hansen
2009-01-01
Fragmentation metrics provide a means of quantifying and describing forest fragmentation. The most common method of calculating these metrics is through the use of Geographic Information System software to analyze raster data, such as a satellite or aerial image of the study area; however, the spatial resolution of the imagery has a significant impact on the results....
NASA Astrophysics Data System (ADS)
Dioguardi, Fabio; Mele, Daniela
2018-03-01
This paper presents PYFLOW_2.0, a hazard tool for the calculation of the impact parameters of dilute pyroclastic density currents (DPDCs). DPDCs represent the dilute turbulent type of gravity flows that occur during explosive volcanic eruptions; their hazard is the result of their mobility and the capability to laterally impact buildings and infrastructures and to transport variable amounts of volcanic ash along the path. Starting from data coming from the analysis of deposits formed by DPDCs, PYFLOW_2.0 calculates the flow properties (e.g., velocity, bulk density, thickness) and impact parameters (dynamic pressure, deposition time) at the location of the sampled outcrop. Given the inherent uncertainties related to sampling, laboratory analyses, and modeling assumptions, the program provides ranges of variations and probability density functions of the impact parameters rather than single specific values; from these functions, the user can interrogate the program to obtain the value of the computed impact parameter at any specified exceedance probability. In this paper, the sedimentological models implemented in PYFLOW_2.0 are presented, program functionalities are briefly introduced, and two application examples are discussed so as to show the capabilities of the software in quantifying the impact of the analyzed DPDCs in terms of dynamic pressure, volcanic ash concentration, and residence time in the atmosphere. The software and user's manual are made available as a downloadable electronic supplement.
Hemsley, Bronwyn; Palmer, Stuart; Dann, Stephen; Balandin, Susan
2018-02-01
Articles 19, 26 and 27 of the Universal Declaration of Human Rights and Articles 4, 9 and 21 of the Convention on the Rights of Persons with Disabilities promote the human rights of communication, education, use of technology and access to information. Social media is an important form of online communication, and Twitter increases users' visibility, influence and reach online. The aim of this sociotechnical research was to determine the impact of teaching three people who use Augmentative and Alternative Communication (AAC) to use Twitter. Three participants were trained in ways of using Twitter strategically. Data collected from participants' Twitter profiles were examined to determine the impact of training on Twitter follower count, frequency of tweeting, tweet content and the development of social networks. Data were also examined using (1) KH Coder software analysis and visualisation of co-occurring networks in the text data, based on word frequencies; and (2) Gephi software analysis to show the Twitter network for each participant. Two participants showed an improvement in Twitter skills and strategies. Twitter can be used to improve social connectedness of people who use AAC, and should not be overlooked in relation to communication rights.
2017-12-04
gap spacing.92,93 By running current through an EBL-fabricated gap array, it has been shown to be possible to impact atomic positions within a...Spectra were collected and the instrument was run using Wire 2.0 software operating on a dedicated computer. 2.5 Data Analysis Data analysis...accomplished using the Unaxis VLR 700 Etch PM3-Dieclectric etch. For this step it is important to first run the process on a dummy wafer to
NASA Technical Reports Server (NTRS)
Langland, R. A.; Stephens, P. L.; Pihos, G. G.
1980-01-01
The techniques used for ingesting SEASAT-A SASS wind retrievals into the existing operational software are described. The intent is to assess the impact of SEASAT data in he marine wind fields produced by the global marine wind/sea level pressure analysis. This analysis is performed on a 21/2 deg latitude/longitude global grid which executes at three hourly time increments. Wind fields with and without SASS winds are being compared. The problems of data volume reduction and aliased wind retrieval ambiquity are treated.
The Software Architecture of the Upgraded ESA DRAMA Software Suite
NASA Astrophysics Data System (ADS)
Kebschull, Christopher; Flegel, Sven; Gelhaus, Johannes; Mockel, Marek; Braun, Vitali; Radtke, Jonas; Wiedemann, Carsten; Vorsmann, Peter; Sanchez-Ortiz, Noelia; Krag, Holger
2013-08-01
In the beginnings of man's space flight activities there was the belief that space is so big that everybody could use it without any repercussions. However during the last six decades the increasing use of Earth's orbits has lead to a rapid growth in the space debris environment, which has a big influence on current and future space missions. For this reason ESA issued the "Requirements on Space Debris Mitigation for ESA Projects" [1] in 2008, which apply to all ESA missions henceforth. The DRAMA (Debris Risk Assessment and Mitigation Analysis) software suite had been developed to support the planning of space missions to comply with these requirements. During the last year the DRAMA software suite has been upgraded under ESA contract by TUBS and DEIMOS to include additional tools and increase the performance of existing ones. This paper describes the overall software architecture of the ESA DRAMA software suite. Specifically the new graphical user interface, which manages the five main tools ARES (Assessment of Risk Event Statistics), MIDAS (MASTER-based Impact Flux and Damage Assessment Software), OSCAR (Orbital Spacecraft Active Removal), CROC (Cross Section of Complex Bodies) and SARA (Re-entry Survival and Risk Analysis) is being discussed. The advancements are highlighted as well as the challenges that arise from the integration of the five tool interfaces. A framework had been developed at the ILR and was used for MASTER-2009 and PROOF-2009. The Java based GUI framework, enables the cross-platform deployment, and its underlying model-view-presenter (MVP) software pattern, meet strict design requirements necessary to ensure a robust and reliable method of operation in an environment where the GUI is separated from the processing back-end. While the GUI framework evolved with each project, allowing an increasing degree of integration of services like validators for input fields, it has also increased in complexity. The paper will conclude with an outlook on the future development of the GUI framework, where the potential for advancements will be shown.
Utility of coupling nonlinear optimization methods with numerical modeling software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Murphy, M.J.
1996-08-05
Results of using GLO (Global Local Optimizer), a general purpose nonlinear optimization software package for investigating multi-parameter problems in science and engineering is discussed. The package consists of the modular optimization control system (GLO), a graphical user interface (GLO-GUI), a pre-processor (GLO-PUT), a post-processor (GLO-GET), and nonlinear optimization software modules, GLOBAL & LOCAL. GLO is designed for controlling and easy coupling to any scientific software application. GLO runs the optimization module and scientific software application in an iterative loop. At each iteration, the optimization module defines new values for the set of parameters being optimized. GLO-PUT inserts the new parametermore » values into the input file of the scientific application. GLO runs the application with the new parameter values. GLO-GET determines the value of the objective function by extracting the results of the analysis and comparing to the desired result. GLO continues to run the scientific application over and over until it finds the ``best`` set of parameters by minimizing (or maximizing) the objective function. An example problem showing the optimization of material model is presented (Taylor cylinder impact test).« less
Toward improved peptide feature detection in quantitative proteomics using stable isotope labeling.
Nilse, Lars; Sigloch, Florian Christoph; Biniossek, Martin L; Schilling, Oliver
2015-08-01
Reliable detection of peptides in LC-MS data is a key algorithmic step in the analysis of quantitative proteomics experiments. While highly abundant peptides can be detected reliably by most modern software tools, there is much less agreement on medium and low-intensity peptides in a sample. The choice of software tools can have a big impact on the quantification of proteins, especially for proteins that appear in lower concentrations. However, in many experiments, it is precisely this region of less abundant but substantially regulated proteins that holds the biggest potential for discoveries. This is particularly true for discovery proteomics in the pharmacological sector with a specific interest in key regulatory proteins. In this viewpoint article, we discuss how the development of novel software algorithms allows us to study this region of the proteome with increased confidence. Reliable results are one of many aspects to be considered when deciding on a bioinformatics software platform. Deployment into existing IT infrastructures, compatibility with other software packages, scalability, automation, flexibility, and support need to be considered and are briefly addressed in this viewpoint article. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Software Component Technologies and Space Applications
NASA Technical Reports Server (NTRS)
Batory, Don
1995-01-01
In the near future, software systems will be more reconfigurable than hardware. This will be possible through the advent of software component technologies which have been prototyped in universities and research labs. In this paper, we outline the foundations for those technologies and suggest how they might impact software for space applications.
Ethics in computer software design and development
Alan J. Thomson; Daniel L. Schmoldt
2001-01-01
Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
Failure analysis and modeling of a multicomputer system. M.S. Thesis
NASA Technical Reports Server (NTRS)
Subramani, Sujatha Srinivasan
1990-01-01
This thesis describes the results of an extensive measurement-based analysis of real error data collected from a 7-machine DEC VaxCluster multicomputer system. In addition to evaluating basic system error and failure characteristics, we develop reward models to analyze the impact of failures and errors on the system. The results show that, although 98 percent of errors in the shared resources recover, they result in 48 percent of all system failures. The analysis of rewards shows that the expected reward rate for the VaxCluster decreases to 0.5 in 100 days for a 3 out of 7 model, which is well over a 100 times that for a 7-out-of-7 model. A comparison of the reward rates for a range of k-out-of-n models indicates that the maximum increase in reward rate (0.25) occurs in going from the 6-out-of-7 model to the 5-out-of-7 model. The analysis also shows that software errors have the lowest reward (0.2 vs. 0.91 for network errors). The large loss in reward rate for software errors is due to the fact that a large proportion (94 percent) of software errors lead to failure. In comparison, the high reward rate for network errors is due to fast recovery from a majority of these errors (median recovery duration is 0 seconds).
A controlled experiment on the impact of software structure on maintainability
NASA Technical Reports Server (NTRS)
Rombach, Dieter H.
1987-01-01
The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.
NASA Astrophysics Data System (ADS)
Funke, H. H.-W.; Keinz, J.; Börner, S.; Hendrick, P.; Elsing, R.
2016-07-01
The paper highlights the modification of the engine control software of the hydrogen (H2) converted gas turbine Auxiliary Power Unit (APU) GTCP 36-300 allowing safe and accurate methane (CH4) operation achieved without mechanical changes of the metering unit. The acceleration and deceleration characteristics of the engine controller from idle to maximum load are analyzed comparing H2 and CH4. Also, the paper presents the influence on the thermodynamic cycle of gas turbine resulting from the different fuels supported by a gas turbine cycle simulation of H2 and CH4 using the software GasTurb.
Ristov, Strahil; Brajkovic, Vladimir; Cubric-Curik, Vlatka; Michieli, Ivan; Curik, Ino
2016-09-10
Identification of genes or even nucleotides that are responsible for quantitative and adaptive trait variation is a difficult task due to the complex interdependence between a large number of genetic and environmental factors. The polymorphism of the mitogenome is one of the factors that can contribute to quantitative trait variation. However, the effects of the mitogenome have not been comprehensively studied, since large numbers of mitogenome sequences and recorded phenotypes are required to reach the adequate power of analysis. Current research in our group focuses on acquiring the necessary mitochondria sequence information and analysing its influence on the phenotype of a quantitative trait. To facilitate these tasks we have produced software for processing pedigrees that is optimised for maternal lineage analysis. We present MaGelLAn 1.0 (maternal genealogy lineage analyser), a suite of four Python scripts (modules) that is designed to facilitate the analysis of the impact of mitogenome polymorphism on quantitative trait variation by combining molecular and pedigree information. MaGelLAn 1.0 is primarily used to: (1) optimise the sampling strategy for molecular analyses; (2) identify and correct pedigree inconsistencies; and (3) identify maternal lineages and assign the corresponding mitogenome sequences to all individuals in the pedigree, this information being used as input to any of the standard software for quantitative genetic (association) analysis. In addition, MaGelLAn 1.0 allows computing the mitogenome (maternal) effective population sizes and probability of mitogenome (maternal) identity that are useful for conservation management of small populations. MaGelLAn is the first tool for pedigree analysis that focuses on quantitative genetic analyses of mitogenome data. It is conceived with the purpose to significantly reduce the effort in handling and preparing large pedigrees for processing the information linked to maternal lines. The software source code, along with the manual and the example files can be downloaded at http://lissp.irb.hr/software/magellan-1-0/ and https://github.com/sristov/magellan .
The Integrated Hazard Analysis Integrator
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2009-01-01
Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and sufficient requirements of one of the significant contributors to mission success, the IHA integrator. Discussions will be provided to describe both the mindset required as well as deleterious assumptions/behaviors to avoid when integrating within a large scale system.
Simulation of cooling efficiency via miniaturised channels in multilayer LTCC for power electronics
NASA Astrophysics Data System (ADS)
Pietrikova, Alena; Girasek, Tomas; Lukacs, Peter; Welker, Tilo; Müller, Jens
2017-03-01
The aim of this paper is detailed investigation of thermal resistance, flow analysis and distribution of coolant as well as thermal distribution inside multilayer LTCC substrates with embedded channels for power electronic devices by simulation software. For this reason four various structures of internal channels in the multilayer LTCC substrates were designed and simulated. The impact of the volume flow, structures of channels, and power loss of chip was simulated, calculated and analyzed by using the simulation software Mentor Graphics FloEFDTM. The structure, size and location of channels have the significant impact on thermal resistance, pressure of coolant as well as the effectivity of cooling power components (chips) that can be placed on the top of LTCC substrate. The main contribution of this paper is thermal analyze, optimization and impact of 4 various cooling channels embedded in LTCC multilayer structure. Paper investigate, the effect of volume flow in cooling channels for achieving the least thermal resistance of LTCC substrate that is loaded by power thermal chips. Paper shows on the impact of the first chips thermal load on the second chip as well as. This possible new technology could ensure in the case of practical realization effective cooling and increasing reliability of high power modules.
Integrated analysis of engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1981-01-01
The need for light, durable, fuel efficient, cost effective aircraft requires the development of engine structures which are flexible, made from advaced materials (including composites), resist higher temperatures, maintain tighter clearances and have lower maintenance costs. The formal quantification of any or several of these requires integrated computer programs (multilevel and/or interdisciplinary analysis programs interconnected) for engine structural analysis/design. Several integrated analysis computer prorams are under development at Lewis Reseach Center. These programs include: (1) COBSTRAN-Composite Blade Structural Analysis, (2) CODSTRAN-Composite Durability Structural Analysis, (3) CISTRAN-Composite Impact Structural Analysis, (4) STAEBL-StruTailoring of Engine Blades, and (5) ESMOSS-Engine Structures Modeling Software System. Three other related programs, developed under Lewis sponsorship, are described.
A Validation Metrics Framework for Safety-Critical Software-Intensive Systems
2009-03-01
so does its definition, tools, and techniques, including means for measuring the validation activity, its outputs, and impact on development...independent of the SDLP. When considering the above SDLPs from the safety engineering team’s perspective, there are also large impacts on the way... impact . Interpretation of any actionable metric data will need to be undertaken in the context of the SDLP. 2. Safety Input The software safety
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
Hay, L.; Knapp, L.
1996-01-01
Investigating natural, potential, and man-induced impacts on hydrological systems commonly requires complex modelling with overlapping data requirements, and massive amounts of one- to four-dimensional data at multiple scales and formats. Given the complexity of most hydrological studies, the requisite software infrastructure must incorporate many components including simulation modelling, spatial analysis and flexible, intuitive displays. There is a general requirement for a set of capabilities to support scientific analysis which, at this time, can only come from an integration of several software components. Integration of geographic information systems (GISs) and scientific visualization systems (SVSs) is a powerful technique for developing and analysing complex models. This paper describes the integration of an orographic precipitation model, a GIS and a SVS. The combination of these individual components provides a robust infrastructure which allows the scientist to work with the full dimensionality of the data and to examine the data in a more intuitive manner.
Laptop Use, Interactive Science Software, and Science Learning Among At-Risk Students
NASA Astrophysics Data System (ADS)
Zheng, Binbin; Warschauer, Mark; Hwang, Jin Kyoung; Collins, Penelope
2014-08-01
This year-long, quasi-experimental study investigated the impact of the use of netbook computers and interactive science software on fifth-grade students' science learning processes, academic achievement, and interest in further science, technology, engineering, and mathematics (STEM) study within a linguistically diverse school district in California. Analysis of students' state standardized science test scores indicated that the program helped close gaps in scientific achievement between at-risk learners (i.e., English learners, Hispanics, and free/reduced-lunch recipients) and their counterparts. Teacher and student interviews and classroom observations suggested that computer-supported visual representations and interactions supported diverse learners' scientific understanding and inquiry and enabled more individualized and differentiated instruction. Finally, interviews revealed that the program had a positive impact on students' motivation in science and on their interest in pursuing science-related careers. This study suggests that technology-facilitated science instruction is beneficial for improving at-risk students' science achievement, scaffolding students' scientific understanding, and strengthening students' motivation to pursue STEM-related careers.
Biomechanical Analysis of Locust Jumping in a Physically Realistic Virtual Environment
NASA Astrophysics Data System (ADS)
Cofer, David; Cymbalyuk, Gennady; Heitler, William; Edwards, Donald
2008-03-01
The biomechanical and neural components that underlie locust jumping have been extensively studied. Previous research suggested that jump energy is stored primarily in the extensor apodeme, and in a band of cuticle called the semi-lunar process (SLP). As it has thus far proven impossible to experimentally alter the SLP without rendering a locust unable to jump, it has not been possible to test whether the energy stored in the SLP has a significant impact on the jump. To address problems such as this we have developed a software toolkit, AnimatLab, which allows researchers to build and test virtual organisms. We used this software to build a virtual locust, and then asked how the SLP is utilized during jumping. The results show that without the SLP the jump distance was reduced by almost half. Further, the simulations were also able to show that loss of the SLP had a significant impact on the final phase of the jump. We are currently working on postural control mechanisms for targeted jumping in locust.
GIS-Based Noise Simulation Open Source Software: N-GNOIS
NASA Astrophysics Data System (ADS)
Vijay, Ritesh; Sharma, A.; Kumar, M.; Shende, V.; Chakrabarti, T.; Gupta, Rajesh
2015-12-01
Geographical information system (GIS)-based noise simulation software (N-GNOIS) has been developed to simulate the noise scenario due to point and mobile sources considering the impact of geographical features and meteorological parameters. These have been addressed in the software through attenuation modules of atmosphere, vegetation and barrier. N-GNOIS is a user friendly, platform-independent and open geospatial consortia (OGC) compliant software. It has been developed using open source technology (QGIS) and open source language (Python). N-GNOIS has unique features like cumulative impact of point and mobile sources, building structure and honking due to traffic. Honking is the most common phenomenon in developing countries and is frequently observed on any type of roads. N-GNOIS also helps in designing physical barrier and vegetation cover to check the propagation of noise and acts as a decision making tool for planning and management of noise component in environmental impact assessment (EIA) studies.
Realizing software longevity over a system's lifetime
NASA Astrophysics Data System (ADS)
Lanclos, Kyle; Deich, William T. S.; Kibrick, Robert I.; Allen, Steven L.; Gates, John
2010-07-01
A successful instrument or telescope will measure its productive lifetime in decades; over that period, the technology behind the control hardware and software will evolve, and be replaced on a per-component basis. These new components must successfully integrate with the old, and the difficulty of that integration depends strongly on the design decisions made over the course of the facility's history. The same decisions impact the ultimate success of each upgrade, as measured in terms of observing efficiency and maintenance cost. We offer a case study of these critical design decisions, analyzing the layers of software deployed for instruments under the care of UCO/Lick Observatory, including recent upgrades to the Low Resolution Imaging Spectrometer (LRIS) at Keck Observatory in Hawaii, as well as the Kast spectrograph, Lick Adaptive Optics system, and Hamilton spectrograph, all at Lick Observatory's Shane 3-meter Telescope at Mt. Hamilton. These issues play directly into design considerations for the software intended for use at the next generation of telescopes, such as the Thirty Meter Telescope. We conduct our analysis with the future of observational astronomy infrastructure firmly in mind.
Geographic Information Systems and Web Page Development
NASA Technical Reports Server (NTRS)
Reynolds, Justin
2004-01-01
The Facilities Engineering and Architectural Branch is responsible for the design and maintenance of buildings, laboratories, and civil structures. In order to improve efficiency and quality, the FEAB has dedicated itself to establishing a data infrastructure based on Geographic Information Systems, GIS. The value of GIS was explained in an article dating back to 1980 entitled "Need for a Multipurpose Cadastre" which stated, "There is a critical need for a better land-information system in the United States to improve land-conveyance procedures, furnish a basis for equitable taxation, and provide much-needed information for resource management and environmental planning." Scientists and engineers both point to GIS as the solution. What is GIS? According to most text books, Geographic Information Systems is a class of software that stores, manages, and analyzes mapable features on, above, or below the surface of the earth. GIS software is basically database management software to the management of spatial data and information. Simply put, Geographic Information Systems manage, analyze, chart, graph, and map spatial information. GIS can be broken down into two main categories, urban GIS and natural resource GIS. Further still, natural resource GIS can be broken down into six sub-categories, agriculture, forestry, wildlife, catchment management, archaeology, and geology/mining. Agriculture GIS has several applications, such as agricultural capability analysis, land conservation, market analysis, or whole farming planning. Forestry GIs can be used for timber assessment and management, harvest scheduling and planning, environmental impact assessment, and pest management. GIS when used in wildlife applications enables the user to assess and manage habitats, identify and track endangered and rare species, and monitor impact assessment.
Impact of electric vehicles on the IEEE 34 node distribution infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Zeming; Shalalfel, Laith; Beshir, Mohammed J.
With the growing penetration of the electric vehicles to our daily life owing to their economic and environmental benefits, there will be both opportunities and challenges to the utilities when adopting plug-in electric vehicles (PEV) to the distribution network. In this study, a thorough analysis based on real-world project is conducted to evaluate the impacts of electric vehicles infrastructure on the grid relating to system load flow, load factor, and voltage stability. IEEE 34 node test feeder was selected and tested along with different case scenarios utilizing the electrical distribution design (EDD) software to find out the potential impacts tomore » the grid.« less
Impact of electric vehicles on the IEEE 34 node distribution infrastructure
Jiang, Zeming; Shalalfel, Laith; Beshir, Mohammed J.
2014-10-01
With the growing penetration of the electric vehicles to our daily life owing to their economic and environmental benefits, there will be both opportunities and challenges to the utilities when adopting plug-in electric vehicles (PEV) to the distribution network. In this study, a thorough analysis based on real-world project is conducted to evaluate the impacts of electric vehicles infrastructure on the grid relating to system load flow, load factor, and voltage stability. IEEE 34 node test feeder was selected and tested along with different case scenarios utilizing the electrical distribution design (EDD) software to find out the potential impacts tomore » the grid.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Groth, Katrina M.; Zumwalt, Hannah Ruth; Clark, Andrew Jordan
2016-03-01
Hydrogen Risk Assessment Models (HyRAM) is a prototype software toolkit that integrates data and methods relevant to assessing the safety of hydrogen fueling and storage infrastructure. The HyRAM toolkit integrates deterministic and probabilistic models for quantifying accident scenarios, predicting physical effects, and characterizing the impact of hydrogen hazards, including thermal effects from jet fires and thermal pressure effects from deflagration. HyRAM version 1.0 incorporates generic probabilities for equipment failures for nine types of components, and probabilistic models for the impact of heat flux on humans and structures, with computationally and experimentally validated models of various aspects of gaseous hydrogen releasemore » and flame physics. This document provides an example of how to use HyRAM to conduct analysis of a fueling facility. This document will guide users through the software and how to enter and edit certain inputs that are specific to the user-defined facility. Description of the methodology and models contained in HyRAM is provided in [1]. This User’s Guide is intended to capture the main features of HyRAM version 1.0 (any HyRAM version numbered as 1.0.X.XXX). This user guide was created with HyRAM 1.0.1.798. Due to ongoing software development activities, newer versions of HyRAM may have differences from this guide.« less
NASA Astrophysics Data System (ADS)
Mohan, C.
In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.
ERIC Educational Resources Information Center
Wood, Clare; Pillinger, Claire; Jackson, Emma
2010-01-01
This paper reports an extended analysis of the study reported in [Wood, C. (2005). "Beginning readers' use of 'talking books' software can affect their reading strategies." "Journal of Research in Reading, 28," 170-182.], in which five and six-year-old children received either six sessions using specially designed talking books or six sessions of…
Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad
2015-01-01
Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.
NASA Astrophysics Data System (ADS)
Hisan Farjana, Shahjadi; Huda, Nazmul; Parvez Mahmud, M. A.
2018-05-01
European mining industries are the vast industrial sector which contributes largely on their economy which constitutes of ferro and non-ferro metals and minerals industries. The non-ferro metals extraction and processing industries require focus of attention due to sustainability concerns as their manufacturing processes are highly energy intensive and impacts globally on environment. This paper analyses major environmental effects caused by European metal industries based on the life-cycle impact analysis technologies. This research work is the first work in considering the comparative environmental impact analysis of European non-ferro metal industries which will reveal their technological similarities and dissimilarities to assess their environmental loads. The life-cycle inventory datasets are collected from the EcoInvent database while the analysis is done using the CML baseline and ReCipe endpoint method using SimaPro software version 8.4. The CML and ReCipe method are chosen because they are specialized impact assessment methods for European continent. The impact categories outlined for discussion here are human health, global warming and ecotoxicity. The analysis results reveal that the gold industry is vulnerable for the environment due to waste emission and similar result retained by silver mines a little bit. But copper, lead, manganese and zinc mining processes and industries are environment friendly in terms of metal extraction technologies and waste emissions.
Weather Augmented Risk Determination (WARD) System
NASA Astrophysics Data System (ADS)
Niknejad, M.; Mazdiyasni, O.; Momtaz, F.; AghaKouchak, A.
2017-12-01
Extreme climatic events have direct and indirect impacts on society, economy and the environment. Based on the United States Bureau of Economic Analysis (BEA) data, over one third of the U.S. GDP can be considered as weather-sensitive involving some degree of weather risk. This expands from a local scale concrete foundation construction to large scale transportation systems. Extreme and unexpected weather conditions have always been considered as one of the probable risks to human health, productivity and activities. The construction industry is a large sector of the economy, and is also greatly influenced by weather-related risks including work stoppage and low labor productivity. Identification and quantification of these risks, and providing mitigation of their effects are always the concerns of construction project managers. In addition to severe weather conditions' destructive effects, seasonal changes in weather conditions can also have negative impacts on human health. Work stoppage and reduced labor productivity can be caused by precipitation, wind, temperature, relative humidity and other weather conditions. Historical and project-specific weather information can improve better project management and mitigation planning, and ultimately reduce the risk of weather-related conditions. This paper proposes new software for project-specific user-defined data analysis that offers (a) probability of work stoppage and the estimated project length considering weather conditions; (b) information on reduced labor productivity and its impacts on project duration; and (c) probabilistic information on the project timeline based on both weather-related work stoppage and labor productivity. The software (WARD System) is designed such that it can be integrated into the already available project management tools. While the system and presented application focuses on the construction industry, the developed software is general and can be used for any application that involves labor productivity (e.g., farming) and work stoppage due to weather conditions (e.g., transportation, agriculture industry).
Learning from Software Localization.
ERIC Educational Resources Information Center
Guo, She-Sen
2003-01-01
Localization is the process of adapting a product to meet the language, cultural and other requirements of a specific target environment or market. This article describes ways in which software localization impacts upon curriculum, and discusses what students will learn from software localization. (AEF)
Automatic thermographic image defect detection of composites
NASA Astrophysics Data System (ADS)
Luo, Bin; Liebenberg, Bjorn; Raymont, Jeff; Santospirito, SP
2011-05-01
Detecting defects, and especially reliably measuring defect sizes, are critical objectives in automatic NDT defect detection applications. In this work, the Sentence software is proposed for the analysis of pulsed thermography and near IR images of composite materials. Furthermore, the Sentence software delivers an end-to-end, user friendly platform for engineers to perform complete manual inspections, as well as tools that allow senior engineers to develop inspection templates and profiles, reducing the requisite thermographic skill level of the operating engineer. Finally, the Sentence software can also offer complete independence of operator decisions by the fully automated "Beep on Defect" detection functionality. The end-to-end automatic inspection system includes sub-systems for defining a panel profile, generating an inspection plan, controlling a robot-arm and capturing thermographic images to detect defects. A statistical model has been built to analyze the entire image, evaluate grey-scale ranges, import sentencing criteria and automatically detect impact damage defects. A full width half maximum algorithm has been used to quantify the flaw sizes. The identified defects are imported into the sentencing engine which then sentences (automatically compares analysis results against acceptance criteria) the inspection by comparing the most significant defect or group of defects against the inspection standards.
Mendez Astudillo, Jorge; Lau, Lawrence; Tang, Yu-Ting; Moore, Terry
2018-02-14
As Global Navigation Satellite System (GNSS) signals travel through the troposphere, a tropospheric delay occurs due to a change in the refractive index of the medium. The Precise Point Positioning (PPP) technique can achieve centimeter/millimeter positioning accuracy with only one GNSS receiver. The Zenith Tropospheric Delay (ZTD) is estimated alongside with the position unknowns in PPP. Estimated ZTD can be very useful for meteorological applications, an example is the estimation of water vapor content in the atmosphere from the estimated ZTD. PPP is implemented with different algorithms and models in online services and software packages. In this study, a performance assessment with analysis of ZTD estimates from three PPP online services and three software packages is presented. The main contribution of this paper is to show the accuracy of ZTD estimation achievable in PPP. The analysis also provides the GNSS users and researchers the insight of the processing algorithm dependence and impact on PPP ZTD estimation. Observation data of eight whole days from a total of nine International GNSS Service (IGS) tracking stations spread in the northern hemisphere, the equatorial region and the southern hemisphere is used in this analysis. The PPP ZTD estimates are compared with the ZTD obtained from the IGS tropospheric product of the same days. The estimates of two of the three online PPP services show good agreement (<1 cm) with the IGS ZTD values at the northern and southern hemisphere stations. The results also show that the online PPP services perform better than the selected PPP software packages at all stations.
Corny, Lame, and Useful: How Secondary School Faculty Use the Learning Software Odyssey
ERIC Educational Resources Information Center
Silberman, Peter
2013-01-01
This study investigated how and why secondary school faculty use the learning software, Odyssey. Faculty were asked about their pedagogical beliefs and instructional practices, the perceived impact of using the software on student outcomes, and how, if at all, Odyssey-using faculty would change the software to increase utility. Data collection…
The Value of Open Source Software Tools in Qualitative Research
ERIC Educational Resources Information Center
Greenberg, Gary
2011-01-01
In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…
Impact detection and analysis/health monitoring system for composites
NASA Astrophysics Data System (ADS)
Child, James E.; Kumar, Amrita; Beard, Shawn; Qing, Peter; Paslay, Don G.
2006-05-01
This manuscript includes information from test evaluations and development of a smart event detection system for use in monitoring composite rocket motor cases for damaging impacts. The primary purpose of the system as a sentry for case impact event logging is accomplished through; implementation of a passive network of miniaturized piezoelectric sensors, logger with pre-determined force threshold levels, and analysis software. Empirical approaches to structural characterizations and network calibrations along with implementation techniques were successfully evaluated, testing was performed on both unloaded (less propellants) as well as loaded rocket motors with the cylindrical areas being of primary focus. The logged test impact data with known physical network parameters provided for impact location as well as force determination, typically within 3 inches of actual impact location using a 4 foot network grid and force accuracy within 25%of an actual impact force. The simplistic empirical characterization approach along with the robust / flexible sensor grids and battery operated portable logger show promise of a system that can increase confidence in composite integrity for both new assets progressing through manufacturing processes as well as existing assets that may be in storage or transportation.
Debugging and Performance Analysis Software Tools for Peregrine System |
High-Performance Computing | NREL Debugging and Performance Analysis Software Tools for Peregrine System Debugging and Performance Analysis Software Tools for Peregrine System Learn about debugging and performance analysis software tools available to use with the Peregrine system. Allinea
Baroni, L; Cenci, L; Tettamanti, M; Berati, M
2007-02-01
Recent studies support the hypothesis that plant-based diets are environmentally better than meat-based diets. This study aims to further explore this topic and to compare different environmental impacts resulting from different dietary patterns (omnivorous, vegetarian, vegan) and methods of production (conventional farming and organic agriculture). Three weekly balanced diets, equivalent to one another for energetic and nutrient content, have been planned: an omnivorous one, a vegetarian one and a vegan one. For each one, the Life Cycle Assessment (LCA) method has been applied in order to calculate the environmental impact, expressed in 'points'. The software we selected to carry out the Inventory Analysis and the Impact Assessment is SimaPro5. The Assessment phase has been conducted using Ecoindicator 99, a damage-oriented method, which analyses the impact according to three large damage categories, each of them subsuming various impact categories.
1989-07-11
LITERATURE CITED [Boeh73] Boehm, Barry W., "Software and its Impact: A Quantitative Assessment," Datamation, 19, 5, (May 1973), pp 48-59. [Boeh76...Boehm, Barry W., "Software Engineering," IEEE Transactions on Computers, C-25, 12, (December 1976), pp 1226-1241. [Boeh81a] Boehm, Barry W., Software...Engineering Economics, Prentice-Hall, Inc., Englewood Cliffs, NJ, (1981). [Boeh8lb] Boehm, Barry W., "An Experiment in Small Scale Application Software
Chen, Zhi-fang
2011-02-01
To investigate the effects of tissue engineered bone scaffold material in the restoration of alveolar socket after extraction of lower impacted third molar. Thirteen patients were immediately implanted with Bio-oss or PerioGlas® in the alveolar cavity after impacted mandibular third molar extraction. Clinical observation and X-ray were taken 1 to 12 weeks after operation. Paired t test was used for statistical analysis by SPSS10.0 software package. Thirteen patients did not experience postoperative complications. The distal alveolar height of the second molar and the gingival attachment did decrease significantly 1 to 12 weeks after operation(P < 0.05). Tissue engineered bone scaffold material is helpful in the restoration of alveolar socket after impacted third molar extraction.
Continuous Risk Management: An Overview
NASA Technical Reports Server (NTRS)
Rosenberg, Linda; Hammer, Theodore F.
1999-01-01
Software risk management is important because it helps avoid disasters, rework, and overkill, but more importantly because it stimulates win-win situations. The objectives of software risk management are to identify, address, and eliminate software risk items before they become threats to success or major sources of rework. In general, good project managers are also good managers of risk. It makes good business sense for all software development projects to incorporate risk management as part of project management. The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to implement risk management. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This is an introductory tutorial to continuous risk management based on this course. The rational for continuous risk management and how it is incorporated into project management are discussed. The risk management structure of six functions is discussed in sufficient depth for managers to understand what is involved in risk management and how it is implemented. These functions include: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.
Planetary Protection Bioburden Analysis Program
NASA Technical Reports Server (NTRS)
Beaudet, Robert A.
2013-01-01
This program is a Microsoft Access program that performed statistical analysis of the colony counts from assays performed on the Mars Science Laboratory (MSL) spacecraft to determine the bioburden density, 3-sigma biodensity, and the total bioburdens required for the MSL prelaunch reports. It also contains numerous tools that report the data in various ways to simplify the reports required. The program performs all the calculations directly in the MS Access program. Prior to this development, the data was exported to large Excel files that had to be cut and pasted to provide the desired results. The program contains a main menu and a number of submenus. Analyses can be performed by using either all the assays, or only the accountable assays that will be used in the final analysis. There are three options on the first menu: either calculate using (1) the old MER (Mars Exploration Rover) statistics, (2) the MSL statistics for all the assays, or This software implements penetration limit equations for common micrometeoroid and orbital debris (MMOD) shield configurations, windows, and thermal protection systems. Allowable MMOD risk is formulated in terms of the probability of penetration (PNP) of the spacecraft pressure hull. For calculating the risk, spacecraft geometry models, mission profiles, debris environment models, and penetration limit equations for installed shielding configurations are required. Risk assessment software such as NASA's BUMPERII is used to calculate mission PNP; however, they are unsuitable for use in shield design and preliminary analysis studies. The software defines a single equation for the design and performance evaluation of common MMOD shielding configurations, windows, and thermal protection systems, along with a description of their validity range and guidelines for their application. Recommendations are based on preliminary reviews of fundamental assumptions, and accuracy in predicting experimental impact test results. The software is programmed in Visual Basic for Applications for installation as a simple add-in for Microsoft Excel. The user is directed to a graphical user interface (GUI) that requires user inputs and provides solutions directly in Microsoft Excel workbooks. This work was done by Shannon Ryan of the USRA Lunar and Planetary Institute for Johnson Space Center. Further information is contained in a TSP (see page 1). MSC- 24582-1 Micrometeoroid and Orbital Debris (MMOD) Shield Ballistic Limit Analysis Program Lyndon B. Johnson Space Center, Houston, Texas Commercially, because it is so generic, Enigma can be used for almost any project that requires engineering visualization, model building, or animation. Models in Enigma can be exported to many other formats for use in other applications as well. Educationally, Enigma is being used to allow university students to visualize robotic algorithms in a simulation mode before using them with actual hardware. This work was done by David Shores and Sharon P. Goza of Johnson Space Center; Cheyenne McKeegan, Rick Easley, Janet Way, and Shonn Everett of MEI Technologies; Mark Manning of PTI; and Mark Guerra, Ray Kraesig, and William Leu of Tietronix Software, Inc. For further information, contact the JSC Innovation Partnerships Office at (281) 483-3809. MSC-24211-1 Spitzer Telemetry Processing System NASA's Jet Propulsion Laboratory, Pasadena, California The Spitzer Telemetry Processing System (SirtfTlmProc) was designed to address objectives of JPL's Multi-mission Image Processing Lab (MIPL) in processing spacecraft telemetry and distributing the resulting data to the science community. To minimize costs and maximize operability, the software design focused on automated error recovery, performance, and information management. The system processes telemetry from the Spitzer spacecraft and delivers Level 0 products to the Spitzer Science Center. SirtfTlmProc is a unique system with automated error notification and recovery, with a real-time continuous service that can go quiescent after periods of inactivity. The software can process 2 GB of telemetry and deliver Level 0 science products to the end user in four hours. It provides analysis tools so the operator can manage the system and troubleshoot problems. It automates telemetry processing in order to reduce staffing costs. This work was done by Alice Stanboli, Elmain M. Martinez, and James M. McAuley of Caltech for NASA's Jet Propulsion Laboratory. For more information, contact iaoffice @jpl.nasa.gov. This software is available for commercial licensing. Please contact Dan Broderick at Daniel.F. Broderick@jpl.nasa.gov. Refer to NPO-47803. NASA Tech Briefs, September 2013 29 This rapid response computer program predicts Orbiter Wing Leading Edge (WLE) damage caused by ice or foam impact during a Space Shuttle launch (Program "IMPACT2"). The program was developed after the Columbia accident in order to assess quickly WLE damage due to ice, foam, or metal impact (if any) during a Shuttle launch. IMPACT2 simulates an impact event in a few minutes for foam impactors, and in seconds for ice and metal impactors. The damage criterion is derived from results obtained from one sophisticated commercial program, which requires hours to carry out simulations of the same impact events. The program was designed to run much faster than the commercial program with prediction of projectile threshold velocities within 10 to 15% of commercial-program values. The mathematical model involves coupling of Orbiter wing normal modes of vibration to nonlinear or linear springmass models. IMPACT2 solves nonlinear or linear impact problems using classical normal modes of vibration of a target, and nonlinear/ linear time-domain equations for the projectile. Impact loads and stresses developed in the target are computed as functions of time. This model is novel because of its speed of execution. A typical model of foam, or other projectile characterized by material nonlinearities, impacting an RCC panel is executed in minutes instead of hours needed by the commercial programs. Target damage due to impact can be assessed quickly, provided that target vibration modes and allowable stress are known. This work was done by Robert Clark, Jr., Paul Cotter, and Constantine Michalopoulos of The Boeing Company for Johnson Space Center. For further information, contact the JSC Innovation Partnerships Office at (281) 483-3809. MSC-24988-1 Wing Leading Edge RCC Rapid Response Damage Prediction Tool (IMPACT2) Lyndon B. Johnson Space Center, Houston, Texas (3) the MSL statistics for only the accountable assays. Other options on the main menu include a data editing form and utility programs that produce various reports requested by the microbiologists and the project, and tools to generate the groupings for the final analyses. The analyses can be carried out in three ways: Each assay can be treated separately, the assays can be collectively treated for the whole zone as a group, or the assays can be collected in groups designated by the JPL Planetary Protection Manager. The latter approach was used to generate the final report because assays on the same equipment or similar equipment can be assumed to have been exposed to the same environment and cleaning. Thus, the statistics are improved by having a larger population, thereby reducing the standard deviation by the square root of N. For each method mentioned above, three reports are available. The first is a detailed report including all the data. This version was very useful in verifying the calculations. The second is a brief report that is similar to the full detailed report, but does not print out the data. The third is a grand total and summary report in which each assay requires only one line. For the first and second reports, most of the calculations are performed in the report section itself. For the third, all the calculations are performed directly in the query bound to the report. All the numeric al results were verified by comparing them with Excel templates, then exporting the data from the Planetary Protection Analysis program to Excel.
The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)
2003-01-01
We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.
Analysis of Contraction Joint Width Influence on Load Stress of Pavement Panels
NASA Astrophysics Data System (ADS)
Gao, Wei; Cui, Wei; Sun, Wei
2018-05-01
The width of transverse contraction joint of the cement road varies with temperatures, which leads to changes in load transmission among plates of the road surface and affects load stress of the road plates. Three-dimensional element analysis software EverFE is used to address the relation between the contraction joint width and road surface load stress, revealing the impact of reducing contraction joint width. The results could be of critical value in maintaining road functions and extending the service life of cement road surfaces.
The Impact of Social Software in Product Design Higher Education
ERIC Educational Resources Information Center
Hurn, Karl
2012-01-01
It is difficult to ignore the impact that Web 2.0 and the subsequent social software revolution has had on society in general, and young people in particular. Information is exchanged and interpreted extremely quickly and in ways that were not imagined 10 years ago. Universities are struggling to keep up with this new technology, with outdated…
Effort Drivers Estimation for Brazilian Geographically Distributed Software Development
NASA Astrophysics Data System (ADS)
Almeida, Ana Carina M.; Souza, Renata; Aquino, Gibeon; Meira, Silvio
To meet the requirements of today’s fast paced markets, it is important to develop projects on time and with the minimum use of resources. A good estimate is the key to achieve this goal. Several companies have started to work with geographically distributed teams due to cost reduction and time-to-market. Some researchers indicate that this approach introduces new challenges, because the teams work in different time zones and have possible differences in culture and language. It is already known that the multisite development increases the software cycle time. Data from 15 DSD projects from 10 distinct companies were collected. The analysis shows drivers that impact significantly the total effort planned to develop systems using DSD approach in Brazil.
Real-Time GNSS Positioning with JPL's new GIPSYx Software
NASA Astrophysics Data System (ADS)
Bar-Sever, Y. E.
2016-12-01
The JPL Global Differential GPS (GDGPS) System is now producing real-time orbit and clock solutions for GPS, GLONASS, BeiDou, and Galileo. The operations are based on JPL's next generation geodetic analysis and data processing software, GIPSYx (also known at RTGx). We will examine the impact of the nascent GNSS constellations on real-time kinematic positioning for earthquake monitoring, and assess the marginal benefits from each constellation. We will discus the options for signal selection, inter-signal bias modeling, and estimation strategies in the context of real-time point positioning. We will provide a brief overview of the key features and attributes of GIPSYx. Finally we will describe the current natural hazard monitoring services from the GDGPS System.
NASA Technical Reports Server (NTRS)
Singh, S. P.
1979-01-01
The computer software developed to set up a method for Wiener spectrum analysis of photographic films is presented. This method is used for the quantitative analysis of the autoradiographic enhancement process. The software requirements and design for the autoradiographic enhancement process are given along with the program listings and the users manual. A software description and program listings modification of the data analysis software are included.
Bourgarit, A; Mallet, H-P; Keshtmand, H; De Castro, N; Rambeloarisoa, J; Fain, O; Antoun, F; Picard, C; Rocher, G; Che, D; Farge, D
2009-10-01
The impact of the TB-info software was assessed on the care of patients treated with antituberculosis regimen (ATT). Cohort study of patients with tuberculosis who received an ATT in 2004 in two hospitals and five medical centres in Paris. Follow-up was implemented with the TB-info software. Data were compared to those of the 1999-2003 cohort. Two hundred and nine ATT were initiated in 2004, with a mean duration of 7.2 months. Demographic and clinical data reflected this population precariousness: 79% were foreign-born, 25% lived in institutions and half of them had no or unusual health insurance. Compared to the previous cohort, viral co-infections were tested in more than 80% cases and showed association with HIV, HBV or HCV in 11, 10 and 5% of the patients, respectively. Twenty-one patients were lost for follow-up (11%) and 76% of the smear-positive pulmonary tuberculosis therapies were declared successful but only 34% were declared cured with the WHO criteria. Analysis of the data obtained with TB-info software showed an improvement of tuberculosis patients care with more co-infection tested and less lost for follow-up. These results confirm the usefulness of this software for patients care and assessment of physicians practice in France.
Software Development Management: Empirical and Analytical Perspectives
ERIC Educational Resources Information Center
Kang, Keumseok
2011-01-01
Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…
Transducer Analysis and ATILA++ Model Development
2016-10-10
the ATILA finite element software package. This will greatly enhance the state-of-the-art in transducer performance prediction and provide a tool...refereed publication. 15 IMPACT/APPLICATIONS This work is helping to enable the expansion of the functionality of the A TILA ++ finite element ...Sb. GRANT NUMBER N00014-13-1-0196 Sc. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Sd. PROJECT NUMBER Richard J. Meyer, Jr. 20675 Douglas C. Markley Se
Rough mill simulator version 3.0: an analysis tool for refining rough mill operations
Edward Thomas; Joel Weiss
2006-01-01
ROMI-3 is a rough mill computer simulation package designed to be used by both rip-first and chop-first rough mill operators and researchers. ROMI-3 allows users to model and examine the complex relationships among cutting bill, lumber grade mix, processing options, and their impact on rough mill yield and efficiency. Integrated into the ROMI-3 software is a new least-...
Infusing Reliability Techniques into Software Safety Analysis
NASA Technical Reports Server (NTRS)
Shi, Ying
2015-01-01
Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.
Health Monitoring System Technology Assessments: Cost Benefits Analysis
NASA Technical Reports Server (NTRS)
Kent, Renee M.; Murphy, Dennis A.
2000-01-01
The subject of sensor-based structural health monitoring is very diverse and encompasses a wide range of activities including initiatives and innovations involving the development of advanced sensor, signal processing, data analysis, and actuation and control technologies. In addition, it embraces the consideration of the availability of low-cost, high-quality contributing technologies, computational utilities, and hardware and software resources that enable the operational realization of robust health monitoring technologies. This report presents a detailed analysis of the cost benefit and other logistics and operational considerations associated with the implementation and utilization of sensor-based technologies for use in aerospace structure health monitoring. The scope of this volume is to assess the economic impact, from an end-user perspective, implementation health monitoring technologies on three structures. It specifically focuses on evaluating the impact on maintaining and supporting these structures with and without health monitoring capability.
A Survey of Automatic Code Generating Software
1988-09-01
11189 Cincinnati, OH 45211 513-662-2300 69 LIST OF REFERENCES 1. Boehm, Barry W., "Software and Its Impact: A Quantita- tive Assessment," Daamtin, Vol...Decision SuDvort Systems: An Organizational Perspective, pp. 11- 12, Addison-Wesley Publishing Company, Inc., 1978. 6. Pressman , Roger S., Software
NASA Astrophysics Data System (ADS)
Fryling, Meg
2010-11-01
Organisations often make implementation decisions with little consideration for the maintenance phase of an enterprise resource planning (ERP) system, resulting in significant recurring maintenance costs. Poor cost estimations are likely related to the lack of an appropriate framework for enterprise-wide pre-packaged software maintenance, which requires an ongoing relationship with the software vendor (Markus, M.L., Tanis, C., and Fenema, P.C., 2000. Multisite ERP implementation. CACM, 43 (4), 42-46). The end result is that critical project decisions are made with little empirical data, resulting in substantial long-term cost impacts. The product of this research is a formal dynamic simulation model that enables theory testing, scenario exploration and policy analysis. The simulation model ERPMAINT1 was developed by combining and extending existing frameworks in several research domains, and by incorporating quantitative and qualitative case study data. The ERPMAINT1 model evaluates tradeoffs between different ERP project management decisions and their impact on post-implementation total cost of ownership (TCO). Through model simulations a variety of dynamic insights were revealed that could assist ERP project managers. Major findings from the simulation show that upfront investments in mentoring and system exposure translate to long-term cost savings. The findings also indicate that in addition to customisations, add-ons have a significant impact on TCO.
ERIC Educational Resources Information Center
Drummond, Kathryn; Chinen, Marjorie; Duncan, Teresa Garcia; Miller, H. Ray; Fryer, Lindsay; Zmach, Courtney; Culp, Katherine
2011-01-01
"Thinking Reader" is a software program for students in Grades 5-8 that incorporates elements commonly identified in policy reports as being key components of effective adolescent literacy instruction. This evaluation of the impact of "Thinking Reader" use by Grade 6 students focused on two confirmatory research questions about…
Factors that Impact Software Project Success in Offshore Information Technology (IT) Companies
ERIC Educational Resources Information Center
Edara, Venkatarao
2011-01-01
Information technology (IT) projects are unsuccessful at a rate of 65% to 75% per year, in spite of employing the latest technologies and training employees. Although many studies have been conducted on project successes in U.S. companies, there is a lack of research studying the impact of various factors on software project success in offshore IT…
ERIC Educational Resources Information Center
Coleman, Mari Beth; Killdare, Laura K.; Bell, Sherry Mee; Carter, Amanda M.
2014-01-01
The purpose of this study was to determine the impact of text-to-speech software on reading fluency and comprehension for four postsecondary students with below average reading fluency and comprehension including three students diagnosed with learning disabilities and concomitant conditions (e.g., attention deficit hyperactivity disorder, seizure…
ERIC Educational Resources Information Center
Schochet, Peter Z.
2015-01-01
This report presents the statistical theory underlying the "RCT-YES" software that estimates and reports impacts for RCTs for a wide range of designs used in social policy research. The report discusses a unified, non-parametric design-based approach for impact estimation using the building blocks of the Neyman-Rubin-Holland causal…
ERIC Educational Resources Information Center
Mellati, Morteza; Khademi, Marzieh
2015-01-01
The expansion of technological applications such as computers and mobile phones in the past three decades has impacted our life from different perspectives. Language teaching is no exception and like other fields of study, language teaching has also influenced by new language teaching sources and software. More recently, there has been a…
From Product- to Service-Oriented Strategies in the Enterprise Software Market
ERIC Educational Resources Information Center
Xin, Mingdi
2009-01-01
The enterprise software market is seeing the rise of a new business model--selling Software-as-a-Service (SaaS), in which a standard piece of software is owned and managed remotely by the vendor and delivered as a service over the Internet. Despite the hype, questions remain regarding the rise of this new service model and how it would impact the…
Boland, Mary Regina; Rusanov, Alexander; So, Yat; Lopez-Jimenez, Carlos; Busacca, Linda; Steinman, Richard C; Bakken, Suzanne; Bigger, J Thomas; Weng, Chunhua
2014-12-01
Underspecified user needs and frequent lack of a gold standard reference are typical barriers to technology evaluation. To address this problem, this paper presents a two-phase evaluation framework involving usability experts (phase 1) and end-users (phase 2). In phase 1, a cross-system functionality alignment between expert-derived user needs and system functions was performed to inform the choice of "the best available" comparison system to enable a cognitive walkthrough in phase 1 and a comparative effectiveness evaluation in phase 2. During phase 2, five quantitative and qualitative evaluation methods are mixed to assess usability: time-motion analysis, software log, questionnaires - System Usability Scale and the Unified Theory of Acceptance of Use of Technology, think-aloud protocols, and unstructured interviews. Each method contributes data for a unique measure (e.g., time motion analysis contributes task-completion-time; software log contributes action transition frequency). The measures are triangulated to yield complementary insights regarding user-perceived ease-of-use, functionality integration, anxiety during use, and workflow impact. To illustrate its use, we applied this framework in a formative evaluation of a software called Integrated Model for Patient Care and Clinical Trials (IMPACT). We conclude that this mixed-methods evaluation framework enables an integrated assessment of user needs satisfaction and user-perceived usefulness and usability of a novel design. This evaluation framework effectively bridges the gap between co-evolving user needs and technology designs during iterative prototyping and is particularly useful when it is difficult for users to articulate their needs for technology support due to the lack of a baseline. Copyright © 2013 Elsevier Inc. All rights reserved.
Inertial Upper Stage (IUS) software analysis
NASA Technical Reports Server (NTRS)
Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.
1979-01-01
The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.
Low velocity impact analysis of composite laminated plates
NASA Astrophysics Data System (ADS)
Zheng, Daihua
2007-12-01
In the past few decades polymer composites have been utilized more in structures where high strength and light weight are major concerns, e.g., aircraft, high-speed boats and sports supplies. It is well known that they are susceptible to damage resulting from lateral impact by foreign objects, such as dropped tools, hail and debris thrown up from the runway. The impact response of the structures depends not only on the material properties but also on the dynamic behavior of the impacted structure. Although commercial software is capable of analyzing such impact processes, it often requires extensive expertise and rigorous training for design and analysis. Analytical models are useful as they allow parametric studies and provide a foundation for validating the numerical results from large-scale commercial software. Therefore, it is necessary to develop analytical or semi-analytical models to better understand the behaviors of composite structures under impact and their associated failure process. In this study, several analytical models are proposed in order to analyze the impact response of composite laminated plates. Based on Meyer's Power Law, a semi-analytical model is obtained for small mass impact response of infinite composite laminates by the method of asymptotic expansion. The original nonlinear second-order ordinary differential equation is transformed into two linear ordinary differential equations. This is achieved by neglecting high-order terms in the asymptotic expansion. As a result, the semi-analytical solution of the overall impact response can be applied to contact laws with varying coefficients. Then an analytical model accounting for permanent deformation based on an elasto-plastic contact law is proposed to obtain the closed-form solutions of the wave-controlled impact responses of composite laminates. The analytical model is also used to predict the threshold velocity for delamination onset by combining with an existing quasi-static delamination criterion. The predictions are compared with experimental data and explicit finite element LS-DYNA simulation. The comparisons show reasonable agreement. Furthermore, an analytical model is developed to evaluate the combined effects of prestresses and permanent deformation based on the linearized elasto-plastic contact law and the Laplace Transform technique. It is demonstrated that prestresses do not have noticeable effects on the time history of contact force and strains, but they have significant consequences on the plate central displacement. For a impacted composite laminate with the presence of prestresses, the contact force increases with the increasing of the mass of impactor, thickness and interlaminar shear strength of the laminate. The combined analytical and numerical investigations provide validated models for elastic and elasto-plastic impact analysis of composite structures and shed light on the design of impact-resistant composite systems.
NASA Astrophysics Data System (ADS)
Haris, H.; Chow, M. F.; Usman, F.; Sidek, L. M.; Roseli, Z. A.; Norlida, M. D.
2016-03-01
Urbanization is growing rapidly in Malaysia. Rapid urbanization has known to have several negative impacts towards hydrological cycle due to decreasing of pervious area and deterioration of water quality in stormwater runoff. One of the negative impacts of urbanization is the congestion of the stormwater drainage system and this situation leading to flash flood problem and water quality degradation. There are many urban stormwater management softwares available in the market such as Storm Water Drainage System design and analysis program (DRAINS), Urban Drainage and Sewer Model (MOUSE), InfoWorks River Simulation (InfoWork RS), Hydrological Simulation Program-Fortran (HSPF), Distributed Routing Rainfall-Runoff Model (DR3M), Storm Water Management Model (SWMM), XP Storm Water Management Model (XPSWMM), MIKE-SWMM, Quality-Quantity Simulators (QQS), Storage, Treatment, Overflow, Runoff Model (STORM), and Hydrologic Engineering Centre-Hydrologic Modelling System (HEC-HMS). In this paper, we are going to discuss briefly about several softwares and their functionality, accessibility, characteristics and components in the quantity analysis of the hydrological design software and compare it with MSMA Design Aid and Database. Green Infrastructure (GI) is one of the main topics that has widely been discussed all over the world. Every development in the urban area is related to GI. GI can be defined as green area build in the develop area such as forest, park, wetland or floodway. The role of GI is to improve life standard such as water filtration or flood control. Among the twenty models that have been compared to MSMA SME, ten models were selected to conduct a comprehensive review for this study. These are known to be widely accepted by water resource researchers. These ten tools are further classified into three major categories as models that address the stormwater management ability of GI in terms of quantity and quality, models that have the capability of conducting the economic analysis of GI and models that can address both stormwater management and economic aspects together.
A Guide to Software Evaluation.
ERIC Educational Resources Information Center
Leonard, Rex; LeCroy, Barbara
Arguing that software evaluation is crucial to the quality of courseware available in a school, this paper begins by discussing reasons why microcomputers are making such a tremendous impact on education, and notes that, although the quality of software has improved over the years, the challenge for teachers to integrate computing into the…
Requirements Engineering in Building Climate Science Software
NASA Astrophysics Data System (ADS)
Batcheller, Archer L.
Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the software team or users have control and responsibility for making changes in response to new scientific ideas. Thick infrastructure provides more functionality for users, but gives them less control of it. The stability of infrastructure trades off against the responsiveness that the infrastructure can have to user needs.
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
Simulation of hypervelocity impacts using a contact charge
NASA Astrophysics Data System (ADS)
Giblin, I.; Martelli, G.; Smith, P. N.; di Martino, M.
1994-12-01
Two sets of hypervelocity impact experiments have been performed in the open using a contact charge technique and recorded using fast-framing cameras. It has been possible to record the uninterrupted ballistic trajectories of fragments from the catastrophically disrupted targets, together with their velocity and rotational properties directly after the impact, as well as their size. By performing these experiments in the open and on fairly soft ground, secondary fragmentation normally caused by impact onto the walls or floor of a test chamber has been minimized. A total of 10 experiments have been performed using targets of artificial rock which were either homogeneous, cored or carefully pre-fractured. We report here on the analysis of some of these data using a computer and special software written and developed by our group, with an indication of the results obtained.
RiskScape: a new tool for comparing risk from natural hazards (Invited)
NASA Astrophysics Data System (ADS)
Stirling, M. W.; King, A.
2010-12-01
The Regional RiskScape is New Zealand’s joint venture between GNS Science & NIWA, and represents a comprehensive and easy-to-use tool for multi-hazard-based risk and impact analysis. It has basic GIS functionality, in that it has Import/Export functions to use with GIS software. Five natural hazards have been implemented in Riskscape to date: Flood (river), earthquake, volcano (ash), tsunami and wind storm. The software converts hazard exposure information into the likely impacts for a region, for example, damage and replacement costs, casualties, economic losses, disruption, and number of people affected. It therefore can be used to assist with risk management, land use planning, building codes and design, risk identification, prioritization of risk-reduction/mitigation, determination of “best use” risk-reduction investment, evacuation and contingency planning, awareness raising, public information, realistic scenarios for exercises, and hazard event response. Three geographically disparate pilot regions have been used to develop and triall Riskscape in New Zealand, and each region is exposed to a different mix of natural hazards. Future (phase II) development of Riskscape will include the following hazards: Landslides (both rainfall and earthquake triggered), storm surges, pyroclastic flows and lahars, and climate change effects. While Riskscape developments have thus far focussed on scenario-based risk, future developments will advance the software into providing probabilistic-based solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klise, Katherine A.; Murray, Regan; Bynum, Michael
Water utilities are vulnerable to a wide variety of human-caused and natural disasters. These disruptive events can result in loss of water service, contaminated water, pipe breaks, and failed equipment. Furthermore, long term changes in water supply and customer demand can have a large impact on the operating conditions of the network. The ability to maintain drinking water service during and following these types of events is critical. Simulation and analysis tools can help water utilities explore how their network will respond to disruptive events and plan effective mitigation strategies. The U.S. Environmental Protection Agency and Sandia National Laboratories aremore » developing new software tools to meet this need. The Water Network Tool for Resilience (WNTR, pronounced winter) is a Python package designed to help water utilities investigate resilience of water distribution systems over a wide range of hazardous scenarios and to evaluate resilience-enhancing actions. The following documentation includes installation instructions and examples, description of software features, and software license. It is assumed that the reader is familiar with the Python Programming Language. References are included for additional background on software components. Online documentation, hosted at http://wntr.readthedocsio/, will be updated as new features are added. The online version includes API documentation and information for developers.« less
Digital radiography: optimization of image quality and dose using multi-frequency software.
Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D
2012-09-01
New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.
Numerical investigations on axial and radial blade rubs in turbo-machinery
NASA Astrophysics Data System (ADS)
Abdelrhman, Ahmed M.; Tang, Eric Sang Sung; Salman Leong, M.; Al-Qrimli, Haidar F.; Rajamohan, G.
2017-07-01
In the recent years, the clearance between the rotor blades and stator/casing had been getting smaller and smaller prior improving the aerodynamic efficiency of the turbomachines as demand in the engineering field. Due to the clearance reduction between the blade tip and the rotor casing and between rotor blades and stator blades, axial and radial blade rubbing could be occurred, especially at high speed resulting into complex nonlinear vibrations. The primary aim of this study is to address the blade axial rubbing phenomenon using numerical analysis of rotor system. A comparison between rubbing caused impacts of axial and radial blade rubbing and rubbing forces are also aims of this study. Tow rotor models (rotor-stator and rotor casing models) has been designed and sketched using SOILDSWORKS software. ANSYS software has been used for the simulation and the numerical analysis. The rubbing conditions were simulated at speed range of 1000rpm, 1500rpm and 2000rpm. Analysis results for axial blade rubbing showed the appearance of blade passing frequency and its multiple frequencies (lx, 2x 3x etc.) and these frequencies will more excited with increasing the rotational speed. Also, it has been observed that when the rotating speed increased, the rubbing force and the harmonics frequencies in x, y and z-direction become higher and severe. The comparison study showed that axial blade rub is more dangerous and would generate a higher vibration impacts and higher blade rubbing force than radial blade rub.
NREL Improves Building Energy Simulation Programs Through Diagnostic Testing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2012-01-01
This technical highlight describes NREL research to develop Building Energy Simulation Test for Existing Homes (BESTEST-EX) to increase the quality and accuracy of energy analysis tools for the building retrofit market. Researchers at the National Renewable Energy Laboratory (NREL) have developed a new test procedure to increase the quality and accuracy of energy analysis tools for the building retrofit market. The Building Energy Simulation Test for Existing Homes (BESTEST-EX) is a test procedure that enables software developers to evaluate the performance of their audit tools in modeling energy use and savings in existing homes when utility bills are available formore » model calibration. Similar to NREL's previous energy analysis tests, such as HERS BESTEST and other BESTEST suites included in ANSI/ASHRAE Standard 140, BESTEST-EX compares software simulation findings to reference results generated with state-of-the-art simulation tools such as EnergyPlus, SUNREL, and DOE-2.1E. The BESTEST-EX methodology: (1) Tests software predictions of retrofit energy savings in existing homes; (2) Ensures building physics calculations and utility bill calibration procedures perform to a minimum standard; and (3) Quantifies impacts of uncertainties in input audit data and occupant behavior. BESTEST-EX includes building physics and utility bill calibration test cases. The diagram illustrates the utility bill calibration test cases. Participants are given input ranges and synthetic utility bills. Software tools use the utility bills to calibrate key model inputs and predict energy savings for the retrofit cases. Participant energy savings predictions using calibrated models are compared to NREL predictions using state-of-the-art building energy simulation programs.« less
Impact Study of Metal Fasteners in Roofing Assemblies using Three-Dimensional Heat Transfer Analysis
Singh, Manan; Gulati, Rupesh; Ravi, Srinivasan; ...
2016-11-29
Heat transfer analysis was performed on typical roofing assemblies using HEAT3, a three-dimensional heat transfer analysis software. The difference in heat transferred through the roofing assemblies considered is compared between two cases - without any steel fasteners and with steel fasteners. In the latter case, the metal roofing fasteners were arranged as per Factor Mutual Global (FMG) approvals, in the field, perimeter, and corner zones of the roof. The temperature conditions used for the analysis represented summer and winter conditions for three separate Climate Zones (CZ) namely Climate Zone 2 or CZ2 represented by Orlando, FL; CZ3 represented by Atlanta,more » GA; and CZ6 zone represented by St. Paul, MN. In all the climatic conditions, higher energy transfer was observed with increase in the number of metal fasteners attributed to high thermal conductivity of metals as compared to the insulation and other materials used in the roofing assembly. This difference in heat loss was also quantified in the form of percentage change in the overall or effective insulation of the roofing assembly for better understanding of the practical aspects. Besides, a comparison of 2D heat transfer analysis (using THERM software) and 3D analysis using HEAT3 is also discussed.« less
Impact Study of Metal Fasteners in Roofing Assemblies using Three-Dimensional Heat Transfer Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Singh, Manan; Gulati, Rupesh; Ravi, Srinivasan
Heat transfer analysis was performed on typical roofing assemblies using HEAT3, a three-dimensional heat transfer analysis software. The difference in heat transferred through the roofing assemblies considered is compared between two cases - without any steel fasteners and with steel fasteners. In the latter case, the metal roofing fasteners were arranged as per Factor Mutual Global (FMG) approvals, in the field, perimeter, and corner zones of the roof. The temperature conditions used for the analysis represented summer and winter conditions for three separate Climate Zones (CZ) namely Climate Zone 2 or CZ2 represented by Orlando, FL; CZ3 represented by Atlanta,more » GA; and CZ6 zone represented by St. Paul, MN. In all the climatic conditions, higher energy transfer was observed with increase in the number of metal fasteners attributed to high thermal conductivity of metals as compared to the insulation and other materials used in the roofing assembly. This difference in heat loss was also quantified in the form of percentage change in the overall or effective insulation of the roofing assembly for better understanding of the practical aspects. Besides, a comparison of 2D heat transfer analysis (using THERM software) and 3D analysis using HEAT3 is also discussed.« less
Quantification of myocardial fibrosis by digital image analysis and interactive stereology
2014-01-01
Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Conclusion Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist’s visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193 PMID:24912374
Quantification of myocardial fibrosis by digital image analysis and interactive stereology.
Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas
2014-06-09
Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist's visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193.
Using software security analysis to verify the secure socket layer (SSL) protocol
NASA Technical Reports Server (NTRS)
Powell, John D.
2004-01-01
nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2003-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...REPORT TYPE 3. DATES COVERED 00-00-2003 to 00-00-2003 4. TITLE AND SUBTITLE Development of Automated Image Analysis Software for Suspended...objective is to develop automated image analysis software to reduce the effort and time required for manual identification of plankton images. Automated
A tool to include gamma analysis software into a quality assurance program.
Agnew, Christina E; McGarry, Conor K
2016-03-01
To provide a tool to enable gamma analysis software algorithms to be included in a quality assurance (QA) program. Four image sets were created comprising two geometric images to independently test the distance to agreement (DTA) and dose difference (DD) elements of the gamma algorithm, a clinical step and shoot IMRT field and a clinical VMAT arc. The images were analysed using global and local gamma analysis with 2 in-house and 8 commercially available software encompassing 15 software versions. The effect of image resolution on gamma pass rates was also investigated. All but one software accurately calculated the gamma passing rate for the geometric images. Variation in global gamma passing rates of 1% at 3%/3mm and over 2% at 1%/1mm was measured between software and software versions with analysis of appropriately sampled images. This study provides a suite of test images and the gamma pass rates achieved for a selection of commercially available software. This image suite will enable validation of gamma analysis software within a QA program and provide a frame of reference by which to compare results reported in the literature from various manufacturers and software versions. Copyright © 2015. Published by Elsevier Ireland Ltd.
The Role of Data Analysis Software in Graduate Programs in Education and Post-Graduate Research
ERIC Educational Resources Information Center
Harwell, Michael
2018-01-01
The importance of data analysis software in graduate programs in education and post-graduate educational research is self-evident. However the role of this software in facilitating supererogated statistical practice versus "cookbookery" is unclear. The need to rigorously document the role of data analysis software in students' graduate…
Gis-Based Spatial Statistical Analysis of College Graduates Employment
NASA Astrophysics Data System (ADS)
Tang, R.
2012-07-01
It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.
NASA Astrophysics Data System (ADS)
Hua, H.; Manipon, G.; Starch, M.
2017-12-01
NASA's upcoming missions are expected to be generating data volumes at least an order of magnitude larger than current missions. A significant increase in data processing, data rates, data volumes, and long-term data archive capabilities are needed. Consequently, new challenges are emerging that impact traditional data and software management approaches. At large-scales, next generation science data systems are exploring the move onto cloud computing paradigms to support these increased needs. New implications such as costs, data movement, collocation of data systems & archives, and moving processing closer to the data, may result in changes to the stewardship, preservation, and provenance of science data and software. With more science data systems being on-boarding onto cloud computing facilities, we can expect more Earth science data records to be both generated and kept in the cloud. But at large scales, the cost of processing and storing global data may impact architectural and system designs. Data systems will trade the cost of keeping data in the cloud with the data life-cycle approaches of moving "colder" data back to traditional on-premise facilities. How will this impact data citation and processing software stewardship? What are the impacts of cloud-based on-demand processing and its affect on reproducibility and provenance. Similarly, with more science processing software being moved onto cloud, virtual machines, and container based approaches, more opportunities arise for improved stewardship and preservation. But will the science community trust data reprocessed years or decades later? We will also explore emerging questions of the stewardship of the science data system software that is generating the science data records both during and after the life of mission.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shevitz, Daniel Wolf; Key, Brian P.; Garcia, Daniel B.
2017-09-05
The Fragment Impact Toolkit (FIT) is a software package used for probabilistic consequence evaluation of fragmenting sources. The typical use case for FIT is to simulate an exploding shell and evaluate the consequence on nearby objects. FIT is written in the programming language Python and is designed as a collection of interacting software modules. Each module has a function that interacts with the other modules to produce desired results.
Educational interactive multimedia software: The impact of interactivity on learning
NASA Astrophysics Data System (ADS)
Reamon, Derek Trent
This dissertation discusses the design, development, deployment and testing of two versions of educational interactive multimedia software. Both versions of the software are focused on teaching mechanical engineering undergraduates about the fundamentals of direct-current (DC) motor physics and selection. The two versions of Motor Workshop software cover the same basic materials on motors, but differ in the level of interactivity between the students and the software. Here, the level of interactivity refers to the particular role of the computer in the interaction between the user and the software. In one version, the students navigate through information that is organized by topic, reading text, and viewing embedded video clips; this is referred to as "low-level interactivity" software because the computer simply presents the content. In the other version, the students are given a task to accomplish---they must design a small motor-driven 'virtual' vehicle that competes against computer-generated opponents. The interaction is guided by the software which offers advice from 'experts' and provides contextual information; we refer to this as "high-level interactivity" software because the computer is actively participating in the interaction. The software was used in two sets of experiments, where students using the low-level interactivity software served as the 'control group,' and students using the highly interactive software were the 'treatment group.' Data, including pre- and post-performance tests, questionnaire responses, learning style characterizations, activity tracking logs and videotapes were collected for analysis. Statistical and observational research methods were applied to the various data to test the hypothesis that the level of interactivity effects the learning situation, with higher levels of interactivity being more effective for learning. The results show that both the low-level and high-level interactive versions of the software were effective in promoting learning about the subject of motors. The focus of learning varied between users of the two versions, however. The low-level version was more effective for teaching concepts and terminology, while the high-level version seemed to be more effective for teaching engineering applications.
Usability study of clinical exome analysis software: top lessons learned and recommendations.
Shyr, Casper; Kushniruk, Andre; Wasserman, Wyeth W
2014-10-01
New DNA sequencing technologies have revolutionized the search for genetic disruptions. Targeted sequencing of all protein coding regions of the genome, called exome analysis, is actively used in research-oriented genetics clinics, with the transition to exomes as a standard procedure underway. This transition is challenging; identification of potentially causal mutation(s) amongst ∼10(6) variants requires specialized computation in combination with expert assessment. This study analyzes the usability of user interfaces for clinical exome analysis software. There are two study objectives: (1) To ascertain the key features of successful user interfaces for clinical exome analysis software based on the perspective of expert clinical geneticists, (2) To assess user-system interactions in order to reveal strengths and weaknesses of existing software, inform future design, and accelerate the clinical uptake of exome analysis. Surveys, interviews, and cognitive task analysis were performed for the assessment of two next-generation exome sequence analysis software packages. The subjects included ten clinical geneticists who interacted with the software packages using the "think aloud" method. Subjects' interactions with the software were recorded in their clinical office within an urban research and teaching hospital. All major user interface events (from the user interactions with the packages) were time-stamped and annotated with coding categories to identify usability issues in order to characterize desired features and deficiencies in the user experience. We detected 193 usability issues, the majority of which concern interface layout and navigation, and the resolution of reports. Our study highlights gaps in specific software features typical within exome analysis. The clinicians perform best when the flow of the system is structured into well-defined yet customizable layers for incorporation within the clinical workflow. The results highlight opportunities to dramatically accelerate clinician analysis and interpretation of patient genomic data. We present the first application of usability methods to evaluate software interfaces in the context of exome analysis. Our results highlight how the study of user responses can lead to identification of usability issues and challenges and reveal software reengineering opportunities for improving clinical next-generation sequencing analysis. While the evaluation focused on two distinctive software tools, the results are general and should inform active and future software development for genome analysis software. As large-scale genome analysis becomes increasingly common in healthcare, it is critical that efficient and effective software interfaces are provided to accelerate clinical adoption of the technology. Implications for improved design of such applications are discussed. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Time-dependent inertia analysis of vehicle mechanisms
NASA Astrophysics Data System (ADS)
Salmon, James Lee
Two methods for performing transient inertia analysis of vehicle hardware systems are developed in this dissertation. The analysis techniques can be used to predict the response of vehicle mechanism systems to the accelerations associated with vehicle impacts. General analytical methods for evaluating translational or rotational system dynamics are generated and evaluated for various system characteristics. The utility of the derived techniques are demonstrated by applying the generalized methods to two vehicle systems. Time dependent acceleration measured during a vehicle to vehicle impact are used as input to perform a dynamic analysis of an automobile liftgate latch and outside door handle. Generalized Lagrange equations for a non-conservative system are used to formulate a second order nonlinear differential equation defining the response of the components to the transient input. The differential equation is solved by employing the fourth order Runge-Kutta method. The events are then analyzed using commercially available two dimensional rigid body dynamic analysis software. The results of the two analytical techniques are compared to experimental data generated by high speed film analysis of tests of the two components performed on a high G acceleration sled at Ford Motor Company.
The Impact of Software on Associate Degree Programs in Electronic Engineering Technology.
ERIC Educational Resources Information Center
Hata, David M.
1986-01-01
Assesses the range and extent of computer assisted instruction software available in electronic engineering technology education. Examines the need for software skills in four areas: (1) high-level languages; (2) assembly language; (3) computer-aided engineering; and (4) computer-aided instruction. Outlines strategies for the future in three…
Optimal Software Strategies in the Presence of Network Externalities
ERIC Educational Resources Information Center
Liu, Yipeng
2009-01-01
Network externalities or alternatively termed network effects are pervasive in computer software markets. While software vendors consider pricing strategies, they must also take into account the impact of network externalities on their sales. My main interest in this research is to describe a firm's strategies and behaviors in the presence of…
Fault Tree Analysis Application for Safety and Reliability
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
Many commercial software tools exist for fault tree analysis (FTA), an accepted method for mitigating risk in systems. The method embedded in the tools identifies a root as use in system components, but when software is identified as a root cause, it does not build trees into the software component. No commercial software tools have been built specifically for development and analysis of software fault trees. Research indicates that the methods of FTA could be applied to software, but the method is not practical without automated tool support. With appropriate automated tool support, software fault tree analysis (SFTA) may be a practical technique for identifying the underlying cause of software faults that may lead to critical system failures. We strive to demonstrate that existing commercial tools for FTA can be adapted for use with SFTA, and that applied to a safety-critical system, SFTA can be used to identify serious potential problems long before integrator and system testing.
RELAP-7 Software Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less
JAMS - a software platform for modular hydrological modelling
NASA Astrophysics Data System (ADS)
Kralisch, Sven; Fischer, Christian
2015-04-01
Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.
DEPEND - A design environment for prediction and evaluation of system dependability
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.; Iyer, Ravishankar K.
1990-01-01
The development of DEPEND, an integrated simulation environment for the design and dependability analysis of fault-tolerant systems, is described. DEPEND models both hardware and software components at a functional level, and allows automatic failure injection to assess system performance and reliability. It relieves the user of the work needed to inject failures, maintain statistics, and output reports. The automatic failure injection scheme is geared toward evaluating a system under high stress (workload) conditions. The failures that are injected can affect both hardware and software components. To illustrate the capability of the simulator, a distributed system which employs a prediction-based, dynamic load-balancing heuristic is evaluated. Experiments were conducted to determine the impact of failures on system performance and to identify the failures to which the system is especially susceptible.
NDE scanning and imaging of aircraft structure
NASA Astrophysics Data System (ADS)
Bailey, Donald; Kepler, Carl; Le, Cuong
1995-07-01
The Science and Engineering Lab at McClellan Air Force Base, Sacramento, Calif. has been involved in the development and use of computer-based scanning systems for NDE (nondestructive evaluation) since 1985. This paper describes the history leading up to our current applications which employ eddy current and ultrasonic scanning of aircraft structures that contain both metallics and advanced composites. The scanning is performed using industrialized computers interfaced to proprietary acquisition equipment and software. Examples are shown that image several types of damage such as exfoliation and fuselage lap joint corrosion in aluminum, impact damage, embedded foreign material, and porosity in Kevlar and graphite epoxy composites. Image analysis techniques are reported that are performed using consumer oriented computer hardware and software that are not NDE specific and not expensive
NASA Technical Reports Server (NTRS)
Brunelle, J. E.; Eckhardt, D. E., Jr.
1985-01-01
Results are presented of an experiment conducted in the NASA Avionics Integrated Research Laboratory (AIRLAB) to investigate the implementation of fault-tolerant software techniques on fault-tolerant computer architectures, in particular the Software Implemented Fault Tolerance (SIFT) computer. The N-version programming and recovery block techniques were implemented on a portion of the SIFT operating system. The results indicate that, to effectively implement fault-tolerant software design techniques, system requirements will be impacted and suggest that retrofitting fault-tolerant software on existing designs will be inefficient and may require system modification.
Proceedings of the 14th Annual Software Engineering Workshop
NASA Technical Reports Server (NTRS)
1989-01-01
Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.
Environmental impact of PV cell waste scenario.
Bogacka, M; Pikoń, K; Landrat, M
2017-12-01
Rapid growth of the volume of waste from PV cells is expected in the following years. The problem of its utilization seems to be the most important issue for future waste management systems. The environmental impacts of the PV recycling scenario are presented in the manuscript. The analysis is based on the LCA approach and the average data available in specialized databases for silicon standard PV cell is used. The functional unit includes parameters like: efficiency, composition, surface area. The discussion on the environmental impact change due to the location of the PV production and waste processing plants is presented in the manuscript. Additionally, the discussion on the environmental effect of substituting different energy resources with PV cells is presented in the manuscript. The analysis of the PV cell life cycle scenario presented in the article was performed using the SIMA PRO software and data from Ecoinvent 3.0 database together with additional data obtained from other sources. Copyright © 2017 Elsevier Ltd. All rights reserved.
Design and validation of Segment--freely available software for cardiovascular image analysis.
Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan
2010-01-11
Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.
User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh
NASA Astrophysics Data System (ADS)
Jones, Craig H.
2002-12-01
"PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.
Visual Environments for CFD Research
NASA Technical Reports Server (NTRS)
Watson, Val; George, Michael W. (Technical Monitor)
1994-01-01
This viewgraph presentation gives an overview of the visual environments for computational fluid dynamics (CFD) research. It includes details on critical needs from the future computer environment, features needed to attain this environment, prospects for changes in and the impact of the visualization revolution on the human-computer interface, human processing capabilities, limits of personal environment and the extension of that environment with computers. Information is given on the need for more 'visual' thinking (including instances of visual thinking), an evaluation of the alternate approaches for and levels of interactive computer graphics, a visual analysis of computational fluid dynamics, and an analysis of visualization software.
Multi-viewpoint clustering analysis
NASA Technical Reports Server (NTRS)
Mehrotra, Mala; Wild, Chris
1993-01-01
In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.
Parameters sensitivity on mooring loads of ship-shaped FPSOs
NASA Astrophysics Data System (ADS)
Hasan, Mohammad Saidee
2017-12-01
The work in this paper is focused on special assessment and evaluation of mooring system of ship-shaped FPSO unit. In particular, the purpose of the study is to find the impact on mooring loads for the variation in different parameters using MIMOSA software. First, a selected base case was designed for an intact mooring system in a typical ultimate limit state (ULS) condition, and then the sensitivity to mooring loads on parameters e.g. location of the turret, analysis method (quasi-static vs. dynamic analysis), low-frequency damping level in the surge, pretension and drag coefficients on chain and steel wire has been performed. It is found that mooring loads change due to the change of these parameters. Especially, pretension has a large impact on the maximum tension of mooring lines and low-frequency damping can change surge offset significantly.
GNSS software receiver sampling noise and clock jitter performance and impact analysis
NASA Astrophysics Data System (ADS)
Chen, Jian Yun; Feng, XuZhe; Li, XianBin; Wu, GuangYao
2015-02-01
In the design of a multi-frequency multi-constellation GNSS software defined radio receivers is becoming more and more popular due to its simple architecture, flexible configuration and good coherence in multi-frequency signal processing. It plays an important role in navigation signal processing and signal quality monitoring. In particular, GNSS software defined radio receivers driving the sampling clock of analogue-to-digital converter (ADC) by FPGA implies that a more flexible radio transceiver design is possible. According to the concept of software defined radio (SDR), the ideal is to digitize as close to the antenna as possible. Whereas the carrier frequency of GNSS signal is of the frequency of GHz, converting at this frequency is expensive and consumes more power. Band sampling method is a cheaper, more effective alternative. When using band sampling method, it is possible to sample a RF signal at twice the bandwidth of the signal. Unfortunately, as the other side of the coin, the introduction of SDR concept and band sampling method induce negative influence on the performance of the GNSS receivers. ADC's suffer larger sampling clock jitter generated by FPGA; and low sampling frequency introduces more noise to the receiver. Then the influence of sampling noise cannot be neglected. The paper analyzes the sampling noise, presents its influence on the carrier noise ratio, and derives the ranging error by calculating the synchronization error of the delay locked loop. Simulations aiming at each impact factors of sampling-noise-induced ranging error are performed. Simulation and experiment results show that if the target ranging accuracy is at the level of centimeter, the quantization length should be no less than 8 and the sampling clock jitter should not exceed 30ps.
NASA Astrophysics Data System (ADS)
Cork, Chris; Lugg, Robert; Chacko, Manoj; Levi, Shimon
2005-06-01
With the exponential increase in output database size due to the aggressive optical proximity correction (OPC) and resolution enhancement technique (RET) required for deep sub-wavelength process nodes, the CPU time required for mask tape-out continues to increase significantly. For integrated device manufacturers (IDMs), this can impact the time-to-market for their products where even a few days delay could have a huge commercial impact and loss of market window opportunity. For foundries, a shorter turnaround time provides a competitive advantage in their demanding market, too slow could mean customers looking elsewhere for these services; while a fast turnaround may even command a higher price. With FAB turnaround of a mature, plain-vanilla CMOS process of around 20-30 days, a delay of several days in mask tapeout would contribute a significant fraction to the total time to deliver prototypes. Unlike silicon processing, masks tape-out time can be decreased by simply purchasing extra computing resources and software licenses. Mask tape-out groups are taking advantage of the ever-decreasing hardware cost and increasing power of commodity processors. The significant distributability inherent in some commercial Mask Synthesis software can be leveraged to address this critical business issue. Different implementations have different fractions of the code that cannot be parallelized and this affects the efficiency with which it scales, as is described by Amdahl"s law. Very few are efficient enough to allow the effective use of 1000"s of processors, enabling run times to drop from days to only minutes. What follows is a cost aware methodology to quantify the scalability of this class of software, and thus act as a guide to estimating the optimal investment in terms of hardware and software licenses.
1997-12-01
Watts Humphrey and is described in his book A Discipline for Software Engineering [ Humphrey 95]. Its intended use is to guide the planning and...Pat; Humphrey , Watts S .; Khajenoori, Soheil; Macke, Susan; & Matvya, Annette. "Introducing the Personal Software Process: Three Industry Case... Humphrey 95] Humphrey , Watts S . A Discipline for Software Engineering. Reading, Ma.: Addison-Wesley, 1995. [Mauchly 40] Mauchly, J.W. "Significance
LITERATURE REVIEW ON IMPACT OF GLYCOLATE ON THE 2H EVAPORATOR AND THE EFFLUENT TREATMENT FACILITY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adu-Wusu, K.
2012-05-10
Glycolic acid (GA) is being studied as an alternate reductant in the Defense Waste Processing Facility (DWPF) feed preparation process. It will either be a total or partial replacement for the formic acid that is currently used. A literature review has been conducted on the impact of glycolate on two post-DWPF downstream systems - the 2H Evaporator system and the Effluent Treatment Facility (ETF). The DWPF recycle stream serves as a portion of the feed to the 2H Evaporator. Glycolate enters the evaporator system from the glycolate in the recycle stream. The overhead (i.e., condensed phase) from the 2H Evaporatormore » serves as a portion of the feed to the ETF. The literature search revealed that virtually no impact is anticipated for the 2H Evaporator. Glycolate may help reduce scale formation in the evaporator due to its high complexing ability. The drawback of the solubilizing ability is the potential impact on the criticality analysis of the 2H Evaporator system. It is recommended that at least a theoretical evaluation to confirm the finding that no self-propagating violent reactions with nitrate/nitrites will occur should be performed. Similarly, identification of sources of ignition relevant to glycolate and/or update of the composite flammability analysis to reflect the effects from the glycolate additions for the 2H Evaporator system are in order. An evaluation of the 2H Evaporator criticality analysis is also needed. A determination of the amount or fraction of the glycolate in the evaporator overhead is critical to more accurately assess its impact on the ETF. Hence, use of predictive models like OLI Environmental Simulation Package Software (OLI/ESP) and/or testing are recommended for the determination of the glycolate concentration in the overhead. The impact on the ETF depends on the concentration of glycolate in the ETF feed. The impact is classified as minor for feed glycolate concentrations {le} 33 mg/L or 0.44 mM. The ETF unit operations that will have minor/major impacts are chlorination, pH adjustment, 1st mercury removal, organics removal, 2nd mercury removal, and ion exchange. For minor impacts, the general approach is to use historical process operations data/modeling software like OLI/ESP and/or monitoring/compiled process operations data to resolve any uncertainties with testing as a last resort. For major impacts (i.e., glycolate concentrations > 33 mg/L or 0.44 mM), testing is recommended. No impact is envisaged for the following ETF unit operations regardless of the glycolate concentration - filtration, reverse osmosis, ion exchange resin regeneration, and evaporation.« less
The Chandra X-ray Center data system: supporting the mission of the Chandra X-ray Observatory
NASA Astrophysics Data System (ADS)
Evans, Janet D.; Cresitello-Dittmar, Mark; Doe, Stephen; Evans, Ian; Fabbiano, Giuseppina; Germain, Gregg; Glotfelty, Kenny; Hall, Diane; Plummer, David; Zografou, Panagoula
2006-06-01
The Chandra X-ray Center Data System provides end-to-end scientific software support for Chandra X-ray Observatory mission operations. The data system includes the following components: (1) observers' science proposal planning tools; (2) science mission planning tools; (3) science data processing, monitoring, and trending pipelines and tools; and (4) data archive and database management. A subset of the science data processing component is ported to multiple platforms and distributed to end-users as a portable data analysis package. Web-based user tools are also available for data archive search and retrieval. We describe the overall architecture of the data system and its component pieces, and consider the design choices and their impacts on maintainability. We discuss the many challenges involved in maintaining a large, mission-critical software system with limited resources. These challenges include managing continually changing software requirements and ensuring the integrity of the data system and resulting data products while being highly responsive to the needs of the project. We describe our use of COTS and OTS software at the subsystem and component levels, our methods for managing multiple release builds, and adapting a large code base to new hardware and software platforms. We review our experiences during the life of the mission so-far, and our approaches for keeping a small, but highly talented, development team engaged during the maintenance phase of a mission.
NASA Technical Reports Server (NTRS)
Shull, Forrest; Feldmann, Raimund; Haingaertner, Ralf; Regardie, Myrna; Seaman, Carolyn
2007-01-01
It is often the case in software projects that when schedule and budget resources are limited, the Verification and Validation (V&V) activities suffer. Fewer V&V activities can be afforded and moreover, short-term challenges can result in V&V activities being scaled back or dropped altogether. As a result, too often the default solution is to save activities for improving software quality until too late in the life-cycle, relying on late-term code inspections followed by thorough testing activities to reduce defect counts to acceptable levels. As many project managers realize, however, this is a resource-intensive way of achieving the required quality for software. The Full Life-cycle Defect Management Assessment Initiative, funded by NASA s Office of Safety and Mission Assurance under the Software Assurance Research Program, aims to address these problems by: Improving the effectiveness of early life-cycle V&V activities to make their benefits more attractive to team leads. Specifically, we focus on software inspection, a proven method that can be applied to any software work product, long before executable code has been developed; Better communicating this effectiveness to software development teams, along with suggestions for parameters to improve in the future to increase effectiveness; Analyzing the impact of early life-cycle V&V on the effectiveness and cost required for late life-cycle V&V activities, such as testing, in order to make the tradeoffs more apparent. This white paper reports on an initial milestone in this work, the development of a preliminary model of inspection effectiveness across multiple NASA Centers. This model contributes toward reaching our project goals by: Allowing an examination of inspection parameters, across different types of projects and different work products, for an analysis of factors that impact defect detection effectiveness. Allowing a comparison of this NASA-specific model to existing recommendations in the literature regarding how to plan effective inspections. Forming a baseline model which can be extended to incorporate factors describing: the numbers and types of defects that are missed by inspections; how such defects flow downstream through software development phases; how effectively they can be caught by testing activities in the late stages of development. The model has been implemented in a prototype web-enabled decision-support tool which allows developers to enter their inspection data and receive feedback based on a comparison against the model. The tool also allows users to access reusable materials (such as checklists) from projects included in the baseline. Both the tool itself and the model underlying it will continue to be extended throughout the remainder of this initiative. As results of analyzing inspection effectiveness for defect containment are determined, they can be shared via the tool and also via updates to existing training courses on metrics and software inspections. Moreover, the tool will help satisfy key CMMI requirements for the NASA Centers, as it will enable NASA to take a global view across peer review results for various types of projects to identify systemic problems. This analysis can result in continuous improvements to the approach to verification.
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
Semantic Metrics for Analysis of Software
NASA Technical Reports Server (NTRS)
Etzkorn, Letha H.; Cox, Glenn W.; Farrington, Phil; Utley, Dawn R.; Ghalston, Sampson; Stein, Cara
2005-01-01
A recently conceived suite of object-oriented software metrics focus is on semantic aspects of software, in contradistinction to traditional software metrics, which focus on syntactic aspects of software. Semantic metrics represent a more human-oriented view of software than do syntactic metrics. The semantic metrics of a given computer program are calculated by use of the output of a knowledge-based analysis of the program, and are substantially more representative of software quality and more readily comprehensible from a human perspective than are the syntactic metrics.
NASA Astrophysics Data System (ADS)
Zelt, C. A.
2017-12-01
Earth science attempts to understand how the earth works. This research often depends on software for modeling, processing, inverting or imaging. Freely sharing open-source software is essential to prevent reinventing the wheel and allows software to be improved and applied in ways the original author may never have envisioned. For young scientists, releasing software can increase their name ID when applying for jobs and funding, and create opportunities for collaborations when scientists who collect data want the software's creator to be involved in their project. However, we frequently hear scientists say software is a tool, it's not science. Creating software that implements a new or better way of earth modeling or geophysical processing, inverting or imaging should be viewed as earth science. Creating software for things like data visualization, format conversion, storage, or transmission, or programming to enhance computational performance, may be viewed as computer science. The former, ideally with an application to real data, can be published in earth science journals, the latter possibly in computer science journals. Citations in either case should accurately reflect the impact of the software on the community. Funding agencies need to support more software development and open-source releasing, and the community should give more high-profile awards for developing impactful open-source software. Funding support and community recognition for software development can have far reaching benefits when the software is used in foreseen and unforeseen ways, potentially for years after the original investment in the software development. For funding, an open-source release that is well documented should be required, with example input and output files. Appropriate funding will provide the incentive and time to release user-friendly software, and minimize the need for others to duplicate the effort. All funded software should be available through a single web site, ideally maintained by someone in a funded position. Perhaps the biggest challenge is the reality that researches who use software, as opposed to develop software, are more attractive university hires because they are more likely to be "big picture" scientists that publish in the highest profile journals, although sometimes the two go together.
Ethics and Morality in Software Development: A Developer's Perspective
ERIC Educational Resources Information Center
Stephenson, James H.
2010-01-01
Computers and other digital devices have become ubiquitous in our lives. Almost all aspects of our lives are in part or wholly impacted by computers and the software that runs on them. Unknowingly, we are placing our livelihoods and even our lives in the hands unknown software developers. Ethical and moral decisions made during software…
The Effect of Firm Strategy and Corporate Performance on Software Market Growth in Emerging Regions
ERIC Educational Resources Information Center
Mertz, Sharon A.
2013-01-01
The purpose of this research is to evaluate the impact of firm strategies and corporate performance on enterprise software market growth in emerging regions. The emerging regions of Asia Pacific, Eastern Europe, the Middle East and Africa, and Latin America, currently represent smaller overall markets for software vendors, but exhibit high growth…
NASA Technical Reports Server (NTRS)
French, Scott W.
1991-01-01
The goals are to show that verifying and validating a software system is a required part of software development and has a direct impact on the software's design and structure. Workshop tasks are given in the areas of statistics, integration/system test, unit and architectural testing, and a traffic controller problem.
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.
This study examined a civil engineering capstone course that embedded a sophisticated simulation-based task within instruction. Students (n=28) were required to conduct a hazardous waste site investigation using simulation software designed specifically for the course (Interactive Site Investigation Software) (ISIS). The software simulated…
Differential item functioning magnitude and impact measures from item response theory models.
Kleinman, Marjorie; Teresi, Jeanne A
2016-01-01
Measures of magnitude and impact of differential item functioning (DIF) at the item and scale level, respectively are presented and reviewed in this paper. Most measures are based on item response theory models. Magnitude refers to item level effect sizes, whereas impact refers to differences between groups at the scale score level. Reviewed are magnitude measures based on group differences in the expected item scores and impact measures based on differences in the expected scale scores. The similarities among these indices are demonstrated. Various software packages are described that provide magnitude and impact measures, and new software presented that computes all of the available statistics conveniently in one program with explanations of their relationships to one another.
The dynamics of software development project management: An integrative systems dynamic perspective
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.; Abdel-Hamid, T.
1984-01-01
Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.
A statistical model of operational impacts on the framework of the bridge crane
NASA Astrophysics Data System (ADS)
Antsev, V. Yu; Tolokonnikov, A. S.; Gorynin, A. D.; Reutov, A. A.
2017-02-01
The technical regulations of the Customs Union demands implementation of the risk analysis of the bridge cranes operation at their design stage. The statistical model has been developed for performance of random calculations of risks, allowing us to model possible operational influences on the bridge crane metal structure in their various combination. The statistical model is practically actualized in the software product automated calculation of risks of failure occurrence of bridge cranes.
Analysis of Department of Defense Social Media Policy and Its Impact on Operational Security
2015-06-01
1976, Generation Y was born between 1977 and 1994, and Generation Z was born after 1995 (Schroer, n.d.). 6 With over 1.3 billion users , Facebook...social media platforms, such as Twitter, Instagram , LinkedIn, and Pinterest, have shown significant user growth over the last six years (Duggan et al... generated content , social software, Web-based email, instant messaging, and discussion forum. Some examples of IbC include YouTube, Facebook, Flickr
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
Smart energy management system
NASA Astrophysics Data System (ADS)
Desai, Aniruddha; Singh, Jugdutt
2010-04-01
Peak and average energy usage in domestic and industrial environments is growing rapidly and absence of detailed energy consumption metrics is making systematic reduction of energy usage very difficult. Smart energy management system aims at providing a cost-effective solution for managing soaring energy consumption and its impact on green house gas emissions and climate change. The solution is based on seamless integration of existing wired and wireless communication technologies combined with smart context-aware software which offers a complete solution for automation of energy measurement and device control. The persuasive software presents users with easy-to-assimilate visual cues identifying problem areas and time periods and encourages a behavioural change to conserve energy. The system allows analysis of real-time/statistical consumption data with the ability to drill down into detailed analysis of power consumption, CO2 emissions and cost. The system generates intelligent projections and suggests potential methods (e.g. reducing standby, tuning heating/cooling temperature, etc.) of reducing energy consumption. The user interface is accessible using web enabled devices such as PDAs, PCs, etc. or using SMS, email, and instant messaging. Successful real-world trial of the system has demonstrated the potential to save 20 to 30% energy consumption on an average. Low cost of deployment and the ability to easily manage consumption from various web enabled devices offers gives this system a high penetration and impact capability offering a sustainable solution to act on climate change today.
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
Wayland, Matthew T; Defaye, Arnaud; Rocha, Joao; Jayaram, Satish Arcot; Royet, Julien; Miguel-Aliaga, Irene; Leulier, François; Cognigni, Paola
2014-10-01
The intestinal physiology of Drosophila melanogaster can be monitored in an integrative, non-invasive manner by analysing graphical features of the excreta produced by flies fed on a dye-supplemented diet. This assay has been used by various labs to explore gut function and its regulation. To facilitate its use, we present here a free, stand-alone dedicated software tool for the analysis of fly excreta. The Ultimate Reader of Dung (T.U.R.D.) is designed to offer a flexible environment for a wide range of experimental designs, with special attention to automation and high-throughput processing. This software detects the distinctive changes in acid-base and water balance previously reported to occur in response to dietary challenges and mating. We have used T.U.R.D. to test the contribution of the bacterial environment of the flies to various intestinal parameters including the established diet- and mating-triggered responses. To this end, we have analysed the faecal patterns of flies reared in germ-free conditions, upon re-association with controlled microbiota and subjected to food-borne or systemic, non-lethal bacterial infections. We find that the tested faecal outputs are unchanged in all these conditions, suggesting that the impact of the bacterial environment on the intestinal features highlighted by faecal deposit analysis is minimal. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
Kubios HRV--heart rate variability analysis software.
Tarvainen, Mika P; Niskanen, Juha-Pekka; Lipponen, Jukka A; Ranta-Aho, Perttu O; Karjalainen, Pasi A
2014-01-01
Kubios HRV is an advanced and easy to use software for heart rate variability (HRV) analysis. The software supports several input data formats for electrocardiogram (ECG) data and beat-to-beat RR interval data. It includes an adaptive QRS detection algorithm and tools for artifact correction, trend removal and analysis sample selection. The software computes all the commonly used time-domain and frequency-domain HRV parameters and several nonlinear parameters. There are several adjustable analysis settings through which the analysis methods can be optimized for different data. The ECG derived respiratory frequency is also computed, which is important for reliable interpretation of the analysis results. The analysis results can be saved as an ASCII text file (easy to import into MS Excel or SPSS), Matlab MAT-file, or as a PDF report. The software is easy to use through its compact graphical user interface. The software is available free of charge for Windows and Linux operating systems at http://kubios.uef.fi. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
An online database for plant image analysis software tools.
Lobet, Guillaume; Draye, Xavier; Périlleux, Claire
2013-10-09
Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.
Stein, Mart Lambertus; Rudge, James W; Coker, Richard; van der Weijden, Charlie; Krumkamp, Ralf; Hanvoravongchai, Piya; Chavez, Irwin; Putthasri, Weerasak; Phommasack, Bounlay; Adisasmito, Wiku; Touch, Sok; Sat, Le Minh; Hsu, Yu-Chen; Kretzschmar, Mirjam; Timen, Aura
2012-10-12
Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software.
Reconstruction of head impacts in FIS World Cup alpine skiing.
Steenstrup, Sophie Elspeth; Mok, Kam-Ming; McIntosh, Andrew S; Bahr, Roald; Krosshaug, Tron
2018-06-01
Prior to the 2013/2014 season, the International Ski Federation (FIS) increased the helmet testing speed from 5.4 to 6.8 m/s for alpine downhill, super-G and giant slalom. Whether this increased testing speed reflects head impact velocities in real head injury situations on snow is unclear. We therefore investigated the injury mechanisms and gross head impact biomechanics in seven real head injury situations among World Cup (WC) alpine skiers. We analysed nine head impacts from seven head injury videos from the FIS Injury Surveillance System, throughout nine WC seasons (2006-2015) in detail. We used commercial video-based motion analysis software to estimate head impact kinematics in two dimensions, including directly preimpact and postimpact, from broadcast video. The sagittal plane angular movement of the head was also measured using angle measurement software. In seven of nine head impacts, the estimated normal to slope preimpact velocity was higher than the current FIS helmet rule of 6.8 m/s (mean 8.1 (±SD 0.6) m/s, range 1.9±0.8 to 12.1±0.4 m/s). The nine head impacts had a mean normal to slope velocity change of 9.3±1.0 m/s, range 5.2±1.1 to 13.5±1.3 m/s. There was a large change in sagittal plane angular velocity (mean 43.3±2.9 rad/s (range 21.2±1.5 to 64.2±3.0 rad/s)) during impact. The estimated normal to slope preimpact velocity was higher than the current FIS helmet rule of 6.8 m/s in seven of nine head impacts. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Finite element modelling of aluminum alloy 2024-T3 under transverse impact loading
NASA Astrophysics Data System (ADS)
Abdullah, Ahmad Sufian; Kuntjoro, Wahyu; Yamin, A. F. M.
2017-12-01
Fiber metal laminate named GLARE is a new aerospace material which has great potential to be widely used in future lightweight aircraft. It consists of aluminum alloy 2024-T3 and glass-fiber reinforced laminate. In order to produce reliable finite element model of impact response or crashworthiness of structure made of GLARE, one can initially model and validate the finite element model of the impact response of its constituents separately. The objective of this study was to develop a reliable finite element model of aluminum alloy 2024-T3 under low velocity transverse impact loading using commercial software ABAQUS. Johnson-Cook plasticity and damage models were used to predict the alloy's material properties and impact behavior. The results of the finite element analysis were compared to the experiment that has similar material and impact conditions. Results showed good correlations in terms of impact forces, deformation and failure progressions which concluded that the finite element model of 2024-T3 aluminum alloy under low velocity transverse impact condition using Johnson-Cook plastic and damage models was reliable.
GWAMA: software for genome-wide association meta-analysis.
Mägi, Reedik; Morris, Andrew P
2010-05-28
Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. The GWAMA (Genome-Wide Association Meta-Analysis) software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.
Impact of the 3-D model strategy on science learning of the solar system
NASA Astrophysics Data System (ADS)
Alharbi, Mohammed
The purpose of this mixed method study, quantitative and descriptive, was to determine whether the first-middle grade (seventh grade) students at Saudi schools are able to learn and use the Autodesk Maya software to interact and create their own 3-D models and animations and whether their use of the software influences their study habits and their understanding of the school subject matter. The study revealed that there is value to the science students regarding the use of 3-D software to create 3-D models to complete science assignments. Also, this study aimed to address the middle-school students' ability to learn 3-D software in art class, and then ultimately use it in their science class. The success of this study may open the way to consider the impact of 3-D modeling on other school subjects, such as mathematics, art, and geography. When the students start using graphic design, including 3-D software, at a young age, they tend to develop personal creativity and skills. The success of this study, if applied in schools, will provide the community with skillful young designers and increase awareness of graphic design and the new 3-D technology. Experimental method was used to answer the quantitative research question, are there significant differences applying the learning method using 3-D models (no 3-D, premade 3-D, and create 3-D) in a science class being taught about the solar system and its impact on the students' science achievement scores? Descriptive method was used to answer the qualitative research questions that are about the difficulty of learning and using Autodesk Maya software, time that students take to use the basic levels of Polygon and Animation parts of the Autodesk Maya software, and level of students' work quality.
Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
An analysis of satellite state vector observability using SST tracking data
NASA Technical Reports Server (NTRS)
Englar, T. S., Jr.; Hammond, C. L.
1976-01-01
Observability of satellite state vectors, using only SST tracking data was investigated by covariance analysis under a variety of satellite and station configurations. These results indicate very precarious observability in most short arc cases. The consequences of this are large variances on many state components, such as the downrange component of the relay satellite position. To illustrate the impact of observability problems, an example is given of two distinct satellite orbit pairs generating essentially the same data arc. The physical bases for unobservability are outlined and related to proposed TDRSS configurations. Results are relevant to any mission depending upon TDRSS to determine satellite state. The required mathematical analysis and the software used is described.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
NASA Technical Reports Server (NTRS)
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
Imai, Shungo; Yamada, Takehiro; Ishiguro, Nobuhisa; Miyamoto, Takenori; Kagami, Keisuke; Tomiyama, Naoki; Niinuma, Yusuke; Nagasaki, Daisuke; Suzuki, Koji; Yamagami, Akira; Kasashi, Kumiko; Kobayashi, Masaki; Iseki, Ken
2017-01-01
Based on the predictive performance in our previous study, we switched the therapeutic drug monitoring (TDM) analysis software for dose setting of vancomycin (VCM) from "Vancomycin MEEK TDM analysis software Ver2.0" (MEEK) to "SHIONOGI-VCM-TDM ver.2009" (VCM-TDM) in January 2015. In the present study, our aim was to validate the effectiveness of the changing VCM TDM analysis software in initial dose setting of VCM. The enrolled patients were divided into two groups, each having 162 patients in total, who received VCM with the initial dose set using MEEK (MEEK group) or VCM-TDM (VCM-TDM group). We compared the rates of attaining the therapeutic range (trough value; 10-20 μg/mL) of serum VCM concentration between the groups. Multivariate logistic regression analysis was performed to confirm that changing the VCM TDM analysis software was an independent factor related to attaining the therapeutic range. Switching the VCM TDM analysis software from MEEK to VCM-TDM improved the rate of attaining the therapeutic range by 21.6% (MEEK group: 42.6% vs. VCM-TDM group: 64.2%, p<0.01). Patient age ≥65 years, concomitant medication (furosemide) and the TDM analysis software used VCM-TDM were considered to be independent factors for attaining the therapeutic range. These results demonstrated the effectiveness of switching the VCM TDM analysis software from MEEK to VCM-TDM for initial dose setting of VCM.
SEADS 3.0 Sectoral Energy/Employment Analysis and Data System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roop, Joseph M.; Anderson, David A.; Schultz, Robert W.
2007-12-17
SEADS 3.0, the Sectoral Energy/Employment Analysis and Data System, is a revision and upgrading of SEADS--PC, a software package designed for the analysis of policy that could be described by modifying final demands of consumer, businesses, or governments (Roop, et al., 1995). If a question can be formulated so that implications can be translated into changes in final demands for goods and services, then SEADS 3.0 provides a quick and easy tool to assess preliminary impacts. And SEADS 3.0 should be considered just that: a quick and easy way to get preliminary results. Often a thorough answer, even to suchmore » a simple question as, “What would be the effect on U. S. energy use and employment if the Federal Government doubled R&D expenditures?” requires a more sophisticated analytical framework than the input-output structure embedded in SEADS 3.0. This tool uses a static, input-output model to assess the impacts of changes in final demands on first industry output, then employment and energy use. The employment and energy impacts are derived by multiplying the industry outputs (derived from the changed final demands) by industry-specific energy and employment coefficients. The tool also allows for the specification of regional or state employment impacts, though this option is not available for energy impacts.« less
Hellrung, Lydia; Hollmann, Maurice; Zscheyge, Oliver; Schlumm, Torsten; Kalberlah, Christian; Roggenhofer, Elisabeth; Okon-Singer, Hadas; Villringer, Arno; Horstmann, Annette
2015-01-01
In this work we present a new open source software package offering a unified framework for the real-time adaptation of fMRI stimulation procedures. The software provides a straightforward setup and highly flexible approach to adapt fMRI paradigms while the experiment is running. The general framework comprises the inclusion of parameters from subject’s compliance, such as directing gaze to visually presented stimuli and physiological fluctuations, like blood pressure or pulse. Additionally, this approach yields possibilities to investigate complex scientific questions, for example the influence of EEG rhythms or fMRI signals results themselves. To prove the concept of this approach, we used our software in a usability example for an fMRI experiment where the presentation of emotional pictures was dependent on the subject’s gaze position. This can have a significant impact on the results. So far, if this is taken into account during fMRI data analysis, it is commonly done by the post-hoc removal of erroneous trials. Here, we propose an a priori adaptation of the paradigm during the experiment’s runtime. Our fMRI findings clearly show the benefits of an adapted paradigm in terms of statistical power and higher effect sizes in emotion-related brain regions. This can be of special interest for all experiments with low statistical power due to a limited number of subjects, a limited amount of time, costs or available data to analyze, as is the case with real-time fMRI. PMID:25837719
Development of a New VLBI Data Analysis Software
NASA Technical Reports Server (NTRS)
Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.
2010-01-01
We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.
Development of Automated Image Analysis Software for Suspended Marine Particle Classification
2002-09-30
Development of Automated Image Analysis Software for Suspended Marine Particle Classification Scott Samson Center for Ocean Technology...and global water column. 1 OBJECTIVES The project’s objective is to develop automated image analysis software to reduce the effort and time
Improved Measurement of Ejection Velocities From Craters Formed in Sand
NASA Technical Reports Server (NTRS)
Cintala, Mark J.; Byers, Terry; Cardenas, Francisco; Montes, Roland; Potter, Elliot E.
2014-01-01
A typical impact crater is formed by two major processes: compression of the target (essentially equivalent to a footprint in soil) and ejection of material. The Ejection-Velocity Measurement System (EVMS) in the Experimental Impact Laboratory has been used to study ejection velocities from impact craters formed in sand since the late 1990s. The original system used an early-generation Charge-Coupled Device (CCD) camera; custom-written software; and a complex, multicomponent optical system to direct laser light for illumination. Unfortunately, the electronic equipment was overtaken by age, and the software became obsolete in light of improved computer hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yixing; Zhang, Jianshun; Pelken, Michael
Executive Summary The objective of this study was to develop a “Virtual Design Studio (VDS)”: a software platform for integrated, coordinated and optimized design of green building systems with low energy consumption, high indoor environmental quality (IEQ), and high level of sustainability. This VDS is intended to assist collaborating architects, engineers and project management team members throughout from the early phases to the detailed building design stages. It can be used to plan design tasks and workflow, and evaluate the potential impacts of various green building strategies on the building performance by using the state of the art simulation toolsmore » as well as industrial/professional standards and guidelines for green building system design. Engaged in the development of VDS was a multi-disciplinary research team that included architects, engineers, and software developers. Based on the review and analysis of how existing professional practices in building systems design operate, particularly those used in the U.S., Germany and UK, a generic process for performance-based building design, construction and operation was proposed. It distinguishes the whole process into five distinct stages: Assess, Define, Design, Apply, and Monitoring (ADDAM). The current VDS is focused on the first three stages. The VDS considers building design as a multi-dimensional process, involving multiple design teams, design factors, and design stages. The intersection among these three dimensions defines a specific design task in terms of “who”, “what” and “when”. It also considers building design as a multi-objective process that aims to enhance the five aspects of performance for green building systems: site sustainability, materials and resource efficiency, water utilization efficiency, energy efficiency and impacts to the atmospheric environment, and IEQ. The current VDS development has been limited to energy efficiency and IEQ performance, with particular focus on evaluating thermal performance, air quality and lighting environmental quality because of their strong interaction with the energy performance of buildings. The VDS software framework contains four major functions: 1) Design coordination: It enables users to define tasks using the Input-Process-Output flow approach, which specifies the anticipated activities (i.e., the process), required input and output information, and anticipated interactions with other tasks. It also allows task scheduling to define the work flow, and sharing of the design data and information via the internet. 2) Modeling and simulation: It enables users to perform building simulations to predict the energy consumption and IEQ conditions at any of the design stages by using EnergyPlus and a combined heat, air, moisture and pollutant simulation (CHAMPS) model. A method for co-simulation was developed to allow the use of both models at the same time step for the combined energy and indoor air quality analysis. 3) Results visualization: It enables users to display a 3-D geometric design of the building by reading BIM (building information model) file generated by design software such as SketchUp, and the predicted results of heat, air, moisture, pollutant and light distributions in the building. 4) Performance evaluation: It enables the users to compare the performance of a proposed building design against a reference building that is defined for the same type of buildings under the same climate condition, and predicts the percent of improvements over the minimum requirements specified in ASHRAE Standard 55-2010, 62.1-2010 and 90.1-2010. An approach was developed to estimate the potential impact of a design factor on the whole building performance, and hence can assist the user to identify areas that have most pay back for investment. The VDS software was developed by using C++ with the conventional Model, View and Control (MVC) software architecture. The software has been verified by using a simple 3-zone case building. The application of the VDS concepts and framework for building design and performance analysis has been illustrated by using a medium-sized, five story office building that received LEED Platinum Certification from USGBC.« less
Online Analysis of Wind and Solar Part II: Transmission Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Etingov, Pavel V.; Ma, Jian
2012-01-31
To facilitate wider penetration of renewable resources without compromising system reliability concerns arising from the lack of predictability of intermittent renewable resources, a tool for use by California Independent System Operator (CAISO) power grid operators was developed by Pacific Northwest National Laboratory (PNNL) in conjunction with CAISO with funding from California Energy Commission. The tool analyzes and displays the impacts of uncertainties in forecasts of loads and renewable generation on: (1) congestion, (2)voltage and transient stability margins, and (3)voltage reductions and reactive power margins. The impacts are analyzed in the base case and under user-specified contingencies.A prototype of the toolmore » has been developed and implemented in software.« less
Research on Control Strategy of the Micro Grid’s Hybrid Energy System
NASA Astrophysics Data System (ADS)
Gao, Zi-jun; Li, Yang; Wang, Yan-ping; Zong, Ke-yong; Zhang, Jing
2018-03-01
This paper study the structure and operating characteristic of the hybrid energy system which is made of super-capacitor and battery. The system is controlled by strategy of bus voltage following. The bus voltage can change the state from swings to stable quickly when load mutation occurs in the micro grid. The transient impact also can be reduced by this way. The passage set up the model of energy system and make an analysis by the software named MATLAB/Simulink. At last, the passage proves the correctness and the effectiveness of the control strategy and draws a conclusion that the transient impact can be inhibited which occurs in the bus voltage of energy system.
Testing in Service-Oriented Environments
2010-03-01
software releases (versions, service packs, vulnerability patches) for one com- mon ESB during the 13-month period from January 1, 2008 through...impact on quality of service : Unlike traditional software compo- nents, a single instance of a web service can be used by multiple consumers. Since the...distributed, with heterogeneous hardware and software (SOA infrastructure, services , operating systems, and databases). Because of cost and security, it
ERIC Educational Resources Information Center
Miyazaki, Mikio; Kimiho, Chino; Katoh, Ryuhei; Arai, Hitoshi; Ogihara, Fumihiro; Oguchi, Yuichi; Morozumi, Tatsuo; Kon, Mayuko; Komatsu, Kotaro
2012-01-01
Three-dimensional dynamic geometry software has the power to enhance students' learning of spatial geometry. The purpose of this research is to clarify what potential using three-dimensional dynamic geometry software can offer us in terms of how to develop the spatial geometry curriculum in lower secondary schools. By focusing on the impacts the…
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
Orbiter subsystem hardware/software interaction analysis. Volume 8: Forward reaction control system
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The results of the orbiter hardware/software interaction analysis for the AFT reaction control system are presented. The interaction between hardware failure modes and software are examined in order to identify associated issues and risks. All orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are discussed.
Sneak Analysis Application Guidelines
1982-06-01
Hardware Program Change Cost Trend, Airborne Environment ....... ....................... 111 3-11 Relative Software Program Change Costs...113 3-50 Derived Software Program Change Cost by Phase,* Airborne Environment ..... ............... 114 3-51 Derived Software Program Change...Cost by Phase, Ground/Water Environment ... ............. .... 114 3-52 Total Software Program Change Costs ................ 115 3-53 Sneak Analysis
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark
2017-12-01
The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.
Risk and Vulnerability Analysis of Satellites Due to MM/SD with PIRAT
NASA Astrophysics Data System (ADS)
Kempf, Scott; Schafer, Frank Rudolph, Martin; Welty, Nathan; Donath, Therese; Destefanis, Roberto; Grassi, Lilith; Janovsky, Rolf; Evans, Leanne; Winterboer, Arne
2013-08-01
Until recently, the state-of-the-art assessment of the threat posed to spacecraft by micrometeoroids and space debris was limited to the application of ballistic limit equations to the outer hull of a spacecraft. The probability of no penetration (PNP) is acceptable for assessing the risk and vulnerability of manned space mission, however, for unmanned missions, whereby penetrations of the spacecraft exterior do not necessarily constitute satellite or mission failure, these values are overly conservative. The newly developed software tool PIRAT (Particle Impact Risk and Vulnerability Analysis Tool) has been developed based on the Schäfer-Ryan-Lambert (SRL) triple-wall ballistic limit equation (BLE), applicable for various satellite components. As a result, it has become possible to assess the individual failure rates of satellite components. This paper demonstrates the modeling of an example satellite, the performance of a PIRAT analysis and the potential for subsequent design optimizations with respect of micrometeoroid and space debris (MM/SD) impact risk.
Huber, Timothy C; Krishnaraj, Arun; Monaghan, Dayna; Gaskin, Cree M
2018-05-18
Due to mandates from recent legislation, clinical decision support (CDS) software is being adopted by radiology practices across the country. This software provides imaging study decision support for referring providers at the point of order entry. CDS systems produce a large volume of data, providing opportunities for research and quality improvement. In order to better visualize and analyze trends in this data, an interactive data visualization dashboard was created using a commercially available data visualization platform. Following the integration of a commercially available clinical decision support product into the electronic health record, a dashboard was created using a commercially available data visualization platform (Tableau, Seattle, WA). Data generated by the CDS were exported from the data warehouse, where they were stored, into the platform. This allowed for real-time visualization of the data generated by the decision support software. The creation of the dashboard allowed the output from the CDS platform to be more easily analyzed and facilitated hypothesis generation. Integrating data visualization tools into clinical decision support tools allows for easier data analysis and can streamline research and quality improvement efforts.
Towards automated traceability maintenance
Mäder, Patrick; Gotel, Orlena
2012-01-01
Traceability relations support stakeholders in understanding the dependencies between artifacts created during the development of a software system and thus enable many development-related tasks. To ensure that the anticipated benefits of these tasks can be realized, it is necessary to have an up-to-date set of traceability relations between the established artifacts. This goal requires the creation of traceability relations during the initial development process. Furthermore, the goal also requires the maintenance of traceability relations over time as the software system evolves in order to prevent their decay. In this paper, an approach is discussed that supports the (semi-) automated update of traceability relations between requirements, analysis and design models of software systems expressed in the UML. This is made possible by analyzing change events that have been captured while working within a third-party UML modeling tool. Within the captured flow of events, development activities comprised of several events are recognized. These are matched with predefined rules that direct the update of impacted traceability relations. The overall approach is supported by a prototype tool and empirical results on the effectiveness of tool-supported traceability maintenance are provided. PMID:23471308
NASA Technical Reports Server (NTRS)
Butler, Roy
2013-01-01
The growth in computer hardware performance, coupled with reduced energy requirements, has led to a rapid expansion of the resources available to software systems, driving them towards greater logical abstraction, flexibility, and complexity. This shift in focus from compacting functionality into a limited field towards developing layered, multi-state architectures in a grand field has both driven and been driven by the history of embedded processor design in the robotic spacecraft industry.The combinatorial growth of interprocess conditions is accompanied by benefits (concurrent development, situational autonomy, and evolution of goals) and drawbacks (late integration, non-deterministic interactions, and multifaceted anomalies) in achieving mission success, as illustrated by the case of the Mars Reconnaissance Orbiter. Approaches to optimizing the benefits while mitigating the drawbacks have taken the form of the formalization of requirements, modular design practices, extensive system simulation, and spacecraft data trend analysis. The growth of hardware capability and software complexity can be expected to continue, with future directions including stackable commodity subsystems, computer-generated algorithms, runtime reconfigurable processors, and greater autonomy.
Zhang, Lanlan; Hub, Martina; Mang, Sarah; Thieke, Christian; Nix, Oliver; Karger, Christian P; Floca, Ralf O
2013-06-01
Radiotherapy is a fast-developing discipline which plays a major role in cancer care. Quantitative analysis of radiotherapy data can improve the success of the treatment and support the prediction of outcome. In this paper, we first identify functional, conceptional and general requirements on a software system for quantitative analysis of radiotherapy. Further we present an overview of existing radiotherapy analysis software tools and check them against the stated requirements. As none of them could meet all of the demands presented herein, we analyzed possible conceptional problems and present software design solutions and recommendations to meet the stated requirements (e.g. algorithmic decoupling via dose iterator pattern; analysis database design). As a proof of concept we developed a software library "RTToolbox" following the presented design principles. The RTToolbox is available as open source library and has already been tested in a larger-scale software system for different use cases. These examples demonstrate the benefit of the presented design principles. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Continuous Risk Management: A NASA Program Initiative
NASA Technical Reports Server (NTRS)
Hammer, Theodore F.; Rosenberg, Linda
1999-01-01
NPG 7120.5A, "NASA Program and Project Management Processes and Requirements" enacted in April, 1998, requires that "The program or project manager shall apply risk management principles..." The Software Assurance Technology Center (SATC) at NASA GSFC has been tasked with the responsibility for developing and teaching a systems level course for risk management that provides information on how to comply with this edict. The course was developed in conjunction with the Software Engineering Institute at Carnegie Mellon University, then tailored to the NASA systems community. This presentation will briefly discuss the six functions for risk management: (1) Identify the risks in a specific format; (2) Analyze the risk probability, impact/severity, and timeframe; (3) Plan the approach; (4) Track the risk through data compilation and analysis; (5) Control and monitor the risk; (6) Communicate and document the process and decisions.
Theoretical and software considerations for nonlinear dynamic analysis
NASA Technical Reports Server (NTRS)
Schmidt, R. J.; Dodds, R. H., Jr.
1983-01-01
In the finite element method for structural analysis, it is generally necessary to discretize the structural model into a very large number of elements to accurately evaluate displacements, strains, and stresses. As the complexity of the model increases, the number of degrees of freedom can easily exceed the capacity of present-day software system. Improvements of structural analysis software including more efficient use of existing hardware and improved structural modeling techniques are discussed. One modeling technique that is used successfully in static linear and nonlinear analysis is multilevel substructuring. This research extends the use of multilevel substructure modeling to include dynamic analysis and defines the requirements for a general purpose software system capable of efficient nonlinear dynamic analysis. The multilevel substructuring technique is presented, the analytical formulations and computational procedures for dynamic analysis and nonlinear mechanics are reviewed, and an approach to the design and implementation of a general purpose structural software system is presented.
Using recurrence plot analysis for software execution interpretation and fault detection
NASA Astrophysics Data System (ADS)
Mosdorf, M.
2015-09-01
This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.
Assessing economic impact of storm surge under projected sea level rise scenarios
NASA Astrophysics Data System (ADS)
Del Angel, D. C.; Yoskowitz, D.
2017-12-01
Global sea level is expected to rise 0.2-2m by the year 2100. Rising sea level is expected to have a number of impacts such as erosion, saltwater intrusion, and decline in coastal wetlands; all which have direct and indirect socio-economic impact to coastal communities. By 2050, 25% of the world's population will reside within flood-prone areas. These statistics raise a concern for the economic cost that sea level and flooding has on the growing coastal communities. Economic cost of storm surge inundation and rising seas may include loss or damage to public facilities and infrastructure that may become temporarily inaccessible, as well as disruptions to business and services. This goal of this project is to assess economic impacts of storms under four SLR scenarios including low, intermediate-low, intermediate-high, and high (0.2m, 0.5m, 1.2m and 2m, respectively) in the Northern Gulf of Mexico region. To assess flooding impact on communities from storm surge, this project utilizes HAZUS-MH software - a Geographic Information System (GIS)-based modeling tool developed by the Federal Emergency Management Agency - to estimate physical, economic, and social impacts of natural disasters such as floods, earthquakes and hurricanes. The HAZUS database comes integrated with aggregate and site specific inventory which includes: demographic data, general building stock, agricultural statistics, vehicle inventory, essential facilities, transportation systems, utility systems (among other sensitive facilities). User-defined inundation scenarios will serve to identify assets at risk and damage estimates will be generated using the Depth Damage Function included in the HAZUS software. Results will focus on 3 communities in the Gulf and highlight changes in storm flood impact. This approach not only provides a method for economic impact assessment but also begins to create a link between ecosystem services and natural and nature-based features such as wetlands, beaches and dunes. Results from this analysis can provide actionable information needed for policy development and planning for coastal communities.
Dynamic Impact Deformation Analysis Using High-speed Cameras and ARAMIS Photogrammetry Software
2010-06-01
NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) U.S. Army Research Laboratory ATTN: RDRL-WMM-B...Aberdeen Proving Ground, MD 21005-5425 8. PERFORMING ORGANIZATION REPORT NUMBER ARL-TR-5212 9. SPONSORING/MONITORING AGENCY NAME(S) AND...desired (highlighted) panel, and then selecting the Edit option. The values in the “Name,” “Calibration scale,” “Cert. Temp.,” and “Exp. Coff
Multi-threaded Event Processing with DANA
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Lawrence; Elliott Wolin
2007-05-14
The C++ data analysis framework DANA has been written to support the next generation of Nuclear Physics experiments at Jefferson Lab commensurate with the anticipated 12GeV upgrade. The DANA framework was designed to allow multi-threaded event processing with a minimal impact on developers of reconstruction software. This document describes how DANA implements multi-threaded event processing and compares it to simply running multiple instances of a program. Also presented are relative reconstruction rates for Pentium4, Xeon, and Opteron based machines.
Shenoy, Shailesh M
2016-07-01
A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
An Analysis of Mission Critical Computer Software in Naval Aviation
1991-03-01
No. Task No. Work Unit Accesion Number 11. TITLE (Include Security Classification) AN ANALYSIS OF MISSION CRITICAL COMPUTER SOFTWARE IN NAVAL AVIATION...software development schedules were sustained without a milestone change being made. Also, software that was released to the fleet had no major...fleet contain any major defects? This research has revealed that only about half of the original software development schedules were sustained without a
Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration
NASA Astrophysics Data System (ADS)
Hariri, Ali
Solar photovoltaic (PV) energy integration in distribution networks is one of the fastest growing sectors of distributed energy integration. The growth in solar PV integration is incentivized by various clean power policies, global interest in solar energy, and reduction in manufacturing and installation costs of solar energy systems. The increase in solar PV integration has raised a number of concerns regarding the potential impacts that might arise as a result of high PV penetration. Some impacts have already been recorded in networks with high PV penetration such as in China, Germany, and USA (Hawaii and California). Therefore, network planning is becoming more intricate as new technologies are integrated into the existing electric grid. The integrated new technologies pose certain compatibility concerns regarding the existing electric grid infrastructure. Therefore, PV integration impact studies are becoming more essential in order to have a better understanding of how to advance the solar PV integration efforts without introducing adverse impacts into the network. PV impact studies are important for understanding the nature of the new introduced phenomena. Understanding the nature of the potential impacts is a key factor for mitigating and accommodating for said impacts. Traditionally, electric power utilities relied on phasor-based power flow simulations for planning their electric networks. However, the conventional, commercially available, phasor-based simulation tools do not provide proper visibility across a wide spectrum of electric phenomena. Moreover, different types of simulation approaches are suitable for specific types of studies. For instance, power flow software cannot be used for studying time varying phenomena. At the same time, it is not practical to use electromagnetic transient (EMT) tools to perform power flow solutions. Therefore, some electric phenomena caused by the variability of PV generation are not visible using the conventional utility simulation software. On the other hand, EMT simulation tools provide high accuracy and visibility over a wide bandwidth of frequencies at the expense of larger processing and memory requirements, limited network size, and long simulation time. Therefore, there is a gap in simulation tools and techniques that can efficiently and effectively identify potential PV impact. New planning simulation tools are needed in order to accommodate for the simulation requirements of new integrated technologies in the electric grid. The dissertation at hand starts by identifying some of the potential impacts that are caused by high PV penetration. A phasor-based quasi-static time series (QSTS) analysis tool is developed in order to study the slow dynamics that are caused by the variations in the PV generation that lead to voltage fluctuations. Moreover, some EMT simulations are performed in order to study the impacts of PV systems on the electric network harmonic levels. These studies provide insights into the type and duration of certain impacts, as well as the conditions that may lead to adverse phenomena. In addition these studies present an idea about the type of simulation tools that are sufficient for each type of study. After identifying some of the potential impacts, certain planning tools and techniques are proposed. The potential PV impacts may cause certain utilities to refrain from integrating PV systems into their networks. However, each electric network has a certain limit beyond which the impacts become substantial and may adversely interfere with the system operation and the equipment along the feeder; this limit is referred to as the hosting limit (or hosting capacity). Therefore, it is important for utilities to identify the PV hosting limit on a specific electric network in order to safely and confidently integrate the maximum possible PV systems. In the following dissertation, two approaches have been proposed for identifying the hosing limit: 1. Analytical approach: this is a theoretical mathematical approach that demonstrated the understanding of the fundamentals of electric power system operation. It provides an easy way to estimate the maximum amount of PV power that can be injected at each node in the network. This approach has been tested and validated. 2. Stochastic simulation software approach: this approach provides a comprehensive simulation software that can be used in order to identify the PV hosting limit. The software performs a large number of stochastic simulation while varying the PV system size and location. The collected data is then analyzed for violations in the voltage levels, voltage fluctuations and reverse power flow. (Abstract shortened by ProQuest.).
InaSAFE applications in disaster preparedness
NASA Astrophysics Data System (ADS)
Pranantyo, Ignatius Ryan; Fadmastuti, Mahardika; Chandra, Fredy
2015-04-01
Disaster preparedness activities aim to reduce the impact of disasters by being better prepared to respond when a disaster occurs. In order to better anticipate requirements during a disaster, contingency planning activities can be undertaken prior to a disaster based on a realistic disaster scenario. InaSAFE is a tool that can inform this process. InaSAFE is a free and open source software that estimates the impact to people and infrastructure from potential hazard scenarios. By using InaSAFE, disaster managers can develop scenarios of disaster impacts (people and infrastructures affected) to inform their contingency plan and emergency response operation plan. While InaSAFE provides the software framework exposure data and hazard data are needed as inputs to run this software. Then InaSAFE can be used to forecast the impact of the hazard scenario to the exposure data. InaSAFE outputs include estimates of the number of people, buildings and roads are affected, list of minimum needs (rice and clean water), and response checklist. InaSAFE is developed by Indonesia's National Disaster Management Agency (BNPB) and the Australian Government, through the Australia-Indonesia Facility for Disaster Reduction (AIFDR), in partnership with the World Bank - Global Facility for Disaster Reduction and Recovery (GFDRR). This software has been used in many parts of Indonesia, including Padang, Maumere, Jakarta, and Slamet Mountain for emergency response and contingency planning.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, W. A.; Lepicovsky, J.
1992-01-01
The software for configuring an LV counter processor system has been developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system has been developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
LV software support for supersonic flow analysis
NASA Technical Reports Server (NTRS)
Bell, William A.
1992-01-01
The software for configuring a Laser Velocimeter (LV) counter processor system was developed using structured design. The LV system includes up to three counter processors and a rotary encoder. The software for configuring and testing the LV system was developed, tested, and included in an overall software package for data acquisition, analysis, and reduction. Error handling routines respond to both operator and instrument errors which often arise in the course of measuring complex, high-speed flows. The use of networking capabilities greatly facilitates the software development process by allowing software development and testing from a remote site. In addition, high-speed transfers allow graphics files or commands to provide viewing of the data from a remote site. Further advances in data analysis require corresponding advances in procedures for statistical and time series analysis of nonuniformly sampled data.
NASA Technical Reports Server (NTRS)
Gaffney, J. E., Jr.; Judge, R. W.
1981-01-01
A model of a software development process is described. The software development process is seen to consist of a sequence of activities, such as 'program design' and 'module development' (or coding). A manpower estimate is made by multiplying code size by the rates (man months per thousand lines of code) for each of the activities relevant to the particular case of interest and summing up the results. The effect of four objectively determinable factors (organization, software product type, computer type, and code type) on productivity values for each of nine principal software development activities was assessed. Four factors were identified which account for 39% of the observed productivity variation.
Experimenting Maintenance of Flight Software in an Integrated Modular Avionics for Space
NASA Astrophysics Data System (ADS)
Hardy, Johan; Laroche, Thomas; Creten, Philippe; Parisis, Paul; Hiller, Martin
2014-08-01
This paper presents an experiment of Flight Software partitioning in an Integrated Modular Avionics for Space (IMA-SP) system. This experiment also tackles the maintenance aspects of IMA-SP systems. The presented case study is PROBA-2 Flight Software. The paper addresses and discusses the following subjects: On-Board Software Maintenance in IMA- SP, boot strategy for Time and Space Partitioning, considerations about the ground segment related to On-Board Software Maintenance in IMA-SP, and architectural impacts of Time and Space Partitioning for PROBA software's. Finally, this paper presents the results and the achievements of the study and it appeals at further perspectives for IMA-SP and Time and Space Partitioning.
NASA Technical Reports Server (NTRS)
1976-01-01
The engineering analyses and evaluation studies conducted for the Software Requirements Analysis are discussed. Included are the development of the study data base, synthesis of implementation approaches for software required by both mandatory onboard computer services and command/control functions, and identification and implementation of software for ground processing activities.
NASA Astrophysics Data System (ADS)
Jiang, X. T.; Wang, Y. D.; Dai, C. H.; Ding, M.
2017-08-01
The finite element model of concrete-filled steel tubular member was established by the numerical analysis software considering material nonlinearity to analyze concrete creep effect on the dynamic responses of the member under axial compression and lateral impact. In the model, the constitutive model of core concrete is the plastic damage model, that of steel is the Von Mises yield criterion and kinematic hardening model, and the creep effect at different ages is equivalent to the change of concrete elastic modulus. Then the dynamic responses of concrete-filled steel tubular member considering creep effects was simulated, and the effects of creep on contact time, impact load, deflection, stress and strain were discussed. The fruits provide a scientific basis for the design of the impact resistance of concrete filled steel tubular members.
Micrometeoroid and Orbital Debris Threat Assessment: Mars Sample Return Earth Entry Vehicle
NASA Technical Reports Server (NTRS)
Christiansen, Eric L.; Hyde, James L.; Bjorkman, Michael D.; Hoffman, Kevin D.; Lear, Dana M.; Prior, Thomas G.
2011-01-01
This report provides results of a Micrometeoroid and Orbital Debris (MMOD) risk assessment of the Mars Sample Return Earth Entry Vehicle (MSR EEV). The assessment was performed using standard risk assessment methodology illustrated in Figure 1-1. Central to the process is the Bumper risk assessment code (Figure 1-2), which calculates the critical penetration risk based on geometry, shielding configurations and flight parameters. The assessment process begins by building a finite element model (FEM) of the spacecraft, which defines the size and shape of the spacecraft as well as the locations of the various shielding configurations. This model is built using the NX I-deas software package from Siemens PLM Software. The FEM is constructed using triangular and quadrilateral elements that define the outer shell of the spacecraft. Bumper-II uses the model file to determine the geometry of the spacecraft for the analysis. The next step of the process is to identify the ballistic limit characteristics for the various shield types. These ballistic limits define the critical size particle that will penetrate a shield at a given impact angle and impact velocity. When the finite element model is built, each individual element is assigned a property identifier (PID) to act as an index for its shielding properties. Using the ballistic limit equations (BLEs) built into the Bumper-II code, the shield characteristics are defined for each and every PID in the model. The final stage of the analysis is to determine the probability of no penetration (PNP) on the spacecraft. This is done using the micrometeoroid and orbital debris environment definitions that are built into the Bumper-II code. These engineering models take into account orbit inclination, altitude, attitude and analysis date in order to predict an impacting particle flux on the spacecraft. Using the geometry and shielding characteristics previously defined for the spacecraft and combining that information with the environment model calculations, the Bumper-II code calculates a probability of no penetration for the spacecraft.
Govender, Sagaren; Gerwel Proches, Cecile N; Kader, Abdulla
2018-01-01
Background Four public hospitals in South Africa, which render both specialized and nonspecialized services to thousands of patients, were examined to determine the impact of leadership on health care service delivery. These hospitals were inundated by various problems that were impacting negatively on health care service delivery. Purpose This research study aimed to gain a comprehensive understanding of the challenges, complexities and constraints facing public health care in KwaZulu-Natal (KZN) and to examine leadership as a strategy to enhance healthcare service delivery with a particular focus on four regional hospitals in the KZN Province. Methods The mixed-method research approach was utilized. Purposive sampling and stratified random sampling were employed in the research setting, and in-depth, semistructured interviews and questionnaires were used to collect data. Data were analyzed using the Nvivo computer software package for in-depth interviews and the Statistical Package for the Social Sciences (SPSS) software for the quantitative analysis. Results The research findings showed that the current leadership framework adopted by the health care leaders in regional hospitals in KZN is weak and is contributing to poor health care service delivery. Conclusion This study, therefore, aimed to address the current challenges and weaknesses that are impacting negatively on health care service delivery in regional hospitals in the KZN Province and made recommendations for improvement. PMID:29535529
Govender, Sagaren; Gerwel Proches, Cecile N; Kader, Abdulla
2018-01-01
Four public hospitals in South Africa, which render both specialized and nonspecialized services to thousands of patients, were examined to determine the impact of leadership on health care service delivery. These hospitals were inundated by various problems that were impacting negatively on health care service delivery. This research study aimed to gain a comprehensive understanding of the challenges, complexities and constraints facing public health care in KwaZulu-Natal (KZN) and to examine leadership as a strategy to enhance healthcare service delivery with a particular focus on four regional hospitals in the KZN Province. The mixed-method research approach was utilized. Purposive sampling and stratified random sampling were employed in the research setting, and in-depth, semistructured interviews and questionnaires were used to collect data. Data were analyzed using the Nvivo computer software package for in-depth interviews and the Statistical Package for the Social Sciences (SPSS) software for the quantitative analysis. The research findings showed that the current leadership framework adopted by the health care leaders in regional hospitals in KZN is weak and is contributing to poor health care service delivery. This study, therefore, aimed to address the current challenges and weaknesses that are impacting negatively on health care service delivery in regional hospitals in the KZN Province and made recommendations for improvement.
Fuchs, Tobias A; Fiechter, Michael; Gebhard, Cathérine; Stehli, Julia; Ghadri, Jelena R; Kazakauskaite, Egle; Herzog, Bernhard A; Husmann, Lars; Gaemperli, Oliver; Kaufmann, Philipp A
2013-03-01
To assess the impact of adaptive statistical iterative reconstruction (ASIR) on coronary plaque volume and composition analysis as well as on stenosis quantification in high definition coronary computed tomography angiography (CCTA). We included 50 plaques in 29 consecutive patients who were referred for the assessment of known or suspected coronary artery disease (CAD) with contrast-enhanced CCTA on a 64-slice high definition CT scanner (Discovery HD 750, GE Healthcare). CCTA scans were reconstructed with standard filtered back projection (FBP) with no ASIR (0 %) or with increasing contributions of ASIR, i.e. 20, 40, 60, 80 and 100 % (no FBP). Plaque analysis (volume, components and stenosis degree) was performed using a previously validated automated software. Mean values for minimal diameter and minimal area as well as degree of stenosis did not change significantly using different ASIR reconstructions. There was virtually no impact of reconstruction algorithms on mean plaque volume or plaque composition (e.g. soft, intermediate and calcified component). However, with increasing ASIR contribution, the percentage of plaque volume component between 401 and 500 HU decreased significantly (p < 0.05). Modern image reconstruction algorithms such as ASIR, which has been developed for noise reduction in latest high resolution CCTA scans, can be used reliably without interfering with the plaque analysis and stenosis severity assessment.
The relationships between software publications and software systems
NASA Astrophysics Data System (ADS)
Hogg, David W.
2017-01-01
When we build software systems or software tools for astronomy, we sometimes do and sometimes don't also write and publish standard scientific papers about those software systems. I will discuss the pros and cons of writing such publications. There are impacts of writing such papers immediately (they can affect the design and structure of the software project itself), in the short term (they can promote adoption and legitimize the software), in the medium term (they can provide a platform for all the literature's mechanisms for citation, criticism, and reuse), and in the long term (they can preserve ideas that are embodied in the software, possibly on timescales much longer than the lifetime of any software context). I will argue that as important as pure software contributions are to astronomy—and I am both a preacher and a practitioner—software contributions are even more valuable when they are associated with traditional scientific publications. There are exceptions and complexities of course, which I will discuss.
ERIC Educational Resources Information Center
Patel, Sunil S.
2013-01-01
Social software technology has gained considerable popularity over the last decade and has had a great impact on hundreds of millions of people across the globe. Businesses have also expressed their interest in leveraging its use in business contexts. As a result, software vendors and business consumers have invested billions of dollars to use…
NASA/CARES dual-use ceramic technology spinoff applications
NASA Technical Reports Server (NTRS)
Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.; Nemeth, Noel N.
1994-01-01
NASA has developed software that enables American industry to establish the reliability and life of ceramic structures in a wide variety of 21st Century applications. Designing ceramic components to survive at higher temperatures than the capability of most metals and in severe loading environments involves the disciplines of statistics and fracture mechanics. Successful application of advanced ceramics material properties and the use of a probabilistic brittle material design methodology. The NASA program, known as CARES (Ceramics Analysis and Reliability Evaluation of Structures), is a comprehensive general purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. The latest version of this software, CARESALIFE, is coupled to several commercially available finite element analysis programs (ANSYS, MSC/NASTRAN, ABAQUS, COSMOS/N4, MARC), resulting in an advanced integrated design tool which is adapted to the computing environment of the user. The NASA-developed CARES software has been successfully used by industrial, government, and academic organizations to design and optimize ceramic components for many demanding applications. Industrial sectors impacted by this program include aerospace, automotive, electronic, medical, and energy applications. Dual-use applications include engine components, graphite and ceramic high temperature valves, TV picture tubes, ceramic bearings, electronic chips, glass building panels, infrared windows, radiant heater tubes, heat exchangers, and artificial hips, knee caps, and teeth.
Assessment of replicate bias in 454 pyrosequencing and a multi-purpose read-filtering tool.
Jérôme, Mariette; Noirot, Céline; Klopp, Christophe
2011-05-26
Roche 454 pyrosequencing platform is often considered the most versatile of the Next Generation Sequencing technology platforms, permitting the sequencing of large genomes, the analysis of variations or the study of transcriptomes. A recent reported bias leads to the production of multiple reads for a unique DNA fragment in a random manner within a run. This bias has a direct impact on the quality of the measurement of the representation of the fragments using the reads. Other cleaning steps are usually performed on the reads before assembly or alignment. PyroCleaner is a software module intended to clean 454 pyrosequencing reads in order to ease the assembly process. This program is a free software and is distributed under the terms of the GNU General Public License as published by the Free Software Foundation. It implements several filters using criteria such as read duplication, length, complexity, base-pair quality and number of undetermined bases. It also permits to clean flowgram files (.sff) of paired-end sequences generating on one hand validated paired-ends file and the other hand single read file. Read cleaning has always been an important step in sequence analysis. The pyrocleaner python module is a Swiss knife dedicated to 454 reads cleaning. It includes commonly used filters as well as specialised ones such as duplicated read removal and paired-end read verification.
Filament winding technique, experiment and simulation analysis on tubular structure
NASA Astrophysics Data System (ADS)
Quanjin, Ma; Rejab, M. R. M.; Kaige, Jiang; Idris, M. S.; Harith, M. N.
2018-04-01
Filament winding process has emerged as one of the potential composite fabrication processes with lower costs. Filament wound products involve classic axisymmetric parts (pipes, rings, driveshafts, high-pressure vessels and storage tanks), non-axisymmetric parts (prismatic nonround sections and pipe fittings). Based on the 3-axis filament winding machine has been designed with the inexpensive control system, it is completely necessary to make a relative comparison between experiment and simulation on tubular structure. In this technical paper, the aim of this paper is to perform a dry winding experiment using the 3-axis filament winding machine and simulate winding process on the tubular structure using CADWIND software with 30°, 45°, 60° winding angle. The main result indicates that the 3-axis filament winding machine can produce tubular structure with high winding pattern performance with different winding angle. This developed 3-axis winding machine still has weakness compared to CAWIND software simulation results with high axes winding machine about winding pattern, turnaround impact, process error, thickness, friction impact etc. In conclusion, the 3-axis filament winding machine improvements and recommendations come up with its comparison results, which can intuitively understand its limitations and characteristics.
Sweidan, Michelle; Williamson, Margaret; Reeve, James F; Harvey, Ken; O'Neill, Jennifer A; Schattner, Peter; Snowdon, Teri
2010-04-15
Electronic prescribing is increasingly being used in primary care and in hospitals. Studies on the effects of e-prescribing systems have found evidence for both benefit and harm. The aim of this study was to identify features of e-prescribing software systems that support patient safety and quality of care and that are useful to the clinician and the patient, with a focus on improving the quality use of medicines. Software features were identified by a literature review, key informants and an expert group. A modified Delphi process was used with a 12-member multidisciplinary expert group to reach consensus on the expected impact of the features in four domains: patient safety, quality of care, usefulness to the clinician and usefulness to the patient. The setting was electronic prescribing in general practice in Australia. A list of 114 software features was developed. Most of the features relate to the recording and use of patient data, the medication selection process, prescribing decision support, monitoring drug therapy and clinical reports. The expert group rated 78 of the features (68%) as likely to have a high positive impact in at least one domain, 36 features (32%) as medium impact, and none as low or negative impact. Twenty seven features were rated as high positive impact across 3 or 4 domains including patient safety and quality of care. Ten features were considered "aspirational" because of a lack of agreed standards and/or suitable knowledge bases. This study defines features of e-prescribing software systems that are expected to support safety and quality, especially in relation to prescribing and use of medicines in general practice. The features could be used to develop software standards, and could be adapted if necessary for use in other settings and countries.
2010-01-01
Background Electronic prescribing is increasingly being used in primary care and in hospitals. Studies on the effects of e-prescribing systems have found evidence for both benefit and harm. The aim of this study was to identify features of e-prescribing software systems that support patient safety and quality of care and that are useful to the clinician and the patient, with a focus on improving the quality use of medicines. Methods Software features were identified by a literature review, key informants and an expert group. A modified Delphi process was used with a 12-member multidisciplinary expert group to reach consensus on the expected impact of the features in four domains: patient safety, quality of care, usefulness to the clinician and usefulness to the patient. The setting was electronic prescribing in general practice in Australia. Results A list of 114 software features was developed. Most of the features relate to the recording and use of patient data, the medication selection process, prescribing decision support, monitoring drug therapy and clinical reports. The expert group rated 78 of the features (68%) as likely to have a high positive impact in at least one domain, 36 features (32%) as medium impact, and none as low or negative impact. Twenty seven features were rated as high positive impact across 3 or 4 domains including patient safety and quality of care. Ten features were considered "aspirational" because of a lack of agreed standards and/or suitable knowledge bases. Conclusions This study defines features of e-prescribing software systems that are expected to support safety and quality, especially in relation to prescribing and use of medicines in general practice. The features could be used to develop software standards, and could be adapted if necessary for use in other settings and countries. PMID:20398294
GammaLib and ctools. A software framework for the analysis of astronomical gamma-ray data
NASA Astrophysics Data System (ADS)
Knödlseder, J.; Mayer, M.; Deil, C.; Cayrou, J.-B.; Owen, E.; Kelley-Hoskins, N.; Lu, C.-C.; Buehler, R.; Forest, F.; Louge, T.; Siejkowski, H.; Kosack, K.; Gerard, L.; Schulz, A.; Martin, P.; Sanchez, D.; Ohm, S.; Hassan, T.; Brau-Nogué, S.
2016-08-01
The field of gamma-ray astronomy has seen important progress during the last decade, yet to date no common software framework has been developed for the scientific analysis of gamma-ray telescope data. We propose to fill this gap by means of the GammaLib software, a generic library that we have developed to support the analysis of gamma-ray event data. GammaLib was written in C++ and all functionality is available in Python through an extension module. Based on this framework we have developed the ctools software package, a suite of software tools that enables flexible workflows to be built for the analysis of Imaging Air Cherenkov Telescope event data. The ctools are inspired by science analysis software available for existing high-energy astronomy instruments, and they follow the modular ftools model developed by the High Energy Astrophysics Science Archive Research Center. The ctools were written in Python and C++, and can be either used from the command line via shell scripts or directly from Python. In this paper we present the GammaLib and ctools software versions 1.0 that were released at the end of 2015. GammaLib and ctools are ready for the science analysis of Imaging Air Cherenkov Telescope event data, and also support the analysis of Fermi-LAT data and the exploitation of the COMPTEL legacy data archive. We propose using ctools as the science tools software for the Cherenkov Telescope Array Observatory.
Role of IAC in large space systems thermal analysis
NASA Technical Reports Server (NTRS)
Jones, G. K.; Skladany, J. T.; Young, J. P.
1982-01-01
Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.
Elements of strategic capability for software outsourcing enterprises based on the resource
NASA Astrophysics Data System (ADS)
Shi, Wengeng
2011-10-01
Software outsourcing enterprises as an emerging high-tech enterprises, the rise of the speed and the number was very amazing. In addition to Chinese software outsourcing for giving preferential policies, the software outsourcing business has its ability to upgrade, and in general the software companies have not had the related characteristics. View from the resource base of the theory, the analysis software outsourcing companies have the ability and resources of rare and valuable and non-mimic, we try to give an initial framework for theoretical analysis based on this.
NASA Technical Reports Server (NTRS)
Lala, Jaynarayan H.; Harper, Richard E.; Jaskowiak, Kenneth R.; Rosch, Gene; Alger, Linda S.; Schor, Andrei L.
1990-01-01
An avionics architecture for the advanced launch system (ALS) that uses validated hardware and software building blocks developed under the advanced information processing system program is presented. The AIPS for ALS architecture defined is preliminary, and reliability requirements can be met by the AIPS hardware and software building blocks that are built using the state-of-the-art technology available in the 1992-93 time frame. The level of detail in the architecture definition reflects the level of detail available in the ALS requirements. As the avionics requirements are refined, the architecture can also be refined and defined in greater detail with the help of analysis and simulation tools. A useful methodology is demonstrated for investigating the impact of the avionics suite to the recurring cost of the ALS. It is shown that allowing the vehicle to launch with selected detected failures can potentially reduce the recurring launch costs. A comparative analysis shows that validated fault-tolerant avionics built out of Class B parts can result in lower life-cycle-cost in comparison to simplex avionics built out of Class S parts or other redundant architectures.
Analysis of patient CT dose data using virtualdose
NASA Astrophysics Data System (ADS)
Bennett, Richard
X-ray computer tomography has many benefits to medical and research applications. Recently, over the last decade CT has had a large increase in usage in hospitals and medical diagnosis. In pediatric care, from 2000 to 2006, abdominal CT scans increased by 49 % and chest CT by 425 % in the emergency room (Broder 2007). Enormous amounts of effort have been performed across multiple academic and government groups to determine an accurate measure of organ dose to patients who undergo a CT scan due to the inherent risks with ionizing radiation. Considering these intrinsic risks, CT dose estimating software becomes a necessary tool that health care providers and radiologist must use to determine many metrics to base the risks versus rewards of having an x-ray CT scan. This thesis models the resultant organ dose as body mass increases for patients with all other related scan parameters fixed. In addition to this,this thesis compares a modern dose estimating software, VirtualDose CT to two other programs, CT-Expo and ImPACT CT. The comparison shows how the software's theoretical basis and the phantom they use to represent the human body affect the range of results in organ dose. CT-Expo and ImPACT CT dose estimating software uses a different model for anatomical representation of the organs in the human body and the results show how that approach dramatically changes the outcome. The results categorizes four datasets as compared to the three software types where the appropriate phantom was available. Modeling was done to simulate chest abdominal pelvis scans and whole body scans. Organ dose difference versus body mass index shows as body mass index (BMI) ranges from 23.5 kg/m 2 to 45 kg/m2 the amount of organ dose also trends a percent change from -4.58 to -176.19 %. Comparing organ dose difference with increasing x-ray tube potential from 120 kVp to 140 kVp the percent change in organ dose increases from 55 % to 65 % across all phantoms. In comparing VirtualDose to CT-Expo for organ dose difference versus age, male phantoms show percent difference of -19 % to 25 % for various organs minus bone surface and breast tissues results. Finally, for organ dose difference across all software for average adult phantom the results range from -45 % to 6 % in the comparison of ImPACT CT to VirtualDose and -27 % to 66 % for the comparison of CT-Expo to VirtualDose. In the comparison for increased BMI (done only in VirtualDose), results show that with all other parameters fixed, the organ dose goes down as BMI increases, which is due to the increase in adipose tissue and bulk of the patient model. The range of results when comparing all the three softwares have a wide range, in some cases greater than 150 %, it is evident that using a different anatomical basis for the human phantom and the theoretical basis for the dose estimation will cause fluctuation in the results. Therefore, choosing the software with the most accurate human phantom will provide a closer range to the true dose to the organ.
The Comparison of VLBI Data Analysis Using Software Globl and Globk
NASA Astrophysics Data System (ADS)
Guangli, W.; Xiaoya, W.; Jinling, L.; Wenyao, Z.
The comparison of different geodetic data analysis software is one of the quite of- ten mentioned topics. In this paper we try to find out the difference between software GLOBL and GLOBK when use them to process the same set of VLBI data. GLOBL is a software developed by VLBI team, geodesy branch, GSFC/NASA to process geode- tic VLBI data using algorithm of arc-parameter-elimination, while GLOBK using al- gorithm of kalman filtering is mainly used in GPS data analysis, and it is also used in VLBI data analysis. Our work focus on whether there are significant difference when use the two softwares to analyze the same VLBI data set and investigate the reasons caused the difference.
ACES: Space shuttle flight software analysis expert system
NASA Technical Reports Server (NTRS)
Satterwhite, R. Scott
1990-01-01
The Analysis Criteria Evaluation System (ACES) is a knowledge based expert system that automates the final certification of the Space Shuttle onboard flight software. Guidance, navigation and control of the Space Shuttle through all its flight phases are accomplished by a complex onboard flight software system. This software is reconfigured for each flight to allow thousands of mission-specific parameters to be introduced and must therefore be thoroughly certified prior to each flight. This certification is performed in ground simulations by executing the software in the flight computers. Flight trajectories from liftoff to landing, including abort scenarios, are simulated and the results are stored for analysis. The current methodology of performing this analysis is repetitive and requires many man-hours. The ultimate goals of ACES are to capture the knowledge of the current experts and improve the quality and reduce the manpower required to certify the Space Shuttle onboard flight software.
The Effects of Development Team Skill on Software Product Quality
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.
2006-01-01
This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics
1983-07-01
Distributed Computing Systems impact DrnwrR - aehR on Sotwar Quaity. PERFORMING 010. REPORT NUMBER 7. AUTNOW) S. CONTRACT OR GRANT "UMBER(*)IS ThomasY...C31 Application", "Space Systems Network", "Need for Distributed Database Management", and "Adaptive Routing". This is discussed in the last para ...data reduction, buffering, encryption, and error detection and correction functions. Examples of such data streams include imagery data, video
Understanding Enterprise Systems' Impact(s) on Business Relationships
NASA Astrophysics Data System (ADS)
Ekman, Peter; Thilenius, Peter
Enterprise systems (ESs), i.e. standardized applications supplied from software vendors such as SAP or Oracle, have been extensively employed by companies during the last decade. Today all Fortune 500 companies have, or are in the process of installing, this kind of information system (Seddon et al. 2003). A wide-spread denotation for these applications is enterprise resource planning (ERP) systems. But the broad utilization use of these software packages in business is rendering this labelling too narrow (Davenport 2000).
Kokhanok Renewable Energy Retrofit Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baring-Gould, Edward I.; Haase, Scott G.; Jimenez, Antonio
In 2010, the community of Kokhanok, Alaska, installed two 90-kW wind turbines, battery storage, a converter, and equipment for integration. Researchers at the National Renewable Energy Laboratory performed an analysis and modeling using the HOMER and REopt software modeling packages.The analysis was designed to answer the following questions: 1) What is required to achieve a 50 percent reduction in power plant diesel fuel consumption in a diesel microgrid? 2) What is required to achieve a 50 percent reduction in 'total' (diesel and heating oil) consumption in a remote community? 3) What is the impact and role of energy efficiency? Thismore » presentation provides an introduction to the community of Kokhanok, Alaska; a summary of energy data; and an overview of analysis results and conceptual design.« less
SAO mission support software and data standards, version 1.0
NASA Technical Reports Server (NTRS)
Hsieh, P.
1993-01-01
This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
NASA Technical Reports Server (NTRS)
Waligora, Sharon; Bailey, John; Stark, Mike
1995-01-01
The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are (1) to understand the software development process in the GSFC environment; (2) to measure the effects of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.
Teaching meta-analysis using MetaLight.
Thomas, James; Graziosi, Sergio; Higgins, Steve; Coe, Robert; Torgerson, Carole; Newman, Mark
2012-10-18
Meta-analysis is a statistical method for combining the results of primary studies. It is often used in systematic reviews and is increasingly a method and topic that appears in student dissertations. MetaLight is a freely available software application that runs simple meta-analyses and contains specific functionality to facilitate the teaching and learning of meta-analysis. While there are many courses and resources for meta-analysis available and numerous software applications to run meta-analyses, there are few pieces of software which are aimed specifically at helping those teaching and learning meta-analysis. Valuable teaching time can be spent learning the mechanics of a new software application, rather than on the principles and practices of meta-analysis. We discuss ways in which the MetaLight tool can be used to present some of the main issues involved in undertaking and interpreting a meta-analysis. While there are many software tools available for conducting meta-analysis, in the context of a teaching programme such software can require expenditure both in terms of money and in terms of the time it takes to learn how to use it. MetaLight was developed specifically as a tool to facilitate the teaching and learning of meta-analysis and we have presented here some of the ways it might be used in a training situation.
Wallops Ship Surveillance System
NASA Technical Reports Server (NTRS)
Smith, Donna C.
2011-01-01
Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.
Spacecraft Trajectory Analysis and Mission Planning Simulation (STAMPS) Software
NASA Technical Reports Server (NTRS)
Puckett, Nancy; Pettinger, Kris; Hallstrom,John; Brownfield, Dana; Blinn, Eric; Williams, Frank; Wiuff, Kelli; McCarty, Steve; Ramirez, Daniel; Lamotte, Nicole;
2014-01-01
STAMPS simulates either three- or six-degree-of-freedom cases for all spacecraft flight phases using translated HAL flight software or generic GN&C models. Single or multiple trajectories can be simulated for use in optimization and dispersion analysis. It includes math models for the vehicle and environment, and currently features a "C" version of shuttle onboard flight software. The STAMPS software is used for mission planning and analysis within ascent/descent, rendezvous, proximity operations, and navigation flight design areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patwe, P; Mhatre, V; Dandekar, P
Purpose: 3DVH software is a patient specific quality assurance tool which estimates the 3D dose to the patient specific geometry with the help of Planned Dose Perturbation algorithm. The purpose of this study is to evaluate the impact of HU value of ArcCHECK phantom entered in Eclipse TPS on 3D dose & DVH QA analysis. Methods: Manufacturer of ArcCHECK phantom provides CT data set of phantom & recommends considering it as a homogeneous phantom with electron density (1.19 gm/cc or 282 HU) close to PMMA. We performed this study on Eclipse TPS (V13, VMS) & trueBEAM STx VMS Linac &more » ArcCHECK phantom (SNC). Plans were generated for 6MV photon beam, 20cm×20cm field size at isocentre & SPD (Source to phantom distance) of 86.7 cm to deliver 100cGy at isocentre. 3DVH software requires patients DICOM data generated by TPS & plan delivered on ArcCHECK phantom. Plans were generated in TPS by assigning different HU values to phantom. We analyzed gamma index & the dose profile for all plans along vertical down direction of beam’s central axis for Entry, Exit & Isocentre dose. Results: The global gamma passing rate (2% & 2mm) for manufacturer recommended HU value 282 was 96.3%. Detector entry, Isocentre & detector exit Doses were 1.9048 (1.9270), 1.00(1.0199) & 0.5078(0.527) Gy for TPS (Measured) respectively.The global gamma passing rate for electron density 1.1302 gm/cc was 98.6%. Detector entry, Isocentre & detector exit Doses were 1.8714 (1.8873), 1.00(0.9988) & 0.5211(0.516) Gy for TPS (Measured) respectively. Conclusion: Electron density value assigned by manufacturer does not hold true for every user. Proper modeling of electron density of ArcCHECK in TPS is essential to avoid systematic error in dose calculation of patient specific QA.« less
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.
2017-12-01
Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications. In this poster, we summarize the key components of the UCVM framework and describe the impact it has had in various computational geoscientific applications.
NASA Technical Reports Server (NTRS)
Moran, Susanne I.
2004-01-01
The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications
Disseminating Metaproteomic Informatics Capabilities and Knowledge Using the Galaxy-P Framework
Easterly, Caleb; Gruening, Bjoern; Johnson, James; Kolmeder, Carolin A.; Kumar, Praveen; May, Damon; Mehta, Subina; Mesuere, Bart; Brown, Zachary; Elias, Joshua E.; Hervey, W. Judson; McGowan, Thomas; Muth, Thilo; Rudney, Joel; Griffin, Timothy J.
2018-01-01
The impact of microbial communities, also known as the microbiome, on human health and the environment is receiving increased attention. Studying translated gene products (proteins) and comparing metaproteomic profiles may elucidate how microbiomes respond to specific environmental stimuli, and interact with host organisms. Characterizing proteins expressed by a complex microbiome and interpreting their functional signature requires sophisticated informatics tools and workflows tailored to metaproteomics. Additionally, there is a need to disseminate these informatics resources to researchers undertaking metaproteomic studies, who could use them to make new and important discoveries in microbiome research. The Galaxy for proteomics platform (Galaxy-P) offers an open source, web-based bioinformatics platform for disseminating metaproteomics software and workflows. Within this platform, we have developed easily-accessible and documented metaproteomic software tools and workflows aimed at training researchers in their operation and disseminating the tools for more widespread use. The modular workflows encompass the core requirements of metaproteomic informatics: (a) database generation; (b) peptide spectral matching; (c) taxonomic analysis and (d) functional analysis. Much of the software available via the Galaxy-P platform was selected, packaged and deployed through an online metaproteomics “Contribution Fest“ undertaken by a unique consortium of expert software developers and users from the metaproteomics research community, who have co-authored this manuscript. These resources are documented on GitHub and freely available through the Galaxy Toolshed, as well as a publicly accessible metaproteomics gateway Galaxy instance. These documented workflows are well suited for the training of novice metaproteomics researchers, through online resources such as the Galaxy Training Network, as well as hands-on training workshops. Here, we describe the metaproteomics tools available within these Galaxy-based resources, as well as the process by which they were selected and implemented in our community-based work. We hope this description will increase access to and utilization of metaproteomics tools, as well as offer a framework for continued community-based development and dissemination of cutting edge metaproteomics software. PMID:29385081
Disseminating Metaproteomic Informatics Capabilities and Knowledge Using the Galaxy-P Framework.
Blank, Clemens; Easterly, Caleb; Gruening, Bjoern; Johnson, James; Kolmeder, Carolin A; Kumar, Praveen; May, Damon; Mehta, Subina; Mesuere, Bart; Brown, Zachary; Elias, Joshua E; Hervey, W Judson; McGowan, Thomas; Muth, Thilo; Nunn, Brook; Rudney, Joel; Tanca, Alessandro; Griffin, Timothy J; Jagtap, Pratik D
2018-01-31
The impact of microbial communities, also known as the microbiome, on human health and the environment is receiving increased attention. Studying translated gene products (proteins) and comparing metaproteomic profiles may elucidate how microbiomes respond to specific environmental stimuli, and interact with host organisms. Characterizing proteins expressed by a complex microbiome and interpreting their functional signature requires sophisticated informatics tools and workflows tailored to metaproteomics. Additionally, there is a need to disseminate these informatics resources to researchers undertaking metaproteomic studies, who could use them to make new and important discoveries in microbiome research. The Galaxy for proteomics platform (Galaxy-P) offers an open source, web-based bioinformatics platform for disseminating metaproteomics software and workflows. Within this platform, we have developed easily-accessible and documented metaproteomic software tools and workflows aimed at training researchers in their operation and disseminating the tools for more widespread use. The modular workflows encompass the core requirements of metaproteomic informatics: (a) database generation; (b) peptide spectral matching; (c) taxonomic analysis and (d) functional analysis. Much of the software available via the Galaxy-P platform was selected, packaged and deployed through an online metaproteomics "Contribution Fest" undertaken by a unique consortium of expert software developers and users from the metaproteomics research community, who have co-authored this manuscript. These resources are documented on GitHub and freely available through the Galaxy Toolshed, as well as a publicly accessible metaproteomics gateway Galaxy instance. These documented workflows are well suited for the training of novice metaproteomics researchers, through online resources such as the Galaxy Training Network, as well as hands-on training workshops. Here, we describe the metaproteomics tools available within these Galaxy-based resources, as well as the process by which they were selected and implemented in our community-based work. We hope this description will increase access to and utilization of metaproteomics tools, as well as offer a framework for continued community-based development and dissemination of cutting edge metaproteomics software.
NASA Technical Reports Server (NTRS)
1976-01-01
Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.
Health impact assessment of liquid biofuel production.
Fink, Rok; Medved, Sašo
2013-01-01
Bioethanol and biodiesel as potential substitutes for fossil fuels in the transportation sector have been analyzed for environmental suitability. However, there could be impacts on human health during the production, therefore adverse health effects have to be analyzed. The aim of this study is to analyze to what health risk factors humans are exposed to in the production of biofuels and what the size of the health effects is. A health impact assessment expressed as disability adjusted life years (DALYs) was conducted in SimaPro 7.1 software. The results show a statistically significant lower carcinogenic impact of biofuels (p < 0.05) than fossil fuels. Meanwhile, the impact of organic respirable compounds is smaller for fossil fuels (p < 0.05) than for biofuels. Analysis of inorganic compounds like PM₁₀,₂.₅, SO₂ or NO(x) shows some advantages of sugar beet bioethanol and soybean biodiesel production (p < 0.05), although production of sugarcane bioethanol shows larger impacts of respirable inorganic compounds than for fossil fuels (p < 0.001). Although liquid biofuels are made of renewable energy sources, this does not necessary mean that they do not represent any health hazards.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
Testing of Hand-Held Mine Detection Systems
2015-01-08
ITOP 04-2-5208 for guidance on software testing . Testing software is necessary to ensure that safety is designed into the software algorithm, and that...sensor verification areas or target lanes. F.2. TESTING OBJECTIVES. a. Testing objectives will impact on the test design . Some examples of...overall safety, performance, and reliability of the system. It describes activities necessary to ensure safety is designed into the system under test
2008-03-01
executables. The current roadblock to detecting Type I Malware consistantly is the practice of legitimate software , such as antivirus programs, using this... Software Security Systems . . 31 3.2.2 Advantages of Hardware . . . . . . . . . . . . . 32 3.2.3 Trustworthiness of Information . . . . . . . . . 33...Towards a Hardware Security Backplane . . . . . . . . . 42 IV. Review of State of the Art Computer Security Solutions . . . . . 46 4.1 Software
NASA Technical Reports Server (NTRS)
Noor, A. K.
1983-01-01
Advances in continuum modeling, progress in reduction methods, and analysis and modeling needs for large space structures are covered with specific attention given to repetitive lattice trusses. As far as continuum modeling is concerned, an effective and verified analysis capability exists for linear thermoelastic stress, birfurcation buckling, and free vibration problems of repetitive lattices. However, application of continuum modeling to nonlinear analysis needs more development. Reduction methods are very effective for bifurcation buckling and static (steady-state) nonlinear analysis. However, more work is needed to realize their full potential for nonlinear dynamic and time-dependent problems. As far as analysis and modeling needs are concerned, three areas are identified: loads determination, modeling and nonclassical behavior characteristics, and computational algorithms. The impact of new advances in computer hardware, software, integrated analysis, CAD/CAM stems, and materials technology is also discussed.
Development of GENOA Progressive Failure Parallel Processing Software Systems
NASA Technical Reports Server (NTRS)
Abdi, Frank; Minnetyan, Levon
1999-01-01
A capability consisting of software development and experimental techniques has been developed and is described. The capability is integrated into GENOA-PFA to model polymer matrix composite (PMC) structures. The capability considers the physics and mechanics of composite materials and structure by integration of a hierarchical multilevel macro-scale (lamina, laminate, and structure) and micro scale (fiber, matrix, and interface) simulation analyses. The modeling involves (1) ply layering methodology utilizing FEM elements with through-the-thickness representation, (2) simulation of effects of material defects and conditions (e.g., voids, fiber waviness, and residual stress) on global static and cyclic fatigue strengths, (3) including material nonlinearities (by updating properties periodically) and geometrical nonlinearities (by Lagrangian updating), (4) simulating crack initiation. and growth to failure under static, cyclic, creep, and impact loads. (5) progressive fracture analysis to determine durability and damage tolerance. (6) identifying the percent contribution of various possible composite failure modes involved in critical damage events. and (7) determining sensitivities of failure modes to design parameters (e.g., fiber volume fraction, ply thickness, fiber orientation. and adhesive-bond thickness). GENOA-PFA progressive failure analysis is now ready for use to investigate the effects on structural responses to PMC material degradation from damage induced by static, cyclic (fatigue). creep, and impact loading in 2D/3D PMC structures subjected to hygrothermal environments. Its use will significantly facilitate targeting design parameter changes that will be most effective in reducing the probability of a given failure mode occurring.
NASA Technical Reports Server (NTRS)
Dunn, William R.; Corliss, Lloyd D.
1991-01-01
Paper examines issue of software safety. Presents four case histories of software-safety analysis. Concludes that, to be safe, software, for all practical purposes, must be free of errors. Backup systems still needed to prevent catastrophic software failures.
Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G
2013-01-16
Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Uber, James G.
1988-01-01
Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
Analysis of a hardware and software fault tolerant processor for critical applications
NASA Technical Reports Server (NTRS)
Dugan, Joanne B.
1993-01-01
Computer systems for critical applications must be designed to tolerate software faults as well as hardware faults. A unified approach to tolerating hardware and software faults is characterized by classifying faults in terms of duration (transient or permanent) rather than source (hardware or software). Errors arising from transient faults can be handled through masking or voting, but errors arising from permanent faults require system reconfiguration to bypass the failed component. Most errors which are caused by software faults can be considered transient, in that they are input-dependent. Software faults are triggered by a particular set of inputs. Quantitative dependability analysis of systems which exhibit a unified approach to fault tolerance can be performed by a hierarchical combination of fault tree and Markov models. A methodology for analyzing hardware and software fault tolerant systems is applied to the analysis of a hypothetical system, loosely based on the Fault Tolerant Parallel Processor. The models consider both transient and permanent faults, hardware and software faults, independent and related software faults, automatic recovery, and reconfiguration.
NASA Technical Reports Server (NTRS)
Melis, Matthew E.; Brand, Jeremy H.; Pereira, J. Michael; Revilock, Duane M.
2007-01-01
Following the tragedy of the Space Shuttle Columbia on February 1, 2003, a major effort commenced to develop a better understanding of debris impacts and their effect on the Space Shuttle subsystems. An initiative to develop and validate physics-based computer models to predict damage from such impacts was a fundamental component of this effort. To develop the models it was necessary to physically characterize Reinforced Carbon-Carbon (RCC) and various debris materials which could potentially shed on ascent and impact the Orbiter RCC leading edges. The validated models enabled the launch system community to use the impact analysis software LS DYNA to predict damage by potential and actual impact events on the Orbiter leading edge and nose cap thermal protection systems. Validation of the material models was done through a three-level approach: fundamental tests to obtain independent static and dynamic material model properties of materials of interest, sub-component impact tests to provide highly controlled impact test data for the correlation and validation of the models, and full-scale impact tests to establish the final level of confidence for the analysis methodology. This paper discusses the second level subcomponent test program in detail and its application to the LS DYNA model validation process. The level two testing consisted of over one hundred impact tests in the NASA Glenn Research Center Ballistic Impact Lab on 6 by 6 in. and 6 by 12 in. flat plates of RCC and evaluated three types of debris projectiles: BX 265 External Tank foam, ice, and PDL 1034 External Tank foam. These impact tests helped determine the level of damage generated in the RCC flat plates by each projectile. The information obtained from this testing validated the LS DYNA damage prediction models and provided a certain level of confidence to begin performing analysis for full-size RCC test articles for returning NASA to flight with STS 114 and beyond.
The Impact of Computer Science on the Development of Oulu ICT during 1985-1990
NASA Astrophysics Data System (ADS)
Oinas-Kukkonen, Henry; Similä, Jouni; Pulli, Petri; Oinas-Kukkonen, Harri; Kerola, Pentti
The region of Oulu has been emphasizing the importance of electronics industry for its business growth since the 1960s. After a pitch-dark recession, the region developed in the 1990s into a new, well-established hub of information and communication technology (ICT) in Finland. The city with its 100,000 inhabitants occupied nearly 10,000 ICT professionals in 1995. This article will contribute to the body of research knowledge through analyzing the role of computer science, in particular information systems and software engineering, for the development of the ICT industry in Oulu in the latter half of the 1980s. This analysis is based on a variety of both primary and secondary sources. This article suggests that the system-theoretical and software-oriented research expertise played a key role for the rapid and successful ICT business development of the Oulu region.
Using Noldus Observer XT for research on deaf signers learning to read: an innovative methodology.
Ducharme, Daphne A; Arcand, Isabelle
2009-08-01
Despite years of research on the reading problems of deaf students, we still do not know how deaf signers who read well actually crack the code of print. How connections are made between sign language and written language is still an open question. In this article, we show how the Noldus Observer XT software can be used to conduct an in-depth analysis of the online behavior of deaf readers. First, we examine factors that may have an impact on reading behavior. Then, we describe how we videotaped teachers with their deaf student signers of langue des signes québécoise during a reading task, how we conducted a recall activity to better understand the students' reading behavior, and how we used this innovative software to analyze the taped footage. Finally, we discuss the contribution this type of research can have on the future reading behavior of deaf students.
NASA Technical Reports Server (NTRS)
Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)
2001-01-01
A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.
NASA Astrophysics Data System (ADS)
Regnell, Björn; Höst, Martin; Nilsson, Fredrik; Bengtsson, Henrik
When developing software-intensive products for a market-place it is important for a development organisation to create innovative features for coming releases in order to achieve advantage over competitors. This paper focuses on assessment of innovation capability at team level in relation to the requirements engineering that is taking place before the actual product development projects are decided, when new business models, technology opportunities and intellectual property rights are created and investigated through e.g. prototyping and concept development. The result is a measurement framework focusing on four areas: innovation elicitation, selection, impact and ways-of-working. For each area, candidate measurements were derived from interviews to be used as inspiration in the development of a tailored measurement program. The framework is based on interviews with participants of a software team with specific innovation responsibilities and validated through cross-case analysis and feedback from practitioners.
Statistical evaluation of manual segmentation of a diffuse low-grade glioma MRI dataset.
Ben Abdallah, Meriem; Blonski, Marie; Wantz-Mezieres, Sophie; Gaudeau, Yann; Taillandier, Luc; Moureaux, Jean-Marie
2016-08-01
Software-based manual segmentation is critical to the supervision of diffuse low-grade glioma patients and to the optimal treatment's choice. However, manual segmentation being time-consuming, it is difficult to include it in the clinical routine. An alternative to circumvent the time cost of manual segmentation could be to share the task among different practitioners, providing it can be reproduced. The goal of our work is to assess diffuse low-grade gliomas' manual segmentation's reproducibility on MRI scans, with regard to practitioners, their experience and field of expertise. A panel of 13 experts manually segmented 12 diffuse low-grade glioma clinical MRI datasets using the OSIRIX software. A statistical analysis gave promising results, as the practitioner factor, the medical specialty and the years of experience seem to have no significant impact on the average values of the tumor volume variable.
NASA Astrophysics Data System (ADS)
Jalasabri, J.; Romli, F. I.; Harmin, M. Y.
2017-12-01
In developing successful airship designs, it is important to fully understand the effect of the design on the performance of the airship. The aim of this research work is to establish the trend for effects of design fineness ratio of an airship towards its aerodynamic performance. An approximate computer-aided design (CAD) model of the Atlant-100 airship is constructed using CATIA software and it is applied in the computational fluid dynamics (CFD) simulation analysis using Star-CCM+ software. In total, 36 simulation runs are executed with different combinations of values for design fineness ratio, altitude and velocity. The obtained simulation results are analyzed using MINITAB to capture the effects relationship on lift and drag coefficients. Based on the results, it is concluded that the design fineness ratio does have a significant impact on the generated aerodynamic lift and drag forces on the airship.
Validation of X1 motorcycle model in industrial plant layout by using WITNESSTM simulation software
NASA Astrophysics Data System (ADS)
Hamzas, M. F. M. A.; Bareduan, S. A.; Zakaria, M. Z.; Tan, W. J.; Zairi, S.
2017-09-01
This paper demonstrates a case study on simulation, modelling and analysis for X1 Motorcycles Model. In this research, a motorcycle assembly plant has been selected as a main place of research study. Simulation techniques by using Witness software were applied to evaluate the performance of the existing manufacturing system. The main objective is to validate the data and find out the significant impact on the overall performance of the system for future improvement. The process of validation starts when the layout of the assembly line was identified. All components are evaluated to validate whether the data is significance for future improvement. Machine and labor statistics are among the parameters that were evaluated for process improvement. Average total cycle time for given workstations is used as criterion for comparison of possible variants. From the simulation process, the data used are appropriate and meet the criteria for two-sided assembly line problems.
Rengarajan, A; Drapekin, J; Patel, A; Gyawali, C P
2016-12-01
High-resolution manometry (HRM) utilizes software tools to diagnose esophageal motor disorders. Performance of these software metrics could be affected by averaging and by software characteristics of different manufacturers. High-resolution manometry studies on 86 patients referred for antireflux surgery (61.6 ± 1.4 year, 70% F) and 20 healthy controls (27.9 ± 0.7 year, 45% F) were first subject to standard analysis (Medtronic, Duluth, GA, USA). Coordinates for each of 10 test swallows were exported and averaged to generate a composite swallow. The swallows and averaged composites were imported as ASCII file format into Manoview (Medtronic) and Medical Measurement Systems database reporter (MMS, Dover, NH, USA), and analyses repeated. Comparisons were made between standard and composite swallow interpretations. Correlation between the two systems was high for mean distal contractile integral (DCI, r 2 ≥ 0.9) but lower for integrated relaxation pressure (IRP, r 2 = 0.7). Excluding achalasia, six patients with outflow obstruction (mean IRP 23.2 ± 2.1 with 10-swallow average) were identified by both systems. An additional nine patients (10.5%) were identified as outflow obstruction (15 mmHg threshold) with MMS 10-swallow and four with MMS composite swallow evaluation; only one was confirmed. Ineffective esophageal motility was diagnosed by 10-swallow evaluation in 19 (22.1%) with Manoview, and 20 (23.3%) with MMS. On Manoview composite, 17 had DCI <450 mmHg/cm/s, and on MMS composite, 21, (p ≥ 0.85 for each comparison) but these did not impact diagnostic conclusions. Comparison of 10 swallow and composite swallows demonstrate variability in software metrics between manometry systems. Our data support use of manufacturer specific software metrics on 10-swallow sequences. © 2016 John Wiley & Sons Ltd.
A novel real-time health monitoring system for unmanned vehicles
NASA Astrophysics Data System (ADS)
Zhang, David C.; Ouyang, Lien; Qing, Peter; Li, Irene
2008-04-01
Real-time monitoring the status of in-service structures such as unmanned vehicles can provide invaluable information to detect the damages to the structures on time. The unmanned vehicles can be maintained and repaired in time if such damages are found. One typical cause of damages of unmanned vehicles is from impacts caused by bumping into some obstacles or being hit by some objects such as hostile fire. This paper introduces a novel impact event sensing system that can detect the location of the impact events and the force-time history of the impact events. The system consists of the Piezo-electric sensor network, the hardware platform and the analysis software. The new customized battery-powered impact event sensing system supports up to 64-channel parallel data acquisition. It features an innovative low-power hardware trigger circuit that monitors 64 channels simultaneously. The system is in the sleep mode most of the time. When an impact event happens, the system will wake up in micro-seconds and detect the impact location and corresponding force-time history. The system can be combined with the SMART sensing system to further evaluate the impact damage severity.
NASA Astrophysics Data System (ADS)
Fraser, Ryan; Gross, Lutz; Wyborn, Lesley; Evans, Ben; Klump, Jens
2015-04-01
Recent investments in HPC, cloud and Petascale data stores, have dramatically increased the scale and resolution that earth science challenges can now be tackled. These new infrastructures are highly parallelised and to fully utilise them and access the large volumes of earth science data now available, a new approach to software stack engineering needs to be developed. The size, complexity and cost of the new infrastructures mean any software deployed has to be reliable, trusted and reusable. Increasingly software is available via open source repositories, but these usually only enable code to be discovered and downloaded. As a user it is hard for a scientist to judge the suitability and quality of individual codes: rarely is there information on how and where codes can be run, what the critical dependencies are, and in particular, on the version requirements and licensing of the underlying software stack. A trusted software framework is proposed to enable reliable software to be discovered, accessed and then deployed on multiple hardware environments. More specifically, this framework will enable those who generate the software, and those who fund the development of software, to gain credit for the effort, IP, time and dollars spent, and facilitate quantification of the impact of individual codes. For scientific users, the framework delivers reviewed and benchmarked scientific software with mechanisms to reproduce results. The trusted framework will have five separate, but connected components: Register, Review, Reference, Run, and Repeat. 1) The Register component will facilitate discovery of relevant software from multiple open source code repositories. The registration process of the code should include information about licensing, hardware environments it can be run on, define appropriate validation (testing) procedures and list the critical dependencies. 2) The Review component is targeting on the verification of the software typically against a set of benchmark cases. This will be achieved by linking the code in the software framework to peer review forums such as Mozilla Science or appropriate Journals (e.g. Geoscientific Model Development Journal) to assist users to know which codes to trust. 3) Referencing will be accomplished by linking the Software Framework to groups such as Figshare or ImpactStory that help disseminate and measure the impact of scientific research, including program code. 4) The Run component will draw on information supplied in the registration process, benchmark cases described in the review and relevant information to instantiate the scientific code on the selected environment. 5) The Repeat component will tap into existing Provenance Workflow engines that will automatically capture information that relate to a particular run of that software, including identification of all input and output artefacts, and all elements and transactions within that workflow. The proposed trusted software framework will enable users to rapidly discover and access reliable code, reduce the time to deploy it and greatly facilitate sharing, reuse and reinstallation of code. Properly designed it could enable an ability to scale out to massively parallel systems and be accessed nationally/ internationally for multiple use cases, including Supercomputer centres, cloud facilities, and local computers.
State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation
2014-07-01
preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data
A Method for Populating the Knowledge Base of AFIT’s Domain-Oriented Application Composition System
1993-12-01
Analysis ( FODA ). The approach identifies prominent features (similarities) and distinctive features (differences) of software systems within an... analysis approaches we have summarized, the re- searchers described FODA in sufficient detail to use on large domain analysis projects (ones with...Software Technology Center, July 1991. 18. Kang, Kyo C. and others. Feature-Oriented Domain Analysis ( FODA ) Feasibility Study. Technical Report, Software
Selection of software for mechanical engineering undergraduates
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheah, C. T.; Yin, C. S.; Halim, T.
A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Knowledge and utilization of computer-software for statistics among Nigerian dentists.
Chukwuneke, F N; Anyanechi, C E; Obiakor, A O; Amobi, O; Onyejiaka, N; Alamba, I
2013-01-01
The use of computer soft ware for generation of statistic analysis has transformed health information and data to simplest form in the areas of access, storage, retrieval and analysis in the field of research. This survey therefore was carried out to assess the level of knowledge and utilization of computer software for statistical analysis among dental researchers in eastern Nigeria. Questionnaires on the use of computer software for statistical analysis were randomly distributed to 65 practicing dental surgeons of above 5 years experience in the tertiary academic hospitals in eastern Nigeria. The focus was on: years of clinical experience; research work experience; knowledge and application of computer generated software for data processing and stastistical analysis. Sixty-two (62/65; 95.4%) of these questionnaires were returned anonymously, which were used in our data analysis. Twenty-nine (29/62; 46.8%) respondents fall within those with 5-10 years of clinical experience out of which none has completed the specialist training programme. Practitioners with above 10 years clinical experiences were 33 (33/62; 53.2%) out of which 15 (15/33; 45.5%) are specialists representing 24.2% (15/62) of the total number of respondents. All the 15 specialists are actively involved in research activities and only five (5/15; 33.3%) can utilize software statistical analysis unaided. This study has i dentified poor utilization of computer software for statistic analysis among dental researchers in eastern Nigeria. This is strongly associated with lack of exposure on the use of these software early enough especially during the undergraduate training. This call for introduction of computer training programme in dental curriculum to enable practitioners develops the attitude of using computer software for their research.
Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F
1997-12-01
Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.
FunRich proteomics software analysis, let the fun begin!
Benito-Martin, Alberto; Peinado, Héctor
2015-08-01
Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Birth, Death, and Resurrection of an SPI Project
NASA Astrophysics Data System (ADS)
Carlsson, Sven; Schönström, Mikael
Commentators on contemporary themes of strategic management and firm competitiveness stress that a firm's competitive advantage flows from its unique knowledge and how it manages knowledge, and for many firms their ability to create, share, exchange, and use knowledge have a major impact on their competitiveness (Nonaka & Teece 2001). In software development, knowledge management (KM) plays an increasingly important role. It has been argued that the KM-field is an important source for creating new perspectives on the software development process (Iivari 2000). Several Software Process Improvement (SPI) approaches stress the importance of managing knowledge and experiences as a way for improving software processes (Ahem et al. 2001). Another SPI-trend is the use of ideas from process management like in the Capability Maturity Model (CMM). Unfortunately, little research on the effects of the use of process management ideas in SPI exists. Given the influx of process management ideas to SPI, the impact of these ideas should be addressed.
A Reference Model for Software and System Inspections. White Paper
NASA Technical Reports Server (NTRS)
He, Lulu; Shull, Forrest
2009-01-01
Software Quality Assurance (SQA) is an important component of the software development process. SQA processes provide assurance that the software products and processes in the project life cycle conform to their specified requirements by planning, enacting, and performing a set of activities to provide adequate confidence that quality is being built into the software. Typical techniques include: (1) Testing (2) Simulation (3) Model checking (4) Symbolic execution (5) Management reviews (6) Technical reviews (7) Inspections (8) Walk-throughs (9) Audits (10) Analysis (complexity analysis, control flow analysis, algorithmic analysis) (11) Formal method Our work over the last few years has resulted in substantial knowledge about SQA techniques, especially the areas of technical reviews and inspections. But can we apply the same QA techniques to the system development process? If yes, what kind of tailoring do we need before applying them in the system engineering context? If not, what types of QA techniques are actually used at system level? And, is there any room for improvement.) After a brief examination of the system engineering literature (especially focused on NASA and DoD guidance) we found that: (1) System and software development process interact with each other at different phases through development life cycle (2) Reviews are emphasized in both system and software development. (Figl.3). For some reviews (e.g. SRR, PDR, CDR), there are both system versions and software versions. (3) Analysis techniques are emphasized (e.g. Fault Tree Analysis, Preliminary Hazard Analysis) and some details are given about how to apply them. (4) Reviews are expected to use the outputs of the analysis techniques. In other words, these particular analyses are usually conducted in preparation for (before) reviews. The goal of our work is to explore the interaction between the Quality Assurance (QA) techniques at the system level and the software level.
Standardizing Activation Analysis: New Software for Photon Activation Analysis
NASA Astrophysics Data System (ADS)
Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.
2011-06-01
Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.
Second Generation Product Line Engineering Takes Hold in the DoD
2014-01-01
Feature- Oriented Domain Analysis ( FODA ) Feasibility Study” (CMU/SEI-90- TR-021, ADA235785). Pittsburgh, PA: Software Engineering Institute...software product line engineering and software architecture documentation and analysis . Clements is co-author of three practitioner-oriented books about
2008-09-01
software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi-automated means to...the use of compendium software facilitate targeting problem understanding and the network analysis tool, Palantir , as an efficient and tailored semi...OBJECTIVES USING COMPENDIUM SOFTWARE .....63 E. HOT TARGET PRIORITIZATION AND DEVELOPMENT USING PALANTIR SOFTWARE .................................69 1
Software Defined Network Monitoring Scheme Using Spectral Graph Theory and Phantom Nodes
2014-09-01
networks is the emergence of software - defined networking ( SDN ) [1]. SDN has existed for the...Chapter III for network monitoring. A. SOFTWARE DEFINED NETWORKS SDNs provide a new and innovative method to simplify network hardware by logically...and R. Giladi, “Performance analysis of software - defined networking ( SDN ),” in Proc. of IEEE 21st International Symposium on Modeling, Analysis
ASPASIA: A toolkit for evaluating the effects of biological interventions on SBML model behaviour.
Evans, Stephanie; Alden, Kieran; Cucurull-Sanchez, Lourdes; Larminie, Christopher; Coles, Mark C; Kullberg, Marika C; Timmis, Jon
2017-02-01
A calibrated computational model reflects behaviours that are expected or observed in a complex system, providing a baseline upon which sensitivity analysis techniques can be used to analyse pathways that may impact model responses. However, calibration of a model where a behaviour depends on an intervention introduced after a defined time point is difficult, as model responses may be dependent on the conditions at the time the intervention is applied. We present ASPASIA (Automated Simulation Parameter Alteration and SensItivity Analysis), a cross-platform, open-source Java toolkit that addresses a key deficiency in software tools for understanding the impact an intervention has on system behaviour for models specified in Systems Biology Markup Language (SBML). ASPASIA can generate and modify models using SBML solver output as an initial parameter set, allowing interventions to be applied once a steady state has been reached. Additionally, multiple SBML models can be generated where a subset of parameter values are perturbed using local and global sensitivity analysis techniques, revealing the model's sensitivity to the intervention. To illustrate the capabilities of ASPASIA, we demonstrate how this tool has generated novel hypotheses regarding the mechanisms by which Th17-cell plasticity may be controlled in vivo. By using ASPASIA in conjunction with an SBML model of Th17-cell polarisation, we predict that promotion of the Th1-associated transcription factor T-bet, rather than inhibition of the Th17-associated transcription factor RORγt, is sufficient to drive switching of Th17 cells towards an IFN-γ-producing phenotype. Our approach can be applied to all SBML-encoded models to predict the effect that intervention strategies have on system behaviour. ASPASIA, released under the Artistic License (2.0), can be downloaded from http://www.york.ac.uk/ycil/software.
Development of new vibration energy flow analysis software and its applications to vehicle systems
NASA Astrophysics Data System (ADS)
Kim, D.-J.; Hong, S.-Y.; Park, Y.-H.
2005-09-01
The Energy flow analysis (EFA) offers very promising results in predicting the noise and vibration responses of system structures in medium-to-high frequency ranges. We have developed the Energy flow finite element method (EFFEM) based software, EFADSC++ R4, for the vibration analysis. The software can analyze the system structures composed of beam, plate, spring-damper, rigid body elements and many other components developed, and has many useful functions in analysis. For convenient use of the software, the main functions of the whole software are modularized into translator, model-converter, and solver. The translator module makes it possible to use finite element (FE) model for the vibration analysis. The model-converter module changes FE model into energy flow finite element (EFFE) model, and generates joint elements to cover the vibrational attenuation in the complex structures composed of various elements and can solve the joint element equations by using the wave tra! nsmission approach very quickly. The solver module supports the various direct and iterative solvers for multi-DOF structures. The predictions of vibration for real vehicles by using the developed software were performed successfully.
The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.
Zamawe, F C
2015-03-01
For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.
Yates, John R
2015-11-01
Advances in computer technology and software have driven developments in mass spectrometry over the last 50 years. Computers and software have been impactful in three areas: the automation of difficult calculations to aid interpretation, the collection of data and control of instruments, and data interpretation. As the power of computers has grown, so too has the utility and impact on mass spectrometers and their capabilities. This has been particularly evident in the use of tandem mass spectrometry data to search protein and nucleotide sequence databases to identify peptide and protein sequences. This capability has driven the development of many new approaches to study biological systems, including the use of "bottom-up shotgun proteomics" to directly analyze protein mixtures. Graphical Abstract ᅟ.
Description of the GMAO OSSE for Weather Analysis Software Package: Version 3
NASA Technical Reports Server (NTRS)
Koster, Randal D. (Editor); Errico, Ronald M.; Prive, Nikki C.; Carvalho, David; Sienkiewicz, Meta; El Akkraoui, Amal; Guo, Jing; Todling, Ricardo; McCarty, Will; Putman, William M.;
2017-01-01
The Global Modeling and Assimilation Office (GMAO) at the NASA Goddard Space Flight Center has developed software and products for conducting observing system simulation experiments (OSSEs) for weather analysis applications. Such applications include estimations of potential effects of new observing instruments or data assimilation techniques on improving weather analysis and forecasts. The GMAO software creates simulated observations from nature run (NR) data sets and adds simulated errors to those observations. The algorithms employed are much more sophisticated, adding a much greater degree of realism, compared with OSSE systems currently available elsewhere. The algorithms employed, software designs, and validation procedures are described in this document. Instructions for using the software are also provided.
ERIC Educational Resources Information Center
Margerum-Leys, Jon; Kupperman, Jeff; Boyle-Heimann, Kristen
This paper presents perspectives on the use of data analysis software in the process of qualitative research. These perspectives were gained in the conduct of three qualitative research studies that differed in theoretical frames, areas of interests, and scope. Their common use of a particular data analysis software package allows the exploration…
ElectroMagnetoEncephalography Software: Overview and Integration with Other EEG/MEG Toolboxes
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section. PMID:21577273
ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.
Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus
2011-01-01
EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.
Applications of Earth Observations for Fisheries Management: An analysis of socioeconomic benefits
NASA Astrophysics Data System (ADS)
Friedl, L.; Kiefer, D. A.; Turner, W.
2013-12-01
This paper will discuss the socioeconomic impacts of a project applying Earth observations and models to support management and conservation of tuna and other marine resources in the eastern Pacific Ocean. A project team created a software package that produces statistical analyses and dynamic maps of habitat for pelagic ocean biota. The tool integrates sea surface temperature and chlorophyll imagery from MODIS, ocean circulation models, and other data products. The project worked with the Inter-American Tropical Tuna Commission, which issues fishery management information, such as stock assessments, for the eastern Pacific region. The Commission uses the tool and broader habitat information to produce better estimates of stock and thus improve their ability to identify species that could be at risk of overfishing. The socioeconomic analysis quantified the relative value that Earth observations contributed to accurate stock size assessments through improvements in calculating population size. The analysis team calculated the first-order economic costs of a fishery collapse (or shutdown), and they calculated the benefits of improved estimates that reduce the uncertainty of stock size and thus reduce the risk of fishery collapse. The team estimated that the project reduced the probability of collapse of different fisheries, and the analysis generated net present values of risk mitigation. USC led the project with sponsorship from the NASA Earth Science Division's Applied Sciences Program, which conducted the socioeconomic impact analysis. The paper will discuss the project and focus primarily on the analytic methods, impact metrics, and the results of the socioeconomic benefits analysis.
ERIC Educational Resources Information Center
Borman, Stuart A.
1985-01-01
Discusses various aspects of scientific software, including evaluation and selection of commercial software products; program exchanges, catalogs, and other information sources; major data analysis packages; statistics and chemometrics software; and artificial intelligence. (JN)
ScaffoldSeq: Software for characterization of directed evolution populations.
Woldring, Daniel R; Holec, Patrick V; Hackel, Benjamin J
2016-07-01
ScaffoldSeq is software designed for the numerous applications-including directed evolution analysis-in which a user generates a population of DNA sequences encoding for partially diverse proteins with related functions and would like to characterize the single site and pairwise amino acid frequencies across the population. A common scenario for enzyme maturation, antibody screening, and alternative scaffold engineering involves naïve and evolved populations that contain diversified regions, varying in both sequence and length, within a conserved framework. Analyzing the diversified regions of such populations is facilitated by high-throughput sequencing platforms; however, length variability within these regions (e.g., antibody CDRs) encumbers the alignment process. To overcome this challenge, the ScaffoldSeq algorithm takes advantage of conserved framework sequences to quickly identify diverse regions. Beyond this, unintended biases in sequence frequency are generated throughout the experimental workflow required to evolve and isolate clones of interest prior to DNA sequencing. ScaffoldSeq software uniquely handles this issue by providing tools to quantify and remove background sequences, cluster similar protein families, and dampen the impact of dominant clones. The software produces graphical and tabular summaries for each region of interest, allowing users to evaluate diversity in a site-specific manner as well as identify epistatic pairwise interactions. The code and detailed information are freely available at http://research.cems.umn.edu/hackel. Proteins 2016; 84:869-874. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Custom software development for use in a clinical laboratory
Sinard, John H.; Gershkovich, Peter
2012-01-01
In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care. PMID:23372985
Custom software development for use in a clinical laboratory.
Sinard, John H; Gershkovich, Peter
2012-01-01
In-house software development for use in a clinical laboratory is a controversial issue. Many of the objections raised are based on outdated software development practices, an exaggeration of the risks involved, and an underestimation of the benefits that can be realized. Buy versus build analyses typically do not consider total costs of ownership, and unfortunately decisions are often made by people who are not directly affected by the workflow obstacles or benefits that result from those decisions. We have been developing custom software for clinical use for over a decade, and this article presents our perspective on this practice. A complete analysis of the decision to develop or purchase must ultimately examine how the end result will mesh with the departmental workflow, and custom-developed solutions typically can have the greater positive impact on efficiency and productivity, substantially altering the decision balance sheet. Involving the end-users in preparation of the functional specifications is crucial to the success of the process. A large development team is not needed, and even a single programmer can develop significant solutions. Many of the risks associated with custom development can be mitigated by a well-structured development process, use of open-source tools, and embracing an agile development philosophy. In-house solutions have the significant advantage of being adaptable to changing departmental needs, contributing to efficient and higher quality patient care.
Crepaldi, Nathalia Yukie; de Lima, Inacia Bezerra; Vicentine, Fernanda Bergamini; Rodrigues, Lídia Maria Lourençon; Sanches, Tiago Lara Michelin; Ruffino-Netto, Antonio; Alves, Domingos; Rijo, Rui Pedro Charters Lopes
2018-05-08
Assessment of health information systems consider different aspects of the system itself. They focus or on the professional who will use the software or on its usability or on the software engineering metrics or on financial and managerial issues. The existent approaches are very resources consuming, disconnected, and not standardized. As the software becomes more critical in the health organizations and in patients, becoming used as a medical device or a medicine, there is an urgency to identify tools and methods that can be applied in the development process. The present work is one of the steps of a broader study to identify standardized protocols to evaluate the health information systems as medicines and medical devices are evaluated by clinical trials. The goal of the present work was to evaluate the effect of the introduction of an information system for monitoring tuberculosis treatment (SISTB) in a Brazilian municipality from the patients' perspective. The Patient Satisfaction Questionnaire and the Hospital Consumer Assessment of Healthcare Providers and Systems were answered by the patients before and after the SISTB introduction, for comparison. Patients from an outpatient clinic, formed the control group, that is, at this site was not implanted the SISTB. Descriptive statistics and mixed effects model were used for data analysis. Eighty-eight interviews were conducted in the study. The questionnaire's results presented better averages after the system introduction but were not considered statistically significant. Therefore, it was not possible to associate system implantation with improved patient satisfaction. The HIS evaluation need be complete, the technical and managerial evaluation, the safety, the impact on the professionals and direct and/or indirect impact on patients are important. Developing the right tools and methods that can evaluate the software in its entirety, from the beginning of the development cycle with a normalized scale, are needed.
Manned Mars Mission program concepts
NASA Technical Reports Server (NTRS)
Hamilton, E. C.; Johnson, P.; Pearson, J.; Tucker, W.
1988-01-01
This paper describes the SRS Manned Mars Mission and Program Analysis study designed to support a manned expedition to Mars contemplated by NASA for the purposes of initiating human exploration and eventual habitation of this planet. The capabilities of the interactive software package being presently developed by the SRS for the mission/program analysis are described, and it is shown that the interactive package can be used to investigate the impact of various mission concepts on the sensitivity of mass required in LEO, schedules, relative costs, and risk. The results, to date, indicate the need for an earth-to-orbit transportation system much larger than the present STS, reliable long-life support systems, and either advanced propulsion or aerobraking technology.
Software selection based on analysis and forecasting methods, practised in 1C
NASA Astrophysics Data System (ADS)
Vazhdaev, A. N.; Chernysheva, T. Y.; Lisacheva, E. I.
2015-09-01
The research focuses on the problem of a “1C: Enterprise 8” platform inboard mechanisms for data analysis and forecasting. It is important to evaluate and select proper software to develop effective strategies for customer relationship management in terms of sales, as well as implementation and further maintenance of software. Research data allows creating new forecast models to schedule further software distribution.
Software for Real-Time Analysis of Subsonic Test Shot Accuracy
2014-03-01
used the C++ programming language, the Open Source Computer Vision ( OpenCV ®) software library, and Microsoft Windows® Application Programming...video for comparison through OpenCV image analysis tools. Based on the comparison, the software then computed the coordinates of each shot relative to...DWB researchers wanted to use the Open Source Computer Vision ( OpenCV ) software library for capturing and analyzing frames of video. OpenCV contains
Software ion scan functions in analysis of glycomic and lipidomic MS/MS datasets.
Haramija, Marko
2018-03-01
Hardware ion scan functions unique to tandem mass spectrometry (MS/MS) mode of data acquisition, such as precursor ion scan (PIS) and neutral loss scan (NLS), are important for selective extraction of key structural data from complex MS/MS spectra. However, their software counterparts, software ion scan (SIS) functions, are still not regularly available. Software ion scan functions can be easily coded for additional functionalities, such as software multiple precursor ion scan, software no ion scan, and software variable ion scan functions. These are often necessary, since they allow more efficient analysis of complex MS/MS datasets, often encountered in glycomics and lipidomics. Software ion scan functions can be easily coded by using modern script languages and can be independent of instrument manufacturer. Here we demonstrate the utility of SIS functions on a medium-size glycomic MS/MS dataset. Knowledge of sample properties, as well as of diagnostic and conditional diagnostic ions crucial for data analysis, was needed. Based on the tables constructed with the output data from the SIS functions performed, a detailed analysis of a complex MS/MS glycomic dataset could be carried out in a quick, accurate, and efficient manner. Glycomic research is progressing slowly, and with respect to the MS experiments, one of the key obstacles for moving forward is the lack of appropriate bioinformatic tools necessary for fast analysis of glycomic MS/MS datasets. Adding novel SIS functionalities to the glycomic MS/MS toolbox has a potential to significantly speed up the glycomic data analysis process. Similar tools are useful for analysis of lipidomic MS/MS datasets as well, as will be discussed briefly. Copyright © 2017 John Wiley & Sons, Ltd.
A User’s Guide for the Software Technology Economic Impact Model
1991-10-01
Copy 16 oft22 oples U AD-A248 023 \\\\1\\\\\\i\\~\\1\\1\\\\1\\1\\11\\\\\\l~ IDA DOCUMENT D-971 E Tt USER’S GUIDE FOR THE SOFTWARE TECHNOLOGY ECONOMIC IMPACT MODEL I... studied , and they are released by the President of IDA. Group Reports Group Reports record the findings and results of IDA established working groups and...the senior individuals responsible for the project and others as selected by IDA to ensure their high quality and relevance to the problems studied
NASA Astrophysics Data System (ADS)
Blanco, Matthew; Cruz, Raul M.; Izdebski, Krzysztof; Yan, Yuling
2017-02-01
Amyloidosis is an unknown pathogenic process in which abnormally folded proteins are deposited in the extracellular space as macroscopic aggregates. Laryngeal deposits of these proteins are extremely rare, but primarily cause dysphonia in patients. High Speed Digital Phonoscopy (HSDP) was used to capture the kinematics of vocal folds in a patient with laryngeal amyloidosis. Acoustic data was also recorded and both HSDP and acoustics were processed using custom Vocalizer® software to help elucidate the physiological impact of amyloids in the larynx, especially in regards to effects on the voice.
1989-05-01
0 ELECTE VARSHA P. RAO SEP11190 AND U DONNA ROBERTS MAY 1989 APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED O Research & Studies Division U.S...DAKF 15-87-0-0144 Subcontract Sub-Hi 88-12, Do No. 88-007, with the Research and Studies Division, Program Analysis and Evaluation Directorate of...Toomepuu, Chief, Research and Studies Division, for their helpful counsel. ii TABLE OF CONTENTS PAGE DISCLAIMER ii ACKNOWLEDGMENTS ii LIST OF FIGURES v
New software for statistical analysis of Cambridge Structural Database data
Sykes, Richard A.; McCabe, Patrick; Allen, Frank H.; Battle, Gary M.; Bruno, Ian J.; Wood, Peter A.
2011-01-01
A collection of new software tools is presented for the analysis of geometrical, chemical and crystallographic data from the Cambridge Structural Database (CSD). This software supersedes the program Vista. The new functionality is integrated into the program Mercury in order to provide statistical, charting and plotting options alongside three-dimensional structural visualization and analysis. The integration also permits immediate access to other information about specific CSD entries through the Mercury framework, a common requirement in CSD data analyses. In addition, the new software includes a range of more advanced features focused towards structural analysis such as principal components analysis, cone-angle correction in hydrogen-bond analyses and the ability to deal with topological symmetry that may be exhibited in molecular search fragments. PMID:22477784
Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.
Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko
2017-11-01
To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.
2012-01-01
Background Health care planning for pandemic influenza is a challenging task which requires predictive models by which the impact of different response strategies can be evaluated. However, current preparedness plans and simulations exercises, as well as freely available simulation models previously made for policy makers, do not explicitly address the availability of health care resources or determine the impact of shortages on public health. Nevertheless, the feasibility of health systems to implement response measures or interventions described in plans and trained in exercises depends on the available resource capacity. As part of the AsiaFluCap project, we developed a comprehensive and flexible resource modelling tool to support public health officials in understanding and preparing for surges in resource demand during future pandemics. Results The AsiaFluCap Simulator is a combination of a resource model containing 28 health care resources and an epidemiological model. The tool was built in MS Excel© and contains a user-friendly interface which allows users to select mild or severe pandemic scenarios, change resource parameters and run simulations for one or multiple regions. Besides epidemiological estimations, the simulator provides indications on resource gaps or surpluses, and the impact of shortages on public health for each selected region. It allows for a comparative analysis of the effects of resource availability and consequences of different strategies of resource use, which can provide guidance on resource prioritising and/or mobilisation. Simulation results are displayed in various tables and graphs, and can also be easily exported to GIS software to create maps for geographical analysis of the distribution of resources. Conclusions The AsiaFluCap Simulator is freely available software (http://www.cdprg.org) which can be used by policy makers, policy advisors, donors and other stakeholders involved in preparedness for providing evidence based and illustrative information on health care resource capacities during future pandemics. The tool can inform both preparedness plans and simulation exercises and can help increase the general understanding of dynamics in resource capacities during a pandemic. The combination of a mathematical model with multiple resources and the linkage to GIS for creating maps makes the tool unique compared to other available software. PMID:23061807
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
Using Combined SFTA and SFMECA Techniques for Space Critical Software
NASA Astrophysics Data System (ADS)
Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.
2012-01-01
This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.
Software development predictors, error analysis, reliability models and software metric analysis
NASA Technical Reports Server (NTRS)
Basili, Victor
1983-01-01
The use of dynamic characteristics as predictors for software development was studied. It was found that there are some significant factors that could be useful as predictors. From a study on software errors and complexity, it was shown that meaningful results can be obtained which allow insight into software traits and the environment in which it is developed. Reliability models were studied. The research included the field of program testing because the validity of some reliability models depends on the answers to some unanswered questions about testing. In studying software metrics, data collected from seven software engineering laboratory (FORTRAN) projects were examined and three effort reporting accuracy checks were applied to demonstrate the need to validate a data base. Results are discussed.
CRAX/Cassandra Reliability Analysis Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, D.
1999-02-10
Over the past few years Sandia National Laboratories has been moving toward an increased dependence on model- or physics-based analyses as a means to assess the impact of long-term storage on the nuclear weapons stockpile. These deterministic models have also been used to evaluate replacements for aging systems, often involving commercial off-the-shelf components (COTS). In addition, the models have been used to assess the performance of replacement components manufactured via unique, small-lot production runs. In either case, the limited amount of available test data dictates that the only logical course of action to characterize the reliability of these components ismore » to specifically consider the uncertainties in material properties, operating environment etc. within the physics-based (deterministic) model. This not only provides the ability to statistically characterize the expected performance of the component or system, but also provides direction regarding the benefits of additional testing on specific components within the system. An effort was therefore initiated to evaluate the capabilities of existing probabilistic methods and, if required, to develop new analysis methods to support the inclusion of uncertainty in the classical design tools used by analysts and design engineers at Sandia. The primary result of this effort is the CMX (Cassandra Exoskeleton) reliability analysis software.« less
An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency
NASA Astrophysics Data System (ADS)
Phillips, Dewanne Marie
Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software architecture framework and acquisition methodology to improve the resiliency of space systems from a software perspective with an emphasis on the early phases of the systems engineering life cycle. This methodology involves seven steps: 1) Define technical resiliency requirements, 1a) Identify standards/policy for software resiliency, 2) Develop a request for proposal (RFP)/statement of work (SOW) for resilient space systems software, 3) Define software resiliency goals for space systems, 4) Establish software resiliency quality attributes, 5) Perform architectural tradeoffs and identify risks, 6) Conduct architecture assessments as part of the procurement process, and 7) Ascertain space system software architecture resiliency metrics. Data illustrates that software vulnerabilities can lead to opportunities for malicious cyber activities, which could degrade the space mission capability for the user community. Reducing the number of vulnerabilities by improving architecture and software system engineering practices can contribute to making space systems more resilient. Since cyber-attacks are enabled by shortfalls in software, robust software engineering practices and an architectural design are foundational to resiliency, which is a quality that allows the system to "take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". To achieve software resiliency for space systems, acquirers and suppliers must identify relevant factors and systems engineering practices to apply across the lifecycle, in software requirements analysis, architecture development, design, implementation, verification and validation, and maintenance phases.
Automated daily quality control analysis for mammography in a multi-unit imaging center.
Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli
2018-01-01
Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.
Flight Software for the LADEE Mission
NASA Technical Reports Server (NTRS)
Cannon, Howard N.
2015-01-01
The Lunar Atmosphere and Dust Environment Explorer (LADEE) spacecraft was launched on September 6, 2013, and completed its mission on April 17, 2014 with a directed impact to the Lunar Surface. Its primary goals were to examine the lunar atmosphere, measure lunar dust, and to demonstrate high rate laser communications. The LADEE mission was a resounding success, achieving all mission objectives, much of which can be attributed to careful planning and preparation. This paper discusses some of the highlights from the mission, and then discusses the techniques used for developing the onboard Flight Software. A large emphasis for the Flight Software was to develop it within tight schedule and cost constraints. To accomplish this, the Flight Software team leveraged heritage software, used model based development techniques, and utilized an automated test infrastructure. This resulted in the software being delivered on time and within budget. The resulting software was able to meet all system requirements, and had very problems in flight.
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
Empirical Model for Predicting Rockfall Trajectory Direction
NASA Astrophysics Data System (ADS)
Asteriou, Pavlos; Tsiambaos, George
2016-03-01
A methodology for the experimental investigation of rockfall in three-dimensional space is presented in this paper, aiming to assist on-going research of the complexity of a block's response to impact during a rockfall. An extended laboratory investigation was conducted, consisting of 590 tests with cubical and spherical blocks made of an artificial material. The effects of shape, slope angle and the deviation of the post-impact trajectory are examined as a function of the pre-impact trajectory direction. Additionally, an empirical model is proposed that estimates the deviation of the post-impact trajectory as a function of the pre-impact trajectory with respect to the slope surface and the slope angle. This empirical model is validated by 192 small-scale field tests, which are also presented in this paper. Some important aspects of the three-dimensional nature of rockfall phenomena are highlighted that have been hitherto neglected. The 3D space data provided in this study are suitable for the calibration and verification of rockfall analysis software that has become increasingly popular in design practice.
Exploration of COTS Ultrasonic NDE Methods for ISS MMOD Impact Analysis
NASA Technical Reports Server (NTRS)
Violette, Daniel P.; Koshti, Ajay; Stanley, David
2012-01-01
The high orbital speed of the International Space Station (ISS) has created a concern about Micro-Meteorite and Orbital Debris (MMOD). The possibility exists that such an impact could cause significant damage to the ISS pressure wall, and possibly lead to a pressure leak. This paper explores the potential of using commercial off-the-shelf (COTS) Ultrasonic Non-Destructive Evaluation (NDE) techniques in order to inspect and analyze MMOD impact damage if such an event would happen to occur. Different types of intra vehicular activity (IVA) Ultrasonic NDE equipment were evaluated, including the Olympus Omniscan MX and the General Electric Phasor XS. The equipment was tested by inspecting various aluminum standards and impact damage test plates in order to determine technological limitations of the equipment as well as the ease of use and availability of features. This study allowed for the design of scanning procedures in order to evaluate the extent of damage caused by an MMOD impact. Lastly, comparisons were drawn between the different pieces of COTS software and a recommendation is made based on each device s capability.
Let's Scrum! Learning Digital Media Collaboratively
ERIC Educational Resources Information Center
Wilson, Daniel G.; Brown, Joshua; Burke, Adam A.
2013-01-01
The changing landscape of digital media and software development has immense impact on society, not only through consumer use of the products, but also in the way these technologies are developed. Modern software and media-development companies are using collaborative methods to develop innovative and useful products. Technology and engineering…
Measuring the Impact of Agile Coaching on Students' Performance
ERIC Educational Resources Information Center
Rodríguez, Guillermo; Soria, Álvaro; Campo, Marcelo
2016-01-01
Nowadays, considerable attention is paid to agile methods as a means to improve management of software development processes. The widespread use of such methods in professional contexts has encouraged their integration into software engineering training and undergraduate courses. Although several research efforts have focused on teaching Scrum…
Laptop Use, Interactive Science Software, and Science Learning among At-Risk Students
ERIC Educational Resources Information Center
Zheng, Binbin; Warschauer, Mark; Hwang, Jin Kyoung; Collins, Penelope
2014-01-01
This year-long, quasi-experimental study investigated the impact of the use of netbook computers and interactive science software on fifth-grade students' science learning processes, academic achievement, and interest in further science, technology, engineering, and mathematics (STEM) study within a linguistically diverse school district in…
CHEMFLO-2000: INTERACTIVE SOFTWARE FOR SIMULATING WATER AND CHEMICAL MOVEMENT IN UNSATURATED SOILS
The movement of water and chemicals into and through soils has a large impact upon our environment and the entire ecosystem. Understanding these processes is of great importance in managing, utilizing, and protecting our natural resources. This software was written to enhance our...
Integrated situational awareness for cyber attack detection, analysis, and mitigation
NASA Astrophysics Data System (ADS)
Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng
2012-06-01
Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.
Analysis of Software Systems for Specialized Computers,
computer) with given computer hardware and software . The object of study is the software system of a computer, designed for solving a fixed complex of...purpose of the analysis is to find parameters that characterize the system and its elements during operation, i.e., when servicing the given requirement flow. (Author)
Flexible Software Architecture for Visualization and Seismic Data Analysis
NASA Astrophysics Data System (ADS)
Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.
2007-12-01
Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.
Influence analysis of Github repositories.
Hu, Yan; Zhang, Jun; Bai, Xiaomei; Yu, Shuo; Yang, Zhuo
2016-01-01
With the support of cloud computing techniques, social coding platforms have changed the style of software development. Github is now the most popular social coding platform and project hosting service. Software developers of various levels keep entering Github, and use Github to save their public and private software projects. The large amounts of software developers and software repositories on Github are posing new challenges to the world of software engineering. This paper tries to tackle one of the important problems: analyzing the importance and influence of Github repositories. We proposed a HITS based influence analysis on graphs that represent the star relationship between Github users and repositories. A weighted version of HITS is applied to the overall star graph, and generates a different set of top influential repositories other than the results from standard version of HITS algorithm. We also conduct the influential analysis on per-month star graph, and study the monthly influence ranking of top repositories.
Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J
2014-02-01
A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.
The Flight Optimization System Weights Estimation Method
NASA Technical Reports Server (NTRS)
Wells, Douglas P.; Horvath, Bryce L.; McCullers, Linwood A.
2017-01-01
FLOPS has been the primary aircraft synthesis software used by the Aeronautics Systems Analysis Branch at NASA Langley Research Center. It was created for rapid conceptual aircraft design and advanced technology impact assessments. FLOPS is a single computer program that includes weights estimation, aerodynamics estimation, engine cycle analysis, propulsion data scaling and interpolation, detailed mission performance analysis, takeoff and landing performance analysis, noise footprint estimation, and cost analysis. It is well known as a baseline and common denominator for aircraft design studies. FLOPS is capable of calibrating a model to known aircraft data, making it useful for new aircraft and modifications to existing aircraft. The weight estimation method in FLOPS is known to be of high fidelity for conventional tube with wing aircraft and a substantial amount of effort went into its development. This report serves as a comprehensive documentation of the FLOPS weight estimation method. The development process is presented with the weight estimation process.
A techno-economic assessment of grid connected photovoltaic system for hospital building in Malaysia
NASA Astrophysics Data System (ADS)
Mat Isa, Normazlina; Tan, Chee Wei; Yatim, AHM
2017-07-01
Conventionally, electricity in hospital building are supplied by the utility grid which uses mix fuel including coal and gas. Due to enhancement in renewable technology, many building shall moving forward to install their own PV panel along with the grid to employ the advantages of the renewable energy. This paper present an analysis of grid connected photovoltaic (GCPV) system for hospital building in Malaysia. A discussion is emphasized on the economic analysis based on Levelized Cost of Energy (LCOE) and total Net Present Post (TNPC) in regards with the annual interest rate. The analysis is performed using Hybrid Optimization Model for Electric Renewables (HOMER) software which give optimization and sensitivity analysis result. An optimization result followed by the sensitivity analysis also being discuss in this article thus the impact of the grid connected PV system has be evaluated. In addition, the benefit from Net Metering (NeM) mechanism also discussed.
Investigation on low velocity impact resistance of SMA composite material
NASA Astrophysics Data System (ADS)
Hu, Dianyin; Zhang, Long; Wang, Rongqiao; Zhang, Xiaoyong
2016-04-01
A method to improve low velocity impact resistance of aeroengine composite casing using shape memory alloy's properties of shape memory(SM) and super-elasticity(SE) is proposed in this study. Firstly, a numerical modeling of SMA reinforced composite laminate under low velocity impact load with impact velocity of 10 m/s is established based on its constitutive model implemented by the VUMAT subroutine of commercial software ABAQUS. Secondly, the responses of SMA composite laminate including stress and deflection distributions were achieved through transient analysis under low velocity impact load. Numerical results show that both peak stress and deflection values of SMA composite laminate are less than that without SMA, which proves that embedding SMA into the composite structure can effectively improve the low velocity impact performance of composite structure. Finally, the influence of SM and SE on low velocity impact resistance is quantitatively investigated. The values of peak stress and deflection of SMA composite based on SM property decrease by 18.28% and 9.43% respectively, compared with those without SMA, instead of 12.87% and 5.19% based on SE. In conclusion, this proposed model described the impact damage of SMA composite structure and turned to be a more beneficial method to enhance the impact resistance by utilizing SM effect.
Experience report: Using formal methods for requirements analysis of critical spacecraft software
NASA Technical Reports Server (NTRS)
Lutz, Robyn R.; Ampo, Yoko
1994-01-01
Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
Learning from examples - Generation and evaluation of decision trees for software resource analysis
NASA Technical Reports Server (NTRS)
Selby, Richard W.; Porter, Adam A.
1988-01-01
A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.
What makes computational open source software libraries successful?
NASA Astrophysics Data System (ADS)
Bangerth, Wolfgang; Heister, Timo
2013-01-01
Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.
Song, Ruiguang; Hall, H Irene; Harrison, Kathleen McDavid; Sharpe, Tanya Telfair; Lin, Lillian S; Dean, Hazel D
2011-01-01
We developed a statistical tool that brings together standard, accessible, and well-understood analytic approaches and uses area-based information and other publicly available data to identify social determinants of health (SDH) that significantly affect the morbidity of a specific disease. We specified AIDS as the disease of interest and used data from the American Community Survey and the National HIV Surveillance System. Morbidity and socioeconomic variables in the two data systems were linked through geographic areas that can be identified in both systems. Correlation and partial correlation coefficients were used to measure the impact of socioeconomic factors on AIDS diagnosis rates in certain geographic areas. We developed an easily explained approach that can be used by a data analyst with access to publicly available datasets and standard statistical software to identify the impact of SDH. We found that the AIDS diagnosis rate was highly correlated with the distribution of race/ethnicity, population density, and marital status in an area. The impact of poverty, education level, and unemployment depended on other SDH variables. Area-based measures of socioeconomic variables can be used to identify risk factors associated with a disease of interest. When correlation analysis is used to identify risk factors, potential confounding from other variables must be taken into account.
78 FR 1162 - Cardiovascular Devices; Reclassification of External Cardiac Compressor
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... safety and electromagnetic compatibility; For devices containing software, software verification... electromagnetic compatibility; For devices containing software, software verification, validation, and hazard... electrical components, appropriate analysis and testing must validate electrical safety and electromagnetic...
FASEA: A FPGA Acquisition System and Software Event Analysis for liquid scintillation counting
NASA Astrophysics Data System (ADS)
Steele, T.; Mo, L.; Bignell, L.; Smith, M.; Alexiev, D.
2009-10-01
The FASEA (FPGA based Acquisition and Software Event Analysis) system has been developed to replace the MAC3 for coincidence pulse processing. The system uses a National Instruments Virtex 5 FPGA card (PXI-7842R) for data acquisition and a purpose developed data analysis software for data analysis. Initial comparisons to the MAC3 unit are included based on measurements of 89Sr and 3H, confirming that the system is able to accurately emulate the behaviour of the MAC3 unit.
ERIC Educational Resources Information Center
Rudner, Lawrence M.; Glass Gene V.; Evartt, David L.; Emery, Patrick J.
This manual and the accompanying software are intended to provide a step-by-step guide to conducting a meta-analytic study along with references for further reading and free high-quality software, "Meta-Stat.""Meta-Stat" is a comprehensive package designed to help in the meta-analysis of research studies in the social and behavioral sciences.…
Development of an automated asbestos counting software based on fluorescence microscopy.
Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio
2015-01-01
An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.
IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.
Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M
2016-04-01
Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses.
Desai, Trunil S; Srivastava, Shireesh
2018-01-01
13 C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13 C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13 C-MFA software that works in various operating systems will enable more researchers to perform 13 C-MFA and to further modify and develop the package.
FluxPyt: a Python-based free and open-source software for 13C-metabolic flux analyses
Desai, Trunil S.
2018-01-01
13C-Metabolic flux analysis (MFA) is a powerful approach to estimate intracellular reaction rates which could be used in strain analysis and design. Processing and analysis of labeling data for calculation of fluxes and associated statistics is an essential part of MFA. However, various software currently available for data analysis employ proprietary platforms and thus limit accessibility. We developed FluxPyt, a Python-based truly open-source software package for conducting stationary 13C-MFA data analysis. The software is based on the efficient elementary metabolite unit framework. The standard deviations in the calculated fluxes are estimated using the Monte-Carlo analysis. FluxPyt also automatically creates flux maps based on a template for visualization of the MFA results. The flux distributions calculated by FluxPyt for two separate models: a small tricarboxylic acid cycle model and a larger Corynebacterium glutamicum model, were found to be in good agreement with those calculated by a previously published software. FluxPyt was tested in Microsoft™ Windows 7 and 10, as well as in Linux Mint 18.2. The availability of a free and open 13C-MFA software that works in various operating systems will enable more researchers to perform 13C-MFA and to further modify and develop the package. PMID:29736347
Padalino, Saverio; Sfondrini, Maria Francesca; Chenuil, Laura; Scudeller, Luigia; Gandini, Paola
2014-12-01
The aim of this study was to assess the feasibility of skeletal maturation analysis using the Cervical Vertebrae Maturation (CVM) method by means of dedicated software, developed in collaboration with Outside Format (Paullo-Milan), as compared with manual analysis. From a sample of patients aged 7-21 years, we gathered 100 lateral cephalograms, 20 for each of the five CVM stages. For each cephalogram, we traced cervical vertebrae C2, C3 and C4 by hand using a lead pencil and an acetate sheet and dedicated software. All the tracings were made by an experienced operator (a dentofacial orthopedics resident) and by an inexperienced operator (a student in dental surgery). Each operator recorded the time needed to make each tracing in order to demonstrate differences in the times taken. Concordance between the manual analysis and the analysis performed using the dedicated software was 94% for the resident and 93% for the student. Interobserver concordance was 99%. The hand-tracing was quicker than that performed by means of the software (28 seconds more on average). The cervical vertebrae analysis software offers excellent clinical performance, even if the method takes longer than the manual technique. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
New software for 3D fracture network analysis and visualization
NASA Astrophysics Data System (ADS)
Song, J.; Noh, Y.; Choi, Y.; Um, J.; Hwang, S.
2013-12-01
This study presents new software to perform analysis and visualization of the fracture network system in 3D. The developed software modules for the analysis and visualization, such as BOUNDARY, DISK3D, FNTWK3D, CSECT and BDM, have been developed using Microsoft Visual Basic.NET and Visualization TookKit (VTK) open-source library. Two case studies revealed that each module plays a role in construction of analysis domain, visualization of fracture geometry in 3D, calculation of equivalent pipes, production of cross-section map and management of borehole data, respectively. The developed software for analysis and visualization of the 3D fractured rock mass can be used to tackle the geomechanical problems related to strength, deformability and hydraulic behaviors of the fractured rock masses.
Byrska-Bishop, Marta; Wallace, John; Frase, Alexander T; Ritchie, Marylyn D
2018-01-01
Abstract Motivation BioBin is an automated bioinformatics tool for the multi-level biological binning of sequence variants. Herein, we present a significant update to BioBin which expands the software to facilitate a comprehensive rare variant analysis and incorporates novel features and analysis enhancements. Results In BioBin 2.3, we extend our software tool by implementing statistical association testing, updating the binning algorithm, as well as incorporating novel analysis features providing for a robust, highly customizable, and unified rare variant analysis tool. Availability and implementation The BioBin software package is open source and freely available to users at http://www.ritchielab.com/software/biobin-download Contact mdritchie@geisinger.edu Supplementary information Supplementary data are available at Bioinformatics online. PMID:28968757
NASA Astrophysics Data System (ADS)
Saroglou, Charalampos; Asteriou, Pavlos; Zekkos, Dimitrios; Tsiambaos, George; Clark, Marin; Manousakis, John
2018-01-01
We present field evidence and a kinematic study of a rock block mobilized in the Ponti area by a Mw = 6.5 earthquake near the island of Lefkada on 17 November 2015. A detailed survey was conducted using an unmanned aerial vehicle (UAV) with an ultrahigh definition (UHD) camera, which produced a high-resolution orthophoto and a digital terrain model (DTM). The sequence of impact marks from the rock trajectory on the ground surface was identified from the orthophoto and field verified. Earthquake characteristics were used to estimate the acceleration of the rock slope and the initial condition of the detached block. Using the impact points from the measured rockfall trajectory, an analytical reconstruction of the trajectory was undertaken, which led to insights on the coefficients of restitution (CORs). The measured trajectory was compared with modeled rockfall trajectories using recommended parameters. However, the actual trajectory could not be accurately predicted, revealing limitations of existing rockfall analysis software used in engineering practice.
Development problem analysis of correlation leak detector’s software
NASA Astrophysics Data System (ADS)
Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.
2018-05-01
In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.
Bieri, Michael; d'Auvergne, Edward J; Gooley, Paul R
2011-06-01
Investigation of protein dynamics on the ps-ns and μs-ms timeframes provides detailed insight into the mechanisms of enzymes and the binding properties of proteins. Nuclear magnetic resonance (NMR) is an excellent tool for studying protein dynamics at atomic resolution. Analysis of relaxation data using model-free analysis can be a tedious and time consuming process, which requires good knowledge of scripting procedures. The software relaxGUI was developed for fast and simple model-free analysis and is fully integrated into the software package relax. It is written in Python and uses wxPython to build the graphical user interface (GUI) for maximum performance and multi-platform use. This software allows the analysis of NMR relaxation data with ease and the generation of publication quality graphs as well as color coded images of molecular structures. The interface is designed for simple data analysis and management. The software was tested and validated against the command line version of relax.
Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna
2016-01-01
Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.
Lou, Junyang; Obuchowski, Nancy A; Krishnaswamy, Amar; Popovic, Zoran; Flamm, Scott D; Kapadia, Samir R; Svensson, Lars G; Bolen, Michael A; Desai, Milind Y; Halliburton, Sandra S; Tuzcu, E Murat; Schoenhagen, Paul
2015-01-01
Preprocedural 3-dimensional CT imaging of the aortic annular plane plays a critical role for transcatheter aortic valve replacement (TAVR) planning; however, manual reconstructions are complex. Automated analysis software may improve reproducibility and agreement between readers but is incompletely validated. In 110 TAVR patients (mean age, 81 years; 37% female) undergoing preprocedural multidetector CT, automated reconstruction of the aortic annular plane and planimetry of the annulus was performed with a prototype of now commercially available software (syngo.CT Cardiac Function-Valve Pilot; Siemens Healthcare, Erlangen, Germany). Fully automated, semiautomated, and manual annulus measurements were compared. Intrareader and inter-reader agreement, intermodality agreement, and interchangeability were analyzed. Finally, the impact of these measurements on recommended valve size was evaluated. Semiautomated analysis required major correction in 5 patients (4.5%). In the remaining 95.5%, only minor correction was performed. Mean manual annulus area was significantly smaller than fully automated results (P < .001 for both readers) but similar to semiautomated measurements (5.0 vs 5.4 vs 4.9 cm(2), respectively). The frequency of concordant recommendations for valve size increased if manual analysis was replaced with the semiautomated method (60% agreement was improved to 82.4%; 95% confidence interval for the difference [69.1%-83.4%]). Semiautomated aortic annulus analysis, with minor correction by the user, provides reliable results in the context of TAVR annulus evaluation. Copyright © 2015 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
O'Kelly, F; Nason, G J; Manecksha, R P; Cascio, S; Quinn, F J; Leonard, M; Koyle, M A; Farhat, W; Leveridge, M J
2017-10-01
Social media (SoMe) comprises a number of internet-based applications that have the capability to disseminate multimodal media and allow for unprecedented inter-user connectivity. The role of Twitter has been studied in conferences and education; moreover, there is increasing evidence that patients are more likely to use social media for their own health education. The aim of this study was to assess the impact of social media platforms on the impact factor of both urological and paediatric journals that publish on paediatric urology, and to assess parental awareness of social media in paediatric urology. A filtered Journal of Citation Reports (JCR) search was performed for the period 2012-16 for journals that published articles on paediatric urology. Journals were ranked according to impact factor, and each individual journal website was accessed to assess for the presence of social media. Parents in paediatric urology clinics and non-paediatric urology patients also filled out a questionnaire to assess for awareness and attitudes to social media. All statistical analysis was performed using Prism 6 software (Prism 6, GraphPad Software, California, USA). Overall, there were 50 urological journals and 39 paediatric journals with a mean impact factor of 2.303 and 1.766, respectively. There was an overall average increase in impact factor across all urological journals between 2012 and 16. The presence of a Twitter feed was statistically significant for a rise in impact factor over the 4 years (P = 0.017). The cohort of parents was statistically more likely to have completed post-secondary education, to have and access to a social media profile, use it for health education, and use it to access journal/physician/hospital social media accounts. This study examined, for the first time, the role of social media in paediatric urology, and demonstrated that SoMe use is associated with a positive influence in impact factor, but also a parental appetite for it. Limitations included a non-externally validated questionnaire. There may also have been bias in larger journals that generate and maintain social media platforms such as Twitter, which may then in turn have an influence on impact factor. Social media use within paediatric urology was associated with a higher impact factor, which remained significant after 4 years of analysis. Parents were more likely to use a wide variety of social media to search for conditions and physicians/healthcare providers; therefore, journals and institutions need to embrace and endorse SoMe as a potential source of important clinical information. Copyright © 2017. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Moser, D. E.; Suggs, R. M.; Kupferschmidt, L.; Feldman, J.
2015-01-01
A bright impact flash detected by the NASA Lunar Impact Monitoring Program in March 2013 brought into focus the importance of determining the impact flash location. A process for locating the impact flash, and presumably its associated crater, was developed using commercially available software tools. The process was successfully applied to the March 2013 impact flash and put into production on an additional 300 impact flashes. The goal today: provide a description of the geolocation technique developed.
Toward Baseline Software Anomalies in NASA Missions
NASA Technical Reports Server (NTRS)
Layman, Lucas; Zelkowitz, Marvin; Basili, Victor; Nikora, Allen P.
2012-01-01
In this fast abstract, we provide preliminary findings an analysis of 14,500 spacecraft anomalies from unmanned NASA missions. We provide some baselines for the distributions of software vs. non-software anomalies in spaceflight systems, the risk ratings of software anomalies, and the corrective actions associated with software anomalies.
Calculating point of origin of blood spatter using laser scanning technology.
Hakim, Nashad; Liscio, Eugene
2015-03-01
The point of origin of an impact pattern is important in establishing the chain of events in a bloodletting incident. In this study, the accuracy and reproducibility of the point of origin estimation using the FARO Scene software with the FARO Focus(3D) laser scanner was determined. Five impact patterns were created for each of three combinations of distances from the floor (z) and the front wall (x). Fifteen spatters were created using a custom impact rig, scanned using the laser scanner, photographed using a DSLR camera, and processed using the Scene software. Overall results gave a SD = 3.49 cm (p < 0.0001) in the x-direction, SD = 1.14 cm (p = 0.9291) in the y-direction, and SD = 9.08 cm (p < 0.0115) in the z-direction. The technique performs within literature ranges of accepted accuracy and reproducibility and is comparable to results reported for other virtual stringing software. © 2015 American Academy of Forensic Sciences.
White, Gary C.; Hines, J.E.
2004-01-01
The reality is that the statistical methods used for analysis of data depend upon the availability of software. Analysis of marked animal data is no different than the rest of the statistical field. The methods used for analysis are those that are available in reliable software packages. Thus, the critical importance of having reliable, up–to–date software available to biologists is obvious. Statisticians have continued to develop more robust models, ever expanding the suite of potential analysis methodsavailable. But without software to implement these newer methods, they will languish in the abstract, and not be applied to the problems deserving them.In the Computers and Software Session, two new software packages are described, a comparison of implementation of methods for the estimation of nest survival is provided, and a more speculative paper about how the next generation of software might be structured is presented.Rotella et al. (2004) compare nest survival estimation with different software packages: SAS logistic regression, SAS non–linear mixed models, and Program MARK. Nests are assumed to be visited at various, possibly infrequent, intervals. All of the approaches described compute nest survival with the same likelihood, and require that the age of the nest is known to account for nests that eventually hatch. However, each approach offers advantages and disadvantages, explored by Rotella et al. (2004).Efford et al. (2004) present a new software package called DENSITY. The package computes population abundance and density from trapping arrays and other detection methods with a new and unique approach. DENSITY represents the first major addition to the analysis of trapping arrays in 20 years.Barker & White (2004) discuss how existing software such as Program MARK require that each new model’s likelihood must be programmed specifically for that model. They wishfully think that future software might allow the user to combine pieces of likelihood functions together to generate estimates. The idea is interesting, and maybe some bright young statistician can work out the specifics to implement the procedure.Choquet et al. (2004) describe MSURGE, a software package that implements the multistate capture–recapture models. The unique feature of MSURGE is that the design matrix is constructed with an interpreted language called GEMACO. Because MSURGE is limited to just multistate models, the special requirements of these likelihoods can be provided.The software and methods presented in these papers gives biologists and wildlife managers an expanding range of possibilities for data analysis. Although ease–of–use is generally getting better, it does not replace the need for understanding of the requirements and structure of the models being computed. The internet provides access to many free software packages as well as user–discussion groups to share knowledge and ideas. (A starting point for wildlife–related applications is (http://www.phidot.org).
The Strategic WAste Minimization Initiative (SWAMI) Software, Version 2.0 is a tool for using process analysis for identifying waste minimization opportunities within an industrial setting. The software requires user-supplied information for process definition, as well as materia...
Integrating Model-Based Verification into Software Design Education
ERIC Educational Resources Information Center
Yilmaz, Levent; Wang, Shuo
2005-01-01
Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…
1992-04-01
contractor’s existing data collection, analysis and corrective action system shall be utilized, with modification only as necessary to meet the...either from test or from analysis of field data . The procedures of MIL-STD-756B assume that the reliability of a 18 DEFINE IDENTIFY SOFTWARE LIFE CYCLE...to generate sufficient data to report a statistically valid reliability figure for a class of software. Casual data gathering accumulates data more
NASA Technical Reports Server (NTRS)
Osgood, Cathy; Williams, Kevin; Gentry, Philip; Brownfield, Dana; Hallstrom, John; Stuit, Tim
2012-01-01
Orbit Software Suite is used to support a variety of NASA/DM (Dependable Multiprocessor) mission planning and analysis activities on the IPS (Intrusion Prevention System) platform. The suite of Orbit software tools (Orbit Design and Orbit Dynamics) resides on IPS/Linux workstations, and is used to perform mission design and analysis tasks corresponding to trajectory/ launch window, rendezvous, and proximity operations flight segments. A list of tools in Orbit Software Suite represents tool versions established during/after the Equipment Rehost-3 Project.
Free and Open Source Software for land degradation vulnerability assessment
NASA Astrophysics Data System (ADS)
Imbrenda, Vito; Calamita, Giuseppe; Coluzzi, Rosa; D'Emilio, Mariagrazia; Lanfredi, Maria Teresa; Perrone, Angela; Ragosta, Maria; Simoniello, Tiziana
2013-04-01
Nowadays the role of FOSS software in scientific research is becoming increasingly important. Besides the important issues of reduced costs for licences, legality and security there are many other reasons that make FOSS software attractive. Firstly, making the code opened is a warranty of quality permitting to thousands of developers around the world to check the code and fix bugs rather than rely on vendors claims. FOSS communities are usually enthusiastic about helping other users for solving problems and expand or customize software (flexibility). Most important for this study, the interoperability allows to combine the user-friendly QGIS with the powerful GRASS-GIS and the richness of statistical methods of R in order to process remote sensing data and to perform geo-statistical analysis in one only environment. This study is focused on the land degradation (i.e. the reduction in the capacity of the land to provide ecosystem goods and services and assure its functions) and in particular on the estimation of the vulnerability levels in order to suggest appropriate policy actions to reduce/halt land degradation impacts, using the above mentioned software. The area investigated is the Basilicata Region (Southern Italy) where large natural areas are mixed with anthropized areas. To identify different levels of vulnerability we adopted the Environmentally Sensitive Areas (ESAs) model, based on the combination of indicators related to soil, climate, vegetation and anthropic stress. Such indicators were estimated by using the following data-sources: - Basilicata Region Geoportal to assess soil vulnerability; - DESERTNET2 project to evaluate potential vegetation vulnerability and climate vulnerability; - NDVI-MODIS satellite time series (2000-2010) with 250m resolution, available as 16-day composite from the NASA LP DAAC to characterize the dynamic component of vegetation; - Agricultural Census data 2010, Corine Land Cover 2006 and morphological information to assess the vulnerability to anthropic factors mainly connected with agricultural and grazing management. To achieve the final ESAs Index depicting the overall vulnerability to degradation of the investigated area we applied the geometric mean to cross normalized indices related to each examined component. In this context QGIS was used to display data and to perform basic GIS calculations, whereas GRASS was used for map-algebra operations and image processing. Finally R was used for computing statistical analysis (Principal Component Analysis) aimed to determine the relative importance of each adopted indicator. Our results show that GRASS, QGIS and R software are suitable to map land degradation vulnerability and identify highly vulnerable areas in which rehabilitation/recovery interventions are urgent. In addition they allow us to put into evidence the most important drivers of degradation thus supplying basic information for the setting up of intervention strategies. Ultimately, Free Open Source Software deliver a fair chance for geoscientific investigations thanks to their high interoperability and flexibility enabling to preserve the accuracy of the data and to reduce processing time. Moreover, the presence of several communities that steadily support users allows for achieving high quality results, making free open source software a valuable and easy alternative to conventional commercial software.
Vocal parameters and voice-related quality of life in adult women with and without ovarian function.
Ferraz, Pablo Rodrigo Rocha; Bertoldo, Simão Veras; Costa, Luanne Gabrielle Morais; Serra, Emmeliny Cristini Nogueira; Silva, Eduardo Magalhães; Brito, Luciane Maria Oliveira; Chein, Maria Bethânia da Costa
2013-05-01
To identify the perceptual and acoustic parameters of voice in adult women with and without ovarian function and its impact on quality of life related to voice. Cross-sectional and analytical study with 106 women divided into, two groups: G1, with ovarian function (n=43) and G2, without physiological ovarian function (n=63). The women were instructed to sustain the vowel "a" and the sounds of /s/ and /z/ in habitual pitch and loudness. They were also asked to classify their voices and answer the voice-related quality of life (V-RQOL) questionnaire. The perceptual analysis of the vocal samples was performed by three speech-language pathologists using the GRBASI (G: grade; R: roughness; B: breathness; A: asthenia; S: strain; I: instability) scale. The acoustic analysis was carried out with the software VoxMetria 2.7h (CTS Informatica). The data were analyzed using descriptive statistics. In the perceptual analysis, both groups showed a mild deviation for the parameters roughness, strain, and instability, but only G2 showed a mild impact for the overall degree of dysphonia. The mean of fundamental frequency was significantly lower for the G2, with a difference of 17.41Hz between the two groups. There was no impact on V-RQOL in any of the V-RQOL domains for this group. With the menopause, there is a change in women's voices, impacting on some voice parameters. However, there is no direct impact on their quality of life related to voice. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.
Distributed Engine Control Empirical/Analytical Verification Tools
NASA Technical Reports Server (NTRS)
DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan
2013-01-01
NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique capabilities to study the effects of a given change to the control system in the context of the distributed paradigm. The simulation tool can support treatment of all components within the control system, both virtual and real; these include communication data network, smart sensor and actuator nodes, centralized control system (FADEC full authority digital engine control), and the aircraft engine itself. The DECsim tool can allow simulation-based prototyping of control laws, control architectures, and decentralization strategies before hardware is integrated into the system. With the configuration specified, the simulator allows a variety of key factors to be systematically assessed. Such factors include control system performance, reliability, weight, and bandwidth utilization.
Putting Safety in the Software
NASA Technical Reports Server (NTRS)
Wetherholt, Martha S.; Berens, Kalynnda M.; Hardy, Sandra (Technical Monitor)
2001-01-01
Software is a vital component of nearly every piece of modern technology. It is not a 'sub-system', able to be separated out from the system as a whole, but a 'co-system' that controls, manipulates, or interacts with the hardware and with the end user. Software has its fingers into all the pieces of the pie. If that 'pie', the system, can lead to injury, death, loss of major equipment, or impact your business bottom line, then software safety becomes vitally important. Learning to think about software from a safety perspective is the focus of this paper. We want you to think of software as part of the safety critical system, a major part. This requires 'system thinking' - being able to grasp the whole picture. Software's contribution to modern technology is both good and potentially bad. Software allows more complex and useful devices to be built. It can also contribute to plane crashes and power outages. We want you to see software in a whole new light, see it as a contributor to system hazards, and also as a possible fix or mitigation to some of those hazards.
Engineering and Techno-Economic Assessment | Concentrating Solar Power |
performance and technology deployment, and investigates the environmental benefits and impacts of utility System (ReEDS) is a software model used to determine energy and environmental impacts. Learn more[BROKEN estimates the economic impacts of constructing and operating power generation and biofuel plants at the
The Impact of a Computerized Dietary Assessment on Nutrition Knowledge
ERIC Educational Resources Information Center
Hensleigh, Katherine Elizabeth; Eddy, James M.; Wang, Min Qi; Dennison, Darwin; Chaney, J. Don
2004-01-01
In recent years, many health educators have integrated computer applications into their health education program interventions. The assessment of the impact of these interventions is limited. This study assessed the impact of the Pyramid Challenge nutrition software program on nutrition knowledge levels of students enrolled in traditional personal…
Arcella, D; Soggiu, M E; Leclercq, C
2003-10-01
For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.