Learn by Yourself: The Self-Learning Tools for Qualitative Analysis Software Packages
ERIC Educational Resources Information Center
Freitas, Fábio; Ribeiro, Jaime; Brandão, Catarina; Reis, Luís Paulo; de Souza, Francislê Neri; Costa, António Pedro
2017-01-01
Computer Assisted Qualitative Data Analysis Software (CAQDAS) are tools that help researchers to develop qualitative research projects. These software packages help the users with tasks such as transcription analysis, coding and text interpretation, writing and annotation, content search and analysis, recursive abstraction, grounded theory…
Analyzing qualitative data with computer software.
Weitzman, E A
1999-01-01
OBJECTIVE: To provide health services researchers with an overview of the qualitative data analysis process and the role of software within it; to provide a principled approach to choosing among software packages to support qualitative data analysis; to alert researchers to the potential benefits and limitations of such software; and to provide an overview of the developments to be expected in the field in the near future. DATA SOURCES, STUDY DESIGN, METHODS: This article does not include reports of empirical research. CONCLUSIONS: Software for qualitative data analysis can benefit the researcher in terms of speed, consistency, rigor, and access to analytic methods not available by hand. Software, however, is not a replacement for methodological training. PMID:10591282
Bridging the Qualitative/Quantitative Software Divide
Annechino, Rachelle; Antin, Tamar M. J.; Lee, Juliet P.
2011-01-01
To compare and combine qualitative and quantitative data collected from respondents in a mixed methods study, the research team developed a relational database to merge survey responses stored and analyzed in SPSS and semistructured interview responses stored and analyzed in the qualitative software package ATLAS.ti. The process of developing the database, as well as practical considerations for researchers who may wish to use similar methods, are explored. PMID:22003318
ERIC Educational Resources Information Center
Williams, Lawrence H., Jr.
2013-01-01
This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…
The Implication of Using NVivo Software in Qualitative Data Analysis: Evidence-Based Reflections.
Zamawe, F C
2015-03-01
For a long time, electronic data analysis has been associated with quantitative methods. However, Computer Assisted Qualitative Data Analysis Software (CAQDAS) are increasingly being developed. Although the CAQDAS has been there for decades, very few qualitative health researchers report using it. This may be due to the difficulties that one has to go through to master the software and the misconceptions that are associated with using CAQDAS. While the issue of mastering CAQDAS has received ample attention, little has been done to address the misconceptions associated with CAQDAS. In this paper, the author reflects on his experience of interacting with one of the popular CAQDAS (NVivo) in order to provide evidence-based implications of using the software. The key message is that unlike statistical software, the main function of CAQDAS is not to analyse data but rather to aid the analysis process, which the researcher must always remain in control of. In other words, researchers must equally know that no software can analyse qualitative data. CAQDAS are basically data management packages, which support the researcher during analysis.
Computing in Qualitative Analysis: A Healthy Development?
ERIC Educational Resources Information Center
Richards, Lyn; Richards, Tom
1991-01-01
Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…
Pathways to Lean Software Development: An Analysis of Effective Methods of Change
ERIC Educational Resources Information Center
Hanson, Richard D.
2014-01-01
This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is…
Collaboration Strategies to Reduce Technical Debt
ERIC Educational Resources Information Center
Miko, Jeffrey Allen
2017-01-01
Inadequate software development collaboration processes can allow technical debt to accumulate increasing future maintenance costs and the chance of system failures. The purpose of this qualitative case study was to explore collaboration strategies software development leaders use to reduce the amount of technical debt created by software…
ERIC Educational Resources Information Center
Holcomb, Glenda S.
2010-01-01
This qualitative, phenomenological doctoral dissertation research study explored the software project team members perceptions of changing organizational cultures based on management decisions made at project deviation points. The research study provided a view into challenged or failing government software projects through the lived experiences…
Open Crowdsourcing: Leveraging Community Software Developers for IT Projects
ERIC Educational Resources Information Center
Phair, Derek
2012-01-01
This qualitative exploratory single-case study was designed to examine and understand the use of volunteer community participants as software developers and other project related roles, such as testers, in completing a web-based application project by a non-profit organization. This study analyzed the strategic decision to engage crowd…
The Value of Open Source Software Tools in Qualitative Research
ERIC Educational Resources Information Center
Greenberg, Gary
2011-01-01
In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…
Computer Aided Teaching of Digital Signal Processing.
ERIC Educational Resources Information Center
Castro, Ian P.
1990-01-01
Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…
From Print to Pixels: Practitioners' Reflections on the Use of Qualitative Data Analysis Software.
ERIC Educational Resources Information Center
Gilbert, Linda S.
This paper studied how individual qualitative researchers perceive that their research procedures and perspectives have been influenced by the adoption of computer assisted qualitative data software. The study focused on Nud*Ist software (non-numerical Unstructured Data; Indexing, Searching, and Theorizing). The seven participants ranged from new…
ERIC Educational Resources Information Center
Margerum-Leys, Jon; Kupperman, Jeff; Boyle-Heimann, Kristen
This paper presents perspectives on the use of data analysis software in the process of qualitative research. These perspectives were gained in the conduct of three qualitative research studies that differed in theoretical frames, areas of interests, and scope. Their common use of a particular data analysis software package allows the exploration…
Design and implementation of a Windows NT network to support CNC activities
NASA Technical Reports Server (NTRS)
Shearrow, C. A.
1996-01-01
The Manufacturing, Materials, & Processes Technology Division is undergoing dramatic changes to bring it's manufacturing practices current with today's technological revolution. The Division is developing Computer Automated Design and Computer Automated Manufacturing (CAD/CAM) abilities. The development of resource tracking is underway in the form of an accounting software package called Infisy. These two efforts will bring the division into the 1980's in relationship to manufacturing processes. Computer Integrated Manufacturing (CIM) is the final phase of change to be implemented. This document is a qualitative study and application of a CIM application capable of finishing the changes necessary to bring the manufacturing practices into the 1990's. The documentation provided in this qualitative research effort includes discovery of the current status of manufacturing in the Manufacturing, Materials, & Processes Technology Division including the software, hardware, network and mode of operation. The proposed direction of research included a network design, computers to be used, software to be used, machine to computer connections, estimate a timeline for implementation, and a cost estimate. Recommendation for the division's improvement include action to be taken, software to utilize, and computer configurations.
ERIC Educational Resources Information Center
Zambak, Vecihi S.; Tyminski, Andrew M.
2017-01-01
This study characterises the development of Specialised Content Knowledge (SCK) with dynamic geometry software (DGS) throughout a semester. The research employed a single-case study with the embedded units of three pre-service middle grades mathematics teachers. Qualitative data were collected, and factors affecting these three teachers' SCK…
ERIC Educational Resources Information Center
Rush, S. Craig
2014-01-01
This article draws on the author's experience using qualitative video and audio analysis, most notably through use of the Transana qualitative video and audio analysis software program, as an alternative method for teaching IQ administration skills to students in a graduate psychology program. Qualitative video and audio analysis may be useful for…
Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst
NASA Astrophysics Data System (ADS)
Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina
2015-03-01
In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.
Potential of the Cogex Software Platform to Replace Logbooks in Capstone Design Projects
ERIC Educational Resources Information Center
Foley, David; Charron, François; Plante, Jean-Sébastien
2018-01-01
Recent technologies are offering the power to share and grow knowledge and ideas in unprecedented ways. The CogEx software platform was developed to take advantage of the digital world with innovative ideas to support designers work in both industrial and academic contexts. This paper presents a qualitative study on the usage of CogEx during…
Pathways to lean software development: An analysis of effective methods of change
NASA Astrophysics Data System (ADS)
Hanson, Richard D.
This qualitative Delphi study explored the challenges that exist in delivering software on time, within budget, and with the original scope identified. The literature review identified many attempts over the past several decades to reform the methods used to develop software. These attempts found that the classical waterfall method, which is firmly entrenched in American business today was to blame for this difficulty (Chatterjee, 2010). Each of these proponents of new methods sought to remove waste, lighten out the process, and implement lean principles in software development. Through this study, the experts evaluated the barriers to effective development principles and defined leadership qualities necessary to overcome these barriers. The barriers identified were issues of resistance to change, risk and reward issues, and management buy-in. Thirty experts in software development from several Fortune 500 companies across the United States explored each of these issues in detail. The conclusion reached by these experts was that visionary leadership is necessary to overcome these challenges.
Developing a database for pedestrians' earthquake emergency evacuation in indoor scenarios.
Zhou, Junxue; Li, Sha; Nie, Gaozhong; Fan, Xiwei; Tan, Jinxian; Li, Huayue; Pang, Xiaoke
2018-01-01
With the booming development of evacuation simulation software, developing an extensive database in indoor scenarios for evacuation models is imperative. In this paper, we conduct a qualitative and quantitative analysis of the collected videotapes and aim to provide a complete and unitary database of pedestrians' earthquake emergency response behaviors in indoor scenarios, including human-environment interactions. Using the qualitative analysis method, we extract keyword groups and keywords that code the response modes of pedestrians and construct a general decision flowchart using chronological organization. Using the quantitative analysis method, we analyze data on the delay time, evacuation speed, evacuation route and emergency exit choices. Furthermore, we study the effect of classroom layout on emergency evacuation. The database for indoor scenarios provides reliable input parameters and allows the construction of real and effective constraints for use in software and mathematical models. The database can also be used to validate the accuracy of evacuation models.
Popelka, Stanislav; Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka
2016-01-01
The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe.
Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka
2016-01-01
The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe. PMID:27087805
Integrating Formal Methods and Testing 2002
NASA Technical Reports Server (NTRS)
Cukic, Bojan
2002-01-01
Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
Hucka, M; Finney, A; Bornstein, B J; Keating, S M; Shapiro, B E; Matthews, J; Kovitz, B L; Schilstra, M J; Funahashi, A; Doyle, J C; Kitano, H
2004-06-01
Biologists are increasingly recognising that computational modelling is crucial for making sense of the vast quantities of complex experimental data that are now being collected. The systems biology field needs agreed-upon information standards if models are to be shared, evaluated and developed cooperatively. Over the last four years, our team has been developing the Systems Biology Markup Language (SBML) in collaboration with an international community of modellers and software developers. SBML has become a de facto standard format for representing formal, quantitative and qualitative models at the level of biochemical reactions and regulatory networks. In this article, we summarise the current and upcoming versions of SBML and our efforts at developing software infrastructure for supporting and broadening its use. We also provide a brief overview of the many SBML-compatible software tools available today.
NASA Technical Reports Server (NTRS)
Trevino, Luis; Brown, Terry; Crumbley, R. T. (Technical Monitor)
2001-01-01
The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to improve overall vehicle system safety, reliability, and rocket engine performance by development of a qualitative and reliable engine control system (QRECS). Specifically, this will be addressed by enhancing rocket engine control using SCT, innovative data mining tools, and sound software engineering practices used in Marshall's Flight Software Group (FSG). The principle goals for addressing the issue of quality are to improve software management, software development time, software maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control methodologies, but to provide alternative design choices for control, implementation, performance, and sustaining engineering, all relative to addressing the issue of reliability. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion (system level), software engineering for embedded flight software systems, and soft computing technologies (i.e., neural networks, fuzzy logic, data mining, and Bayesian belief networks); some of which are briefed in this paper. For this effort, the targeted demonstration rocket engine testbed is the MC-1 engine (formerly FASTRAC) which is simulated with hardware and software in the Marshall Avionics & Software Testbed (MAST) laboratory that currently resides at NASA's Marshall Space Flight Center, building 4476, and is managed by the Avionics Department. A brief plan of action for design, development, implementation, and testing a Phase One effort for QRECS is given, along with expected results. Phase One will focus on development of a Smart Start Engine Module and a Mainstage Engine Module for proper engine start and mainstage engine operations. The overall intent is to demonstrate that by employing soft computing technologies, the quality and reliability of the overall scheme to engine controller development is further improved and vehicle safety is further insured. The final product that this paper proposes is an approach to development of an alternative low cost engine controller that would be capable of performing in unique vision spacecraft vehicles requiring low cost advanced avionics architectures for autonomous operations from engine pre-start to engine shutdown.
A Clustering-Based Approach to Enriching Code Foraging Environment.
Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu
2016-09-01
Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.
ERIC Educational Resources Information Center
Paulus, Trena M.; Bennett, Ann M.
2017-01-01
While research on teaching qualitative methods in education has increased, few studies explore teaching qualitative data analysis software within graduate-level methods courses. During 2013, we required students in several such courses to use ATLAS.ti™ as a project management tool for their assignments. By supporting students' early experiences…
An approach to integrating and creating flexible software environments
NASA Technical Reports Server (NTRS)
Bellman, Kirstie L.
1992-01-01
Engineers and scientists are attempting to represent, analyze, and reason about increasingly complex systems. Many researchers have been developing new ways of creating increasingly open environments. In this research on VEHICLES, a conceptual design environment for space systems, an approach was developed, called 'wrapping', to flexibility and integration based on the collection and then processing of explicit qualitative descriptions of all the software resources in the environment. Currently, a simulation is available, VSIM, used to study both the types of wrapping descriptions and the processes necessary to use the metaknowledge to combine, select, adapt, and explain some of the software resources used in VEHICLES. What was learned about the types of knowledge necessary for the wrapping approach is described along with the implications of wrapping for several key software engineering issues.
Agile methods in biomedical software development: a multi-site experience report.
Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A
2006-05-30
Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods.
Agile methods in biomedical software development: a multi-site experience report
Kane, David W; Hohman, Moses M; Cerami, Ethan G; McCormick, Michael W; Kuhlmman, Karl F; Byrd, Jeff A
2006-01-01
Background Agile is an iterative approach to software development that relies on strong collaboration and automation to keep pace with dynamic environments. We have successfully used agile development approaches to create and maintain biomedical software, including software for bioinformatics. This paper reports on a qualitative study of our experiences using these methods. Results We have found that agile methods are well suited to the exploratory and iterative nature of scientific inquiry. They provide a robust framework for reproducing scientific results and for developing clinical support systems. The agile development approach also provides a model for collaboration between software engineers and researchers. We present our experience using agile methodologies in projects at six different biomedical software development organizations. The organizations include academic, commercial and government development teams, and included both bioinformatics and clinical support applications. We found that agile practices were a match for the needs of our biomedical projects and contributed to the success of our organizations. Conclusion We found that the agile development approach was a good fit for our organizations, and that these practices should be applicable and valuable to other biomedical software development efforts. Although we found differences in how agile methods were used, we were also able to identify a set of core practices that were common to all of the groups, and that could be a focus for others seeking to adopt these methods. PMID:16734914
Selecting a software development methodology. [of digital flight control systems
NASA Technical Reports Server (NTRS)
Jones, R. E.
1981-01-01
The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.
Domain analysis for the reuse of software development experiences
NASA Technical Reports Server (NTRS)
Basili, V. R.; Briand, L. C.; Thomas, W. M.
1994-01-01
We need to be able to learn from past experiences so we can improve our software processes and products. The Experience Factory is an organizational structure designed to support and encourage the effective reuse of software experiences. This structure consists of two organizations which separates project development concerns from organizational concerns of experience packaging and learning. The experience factory provides the processes and support for analyzing, packaging, and improving the organization's stored experience. The project organization is structured to reuse this stored experience in its development efforts. However, a number of questions arise: What past experiences are relevant? Can they all be used (reused) on our current project? How do we take advantage of what has been learned in other parts of the organization? How do we take advantage of experience in the world-at-large? Can someone else's best practices be used in our organization with confidence? This paper describes approaches to help answer these questions. We propose both quantitative and qualitative approaches for effectively reusing software development experiences.
The Interaction between Multimedia Data Analysis and Theory Development in Design Research
ERIC Educational Resources Information Center
van Nes, Fenna; Doorman, Michiel
2010-01-01
Mathematics education researchers conducting instruction experiments using a design research methodology are challenged with the analysis of often complex and large amounts of qualitative data. In this paper, we present two case studies that show how multimedia analysis software can greatly support video data analysis and theory development in…
Building an experience factory for maintenance
NASA Technical Reports Server (NTRS)
Valett, Jon D.; Condon, Steven E.; Briand, Lionel; Kim, Yong-Mi; Basili, Victor R.
1994-01-01
This paper reports the preliminary results of a study of the software maintenance process in the Flight Dynamics Division (FDD) of the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC). This study is being conducted by the Software Engineering Laboratory (SEL), a research organization sponsored by the Software Engineering Branch of the FDD, which investigates the effectiveness of software engineering technologies when applied to the development of applications software. This software maintenance study began in October 1993 and is being conducted using the Quality Improvement Paradigm (QIP), a process improvement strategy based on three iterative steps: understanding, assessing, and packaging. The preliminary results represent the outcome of the understanding phase, during which SEL researchers characterized the maintenance environment, product, and process. Findings indicate that a combination of quantitative and qualitative analysis is effective for studying the software maintenance process, that additional measures should be collected for maintenance (as opposed to new development), and that characteristics such as effort, error rate, and productivity are best considered on a 'release' basis rather than on a project basis. The research thus far has documented some basic differences between new development and software maintenance. It lays the foundation for further application of the QIP to investigate means of improving the maintenance process and product in the FDD.
A Decision Model for Supporting Task Allocation Processes in Global Software Development
NASA Astrophysics Data System (ADS)
Lamersdorf, Ansgar; Münch, Jürgen; Rombach, Dieter
Today, software-intensive systems are increasingly being developed in a globally distributed way. However, besides its benefit, global development also bears a set of risks and problems. One critical factor for successful project management of distributed software development is the allocation of tasks to sites, as this is assumed to have a major influence on the benefits and risks. We introduce a model that aims at improving management processes in globally distributed projects by giving decision support for task allocation that systematically regards multiple criteria. The criteria and causal relationships were identified in a literature study and refined in a qualitative interview study. The model uses existing approaches from distributed systems and statistical modeling. The article gives an overview of the problem and related work, introduces the empirical and theoretical foundations of the model, and shows the use of the model in an example scenario.
ERIC Educational Resources Information Center
Putten, Jim Vander; Nolen, Amanda L.
2010-01-01
This study compared qualitative research results obtained by manual constant comparative analysis with results obtained by computer software analysis of the same data. An investigated about issues of trustworthiness and accuracy ensued. Results indicated that the inductive constant comparative data analysis generated 51 codes and two coding levels…
A qualitative approach to systemic diagnosis of the SSME
NASA Technical Reports Server (NTRS)
Bickmore, Timothy W.; Maul, William A.
1993-01-01
A generic software architecture has been developed for posttest diagnostics of rocket engines, and is presently being applied to the posttest analysis of the SSME. This investigation deals with the Systems Section module of the architecture, which is presently under development. Overviews of the manual SSME systems analysis process and the overall SSME diagnostic system architecture are presented.
ERIC Educational Resources Information Center
Vos, Hans J.
1994-01-01
Describes the construction of a model of computer-assisted instruction using a qualitative block diagram based on general systems theory (GST) as a framework. Subject matter representation is discussed, and appendices include system variables and system equations of the GST model, as well as an example of developing flexible courseware. (Contains…
Organizational Analysis of the United States Army Evaluation Center
2014-12-01
analysis of qualitative or quantitative data obtained from design reviews, hardware inspections, M&S, hardware and software testing , metrics review... Research Development Test & Evaluation (RDT&E) appropriation account. The Defense Acquisition Portal ACQuipedia website describes RDT&E as “ one of the... research , design , development, test and evaluation, production, installation, operation, and maintenance; data collection; processing and analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Basham, Bryan D.
1989-01-01
CONFIG is a modeling and simulation tool prototype for analyzing the normal and faulty qualitative behaviors of engineered systems. Qualitative modeling and discrete-event simulation have been adapted and integrated, to support early development, during system design, of software and procedures for management of failures, especially in diagnostic expert systems. Qualitative component models are defined in terms of normal and faulty modes and processes, which are defined by invocation statements and effect statements with time delays. System models are constructed graphically by using instances of components and relations from object-oriented hierarchical model libraries. Extension and reuse of CONFIG models and analysis capabilities in hybrid rule- and model-based expert fault-management support systems are discussed.
White, Michael J.; Judd, Maya D.; Poliandri, Simone
2012-01-01
Although there has been much optimistic discussion of integrating quantitative and qualitative findings into sociological analysis, there remains a gap regarding the application of mixed approaches. We examine the potential gains and pitfalls of such integration in the context of the growing analytic power of contemporary qualitative data analysis software (QDAS) programs. We illustrate the issues with our own research in a mixed-methods project examining low fertility in Italy, a project that combines analysis of large nationally representative survey data with qualitative in-depth interviews with women across four (4) cities in Italy. Despite the enthusiasm for mixed-methods research, the available software appears to be underutilized. In addition, we suggest that the sociological research community will want to address several conceptual and inferential issues with these approaches. PMID:23543938
White, Michael J; Judd, Maya D; Poliandri, Simone
2012-08-01
Although there has been much optimistic discussion of integrating quantitative and qualitative findings into sociological analysis, there remains a gap regarding the application of mixed approaches. We examine the potential gains and pitfalls of such integration in the context of the growing analytic power of contemporary qualitative data analysis software (QDAS) programs. We illustrate the issues with our own research in a mixed-methods project examining low fertility in Italy, a project that combines analysis of large nationally representative survey data with qualitative in-depth interviews with women across four (4) cities in Italy. Despite the enthusiasm for mixed-methods research, the available software appears to be underutilized. In addition, we suggest that the sociological research community will want to address several conceptual and inferential issues with these approaches.
Making Visible the Coding Process: Using Qualitative Data Software in a Post-Structural Study
ERIC Educational Resources Information Center
Ryan, Mary
2009-01-01
Qualitative research methods require transparency to ensure the "trustworthiness" of the data analysis. The intricate processes of organising, coding and analysing the data are often rendered invisible in the presentation of the research findings, which requires a "leap of faith" for the reader. Computer assisted data analysis software can be used…
Lindoerfer, Doris; Mansmann, Ulrich
2017-07-01
Patient registries are instrumental for medical research. Often their structures are complex and their implementations use composite software systems to meet the wide spectrum of challenges. Commercial and open-source systems are available for registry implementation, but many research groups develop their own systems. Methodological approaches in the selection of software as well as the construction of proprietary systems are needed. We propose an evidence-based checklist, summarizing essential items for patient registry software systems (CIPROS), to accelerate the requirements engineering process. Requirements engineering activities for software systems follow traditional software requirements elicitation methods, general software requirements specification (SRS) templates, and standards. We performed a multistep procedure to develop a specific evidence-based CIPROS checklist: (1) A systematic literature review to build a comprehensive collection of technical concepts, (2) a qualitative content analysis to define a catalogue of relevant criteria, and (3) a checklist to construct a minimal appraisal standard. CIPROS is based on 64 publications and covers twelve sections with a total of 72 items. CIPROS also defines software requirements. Comparing CIPROS with traditional software requirements elicitation methods, SRS templates and standards show a broad consensus but differences in issues regarding registry-specific aspects. Using an evidence-based approach to requirements engineering for registry software adds aspects to the traditional methods and accelerates the software engineering process for registry software. The method we used to construct CIPROS serves as a potential template for creating evidence-based checklists in other fields. The CIPROS list supports developers in assessing requirements for existing systems and formulating requirements for their own systems, while strengthening the reporting of patient registry software system descriptions. It may be a first step to create standards for patient registry software system assessments. Copyright © 2017 Elsevier Inc. All rights reserved.
[Stressor and stress reduction strategies for computer software engineers].
Asakura, Takashi
2002-07-01
First, in this article we discuss 10 significant occupational stressors for computer software engineers, based on the review of the scientific literature on their stress and mental health. The stressors include 1) quantitative work overload, 2) time pressure, 3) qualitative work load, 4) speed and diffusion of technological innovation, and technological divergence, 5) low discretional power, 6) underdeveloped career pattern, 7) low earnings/reward from jobs, 8) difficulties in managing a project team for software development and establishing support system, 9) difficulties in customer relations, and 10) personality characteristics. In addition, we delineate their working and organizational conditions that cause such occupational stressors in order to find strategies to reduce those stressors in their workplaces. Finally, we suggest three stressor and stress reduction strategies for software engineers.
Comprehensive Quantitative Analysis on Privacy Leak Behavior
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046
Comprehensive quantitative analysis on privacy leak behavior.
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.
Iterative categorization (IC): a systematic technique for analysing qualitative data
2016-01-01
Abstract The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. PMID:26806155
Hybrid Modeling for Testing Intelligent Software for Lunar-Mars Closed Life Support
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Nicholson, Leonard S. (Technical Monitor)
1999-01-01
Intelligent software is being developed for closed life support systems with biological components, for human exploration of the Moon and Mars. The intelligent software functions include planning/scheduling, reactive discrete control and sequencing, management of continuous control, and fault detection, diagnosis, and management of failures and errors. Four types of modeling information have been essential to system modeling and simulation to develop and test the software and to provide operational model-based what-if analyses: discrete component operational and failure modes; continuous dynamic performance within component modes, modeled qualitatively or quantitatively; configuration of flows and power among components in the system; and operations activities and scenarios. CONFIG, a multi-purpose discrete event simulation tool that integrates all four types of models for use throughout the engineering and operations life cycle, has been used to model components and systems involved in the production and transfer of oxygen and carbon dioxide in a plant-growth chamber and between that chamber and a habitation chamber with physicochemical systems for gas processing.
Design of Mobile Health Tools to Promote Goal Achievement in Self-Management Tasks
Henderson, Geoffrey; Parmanto, Bambang
2017-01-01
Background Goal-setting within rehabilitation is a common practice ultimately geared toward helping patients make functional progress. Objective The purposes of this study were to (1) qualitatively analyze data from a wellness program for patients with spina bifida (SB) and spinal cord injury (SCI) in order to generate software requirements for a goal-setting module to support their complex goal-setting routines, (2) design a prototype of a goal-setting module within an existing mobile health (mHealth) system, and (3) identify what educational content might be necessary to integrate into the system. Methods A total of 750 goals were analyzed from patients with SB and SCI enrolled in a wellness program. These goals were qualitatively analyzed in order to operationalize a set of software requirements for an mHealth goal-setting module and identify important educational content. Results Those of male sex (P=.02) and with SCI diagnosis (P<.001) were more likely to achieve goals than females or those with SB. Temporality (P<.001) and type (P<.001) of goal were associated with likelihood that the goal would be achieved. Nearly all (210/213; 98.6%) of the fact-finding goals were achieved. There was no significant difference in achievement based on goal theme. Checklists, data tracking, and fact-finding tools were identified as three functionalities that could support goal-setting and achievement in an mHealth system. Based on the qualitative analysis, a list of software requirements for a goal-setting module was generated, and a prototype was developed. Targets for educational content were also generated. Conclusions Innovative mHealth tools can be developed to support commonly set goals by individuals with disabilities. PMID:28739558
Rupcic, Sonia; Tamrat, Tigest; Kachnowski, Stan
2012-11-01
This study reviews the state of diabetes information technology (IT) initiatives and presents a set of recommendations for improvement based on interviews with commercial IT innovators. Semistructured interviews were conducted with 10 technology developers, representing 12 of the most successful IT companies in the world. Average interview time was approximately 45 min. Interviews were audio-recorded, transcribed, and entered into ATLAS.ti for qualitative data analysis. Themes were identified through a process of selective and open coding by three researchers. We identified two practices, common among successful IT companies, that have allowed them to avoid or surmount the challenges that confront healthcare professionals involved in diabetes IT development: (1) employing a diverse research team of software developers and engineers, statisticians, consumers, and business people and (2) conducting rigorous research and analytics on technology use and user preferences. Because of the nature of their respective fields, healthcare professionals and commercial innovators face different constraints. With these in mind we present three recommendations, informed by practices shared by successful commercial developers, for those involved in developing diabetes IT programming: (1) include software engineers on the implementation team throughout the intervention, (2) conduct more extensive baseline testing of users and monitor the usage data derived from the technology itself, and (3) pursue Institutional Review Board-exempt research.
Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm
2016-06-03
Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile capabilities with the free and open source software WordPress. This allows institutions without proprietary software to build a sophisticated ePortfolio system adapted to their needs with relatively few resources. The implementation of WordPress should be accompanied by introductory courses in the use of the software and its apps in order to facilitate its usability.
Validating agent oriented methodology (AOM) for netlogo modelling and simulation
NASA Astrophysics Data System (ADS)
WaiShiang, Cheah; Nissom, Shane; YeeWai, Sim; Sharbini, Hamizan
2017-10-01
AOM (Agent Oriented Modeling) is a comprehensive and unified agent methodology for agent oriented software development. AOM methodology was proposed to aid developers with the introduction of technique, terminology, notation and guideline during agent systems development. Although AOM methodology is claimed to be capable of developing a complex real world system, its potential is yet to be realized and recognized by the mainstream software community and the adoption of AOM is still at its infancy. Among the reason is that there are not much case studies or success story of AOM. This paper presents two case studies on the adoption of AOM for individual based modelling and simulation. It demonstrate how the AOM is useful for epidemiology study and ecological study. Hence, it further validate the AOM in a qualitative manner.
Using Discursis to enhance the qualitative analysis of hospital pharmacist-patient interactions.
Chevalier, Bernadette A M; Watson, Bernadette M; Barras, Michael A; Cottrell, William N; Angus, Daniel J
2018-01-01
Pharmacist-patient communication during medication counselling has been successfully investigated using Communication Accommodation Theory (CAT). Communication researchers in other healthcare professions have utilised Discursis software as an adjunct to their manual qualitative analysis processes. Discursis provides a visual, chronological representation of communication exchanges and identifies patterns of interactant engagement. The aim of this study was to describe how Discursis software was used to enhance previously conducted qualitative analysis of pharmacist-patient interactions (by visualising pharmacist-patient speech patterns, episodes of engagement, and identifying CAT strategies employed by pharmacists within these episodes). Visual plots from 48 transcribed audio recordings of pharmacist-patient exchanges were generated by Discursis. Representative plots were selected to show moderate-high and low- level speaker engagement. Details of engagement were investigated for pharmacist application of CAT strategies (approximation, interpretability, discourse management, emotional expression, and interpersonal control). Discursis plots allowed for identification of distinct patterns occurring within pharmacist-patient exchanges. Moderate-high pharmacist-patient engagement was characterised by multiple off-diagonal squares while alternating single coloured squares depicted low engagement. Engagement episodes were associated with multiple CAT strategies such as discourse management (open-ended questions). Patterns reflecting pharmacist or patient speaker dominance were dependant on clinical setting. Discursis analysis of pharmacist-patient interactions, a novel application of the technology in health communication, was found to be an effective visualisation tool to pin-point episodes for CAT analysis. Discursis has numerous practical and theoretical applications for future health communication research and training. Researchers can use the software to support qualitative analysis where large data sets can be quickly reviewed to identify key areas for concentrated analysis. Because Discursis plots are easily generated from audio recorded transcripts, they are conducive as teaching tools for both students and practitioners to assess and develop their communication skills.
Awoonor-Williams, John Koku; Schmitt, Margaret L; Tiah, Janet; Ndago, Joyce; Asuru, Rofina; Bawah, Ayaga A; Phillips, James F
2016-01-01
In 2010, the Ghana Health Service launched a program of cooperation with the Tanzania Ministry of Health and Social Welfare that was designed to adapt Tanzania's PLANREP budgeting and reporting tool to Ghana's primary health care program. The product of this collaboration is a system of budgeting, data visualization, and reporting that is known as the District Health Planning and Reporting Tool (DiHPART). This study was conducted to evaluate the design and implementation processes (technical, procedures, feedback, maintenance, and monitoring) of the DiHPART tool in northern Ghana. This paper reports on a qualitative appraisal of user reactions to the DiHPART system and implications of pilot experience for national scale-up. A total of 20 health officials responsible for financial planning operations were drawn from the national, regional, and district levels of the health system and interviewed in open-ended discussions about their reactions to DiHPART and suggestions for systems development. The findings show that technical shortcomings merit correction before scale-up can proceed. The review makes note of features of the software system that could be developed, based on experience gained from the pilot. Changes in the national system of financial reporting and budgeting complicate DiHPART utilization. This attests to the importance of pursuing a software application framework that anticipates the need for automated software generation. Despite challenges encountered in the pilot, the results lend support to the notion that evidence-based budgeting merits development and implementation in Ghana.
Communication and Organization in Software Development: An Empirical Study
NASA Technical Reports Server (NTRS)
Seaman, Carolyn B.; Basili, Victor R.
1996-01-01
The empirical study described in this paper addresses the issue of communication among members of a software development organization. The independent variables are various attributes of organizational structure. The dependent variable is the effort spent on sharing information which is required by the software development process in use. The research questions upon which the study is based ask whether or not these attributes of organizational structure have an effect on the amount of communication effort expended. In addition, there are a number of blocking variables which have been identified. These are used to account for factors other than organizational structure which may have an effect on communication effort. The study uses both quantitative and qualitative methods for data collection and analysis. These methods include participant observation, structured interviews, and graphical data presentation. The results of this study indicate that several attributes of organizational structure do affect communication effort, but not in a simple, straightforward way. In particular, the distances between communicators in the reporting structure of the organization, as well as in the physical layout of offices, affects how quickly they can share needed information, especially during meetings. These results provide a better understanding of how organizational structure helps or hinders communication in software development.
Test Driven Development of a Parameterized Ice Sheet Component
NASA Astrophysics Data System (ADS)
Clune, T.
2011-12-01
Test driven development (TDD) is a software development methodology that offers many advantages over traditional approaches including reduced development and maintenance costs, improved reliability, and superior design quality. Although TDD is widely accepted in many software communities, the suitability to scientific software is largely undemonstrated and warrants a degree of skepticism. Indeed, numerical algorithms pose several challenges to unit testing in general, and TDD in particular. Among these challenges are the need to have simple, non-redundant closed-form expressions to compare against the results obtained from the implementation as well as realistic error estimates. The necessity for serial and parallel performance raises additional concerns for many scientific applicaitons. In previous work I demonstrated that TDD performed well for the development of a relatively simple numerical model that simulates the growth of snowflakes, but the results were anecdotal and of limited relevance to far more complex software components typical of climate models. This investigation has now been extended by successfully applying TDD to the implementation of a substantial portion of a new parameterized ice sheet component within a full climate model. After a brief introduction to TDD, I will present techniques that address some of the obstacles encountered with numerical algorithms. I will conclude with some quantitative and qualitative comparisons against climate components developed in a more traditional manner.
Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo
2016-01-01
The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
Read, Sue; Nte, Sol; Corcoran, Patsy; Stephens, Richard
2013-05-01
Loss is a universal experience and death is perceived as the ultimate loss. The overarching aim of this research is to produce a qualitative, flexible, interactive, computerised tool to support the facilitation of emotional expressions around loss for people with intellectual disabilities. This paper explores the process of using Participatory Action Research (PAR) to develop this tool. Participator Action Research provided the indicative framework for the process of developing a software tool that is likely to be used in practice. People with intellectual disability worked alongside researchers to produce an accessible, flexible piece of software that can facilitate storytelling around loss and bereavement and promote spontaneous expression that can be shared with others. This tool has the capacity to enable individuals to capture experiences in a storyboard format; that can be stored; is easily retrievable; can be printed out; and could feasibly be personalised by the insertion of photographs. © 2012 Blackwell Publishing Ltd.
2015-05-30
study used quantitative and qualitative analytical methods in the examination of software versus hardware maintenance trends and forecasts, human and...financial resources at TYAD and SEC, and overall compliance with Title 10 mandates (e.g., 10 USC 2466). Quantitative methods were executed by...Systems (PEO EIS). These methods will provide quantitative-based analysis on which to base and justify trends and gaps, as well as qualitative methods
ERIC Educational Resources Information Center
Read, Sue; Nte, Sol; Corcoran, Patsy; Stephens, Richard
2013-01-01
Background: Loss is a universal experience and death is perceived as the ultimate loss. The overarching aim of this research is to produce a qualitative, flexible, interactive, computerised tool to support the facilitation of emotional expressions around loss for people with intellectual disabilities. This paper explores the process of using…
ERIC Educational Resources Information Center
Sandler, Heidi J.
2016-01-01
The purpose of this grounded theory study was to examine the relationship between corporate culture (artifacts, values, and assumptions) and the creative endeavor of innovation in the software development industry. Innovation, the active implementation of creative ideas, is a widespread enterprise in the corporate world, especially in the areas of…
ERIC Educational Resources Information Center
Bueno de Mesquita, Paul; Dean, Ross F.; Young, Betty J.
2010-01-01
Advances in digital video technology create opportunities for more detailed qualitative analyses of actual teaching practice in science and other subject areas. User-friendly digital cameras and highly developed, flexible video-analysis software programs have made the tasks of video capture, editing, transcription, and subsequent data analysis…
Exploring the Convergence of Sequences in the Embodied World Using GeoGebra
ERIC Educational Resources Information Center
de Moura Fonseca, Daila Silva Seabra; de Oliveira Lino Franchi, Regina Helena
2016-01-01
This study addresses the embodied approach of convergence of numerical sequences using the GeoGebra software. We discuss activities that were applied in regular calculus classes, as a part of a research which used a qualitative methodology and aimed to identify contributions of the development of activities based on the embodiment of concepts,…
ERIC Educational Resources Information Center
Günersel, Adalet B.; Fleming, Steven A.
2013-01-01
Research shows that computer-based simulations and animations are especially helpful in fields such as chemistry where concepts are abstract and cannot be directly observed. Bio-Organic Reaction Animations (BioORA) is a freely available 3D visualization software program developed to help students understand the chemistry of biomolecular events.…
The Case for Open Source Software: The Interactional Discourse Lab
ERIC Educational Resources Information Center
Choi, Seongsook
2016-01-01
Computational techniques and software applications for the quantitative content analysis of texts are now well established, and many qualitative data software applications enable the manipulation of input variables and the visualization of complex relations between them via interactive and informative graphical interfaces. Although advances in…
Awoonor-Williams, John Koku; Schmitt, Margaret L.; Tiah, Janet; Ndago, Joyce; Asuru, Rofina; Bawah, Ayaga A.; Phillips, James F.
2016-01-01
Background In 2010, the Ghana Health Service launched a program of cooperation with the Tanzania Ministry of Health and Social Welfare that was designed to adapt Tanzania's PLANREP budgeting and reporting tool to Ghana's primary health care program. The product of this collaboration is a system of budgeting, data visualization, and reporting that is known as the District Health Planning and Reporting Tool (DiHPART). Objective This study was conducted to evaluate the design and implementation processes (technical, procedures, feedback, maintenance, and monitoring) of the DiHPART tool in northern Ghana. Design This paper reports on a qualitative appraisal of user reactions to the DiHPART system and implications of pilot experience for national scale-up. A total of 20 health officials responsible for financial planning operations were drawn from the national, regional, and district levels of the health system and interviewed in open-ended discussions about their reactions to DiHPART and suggestions for systems development. Results The findings show that technical shortcomings merit correction before scale-up can proceed. The review makes note of features of the software system that could be developed, based on experience gained from the pilot. Changes in the national system of financial reporting and budgeting complicate DiHPART utilization. This attests to the importance of pursuing a software application framework that anticipates the need for automated software generation. Conclusions Despite challenges encountered in the pilot, the results lend support to the notion that evidence-based budgeting merits development and implementation in Ghana. PMID:27246868
Development and weighting of a life cycle assessment screening model
NASA Astrophysics Data System (ADS)
Bates, Wayne E.; O'Shaughnessy, James; Johnson, Sharon A.; Sisson, Richard
2004-02-01
Nearly all life cycle assessment tools available today are high priced, comprehensive and quantitative models requiring a significant amount of data collection and data input. In addition, most of the available software packages require a great deal of training time to learn how to operate the model software. Even after this time investment, results are not guaranteed because of the number of estimations and assumptions often necessary to run the model. As a result, product development, design teams and environmental specialists need a simplified tool that will allow for the qualitative evaluation and "screening" of various design options. This paper presents the development and design of a generic, qualitative life cycle screening model and demonstrates its applicability and ease of use. The model uses qualitative environmental, health and safety factors, based on site or product-specific issues, to sensitize the overall results for a given set of conditions. The paper also evaluates the impact of different population input ranking values on model output. The final analysis is based on site or product-specific variables. The user can then evaluate various design changes and the apparent impact or improvement on the environment, health and safety, compliance cost and overall corporate liability. Major input parameters can be varied, and factors such as materials use, pollution prevention, waste minimization, worker safety, product life, environmental impacts, return of investment, and recycle are evaluated. The flexibility of the model format will be discussed in order to demonstrate the applicability and usefulness within nearly any industry sector. Finally, an example using audience input value scores will be compared to other population input results.
NASA Technical Reports Server (NTRS)
Eichmann, David A.
1992-01-01
We present a user interface for software reuse repository that relies both on the informal semantics of faceted classification and the formal semantics of type signatures for abstract data types. The result is an interface providing both structural and qualitative feedback to a software reuser.
Patel, Vaishali N; Riley, Anne W
2007-10-01
A multiple case study was conducted to examine how staff in child out-of-home care programs used data from an Outcomes Management System (OMS) and other sources to inform decision-making. Data collection consisted of thirty-seven semi-structured interviews with clinicians, managers, and directors from two treatment foster care programs and two residential treatment centers, and individuals involved with developing the OMS; and observations of clinical and quality management meetings. Case study and grounded theory methodology guided analyses. The application of qualitative data analysis software is described. Results show that although staff rarely used data from the OMS, they did rely on other sources of systematically collected information to inform clinical, quality management, and program decisions. Analyses of how staff used these data suggest that improving the utility of OMS will involve encouraging staff to participate in data-based decision-making, and designing and implementing OMS in a manner that reflects how decision-making processes operate.
Emerging Uses of Computer Technology in Qualitative Research.
ERIC Educational Resources Information Center
Parker, D. Randall
The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…
Iterative categorization (IC): a systematic technique for analysing qualitative data.
Neale, Joanne
2016-06-01
The processes of analysing qualitative data, particularly the stage between coding and publication, are often vague and/or poorly explained within addiction science and research more broadly. A simple but rigorous and transparent technique for analysing qualitative textual data, developed within the field of addiction, is described. The technique, iterative categorization (IC), is suitable for use with inductive and deductive codes and can support a range of common analytical approaches, e.g. thematic analysis, Framework, constant comparison, analytical induction, content analysis, conversational analysis, discourse analysis, interpretative phenomenological analysis and narrative analysis. Once the data have been coded, the only software required is a standard word processing package. Worked examples are provided. © 2016 The Authors. Addiction published by John Wiley & Sons Ltd on behalf of Society for the Study of Addiction.
Keedle, Hazel; Schmied, Virginia; Burns, Elaine; Dahlen, Hannah
2018-01-01
This article explores the development and evaluation of a smartphone mobile software application (app) to collect qualitative data. The app was specifically designed to capture real-time qualitative data from women planning a vaginal birth after caesarean delivery. This article outlines the design and development of the app to include funding, ethics, and the recruitment of an app developer, as well as the evaluation of using the app by seven participants. Data collection methods used in qualitative research include interviews and focus groups (either online, face-to-face, or by phone), participant diaries, or observations of interactions. This article identifies an alternative data collection methodology using a smartphone app to collect real-time data. The app provides real-time data and instant access to data alongside the ability to access participants from a variety of locations. This allows the researcher to gain insight into the experiences of participants through audio or video recordings in longitudinal studies without the need for constant interactions or interviews with participants. Using smartphone applications can allow researchers to access participants who are traditionally hard to reach and access their data in real time. Evaluating these apps before use in research is invaluable. © 2017 Sigma Theta Tau International.
ERIC Educational Resources Information Center
Garcia, Criselda G.; Hooper, H. H., Jr.
2011-01-01
The purpose of the qualitative study using a phenomenological approach was to gain insight of preservice teachers' experiences with a WebCT seminar designed to develop critical thinking and problem-solving skills in a Hispanic-Serving Institution's teacher education program. By applying a "holistic approach" to analyze data, NVivo software was…
ERIC Educational Resources Information Center
Antony, Laljith
2016-01-01
Failing to prevent leaks of confidential and proprietary information to unauthorized users from software applications is a major challenge that companies face. Access control policies defined in software applications with access control mechanisms are unable to prevent information leaks from software applications to unauthorized users. Role-based…
Boolean Classes and Qualitative Inquiry. WCER Working Paper No. 2006-3
ERIC Educational Resources Information Center
Nathan, Mitchell J.; Jackson, Kristi
2006-01-01
The prominent role of Boolean classes in qualitative data analysis software is viewed by some as an encroachment of logical positivism on qualitative research methodology. The authors articulate an embodiment perspective, in which Boolean classes are viewed as conceptual metaphors for apprehending and manipulating data, concepts, and categories in…
Beyond Constant Comparison Qualitative Data Analysis: Using NVivo
ERIC Educational Resources Information Center
Leech, Nancy L.; Onwuegbuzie, Anthony J.
2011-01-01
The purposes of this paper are to outline seven types of qualitative data analysis techniques, to present step-by-step guidance for conducting these analyses via a computer-assisted qualitative data analysis software program (i.e., NVivo9), and to present screenshots of the data analysis process. Specifically, the following seven analyses are…
Using Discursis to enhance the qualitative analysis of hospital pharmacist-patient interactions
Barras, Michael A.; Angus, Daniel J.
2018-01-01
Introduction Pharmacist-patient communication during medication counselling has been successfully investigated using Communication Accommodation Theory (CAT). Communication researchers in other healthcare professions have utilised Discursis software as an adjunct to their manual qualitative analysis processes. Discursis provides a visual, chronological representation of communication exchanges and identifies patterns of interactant engagement. Aim The aim of this study was to describe how Discursis software was used to enhance previously conducted qualitative analysis of pharmacist-patient interactions (by visualising pharmacist-patient speech patterns, episodes of engagement, and identifying CAT strategies employed by pharmacists within these episodes). Methods Visual plots from 48 transcribed audio recordings of pharmacist-patient exchanges were generated by Discursis. Representative plots were selected to show moderate-high and low- level speaker engagement. Details of engagement were investigated for pharmacist application of CAT strategies (approximation, interpretability, discourse management, emotional expression, and interpersonal control). Results Discursis plots allowed for identification of distinct patterns occurring within pharmacist-patient exchanges. Moderate-high pharmacist-patient engagement was characterised by multiple off-diagonal squares while alternating single coloured squares depicted low engagement. Engagement episodes were associated with multiple CAT strategies such as discourse management (open-ended questions). Patterns reflecting pharmacist or patient speaker dominance were dependant on clinical setting. Discussion and conclusions Discursis analysis of pharmacist-patient interactions, a novel application of the technology in health communication, was found to be an effective visualisation tool to pin-point episodes for CAT analysis. Discursis has numerous practical and theoretical applications for future health communication research and training. Researchers can use the software to support qualitative analysis where large data sets can be quickly reviewed to identify key areas for concentrated analysis. Because Discursis plots are easily generated from audio recorded transcripts, they are conducive as teaching tools for both students and practitioners to assess and develop their communication skills. PMID:29787568
Effective factors in providing holistic care: a qualitative study.
Zamanzadeh, Vahid; Jasemi, Madineh; Valizadeh, Leila; Keogh, Brian; Taleghani, Fariba
2015-01-01
Holistic care is a comprehensive model of caring. Previous studies have shown that most nurses do not apply this method. Examining the effective factors in nurses' provision of holistic care can help with enhancing it. Studying these factors from the point of view of nurses will generate real and meaningful concepts and can help to extend this method of caring. A qualitative study was used to identify effective factors in holistic care provision. Data gathered by interviewing 14 nurses from university hospitals in Iran were analyzed with a conventional qualitative content analysis method and by using MAXQDA (professional software for qualitative and mixed methods data analysis) software. Analysis of data revealed three main themes as effective factors in providing holistic care: The structure of educational system, professional environment, and personality traits. Establishing appropriate educational, management systems, and promoting religiousness and encouragement will induce nurses to provide holistic care and ultimately improve the quality of their caring.
Applying a Framework to Evaluate Assignment Marking Software: A Case Study on Lightwork
ERIC Educational Resources Information Center
Heinrich, Eva; Milne, John
2012-01-01
This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has…
Lynne M. Westphal
2000-01-01
By using computer packages designed for qualitative data analysis a researcher can increase trustworthiness (i.e., validity and reliability) of conclusions drawn from qualitative research results. This paper examines trustworthiness issues and therole of computer software (QSR's NUD*IST) in the context of a current research project investigating the social...
Simulation Of Combat With An Expert System
NASA Technical Reports Server (NTRS)
Provenzano, J. P.
1989-01-01
Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.
Establishing Qualitative Software Metrics in Department of the Navy Programs
2015-10-29
dedicated to provide the highest quality software to its users. In doing, there is a need for a formalized set of Software Quality Metrics . The goal...of this paper is to establish the validity of those necessary Quality metrics . In our approach we collected the data of over a dozen programs...provide the necessary variable data for our formulas and tested the formulas for validity. Keywords: metrics ; software; quality I. PURPOSE Space
Towards Measurement of Confidence in Safety Cases
NASA Technical Reports Server (NTRS)
Denney, Ewen; Paim Ganesh J.; Habli, Ibrahim
2011-01-01
Arguments in safety cases are predominantly qualitative. This is partly attributed to the lack of sufficient design and operational data necessary to measure the achievement of high-dependability targets, particularly for safety-critical functions implemented in software. The subjective nature of many forms of evidence, such as expert judgment and process maturity, also contributes to the overwhelming dependence on qualitative arguments. However, where data for quantitative measurements is systematically collected, quantitative arguments provide far more benefits over qualitative arguments, in assessing confidence in the safety case. In this paper, we propose a basis for developing and evaluating integrated qualitative and quantitative safety arguments based on the Goal Structuring Notation (GSN) and Bayesian Networks (BN). The approach we propose identifies structures within GSN-based arguments where uncertainties can be quantified. BN are then used to provide a means to reason about confidence in a probabilistic way. We illustrate our approach using a fragment of a safety case for an unmanned aerial system and conclude with some preliminary observations
ERIC Educational Resources Information Center
Stuck, M. F.
This guide provides an introduction to the use of microcomputers with qualitative data. It is deliberately non-specific and rudimentary in order to be of maximum benefit to the widest possible audience of beginning microcomputer users who wish to analyze their data using software that is not specifically designed for qualitative data analysis. The…
Resilient Women Leaders: A Qualitative Investigation
ERIC Educational Resources Information Center
Baldwin, Julia; Maldonado, Nancy L.; Lacey, Candace H.; Efinger, Joan
2004-01-01
This qualitative study investigated perceptions of resilient, transformational, successful women leaders regarding their own resiliency and leadership. The ten participants provided information during semi-structured, open-ended, audio taped interviews which were transcribed, hand coded, and then analyzed with QSR N6 software. Findings indicated…
Utilizing Microsoft[R] Office to Produce and Present Recursive Frame Analysis Findings
ERIC Educational Resources Information Center
Chenail, Ronald J.; Duffy, Maureen
2011-01-01
Although researchers conducting qualitative descriptive studies, ethnographies, phenomenologies, grounded theory, and narrative inquiries commonly use computer-assisted qualitative data analysis software (CAQDAS) to manage their projects and analyses, investigators conducting discursive methodologies such as discourse or conversation analysis seem…
System testing of a production Ada (trademark) project: The GRODY study
NASA Technical Reports Server (NTRS)
Seigle, Jeffrey; Esker, Linda; Shi, Ying-Liang
1990-01-01
The use of the Ada language and design methodologies that utilize its features has a strong impact on all phases of the software development project lifecycle. At the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC), the Software Engineering Laboratory (SEL) conducted an experiment in parallel development of two flight dynamics systems in FORTRAN and Ada. The teams found some qualitative differences between the system test phases of the two projects. Although planning for system testing and conducting of tests were not generally affected by the use of Ada, the solving of problems found in system testing was generally facilitated by Ada constructs and design methodology. Most problems found in system testing were not due to difficulty with the language or methodology but to lack of experience with the application.
NASA Astrophysics Data System (ADS)
Poppe, Michaela; Zitek, Andreas; Salles, Paulo; Bredeweg, Bert; Muhar, Susanne
2010-05-01
The education system needs strategies to attract future scientists and practitioners. There is an alarming decline in the number of students choosing science subjects. Reasons for this include the perceived complexity and the lack of effective cognitive tools that enable learners to acquire the expertise in a way that fits its qualitative nature. The DynaLearn project utilises a "Learning by modelling" approach to deliver an individualised and engaging cognitive tool for acquiring conceptual knowledge. The modelling approach is based on qualitative reasoning, a research area within artificial intelligence, and allows for capturing and simulating qualitative systems knowledge. Educational activities within the DynaLearn software address topics at different levels of complexity, depending on the educational goals and settings. DynaLearn uses virtual characters in the learning environment as agents for engaging and motivating the students during their modelling exercise. The DynaLearn software represents an interactive learning environment in which learners are in control of their learning activities. The software is able to coach them individually based on their current progress, their knowledge needs and learning goals. Within the project 70 expert models on different environmental issues covering seven core topics (Earth Systems and Resources, The Living World, Human population, Land and Water Use, Energy Resources and Consumption, Pollution, and Global Changes) will be delivered. In the context of the core topic "Land and Water Use" the Institute of Hydrobiology and Aquatic Ecosystem Management has developed a model on Sustainable River Catchment Management. River systems with their catchments have been tremendously altered due to human pressures with serious consequences for the ecological integrity of riverine landscapes. The operation of hydropower plants, the implementation of flood protection measures, the regulation of flow and sediment regime and intensive land use in the catchments have created ecological problems. A sustainable, catchment-wide management of riverine landscapes is needed and stated by water right acts, e.g. the European Water Framework and Floods Directive. This interdisciplinary approach needs the integration of natural riverine processes, flood protection, resource management, landscape planning, and social and political aspects to achieve a sustainable development. Therefore the model shows the effects of different management strategies concerning flood protection, restoration measures and land use. The model illustrates the wide range of ecosystem services of riverine landscapes that contribute to human well-being such as water supply, hydropower generation, flood regulation, and recreational opportunities. The effects of different land use strategies in the catchment are highlighted by means of the Driver-Pressure-State-Impact-Response (DPSIR) framework. The model is used to support activities of students at the University as well as at High School within the DynaLearn Software to promote scientific culture in the secondary education system. Model fragments allow learners to re-use parts of the existing model at different levels of complexity. But learners can also construct their own conceptual system knowledge, either individually or in a collaborative setting, and using the model as a reference for comparisons of their own understanding. Of special interest for the DynaLearn project is the intended development of interdisciplinary and social skills like cooperative working, cross-linked thinking, problem solving, decision-making, and the identification of the conflicts between environment, economy, legislation, science, technology, and society. A comprehensive evaluation of the DynaLearn software is part of the project. To be effective, science education should focus on understanding scientific concepts and on application of scientific knowledge to everyday life. Conceptual knowledge of systems behaviour is crucial for society to understand and successfully interact with its environment. The transfer of environmental-scientific knowledge by means of the DynaLearn software to wide parts of the society can be regarded as an important contribution to that, and contributes to foster a life-long learning process.
A study about teaching quadratic functions using mathematical models and free software
NASA Astrophysics Data System (ADS)
Nepomucena, T. V.; da Silva, A. C.; Jardim, D. F.; da Silva, J. M.
2017-12-01
In the face of the reality of teaching Mathematics in Basic Education in Brazil, specially relating teach functions focusing their relevance to the student’s academic development in Basic and Superior Education, this work proposes the use of educational software to help the teaching of functions in Basic Education since the computers and software show as an outstanding option to help the teaching and learning processes. On the other hand, the study also proposes the use of Didactic Transposition as a methodology investigation and research. Along with this survey, some teaching interventions were applied to detect the main difficulties in the teaching process of functions in the Basic Education, analyzing the results obtained along the interventions in a qualitative form. Considering the discussion of the results at the end of the didactic interventions, it was verified that the results obtained were satisfactory.
A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.
Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy
2016-12-01
Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.
ERIC Educational Resources Information Center
Srinivasan, Srilekha; Perez, Lance C.; Palmer, Robert D.; Brooks, David W.; Wilson, Kathleen; Fowler, David
2006-01-01
A systematic study of the implementation of simulation hardware (TIMS) replacing software (MATLAB) was undertaken for advanced undergraduate and early graduate courses in electrical engineering. One outcome of the qualitative component of the study was remarkable: most students interviewed (4/4 and 6/9) perceived the software simulations as…
Programming Language CAMIL II: Implementation and Evaluation.
ERIC Educational Resources Information Center
Gardner, Edward
A reimplementation of Computer assisted/managed instruction language (CAMIL) for qualitative and quantitative improvements in the software is presented. The reformatted language is described narratively, and major components of the system software are indicated and discussed. Authoring aids and imbedded support facilities are also described, and…
Effective Factors in Providing Holistic Care: A Qualitative Study
Zamanzadeh, Vahid; Jasemi, Madineh; Valizadeh, Leila; Keogh, Brian; Taleghani, Fariba
2015-01-01
Background: Holistic care is a comprehensive model of caring. Previous studies have shown that most nurses do not apply this method. Examining the effective factors in nurses’ provision of holistic care can help with enhancing it. Studying these factors from the point of view of nurses will generate real and meaningful concepts and can help to extend this method of caring. Materials and Methods: A qualitative study was used to identify effective factors in holistic care provision. Data gathered by interviewing 14 nurses from university hospitals in Iran were analyzed with a conventional qualitative content analysis method and by using MAXQDA (professional software for qualitative and mixed methods data analysis) software. Results: Analysis of data revealed three main themes as effective factors in providing holistic care: The structure of educational system, professional environment, and personality traits. Conclusion: Establishing appropriate educational, management systems, and promoting religiousness and encouragement will induce nurses to provide holistic care and ultimately improve the quality of their caring. PMID:26009677
Nayak, Shalini G; Pai, Mamatha Shivananda; George, Linu Sara
2018-01-01
Conceptual models developed through qualitative research are based on the unique experiences of suffering and individuals' adoptions of each participant. A wide array of problems are faced by head-and-neck cancer (HNC) patients due to disease pathology and treatment modalities which are sufficient to influence the quality of life (QOL). Men possess greater self-acceptance and are better equipped with intrapersonal strength to cope with stress and adequacy compared to women. A qualitative phenomenology study was conducted among seven women suffering from HNC, with the objective to understand their experiences of suffering and to describe the phenomenon. Data were collected by face-to-face, in-depth, open-ended interviews. Data were analyzed using Open Code software (OPC 4.0) by following the steps of Colaizzi process. The phenomenon that emerged out of the lived experiences of HNC women was "Personified as paragon of suffering.optimistic being of achieving normalcy," with five major themes and 13 subthemes. The conceptual model developed with the phenomenological approach is very specific to the women suffering from HNC, which will be contributing to develop strategies to improve the QOL of women.
Nayak, Shalini G; Pai, Mamatha Shivananda; George, Linu Sara
2018-01-01
Background: Conceptual models developed through qualitative research are based on the unique experiences of suffering and individuals’ adoptions of each participant. A wide array of problems are faced by head-and-neck cancer (HNC) patients due to disease pathology and treatment modalities which are sufficient to influence the quality of life (QOL). Men possess greater self-acceptance and are better equipped with intrapersonal strength to cope with stress and adequacy compared to women. Methodology: A qualitative phenomenology study was conducted among seven women suffering from HNC, with the objective to understand their experiences of suffering and to describe the phenomenon. Data were collected by face-to-face, in-depth, open-ended interviews. Data were analyzed using Open Code software (OPC 4.0) by following the steps of Colaizzi process. Results: The phenomenon that emerged out of the lived experiences of HNC women was "Personified as paragon of suffering.optimistic being of achieving normalcy," with five major themes and 13 subthemes. Conclusion: The conceptual model developed with the phenomenological approach is very specific to the women suffering from HNC, which will be contributing to develop strategies to improve the QOL of women. PMID:29440812
ERIC Educational Resources Information Center
Burroughs-Lange, Sue G.; Lange, John
This paper evaluates the effects of using the NUDIST (Non-numerical, Unstructured Data Indexing, Searching and Theorising) computer program to organize coded, qualitative data. The use of the software is discussed within the context of the study for which it was used: an Australian study that aimed to develop a theoretical understanding of the…
Villoria, Eduardo M; Lenzi, Antônio R; Soares, Rodrigo V; Souki, Bernardo Q; Sigurdsson, Asgeir; Marques, Alexandre P; Fidel, Sandra R
2017-01-01
To describe the use of open-source software for the post-processing of CBCT imaging for the assessment of periapical lesions development after endodontic treatment. CBCT scans were retrieved from endodontic records of two patients. Three-dimensional virtual models, voxel counting, volumetric measurement (mm 3 ) and mean intensity of the periapical lesion were performed with ITK-SNAP v. 3.0 software. Three-dimensional models of the lesions were aligned and overlapped through the MeshLab software, which performed an automatic recording of the anatomical structures, based on the best fit. Qualitative and quantitative analyses of the changes in lesions size after treatment were performed with the 3DMeshMetric software. The ITK-SNAP v. 3.0 showed the smaller value corresponding to the voxel count and the volume of the lesion segmented in yellow, indicating reduction in volume of the lesion after the treatment. A higher value of the mean intensity of the segmented image in yellow was also observed, which suggested new bone formation. Colour mapping and "point value" tool allowed the visualization of the reduction of periapical lesions in several regions. Researchers and clinicians in the monitoring of endodontic periapical lesions have the opportunity to use open-source software.
Rehabilitation R@D Progress Reports, 1992-1993. Volume 30-31
1993-01-01
Transcripts of the videotape are being ana- lyzed on a hypertext database and also by qualitative data analysis software ( NUDIST ) to determine elements...number of videotapes have been transcribed and are being analyzed by the hypertext and NUDIST software. The first cycle is in progress, reflecting
Combining Agile and Traditional: Customer Communication in Distributed Environment
NASA Astrophysics Data System (ADS)
Korkala, Mikko; Pikkarainen, Minna; Conboy, Kieran
Distributed development is a radically increasing phenomenon in modern software development environments. At the same time, traditional and agile methodologies and combinations of those are being used in the industry. Agile approaches place a large emphasis on customer communication. However, existing knowledge on customer communication in distributed agile development seems to be lacking. In order to shed light on this topic and provide practical guidelines for companies in distributed agile environments, a qualitative case study was conducted in a large globally distributed software company. The key finding was that it might be difficult for an agile organization to get relevant information from a traditional type of customer organization, even though the customer communication was indicated to be active and utilized via multiple different communication media. Several challenges discussed in this paper referred to "information blackout" indicating the importance of an environment fostering meaningful communication. In order to evaluate if this environment can be created a set of guidelines is proposed.
Social marketing and MARTIN: tools for organizing, analyzing, and interpreting qualitative data.
Higgins, J W
1998-11-01
The purpose of this article is to discuss how the computer software program MARTIN and social marketing concepts (understanding the consumer perspective, exchange, marketing mix, and segmentation) were used as organizational, analytical, and interpretive tools for qualitative data. The qualitative data are from a case study on citizen participation in a health reform policy in British Columbia. The concept of broad-based public participation is a fundamental element of health promotion and citizenship. There is a gap, however, between the promise and reality of citizen participation in health promotion. Emerging from the analysis was an understanding of the societal circumstances that inhibited or fostered participation. This article describes how the code-based, theory-building attributes of the MARTIN software facilitated a new conceptualization of participatory citizenship and generated new insights into understanding why some people participate and others do not.
Automated Simulation For Analysis And Design
NASA Technical Reports Server (NTRS)
Cantwell, E.; Shenk, Tim; Robinson, Peter; Upadhye, R.
1992-01-01
Design Assistant Workstation (DAWN) software being developed to facilitate simulation of qualitative and quantitative aspects of behavior of life-support system in spacecraft, chemical-processing plant, heating and cooling system of large building, or any of variety of systems including interacting process streams and processes. Used to analyze alternative design scenarios or specific designs of such systems. Expert system will automate part of design analysis: reason independently by simulating design scenarios and return to designer with overall evaluations and recommendations.
Modular Analytical Multicomponent Analysis in Gas Sensor Aarrays
Chaiyboun, Ali; Traute, Rüdiger; Kiesewetter, Olaf; Ahlers, Simon; Müller, Gerhard; Doll, Theodor
2006-01-01
A multi-sensor system is a chemical sensor system which quantitatively and qualitatively records gases with a combination of cross-sensitive gas sensor arrays and pattern recognition software. This paper addresses the issue of data analysis for identification of gases in a gas sensor array. We introduce a software tool for gas sensor array configuration and simulation. It concerns thereby about a modular software package for the acquisition of data of different sensors. A signal evaluation algorithm referred to as matrix method was used specifically for the software tool. This matrix method computes the gas concentrations from the signals of a sensor array. The software tool was used for the simulation of an array of five sensors to determine gas concentration of CH4, NH3, H2, CO and C2H5OH. The results of the present simulated sensor array indicate that the software tool is capable of the following: (a) identify a gas independently of its concentration; (b) estimate the concentration of the gas, even if the system was not previously exposed to this concentration; (c) tell when a gas concentration exceeds a certain value. A gas sensor data base was build for the configuration of the software. With the data base one can create, generate and manage scenarios and source files for the simulation. With the gas sensor data base and the simulation software an on-line Web-based version was developed, with which the user can configure and simulate sensor arrays on-line.
The impact of the maternal experience with a jaundiced newborn on the breastfeeding relationship.
Willis, Sharla K; Hannon, Patricia R; Scrimshaw, Susan C
2002-05-01
To examine the process by which mothers' experiences with neonatal jaundice affect breastfeeding. We used ethnographic interviews with grounded theory methodology. Audiotaped data were transcribed and analyzed for themes using ATLAS/ti qualitative data analysis software (Scientific Software Development, Berlin, Germany). We studied a total of 47 Spanish- and English-speaking breastfeeding mothers of otherwise healthy infants diagnosed with neonatal jaundice. Our outcomes were the qualitative descriptions of maternal experiences with neonatal jaundice. Interactions with medical professionals emerged as the most important factor mediating the impact of neonatal jaundice on breastfeeding. Breastfeeding orders and the level of encouragement from medical professionals toward breastfeeding had the strongest effect on feeding decisions. Maternal reaction to and understanding of information from their physicians also played an important role. Guilt was common, as many mothers felt they had caused the jaundice by breastfeeding. By providing accurate information and encouragement to breastfeed, medical professionals have great impact on whether a mother continues breastfeeding after her experience with neonatal jaundice. Health care providers must be aware of how mothers receive and interpret information related to jaundice to minimize maternal reactions, such as guilt, that have a negative impact on breastfeeding.
Label-free tissue scanner for colorectal cancer screening
NASA Astrophysics Data System (ADS)
Kandel, Mikhail E.; Sridharan, Shamira; Liang, Jon; Luo, Zelun; Han, Kevin; Macias, Virgilia; Shah, Anish; Patel, Roshan; Tangella, Krishnarao; Kajdacsy-Balla, Andre; Guzman, Grace; Popescu, Gabriel
2017-06-01
The current practice of surgical pathology relies on external contrast agents to reveal tissue architecture, which is then qualitatively examined by a trained pathologist. The diagnosis is based on the comparison with standardized empirical, qualitative assessments of limited objectivity. We propose an approach to pathology based on interferometric imaging of "unstained" biopsies, which provides unique capabilities for quantitative diagnosis and automation. We developed a label-free tissue scanner based on "quantitative phase imaging," which maps out optical path length at each point in the field of view and, thus, yields images that are sensitive to the "nanoscale" tissue architecture. Unlike analysis of stained tissue, which is qualitative in nature and affected by color balance, staining strength and imaging conditions, optical path length measurements are intrinsically quantitative, i.e., images can be compared across different instruments and clinical sites. These critical features allow us to automate the diagnosis process. We paired our interferometric optical system with highly parallelized, dedicated software algorithms for data acquisition, allowing us to image at a throughput comparable to that of commercial tissue scanners while maintaining the nanoscale sensitivity to morphology. Based on the measured phase information, we implemented software tools for autofocusing during imaging, as well as image archiving and data access. To illustrate the potential of our technology for large volume pathology screening, we established an "intrinsic marker" for colorectal disease that detects tissue with dysplasia or colorectal cancer and flags specific areas for further examination, potentially improving the efficiency of existing pathology workflows.
Tamminen, Nina; Solin, Pia; Stengård, Eija; Kannas, Lasse; Kettunen, Tarja
2017-07-01
In this study, we aimed to investigate what competencies are needed for mental health promotion in health sector practice in Finland. A qualitative study was carried out to seek the views of mental health professionals regarding mental health promotion-related competencies. The data were collected via two focus groups and a questionnaire survey of professionals working in the health sector in Finland. The focus groups consisted of a total of 13 professionals. Further, 20 questionnaires were received from the questionnaire survey. The data were analysed using the qualitative data analysis software ATLAS.ti Scientific Software Development GmbH, Berlin. A content analysis was carried out. In total, 23 competencies were identified and clustered under the categories of theoretical knowledge, practical skills, and personal attitudes and values. In order to promote mental health, it is necessary to have a knowledge of the principles and concepts of mental health promotion, including methods and tools for effective practices. Furthermore, a variety of skills-based competencies such as communication and collaboration skills were described. Personal attitudes and values included a holistic approach and respect for human rights, among others. The study provides new information on what competencies are needed to plan, implement and evaluate mental health promotion in health sector practice, with the aim of contributing to a more effective workforce. The competencies provide aid in planning training programmes and qualifications, as well as job descriptions and roles in health sector workplaces related to mental health promotion.
Wireless Network Simulation in Aircraft Cabins
NASA Technical Reports Server (NTRS)
Beggs, John H.; Youssef, Mennatoallah; Vahala, Linda
2004-01-01
An electromagnetic propagation prediction tool was used to predict electromagnetic field strength inside airplane cabins. A commercial software package, Wireless Insite, was used to predict power levels inside aircraft cabins and the data was compared with previously collected experimental data. It was concluded that the software could qualitatively predict electromagnetic propagation inside the aircraft cabin environment.
Transitions in Classroom Technology: Instructor Implementation of Classroom Management Software
ERIC Educational Resources Information Center
Ackerman, David; Chung, Christina; Sun, Jerry Chih-Yuan
2014-01-01
The authors look at how business instructor needs are fulfilled by classroom management software (CMS), such as Moodle, and why instructors are sometimes slow to implement it. Instructors at different universities provided both qualitative and quantitative responses regarding their use of CMS. The results indicate that the top needs fulfilled by…
An Analysis of Related Software Cycles Among Organizations, People and the Software Industry
2008-06-01
confirmatory, theory testing type of research. Miles and Huberman (1994) claim: Much qualitative research lies between these two extremes. Something is...either graphically or in narrative form, the main things to be studied” ( Miles and Huberman 1994). The “main things” mentioned here refer to the key
The Data Collector: A Qualitative Research Tool.
ERIC Educational Resources Information Center
Handler, Marianne G.; Turner, Sandra V.
Computer software that is intended to assist the qualitative researcher in the analysis of textual data is relatively new. One such program, the Data Collector, is a HyperCard computer program designed for use on the Macintosh computer. A tool for organizing and analyzing textual data obtained from observations, interviews, surveys, and other…
Exploring Global Competence with Managers in India, Japan, and the Netherlands: A Qualitative Study
ERIC Educational Resources Information Center
Ras, Gerard J. M.
2011-01-01
This qualitative study explores the meaning of global competence for global managers in three different countries. Thirty interviews were conducted with global managers in India, Japan and the Netherlands through Skype, an internet based software. Findings are reported by country in five major categories: country background, personal…
Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B
2005-04-06
Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.
Wells, Stewart; Bullen, Chris
2008-01-01
This article describes the near failure of an information technology (IT) system designed to support a government-funded, primary care-based hepatitis B screening program in New Zealand. Qualitative methods were used to collect data and construct an explanatory model. Multiple incorrect assumptions were made about participants, primary care workflows and IT capacity, software vendor user knowledge, and the health IT infrastructure. Political factors delayed system development and it was implemented untested, almost failing. An intensive rescue strategy included system modifications, relaxation of data validity rules, close engagement with software vendors, and provision of intensive on-site user support. This case study demonstrates that consideration of the social, political, technological, and health care contexts is important for successful implementation of public health informatics projects.
Barriers to Early Detection of Breast Cancer Among African American Females Over Age of 55
2005-02-01
used for data analysis. NUDIST , software for qualitative data analysis will be used for systematic coding. All transcripts, as well as interviewer notes...will be coded in NUDIST . Dr. Smith and Mr. Worts will jointly develop the NUDIST coding system. Each of them will separately code each transcript and...already provided training in NUDIST to Dr. Smith and Mr. Worts. All interviews will be conducted by the Principal Investigator for this study who is
Management of natural resources through automatic cartographic inventory
NASA Technical Reports Server (NTRS)
Rey, P.; Gourinard, Y.; Cambou, F. (Principal Investigator)
1973-01-01
The author has identified the following significant results. Significant results of the ARNICA program from August 1972 - January 1973 have been: (1) establishment of image to object correspondence codes for all types of soil use and forestry in northern Spain; (2) establishment of a transfer procedure between qualitative (remote identification and remote interpretation) and quantitative (numerization, storage, automatic statistical cartography) use of images; (3) organization of microdensitometric data processing and automatic cartography software; and (4) development of a system for measuring reflectance simultaneous with imagery.
Experimenting with brass musical instruments
NASA Astrophysics Data System (ADS)
Lo Presto, Michael C.
2003-07-01
With the aid of microcomputer hardware and software for the introductory physics laboratory, I have developed several experiments dealing with the properties of brass musical instruments that could be used when covering sound anywhere from an introductory physics laboratory to a course in musical acoustics, or even independent studies. The results of these experiments demonstrate in a quantitative fashion the effects of the mouthpiece and bell on the frequencies of the sound waves and thus the musical pitches produced. Most introductory sources only discuss these effects qualitatively.
Modeling and Diagnostic Software for Liquefying-Fuel Rockets
NASA Technical Reports Server (NTRS)
Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann
2005-01-01
A report presents a study of five modeling and diagnostic computer programs considered for use in an integrated vehicle health management (IVHM) system during testing of liquefying-fuel hybrid rocket engines in the Hybrid Combustion Facility (HCF) at NASA Ames Research Center. Three of the programs -- TEAMS, L2, and RODON -- are model-based reasoning (or diagnostic) programs. The other two programs -- ICS and IMS -- do not attempt to isolate the causes of failures but can be used for detecting faults. In the study, qualitative models (in TEAMS and L2) and quantitative models (in RODON) having varying scope and completeness were created. Each of the models captured the structure and behavior of the HCF as a physical system. It was noted that in the cases of the qualitative models, the temporal aspects of the behavior of the HCF and the abstraction of sensor data are handled outside of the models, and it is necessary to develop additional code for this purpose. A need for additional code was also noted in the case of the quantitative model, though the amount of development effort needed was found to be less than that for the qualitative models.
Doing Qualitative Research Using Your Computer: A Practical Guide
ERIC Educational Resources Information Center
Hahn, Chris
2008-01-01
This book is a practical, hands-on guide to using commonly available everyday technology, including Microsoft software, to manage and streamline research projects. It uses straight-forward, everyday language to walk readers through this process, drawing on a wide range of examples to demonstrate how easy it is to use such software. This guide is…
Originality Detection Software in a Graduate Policy Course: A Mixed-Methods Evaluation of Plagiarism
ERIC Educational Resources Information Center
Dreuth Zeman, Laura; Steen, Julie A.; Metz Zeman, Natalie
2011-01-01
The authors used a mixed-methods approach to evaluate the use of Turnitin originality detection software in a graduate social work course. Qualitative analysis of student responses revealed positive and negative spent completing assignments, and the tone of the class. Quantitative analysis of students' originality scores indicated a short-term…
ICT Teachers' Acceptance of "Scratch" as Algorithm Visualization Software
ERIC Educational Resources Information Center
Saltan, Fatih; Kara, Mehmet
2016-01-01
This study aims to investigate the acceptance of ICT teachers pertaining to the use of Scratch as an Algorithm Visualization (AV) software in terms of perceived ease of use and perceived usefulness. An embedded mixed method research design was used in the study, in which qualitative data were embedded in quantitative ones and used to explain the…
ERIC Educational Resources Information Center
Matheson, Jennifer L.
2007-01-01
Transcribing interview data is a time-consuming task that most qualitative researchers dislike. Transcribing is even more difficult for people with physical limitations because traditional transcribing requires manual dexterity and the ability to sit at a computer for long stretches of time. Researchers have begun to explore using an automated…
Park, Sophie Elizabeth; Thomas, James
2018-06-07
It can be challenging to decide which evidence synthesis software to choose when doing a systematic review. This article discusses some of the important questions to consider in relation to the chosen method and synthesis approach. Software can support researchers in a range of ways. Here, a range of review conditions and software solutions. For example, facilitating contemporaneous collaboration across time and geographical space; in-built bias assessment tools; and line-by-line coding for qualitative textual analysis. EPPI-Reviewer is a review software for research synthesis managed by the EPPI-centre, UCL Institute of Education. EPPI-Reviewer has text mining automation technologies. Version 5 supports data sharing and re-use across the systematic review community. Open source software will soon be released. EPPI-Centre will continue to offer the software as a cloud-based service. The software is offered via a subscription with a one-month (extendible) trial available and volume discounts for 'site licences'. It is free to use for Cochrane and Campbell reviews. The next EPPI-Reviewer version is being built in collaboration with National Institute for Health and Care Excellence using 'surveillance' of newly published research to support 'living' iterative reviews. This is achieved using a combination of machine learning and traditional information retrieval technologies to identify the type of research each new publication describes and determine its relevance for a particular review, domain or guideline. While the amount of available knowledge and research is constantly increasing, the ways in which software can support the focus and relevance of data identification are also developing fast. Software advances are maximising the opportunities for the production of relevant and timely reviews. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Hu, Zhi-yu; Zhang, Lei; Ma, Wei-guang; Yan, Xiao-juan; Li, Zhi-xin; Zhang, Yong-zhi; Wang, Le; Dong, Lei; Yin, Wang-bao; Jia, Suo-tang
2012-03-01
Self-designed identifying software for LIBS spectral line was introduced. Being integrated with LabVIEW, the soft ware can smooth spectral lines and pick peaks. The second difference and threshold methods were employed. Characteristic spectrum of several elements matches the NIST database, and realizes automatic spectral line identification and qualitative analysis of the basic composition of sample. This software can analyze spectrum handily and rapidly. It will be a useful tool for LIBS.
Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis
NASA Astrophysics Data System (ADS)
Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz
2004-04-01
Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.
Quantitative reactive modeling and verification.
Henzinger, Thomas A
Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.
[Reporting echocardiography exams with the G8-Cardio ANMCO software].
Badano, L P; Marchesini, A; Pizzuti, A; Mantero, A; Cianflone, D; Neri, E; Caira, P; Tubaro, M
2001-03-01
The availability of a common computerized program for echocardiographic study archiving and reporting at national and/or international level could make it possible to standardize the echo reports of different echocardiographic laboratories, and to use the wealth of data thus obtainable with echocardiography, and to exploit its capillary territorial distribution, with the aim of collecting echocardiographic data in a standard format for epidemiological, scientific and administrative purposes. To develop such a software, an ad hoc joint National Association of Hospital Cardiologists and Italian Society of Echocardiography task force worked in conjunction with the Italian Branch of Agilent Technologies to standardize the phraseology of accepted echocardiographic terms and of the quantitative parameters derived from transthoracic and transesophageal echocardiographic examination at rest as well as during exercise and pharmacological stress, and to develop an ad hoc software. This echocardiographic study archiving and reporting program is part of the whole G8-Cardio ANMCO software developed to computerize the whole cardiological chart. The software has been developed by Agilent Technologies to provide a fast, easy-access and easy to use report generator for the non-computer specialist using DBMS Oracle 7.3 database and Power Builder 5.0 to develop a user-friendly interface. The number of qualitative and quantitative variables contained in the program is 733 for echocardiography at rest, while it depends on the stressor and on the length of the examination for the stress echo (dipyridamole 214-384, dobutamine 236-406, exercise 198-392). The program was tested and refined in our laboratory between November 1999 and May 2000. During this time period, 291 resting and 56 stress echocardiographic studies were reported and recorded in a database. On average, each resting echocardiographic study lasting 10 +/- 4 (range 5-17) min was recorded using 50 +/- 11 (range 33-67) variables and 41,566 bytes of hard-disk memory space. Stress echocardiographic studies, each lasting 7 +/- 5 (range 5-21) min, were recorded using 143 +/- 74 (range 38-194) variables and 38,531 bytes of hard-disk memory space. To our knowledge this software represents the first experience of a common computerized program for echo archiving and reporting carried out at national level.
Early Algebra with Graphics Software as a Type II Application of Technology
ERIC Educational Resources Information Center
Abramovich, Sergei
2006-01-01
This paper describes the use of Kid Pix-graphics software for creative activities of young children--in the context of early algebra as determined by the mathematics core curriculum of New York state. It shows how grade-two appropriate pedagogy makes it possible to bring about a qualitative change in the learning process of those commonly…
ERIC Educational Resources Information Center
Holstein, Simona; Cohen, Anat
2016-01-01
The characteristics of successful MOOCs were explored in this study. Thousands of student reviews regarding five xMOOCs (Massive Open Online Course) in the fields of software, science, and management were extracted from the Coursetalk website and analyzed by quantitative and qualitative methods using the Garrison, Anderson, and Archer (2000)…
ERIC Educational Resources Information Center
Bruton, Samuel; Childers, Dan
2016-01-01
Recently, the usage of plagiarism detection software such as Turnitin® has increased dramatically among university instructors. At the same time, academic criticism of this software's employment has also increased. We interviewed 23 faculty members from various departments at a medium-sized, public university in the southeastern US to determine…
ERIC Educational Resources Information Center
Akre, Christina; Michaud, Pierre-Andre; Berchtold, Andre; Suris, Joan-Carles
2010-01-01
The purpose of this article is to identify tobacco and cannabis co-consumptions and consumers' perceptions of each substance. A qualitative research including 22 youths (14 males) aged 15-21 years in seven individual interviews and five focus groups. Discussions were recorded, transcribed verbatim and transferred to Atlas.ti software for narrative…
Sociotechnical Challenges of Developing an Interoperable Personal Health Record
Gaskin, G.L.; Longhurst, C.A.; Slayton, R.; Das, A.K.
2011-01-01
Objectives To analyze sociotechnical issues involved in the process of developing an interoperable commercial Personal Health Record (PHR) in a hospital setting, and to create guidelines for future PHR implementations. Methods This qualitative study utilized observational research and semi-structured interviews with 8 members of the hospital team, as gathered over a 28 week period of developing and adapting a vendor-based PHR at Lucile Packard Children’s Hospital at Stanford University. A grounded theory approach was utilized to code and analyze over 100 pages of typewritten field notes and interview transcripts. This grounded analysis allowed themes to surface during the data collection process which were subsequently explored in greater detail in the observations and interviews. Results Four major themes emerged: (1) Multidisciplinary teamwork helped team members identify crucial features of the PHR; (2) Divergent goals for the PHR existed even within the hospital team; (3) Differing organizational conceptions of the end-user between the hospital and software company differentially shaped expectations for the final product; (4) Difficulties with coordination and accountability between the hospital and software company caused major delays and expenses and strained the relationship between hospital and software vendor. Conclusions Though commercial interoperable PHRs have great potential to improve healthcare, the process of designing and developing such systems is an inherently sociotechnical process with many complex issues and barriers. This paper offers recommendations based on the lessons learned to guide future development of such PHRs. PMID:22003373
Multimodality Data Integration in Epilepsy
Muzik, Otto; Chugani, Diane C.; Zou, Guangyu; Hua, Jing; Lu, Yi; Lu, Shiyong; Asano, Eishi; Chugani, Harry T.
2007-01-01
An important goal of software development in the medical field is the design of methods which are able to integrate information obtained from various imaging and nonimaging modalities into a cohesive framework in order to understand the results of qualitatively different measurements in a larger context. Moreover, it is essential to assess the various features of the data quantitatively so that relationships in anatomical and functional domains between complementing modalities can be expressed mathematically. This paper presents a clinically feasible software environment for the quantitative assessment of the relationship among biochemical functions as assessed by PET imaging and electrophysiological parameters derived from intracranial EEG. Based on the developed software tools, quantitative results obtained from individual modalities can be merged into a data structure allowing a consistent framework for advanced data mining techniques and 3D visualization. Moreover, an effort was made to derive quantitative variables (such as the spatial proximity index, SPI) characterizing the relationship between complementing modalities on a more generic level as a prerequisite for efficient data mining strategies. We describe the implementation of this software environment in twelve children (mean age 5.2 ± 4.3 years) with medically intractable partial epilepsy who underwent both high-resolution structural MR and functional PET imaging. Our experiments demonstrate that our approach will lead to a better understanding of the mechanisms of epileptogenesis and might ultimately have an impact on treatment. Moreover, our software environment holds promise to be useful in many other neurological disorders, where integration of multimodality data is crucial for a better understanding of the underlying disease mechanisms. PMID:17710251
Shoemaker, Michael J; Platko, Christina M; Cleghorn, Susan M; Booth, Andrew
2014-07-01
The purpose of this retrospective qualitative case report is to describe how a case-based, virtual patient interprofessional education (IPE) simulation activity was utilized to achieve physician assistant (PA), physical therapy (PT) and occupational therapy (OT) student IPE learning outcomes. Following completion of a virtual patient case, 30 PA, 46 PT and 24 OT students were required to develop a comprehensive, written treatment plan and respond to reflective questions. A qualitative analysis of the submitted written assignment was used to determine whether IPE learning objectives were met. Student responses revealed three themes that supported the learning objectives of the IPE experience: benefits of collaborative care, role clarification and relevance of the IPE experience for future practice. A case-based, IPE simulation activity for physician assistant and rehabilitation students using a computerized virtual patient software program effectively facilitated achievement of the IPE learning objectives, including development of greater student awareness of other professions and ways in which collaborative patient care can be provided.
[Problem based learning from the perspective of tutors].
Navarro Hernández, Nancy; Illesca P, Mónica; Cabezas G, Mirtha
2009-02-01
Problem based learning is a student centered learning technique that develops deductive, constructive and reasoning capacities among the students. Teachers must adapt to this paradigm of constructing rather than transmitting knowledge. To interpret the importance of tutors in problem based learning during a module of Health research and management given to medical, nursing, physical therapy, midwifery, technology and nutrition students. Eight teachers that participated in a module using problem based learning accepted to participate in an in depth interview. The qualitative analysis of the textual information recorded, was performed using the ATLAS software. We identified 662 meaning units, grouped in 29 descriptive categories, with eight emerging meta categories. The sequential and cross-generated qualitative analysis generated four domains: competence among students, competence of teachers, student-centered learning and evaluation process. Multiprofessional problem based learning contributes to the development of generic competences among future health professionals, such as multidisciplinary work, critical capacity and social skills. Teachers must shelter the students in the context of their problems and social situation.
Francis, Jacinta; Giles-Corti, Billie; Wood, Lisa; Knuiman, Matthew
2014-12-01
Neighbourhood characteristics have been linked to a range of health outcomes, including mental health. Despite the growth of master planned estates (MPEs) within Australia, few studies have investigated the physical and social correlates of mental health in residents of new housing developments. Methods This study aimed to identify the facilitators of, and barriers to, mentally healthy neighbourhoods using focus groups with residents of MPEs in metropolitan Perth, Western Australia. Focus group interviews were analysed using qualitative research software package QSR NVivo. Results and Conclusions Results suggest that mental health is strongly influenced by a sense of community and security, as well as an aesthetically pleasing environment. Residents of MPEs may experience a strong sense of community due to similarities in life-stage and the community building efforts of property developers. Expanding population size, social exclusion, and insufficient services may negatively affect the mental health of residents in MPEs. SO WHAT?: Identifying correlates of mentally healthy neighbourhoods may help urban planners design residential areas that promote healthy living.
NASA Technical Reports Server (NTRS)
Pineda, Evan J.; Waas, Anthony M.; Berdnarcyk, Brett A.; Arnold, Steven M.; Collier, Craig S.
2009-01-01
This preliminary report demonstrates the capabilities of the recently developed software implementation that links the Generalized Method of Cells to explicit finite element analysis by extending a previous development which tied the generalized method of cells to implicit finite elements. The multiscale framework, which uses explicit finite elements at the global-scale and the generalized method of cells at the microscale is detailed. This implementation is suitable for both dynamic mechanics problems and static problems exhibiting drastic and sudden changes in material properties, which often encounter convergence issues with commercial implicit solvers. Progressive failure analysis of stiffened and un-stiffened fiber-reinforced laminates subjected to normal blast pressure loads was performed and is used to demonstrate the capabilities of this framework. The focus of this report is to document the development of the software implementation; thus, no comparison between the results of the models and experimental data is drawn. However, the validity of the results are assessed qualitatively through the observation of failure paths, stress contours, and the distribution of system energies.
NASA Astrophysics Data System (ADS)
Ma, Xiaoli; Guo, Xiaoyu; Song, Yuelin; Qiao, Lirui; Wang, Wenguang; Zhao, Mingbo; Tu, Pengfei; Jiang, Yong
2016-12-01
Clarification of the chemical composition of traditional Chinese medicine formulas (TCMFs) is a challenge due to the variety of structures and the complexity of plant matrices. Herein, an integrated strategy was developed by hyphenating ultra-performance liquid chromatography (UPLC), quadrupole time-of-flight (Q-TOF), hybrid triple quadrupole-linear ion trap mass spectrometry (Qtrap-MS), and the novel post-acquisition data processing software UNIFI to achieve automatic, rapid, accurate, and comprehensive qualitative and quantitative analysis of the chemical components in TCMFs. As a proof-of-concept, the chemical profiling of Baoyuan decoction (BYD), which is an ancient TCMF that is clinically used for the treatment of coronary heart disease that consists of Ginseng Radix et Rhizoma, Astragali Radix, Glycyrrhizae Radix et Rhizoma Praeparata Cum Melle, and Cinnamomi Cortex, was performed. As many as 236 compounds were plausibly or unambiguously identified, and 175 compounds were quantified or relatively quantified by the scheduled multiple reaction monitoring (sMRM) method. The findings demonstrate that the strategy integrating the rapidity of UNIFI software, the efficiency of UPLC, the accuracy of Q-TOF-MS, and the sensitivity and quantitation ability of Qtrap-MS provides a method for the efficient and comprehensive chemome characterization and quality control of complex TCMFs.
Horodniceanu, Erica G; Bal, Vasudha; Dhatt, Harman; Carter, John A; Huang, Vicky; Lasch, Kathryn
2017-06-23
Compliance, palatability, gastrointestinal (GI) symptom, and treatment satisfaction patient- and observer-reported outcome (PRO, ObsRO) measures were developed/modified for patients with transfusion-dependent anemias or myelodysplastic syndrome (MDS) requiring iron chelation therapy (ICT). This qualitative cross-sectional observational study used grounded theory data collection and analysis methods and followed PRO/ObsRO development industry guidance. Patients and caregivers of patients with transfusion-dependent anemias or MDS were individually interviewed face-to-face to cognitively debrief the Compliance, Palatability, GI Symptom Diary, and Modified Satisfaction with Iron Chelation Therapy (SICT) instruments presented electronically. Interviews were conducted in sets. Interviews began open-endedly to spontaneously elicit ICT experiences. Item modifications were debriefed during the later interviews. Interviews were audio recorded, transcribed, and coded. Data was analyzed using ATLAS.ti qualitative research software. Twenty-one interviews were completed (Set 1: 5 patients, 6 caregivers; Set 2: 6 patients, 4 caregivers) in 6 US cities. Mean age was 43 years for patients and 9 years for children of caregivers. Conditions requiring ICT use across groups included transfusion-dependent anemias (85.7%) and MDS (14.3%). Concepts spontaneously reported were consistent with instruments debriefed. Interview analysis resulted in PRO and ObsRO versions of each instrument: Compliance (2 items), Palatability (4 items), GI Symptom Diary (6 items), and Modified SICT (PRO = 13, ObsRO = 17 items). Qualitative research data from cognitive interviews supports the content validity and relevance of the instruments developed/modified. Quantitative validation of these PRO and ObsRO measures is needed testing for validity, reliability, and responsiveness for future research use with new formulations of oral ICT.
NASA Technical Reports Server (NTRS)
Kibbee, G. W.
1978-01-01
The development, evaluation, and evaluation results of a DC-9-10 runway directional control simulator are described. An existing wide bodied flight simulator was modified to this aircraft configuration. The simulator was structured to use either two of antiskid simulations; (1) an analog mechanization that used aircraft hardware; or (2) a digital software simulation. After the simulation was developed it was evaluated by 14 pilots who made 818 simulated flights. These evaluations involved landings, rejected takeoffs, and various ground maneuvers. Qualitatively most pilots evaluated the simulator as realistic with good potential especially for pilot training for adverse runway conditions.
ERIC Educational Resources Information Center
Zengin, Yilmaz
2017-01-01
The purpose of this study was to determine the effect of the flipped classroom approach designed by using Khan Academy and free open source software on students' academic achievement and to examine students' views about this approach. The research was evaluated in the light of both qualitative and quantitative data. Twenty-eight students studying…
Tan, Xuhua; Lin, Haotian; Lin, Zhuoling; Chen, Jingjing; Tang, Xiangchen; Luo, Lixia; Chen, Weirong; Liu, Yizhi
2016-03-01
The objective of this study was to investigate capsular outcomes 12 months after pediatric cataract surgery without intraocular lens implantation via qualitative classification and quantitative measurement.This study is a cross-sectional study that was approved by the institutional review board of Zhongshan Ophthalmic Center of Sun Yat-sen University in Guangzhou, China.Digital coaxial retro-illumination photographs of 329 aphakic pediatric eyes were obtained 12 months after pediatric cataract surgery without intraocular lens implantation. Capsule digital coaxial retro-illumination photographs were divided as follows: anterior capsule opening area (ACOA), posterior capsule opening area (PCOA), and posterior capsule opening opacity (PCOO). Capsular outcomes were qualitatively classified into 3 types based on the PCOO: Type I-capsule with mild opacification but no invasion into the capsule opening; Type II-capsule with moderate opacification accompanied by contraction of the ACOA and invasion to the occluding part of the PCOA; and Type III-capsule with severe opacification accompanied by total occlusion of the PCOA. Software was developed to quantitatively measure the ACOA, PCOA, and PCOO using standardized DCRPs. The relationships between the accurate intraoperative anterior and posterior capsulorhexis sizes and the qualitative capsular types were statistically analyzed.The DCRPs of 315 aphakic eyes (95.8%) of 191 children were included. Capsular outcomes were classified into 3 types: Type I-120 eyes (38.1%); Type II-157 eyes (49.8%); Type III-38 eyes (12.1%). The scores of the capsular outcomes were negatively correlated with intraoperative anterior capsulorhexis size (R = -0.572, P < 0.001), but no significant correlation with intraoperative posterior capsulorhexis size (R = -0.16, P = 0.122) was observed. The ACOA significantly decreased from Type I to Type II to Type III, the PCOA increased in size from Type I to Type II, and the PCOO increased from Type II to Type III (all P < 0.05).Capsular outcomes after pediatric cataract surgery can be qualitatively classified and quantitatively measured by acquisition, division, definition, and user-friendly software analyses of high-quality digital coaxial retro-illumination photographs.
Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros
2018-02-01
Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.
IPL Processing of the Viking Orbiter Images of Mars
NASA Technical Reports Server (NTRS)
Ruiz, R. M.; Elliott, D. A.; Yagi, G. M.; Pomphrey, R. B.; Power, M. A.; Farrell, W., Jr.; Lorre, J. J.; Benton, W. D.; Dewar, R. E.; Cullen, L. E.
1977-01-01
The Viking orbiter cameras returned over 9000 images of Mars during the 6-month nominal mission. Digital image processing was required to produce products suitable for quantitative and qualitative scientific interpretation. Processing included the production of surface elevation data using computer stereophotogrammetric techniques, crater classification based on geomorphological characteristics, and the generation of color products using multiple black-and-white images recorded through spectral filters. The Image Processing Laboratory of the Jet Propulsion Laboratory was responsible for the design, development, and application of the software required to produce these 'second-order' products.
Advanced ballistic range technology
NASA Technical Reports Server (NTRS)
Yates, Leslie A.
1993-01-01
Experimental interferograms, schlieren, and shadowgraphs are used for quantitative and qualitative flow-field studies. These images are created by passing light through a flow field, and the recorded intensity patterns are functions of the phase shift and angular deflection of the light. As part of the grant NCC2-583, techniques and software have been developed for obtaining phase shifts from finite-fringe interferograms and for constructing optical images from Computational Fluid Dynamics (CFD) solutions. During the period from 1 Nov. 1992 - 30 Jun. 1993, research efforts have been concentrated in improving these techniques.
NASA integrated vehicle health management technology experiment for X-37
NASA Astrophysics Data System (ADS)
Schwabacher, Mark; Samuels, Jeff; Brownston, Lee
2002-07-01
The NASA Integrated Vehicle Health Management (IVHM) Technology Experiment for X-37 was intended to run IVHM software on board the X-37 spacecraft. The X-37 is an unpiloted vehicle designed to orbit the Earth for up to 21 days before landing on a runway. The objectives of the experiment were to demonstrate the benefits of in-flight IVHM to the operation of a Reusable Launch Vehicle, to advance the Technology Readiness Level of this IVHM technology within a flight environment, and to demonstrate that the IVHM software could operate on the Vehicle Management Computer. The scope of the experiment was to perform real-time fault detection and isolation for X-37's electrical power system and electro-mechanical actuators. The experiment used Livingstone, a software system that performs diagnosis using a qualitative, model-based reasoning approach that searches system-wide interactions to detect and isolate failures. Two of the challenges we faced were to make this research software more efficient so that it would fit within the limited computational resources that were available to us on the X-37 spacecraft, and to modify it so that it satisfied the X-37's software safety requirements. Although the experiment is currently unfunded, the development effort resulted in major improvements in Livingstone's efficiency and safety. This paper reviews some of the details of the modeling and integration efforts, and some of the lessons that were learned.
A model-based approach for automated in vitro cell tracking and chemotaxis analyses.
Debeir, Olivier; Camby, Isabelle; Kiss, Robert; Van Ham, Philippe; Decaestecker, Christine
2004-07-01
Chemotaxis may be studied in two main ways: 1) counting cells passing through an insert (e.g., using Boyden chambers), and 2) directly observing cell cultures (e.g., using Dunn chambers), both in response to stationary concentration gradients. This article promotes the use of Dunn chambers and in vitro cell-tracking, achieved by video microscopy coupled with automatic image analysis software, in order to extract quantitative and qualitative measurements characterizing the response of cells to a diffusible chemical agent. Previously, we set up a videomicroscopy system coupled with image analysis software that was able to compute cell trajectories from in vitro cell cultures. In the present study, we are introducing a new software increasing the application field of this system to chemotaxis studies. This software is based on an adapted version of the active contour methodology, enabling each cell to be efficiently tracked for hours and resulting in detailed descriptions of individual cell trajectories. The major advantages of this method come from an improved robustness with respect to variability in cell morphologies between different cell lines and dynamical changes in cell shape during cell migration. Moreover, the software includes a very small number of parameters which do not require overly sensitive tuning. Finally, the running time of the software is very short, allowing improved possibilities in acquisition frequency and, consequently, improved descriptions of complex cell trajectories, i.e. trajectories including cell division and cell crossing. We validated this software on several artificial and real cell culture experiments in Dunn chambers also including comparisons with manual (human-controlled) analyses. We developed new software and data analysis tools for automated cell tracking which enable cell chemotaxis to be efficiently analyzed. Copyright 2004 Wiley-Liss, Inc.
Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.
Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E
2018-01-01
The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably.
Nang, Roberto N; Monahan, Felicia; Diehl, Glendon B; French, Daniel
2015-04-01
Many institutions collect reports in databases to make important lessons-learned available to their members. The Uniformed Services University of the Health Sciences collaborated with the Peacekeeping and Stability Operations Institute to conduct a descriptive and qualitative analysis of global health engagements (GHEs) contained in the Stability Operations Lessons Learned and Information Management System (SOLLIMS). This study used a summative qualitative content analysis approach involving six steps: (1) a comprehensive search; (2) two-stage reading and screening process to identify first-hand, health-related records; (3) qualitative and quantitative data analysis using MAXQDA, a software program; (4) a word cloud to illustrate word frequencies and interrelationships; (5) coding of individual themes and validation of the coding scheme; and (6) identification of relationships in the data and overarching lessons-learned. The individual codes with the most number of text segments coded included: planning, personnel, interorganizational coordination, communication/information sharing, and resources/supplies. When compared to the Department of Defense's (DoD's) evolving GHE principles and capabilities, the SOLLIMS coding scheme appeared to align well with the list of GHE capabilities developed by the Department of Defense Global Health Working Group. The results of this study will inform practitioners of global health and encourage additional qualitative analysis of other lessons-learned databases. Reprint & Copyright © 2015 Association of Military Surgeons of the U.S.
Design and implementation of an online systemic human anatomy course with laboratory.
Attardi, Stefanie M; Rogers, Kem A
2015-01-01
Systemic Human Anatomy is a full credit, upper year undergraduate course with a (prosection) laboratory component at Western University Canada. To meet enrollment demands beyond the physical space of the laboratory facility, a fully online section was developed to run concurrently with the traditional face to face (F2F) course. Lectures given to F2F students are simultaneously broadcasted to online students using collaborative software (Blackboard Collaborate). The same collaborative software is used by a teaching assistant to deliver laboratory demonstrations in which three-dimensional (3D) virtual anatomical models are manipulated. Ten commercial software programs were reviewed to determine their suitability for demonstrating the virtual models, resulting in the selection of Netter's 3D Interactive Anatomy. Supplementary online materials for the central nervous system were developed by creating 360° images of plastinated prosected brain specimens and a website through which they could be accessed. This is the first description of a fully online undergraduate anatomy course with a live, interactive laboratory component. Preliminary data comparing the online and F2F student grades suggest that previous student academic performance, and not course delivery format, predicts performance in anatomy. Future qualitative studies will reveal student perceptions about their learning experiences in both of the course delivery formats. © 2014 American Association of Anatomists.
Zapata-Villa, Carolina; Agudelo-Suárez, Andrés A; Cardona-Arango, Doris; Ronda-Pérez, Elena
2017-12-14
This study aims to understand the migratory experience and the employment, work and health conditions of the returned migrants from Spain to Colombia. A qualitative study was conducted by means of 23 semi-structured interviews with Colombian returned migrant workers. Qualitative narrative content analysis was performed using Atlas.Ti software. Main findings are represented by nine categories emerged from the participants' discourses: (1) impact of the economic crisis on work and employment conditions in Spain, (2) economic crisis and return, (3) characteristics of returnees, (4) perception of the returnees about Colombia, (5) the role of social support networks, (6) employment and working conditions in Colombia, (7) health and wellbeing, (8) future plans and expectations, (9) the experience of being immigrant. Adjustment difficulties in participants are evidenced by the return migration process and the conditions of the social, political and economic system in Colombia. Return migration represents the reconfiguration of personal and working lives of this population. This situation requires the development of global policies and strategies in public health to facilitate the adaptation of these people.
ERIC Educational Resources Information Center
O'Sullivan, Saskia Katarina Emily; Harrison, Timothy Guy
2016-01-01
This qualitative study indicates that Chinese origin students completing their pre-university education in a British school have particular difficulties related to sociocultural change, pedagogical differences, affective aspects, cognitive demand and language learning. These are discussed. The use of a pre-laboratory software resource to support…
The East-German Research Landscape in Transition. Part C. Research at East-German Universities
1993-03-10
Software-Systemlosungen fUr Aufgaben der Qualit ~ tssicherung und Pr~zisionsmeBtechnik. Beratung zur automatisierten ProzeBsteuerung und rechnergestUtzten... Qualit ~ tssicherung Beratung zu Schnittstellenproblemen und zur Lichtzeichentechnik Beratung zu Auswahl und Einsatz von metrischen Bi...49 (351) 463-2786 with seventeen institutes. C45.wp-09 05 MAR 93 #4350 DEPARTMENT FOR CIVIL ENGINEERING. WATER- AND FOREST TECHNOLOGY Fakult~t fUr Bau
Successful Strategies for Activity and Wellness after Spinal Cord Injury
2016-10-01
Concurrent qualitative and quantitative methods (mixed methods approach) will be used to address Specific Aims 3 and 4. The types of assessment and...Beatrice Kiratli PhD RECIPIENT: Palo Alto Veterens Institute for Research Palo Alto, CA 94304 REPORT DATE: October 2016 TYPE OF REPORT...37/40) with 25% of transcript analysis accomplished. Staff training in qualitative research and use of mind mapping software has been completed with
Smith, Selina A.; Whitehead, Mary S.; Sheats, Joyce Q.; Fontenot, Brittney; Alema-Mensah, Ernest; Ansa, Benjamin
2016-01-01
Background There is a proliferation of lifestyle-oriented mobile technologies; however, few have targeted users. Through intervention mapping, investigators and community partners completed Steps 1–3 (needs assessment, formulation of change objectives, and selection of theory-based methods) of a process to develop a mobile cancer prevention application (app) for cancer prevention. The aim of this qualitative study was to complete Step 4 (intervention development) by eliciting input from African American (AA) breast cancer survivors (BCSs) to guide app development. Methods Four focus group discussions (n=60) and three individual semi-structured interviews (n=36) were conducted with AA BCSs (40–72 years of age) to assess barriers and strategies for lifestyle change. All focus groups and interviews were recorded and transcribed verbatim. Data were analyzed with NVivo qualitative data analysis software version 10, allowing categories, themes, and patterns to emerge. Results Three categories and related themes emerged from the analysis: 1) perceptions about modifiable risk factors; 2) strategies related to adherence to cancer prevention guidelines; and 3) app components to address barriers to adherence. Participant perceptions, strategies, and recommended components guided development of the app. Conclusions For development of a mobile cancer prevention app, these findings will assist investigators in targeting features that are usable, acceptable, and accessible for AA BCSs. PMID:27583307
North, Carol S; Pollio, David E; Pfefferbaum, Betty; Megivern, Deborah; Vythilingam, Meena; Westerhaus, Elizabeth Terry; Martin, Gregory J; Hong, Barry A
2005-08-01
Systematic studies of mental health effects of bioterrorism on exposed populations have not been carried out. Exploratory focus groups were conducted with an exposed population to provide qualitative data and inform empirical research. Five focus groups of 28 political worker volunteers were conducted 3 months after the October 15, 2001, anthrax attack on Capitol Hill. More than 2000 transcribed focus group passages were categorized using qualitative software. The category with the most items was authorities' response (23% passages), and much of this discussion pertained to communication by authorities. The category with the fewest items was symptoms (4%). Identified issues were less within individuals and more between them and authorities. Risk communication by authorities regarding safety and medical issues was a prominent concern among Capitol Hill office staff workers regarding the anthrax incident on Capitol Hill. This suggests focus on risk communication in developing interventions, but more systematic investigation is needed.
Qualitative and Quantitative Pedigree Analysis: Graph Theory, Computer Software, and Case Studies.
ERIC Educational Resources Information Center
Jungck, John R.; Soderberg, Patti
1995-01-01
Presents a series of elementary mathematical tools for re-representing pedigrees, pedigree generators, pedigree-driven database management systems, and case studies for exploring genetic relationships. (MKR)
Qualitative Analysis for Maintenance Process Assessment
NASA Technical Reports Server (NTRS)
Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor
1996-01-01
In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.
Pros and Cons of Clinical Pathway Software Management: A Qualitative Study.
Aarnoutse, M F; Brinkkemper, S; de Mul, M; Askari, M
2018-01-01
In this study we aimed to assess the perceived effectiveness of clinical pathway management software for healthcare professionals. A case study on the clinical pathway management software program Check-It was performed in three departments at an academic medical center. Four months after the implementation of the software, interviews were held with healthcare professionals who work with the system. The interview questions were posed in a semi-structured interview format and the participant were asked about the perceived positive or negative effects of Check-It, and whether they thought the software is effective for them. The interviews were recorded and transcribed based on grounded theory, using different coding techniques. Our results showed fewer overlooked tasks, pre-filled orders and letters, better overview, and increased protocol insight as positive aspects of using the software. Being not flexible enough was experienced as a negative aspect.
Cao, Di; Wang, Qing; Jin, Jing; Qiu, Maosong; Zhou, Lian; Zhou, Xinghong; Li, Hui; Zhao, Zhongxiang
2018-03-01
Ilex pubescens Hook et Arn mainly contains triterpenoids that possess antithrombotic, anti-inflammatory and analgesic effects. Quantitative and qualitative analyses of the triterpenoids in I. pubescens can be useful for determining the authenticity and quality of raw materials and guiding its clinical preparation. To establish a method for rapid and comprehensive analysis of triterpenoids in I. pubescens using ultra-high-performance liquid chromatography coupled to electrospray ionisation and quadrupole time-of-flight-mass spectrometry (UPLC-ESI-QTOF-MS), which will also be applied to evaluate the contents of nine triterpenoids among root, root heartwood and root bark of I. pubescens to judge the value of the root bark to avoid wastage. UPLC-ESI-QTOF-MS data from the extracts of I. pubescens in negative mode were analysed using Peakview and Masterview software that provided molecular weight, mass errors, isotope pattern fit and MS/MS fragments for the identification of triterpenoids. The quantification of nine investigated compounds of I. pubescens was accomplished using MultiQuant software. A total of 33 triterpenoids, five phenolic acids, two lignans and a flavonol were characterised in only 14 min. The total content of the nine compounds in the root bark was generally slightly higher than that of the root and root heartwood, which has not been reported before. The developed UPLC-ESI-QTOF-MS method was proven to be rapid and comprehensive for simultaneous qualitative and quantitative analyses of the characteristic triterpenoids in I. pubescens. The results may provide a basis for holistic quality control and metabolic studies of I. pubescens, as well as serve as a reference for the analysis of other Ilex plants. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Two Strategies for Qualitative Content Analysis: An Intramethod Approach to Triangulation.
Renz, Susan M; Carrington, Jane M; Badger, Terry A
2018-04-01
The overarching aim of qualitative research is to gain an understanding of certain social phenomena. Qualitative research involves the studied use and collection of empirical materials, all to describe moments and meanings in individuals' lives. Data derived from these various materials require a form of analysis of the content, focusing on written or spoken language as communication, to provide context and understanding of the message. Qualitative research often involves the collection of data through extensive interviews, note taking, and tape recording. These methods are time- and labor-intensive. With the advances in computerized text analysis software, the practice of combining methods to analyze qualitative data can assist the researcher in making large data sets more manageable and enhance the trustworthiness of the results. This article will describe a novel process of combining two methods of qualitative data analysis, or Intramethod triangulation, as a means to provide a deeper analysis of text.
Development of a 3-D Rehabilitation System for Upper Limbs Using ER Actuators in a Nedo Project
NASA Astrophysics Data System (ADS)
Furusho, Junji; Koyanagi, Ken'ichi; Nakanishi, Kazuhiko; Ryu, Ushio; Takenaka, Shigekazu; Inoue, Akio; Domen, Kazuhisa; Miyakoshi, Koichi
New training methods and exercises for upper limbs rehabilitation are made possible by application of robotics and virtual reality technology. The technologies can also make quantitative evaluations and enhance the qualitative effect of training. We have joined a project managed by NEDO (New Energy and Industrial Technology Development Organization as a semi-governmental organization under the Ministry of Economy, Trade and Industry of Japan) 5-year Project, "Rehabilitation System for the Upper Limbs and Lower Limbs", and developed a 3-DOF exercise machine for upper limbs (EMUL) using ER actuators. In this paper, we also present the development of software for motion exercise trainings and some results of clinical evaluation. Moreover, it is discussed how ER actuators ensure the mechanical safety.
Impact of the 3-D model strategy on science learning of the solar system
NASA Astrophysics Data System (ADS)
Alharbi, Mohammed
The purpose of this mixed method study, quantitative and descriptive, was to determine whether the first-middle grade (seventh grade) students at Saudi schools are able to learn and use the Autodesk Maya software to interact and create their own 3-D models and animations and whether their use of the software influences their study habits and their understanding of the school subject matter. The study revealed that there is value to the science students regarding the use of 3-D software to create 3-D models to complete science assignments. Also, this study aimed to address the middle-school students' ability to learn 3-D software in art class, and then ultimately use it in their science class. The success of this study may open the way to consider the impact of 3-D modeling on other school subjects, such as mathematics, art, and geography. When the students start using graphic design, including 3-D software, at a young age, they tend to develop personal creativity and skills. The success of this study, if applied in schools, will provide the community with skillful young designers and increase awareness of graphic design and the new 3-D technology. Experimental method was used to answer the quantitative research question, are there significant differences applying the learning method using 3-D models (no 3-D, premade 3-D, and create 3-D) in a science class being taught about the solar system and its impact on the students' science achievement scores? Descriptive method was used to answer the qualitative research questions that are about the difficulty of learning and using Autodesk Maya software, time that students take to use the basic levels of Polygon and Animation parts of the Autodesk Maya software, and level of students' work quality.
Pludwinski, Sarah; Ahmad, Farah; Wayne, Noah; Ritvo, Paul
2016-04-01
We investigated the experience of individuals diagnosed with type 2 diabetes mellitus (T2DM) who participated in an intervention in which the key elements were the provision of a smartphone and self-monitoring software. The interviews focused on use of a smartphone and the effects on motivation for health behavior change. This was a qualitative evaluation of participants in a larger T2DM self-management randomized controlled trial (RCT) conducted at the Black Creek Community Health Centre (BCCHC) in Toronto, Canada (ClinicalTrials.gov Identifier: NCT02036892). The study is based on semi-structured interviews (n = 11) that were audio taped and analyzed with a thematic analytic approach. The RCT compared the effectiveness of six months of smartphone-based self-monitoring and health coaching with a control group who received health coaching without internet or smartphone-based assistance. Qualitative data analyses resulted in derivation of four major themes that describe participant experience: (a) 'smartphone and software', describes smartphone use in relation to health behavior change; (b) 'health coach' describes how client/health coach relationships were assisted by smartphone use; (c) 'overall experience' describes perceptions of the overall intervention; and (d) 'frustrations in managing chronic conditions' describes difficulties with the complexities of T2DM management from a patient perspective. Findings suggest that interventions with T2DM assisted by smartphone software and health coaches actively engage individuals in improved hemoglobin A1c (HbA1c) control. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
Fazliev, A.
2009-04-01
The information and knowledge layers of information-computational system for water spectroscopy are described. Semantic metadata for all the tasks of domain information model that are the basis of the layers have been studied. The principle of semantic metadata determination and mechanisms of the usage during information systematization in molecular spectroscopy has been revealed. The software developed for the work with semantic metadata is described as well. Formation of domain model in the framework of Semantic Web is based on the use of explicit specification of its conceptualization or, in other words, its ontologies. Formation of conceptualization for molecular spectroscopy was described in Refs. 1, 2. In these works two chains of task are selected for zeroth approximation for knowledge domain description. These are direct tasks chain and inverse tasks chain. Solution schemes of these tasks defined approximation of data layer for knowledge domain conceptualization. Spectroscopy tasks solutions properties lead to a step-by-step extension of molecular spectroscopy conceptualization. Information layer of information system corresponds to this extension. An advantage of molecular spectroscopy model designed in a form of tasks chain is actualized in the fact that one can explicitly define data and metadata at each step of solution of these molecular spectroscopy chain tasks. Metadata structure (tasks solutions properties) in knowledge domain also has form of a chain in which input data and metadata of the previous task become metadata of the following tasks. The term metadata is used in its narrow sense: metadata are the properties of spectroscopy tasks solutions. Semantic metadata represented with the help of OWL 3 are formed automatically and they are individuals of classes (A-box). Unification of T-box and A-box is an ontology that can be processed with the help of inference engine. In this work we analyzed the formation of individuals of molecular spectroscopy applied ontologies as well as the software used for their creation by means of OWL DL language. The results of this work are presented in a form of an information layer and a knowledge layer in W@DIS information system 4. 1 FORMATION OF INDIVIDUALS OF WATER SPECTROSCOPY APPLIED ONTOLOGY Applied tasks ontology contains explicit description of input an output data of physical tasks solved in two chains of molecular spectroscopy tasks. Besides physical concepts, related to spectroscopy tasks solutions, an information source, which is a key concept of knowledge domain information model, is also used. Each solution of knowledge domain task is linked to the information source which contains a reference on published task solution, molecule and task solution properties. Each information source allows us to identify a certain knowledge domain task solution contained in the information system. Water spectroscopy applied ontology classes are formed on the basis of molecular spectroscopy concepts taxonomy. They are defined by constrains on properties of the selected conceptualization. Extension of applied ontology in W@DIS information system is actualized according to two scenarios. Individuals (ontology facts or axioms) formation is actualized during the task solution upload in the information system. Ontology user operation that implies molecular spectroscopy taxonomy and individuals is performed solely by the user. For this purpose Protege ontology editor was used. For the formation, processing and visualization of knowledge domain tasks individuals a software was designed and implemented. Method of individual formation determines the sequence of steps of created ontology individuals' generation. Tasks solutions properties (metadata) have qualitative and quantitative values. Qualitative metadata are regarded as metadata describing qualitative side of a task such as solution method or other information that can be explicitly specified by object properties of OWL DL language. Quantitative metadata are metadata that describe quantitative properties of task solution such as minimal and maximal data value or other information that can be explicitly obtained by programmed algorithmic operations. These metadata are related to DatatypeProperty properties of OWL specification language Quantitative metadata can be obtained automatically during data upload into information system. Since ObjectProperty values are objects, processing of qualitative metadata requires logical constraints. In case of the task solved in W@DIS ICS qualitative metadata can be formed automatically (for example in spectral functions calculation task). The used methods of translation of qualitative metadata into quantitative is characterized as roughened representation of knowledge in knowledge domain. The existence of two ways of data obtainment is a key moment in the formation of applied ontology of molecular spectroscopy task. experimental method (metadata for experimental data contain description of equipment, experiment conditions and so on) on the initial stage and inverse task solution on the following stages; calculation method (metadata for calculation data are closely related to the metadata used for the description of physical and mathematical models of molecular spectroscopy) 2 SOFTWARE FOR ONTOLOGY OPERATION Data collection in water spectroscopy information system is organized in a form of workflow that contains such operations as information source creation, entry of bibliographic data on publications, formation of uploaded data schema an so on. Metadata are generated in information source as well. Two methods are used for their formation: automatic metadata generation and manual metadata generation (performed by user). Software implementation of support of actions related to metadata formation is performed by META+ module. Functions of META+ module can be divided into two groups. The first groups contains the functions necessary to software developer while the second one the functions necessary to a user of the information system. META+ module functions necessary to the developer are: 1. creation of taxonomy (T-boxes) of applied ontology classes of knowledge domain tasks; 2. creation of instances of task classes; 3. creation of data schemes of tasks in a form of an XML-pattern and based on XML-syntax. XML-pattern is developed for instances generator and created according to certain rules imposed on software generator implementation. 4. implementation of metadata values calculation algorithms; 5. creation of a request interface and additional knowledge processing function for the solution of these task; 6. unification of the created functions and interfaces into one information system The following sequence is universal for the generation of task classes' individuals that form chains. Special interfaces for user operations management are designed for software developer in META+ module. There are means for qualitative metadata values updating during data reuploading to information source. The list of functions necessary to end user contains: - data sets visualization and editing, taking into account their metadata, e.g.: display of unique number of bands in transitions for a certain data source; - export of OWL/RDF models from information system to the environment in XML-syntax; - visualization of instances of classes of applied ontology tasks on molecular spectroscopy; - import of OWL/RDF models into the information system and their integration with domain vocabulary; - formation of additional knowledge of knowledge domain for the construction of ontological instances of task classes using GTML-formats and their processing; - formation of additional knowledge in knowledge domain for the construction of instances of task classes, using software algorithm for data sets processing; - function of semantic search implementation using an interface that formulates questions in a form of related triplets in order for getting an adequate answer. 3 STRUCTURE OF META+ MODULE META+ software module that provides the above functions contains the following components: - a knowledge base that stores semantic metadata and taxonomies of information system; - software libraries POWL and RAP 5 created by third-party developer and providing access to ontological storage; - function classes and libraries that form the core of the module and perform the tasks of formation, storage and visualization of classes instances; - configuration files and module patterns that allow one to adjust and organize operation of different functional blocks; META+ module also contains scripts and patterns implemented according to the rules of W@DIS information system development environment. - scripts for interaction with environment by means of the software core of information system. These scripts provide organizing web-oriented interactive communication; - patterns for the formation of functionality visualization realized by the scripts Software core of scientific information-computational system W@DIS is created with the help of MVC (Model - View - Controller) design pattern that allows us to separate logic of application from its representation. It realizes the interaction of three logical components, actualizing interactivity with the environment via Web and performing its preprocessing. Functions of «Controller» logical component are realized with the help of scripts designed according to the rules imposed by software core of the information system. Each script represents a definite object-oriented class with obligatory class method of script initiation called "start". Functions of actualization of domain application operation results representation (i.e. "View" component) are sets of HTML-patterns that allow one to visualize the results of domain applications operation with the help of additional constructions processed by software core of the system. Besides the interaction with the software core of the scientific information system this module also deals with configuration files of software core and its database. Such organization of work provides closer integration with software core and deeper and more adequate connection in operating system support. 4 CONCLUSION In this work the problems of semantic metadata creation in information system oriented on information representation in the area of molecular spectroscopy have been discussed. The described method of semantic metadata and functions formation as well as realization and structure of META+ module have been described. Architecture of META+ module is closely related to the existing software of "Molecular spectroscopy" scientific information system. Realization of the module is performed with the use of modern approaches to Web-oriented applications development. It uses the existing applied interfaces. The developed software allows us to: - perform automatic metadata annotation of calculated tasks solutions directly in the information system; - perform automatic annotation of metadata on the solution of tasks on task solution results uploading outside the information system forming an instance of the solved task on the basis of entry data; - use ontological instances of task solution for identification of data in information tasks of viewing, comparison and search solved by information system; - export applied tasks ontologies for the operation with them by external means; - solve the task of semantic search according to the pattern and using question-answer type interface. 5 ACKNOWLEDGEMENT The authors are grateful to RFBR for the financial support of development of distributed information system for molecular spectroscopy. REFERENCES A.D.Bykov, A.Z. Fazliev, N.N.Filippov, A.V. Kozodoev, A.I.Privezentsev, L.N.Sinitsa, M.V.Tonkov and M.Yu.Tretyakov, Distributed information system on atmospheric spectroscopy // Geophysical Research Abstracts, SRef-ID: 1607-7962/gra/EGU2007-A-01906, 2007, v. 9, p. 01906. A.I.Prevezentsev, A.Z. Fazliev Applied task ontology for molecular spectroscopy information resources systematization. The Proceedings of 9th Russian scientific conference "Electronic libraries: advanced methods and technologies, electronic collections" - RCDL'2007, Pereslavl Zalesskii, 2007, part.1, 2007, P.201-210. OWL Web Ontology Language Semantics and Abstract Syntax, W3C Recommendation 10 February 2004, http://www.w3.org/TR/2004/REC-owl-semantics-20040210/ W@DIS information system, http://wadis.saga.iao.ru RAP library, http://www4.wiwiss.fu-berlin.de/bizer/rdfapi/.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eilert, A.J.; Danley, W.J.; Wang, Xiaolu
1995-12-31
A near-infrared analyzer utilizing state-of-the-art acousto-optic tunable filter (AOTF) spectrometry with digital wavelength control and high D* extended-range INGaAs TE-cooled detector provides excellent wavelength repeatability (better than 0.02 nm) and very high signal-to-noise ration. Because the AOTF dispersive element is completely solid-state (no-moving parts), as is the entire spectrometer, the instrument is small, rugged and very reliable. Using this spectrometer, methods employing chemometrics have been developed and applied to measure organic contaminants such as gasoline and a variety of jet fuels in water. Qualitative identification of contaminants was achieved with discriminant analysis software developed specifically for this task. Both themore » technique of grouping sample spectra into specific clusters based of Mahalanobis distances and that of matching each spectrum with the most representative member of the appropriate group of calibration spectra were used to identify contaminants. After initial classification, appropriate qualitative chemometric calibrations may be applied to more accurately assess the level of contamination. The instrument will be used to evaluate ground water supplies.« less
Neville, C; Da Costa, D; Mill, C; Rochon, M; Aviña-Zubieta, J A; Pineau, C A; Eng, D; Fortin, P R
2014-02-01
Systemic lupus erythematosus is an inflammatory autoimmune disease associated with high morbidity and unacceptable mortality. A major challenge for persons with lupus is coping with their illness and complex care. Our objective was to identify the informational and resource needs of persons with lupus, rheumatologists, and allied health professionals treating lupus. Our findings will be applied toward the development of an innovative web-based technology, the Lupus Interactive Navigator (LIN™), to facilitate and support engagement and self-management for persons with lupus. Eight focus groups were conducted: four groups of persons with lupus (n=29), three groups of rheumatologists (n=20), and one group of allied health professionals (n=8). The groups were held in British Columbia, Ontario, and Quebec. All sessions were audio-recorded and transcribed verbatim. Qualitative analysis was performed using grounded theory. The transcripts were reviewed independently and coded by the moderator and co-moderator using 1) qualitative data analysis software developed by Provalis Research, Montreal, Canada, and 2) manual coding. Four main themes emerged: 1) specific information and resource needs; 2) barriers to engagement in health care; 3) facilitators of engagement in health care; and 4) tools identified as helpful for the self-management of lupus. These findings will help guide the scope of LIN™ with relevant information topics and specific tools that will be most helpful to the diverse needs of persons with lupus and their health care providers.
The NASA Integrated Vehicle Health Management Technology Experiment for X-37
NASA Technical Reports Server (NTRS)
Schwabacher, Mark; Samuels, Jeff; Brownston, Lee; Clancy, Daniel (Technical Monitor)
2002-01-01
The NASA Integrated Vehicle Health Management (IVHM) Technology Experiment for X-37 was intended to run IVHM software on-board the X-37 spacecraft. The X-37 is intended to be an unpiloted vehicle that would orbit the Earth for up to 21 days before landing on a runway. The objectives of the experiment were to demonstrate the benefits of in-flight IVHM to the operation of a Reusable Launch Vehicle, to advance the Technology Readiness Level of this IVHM technology within a flight environment, and to demonstrate that the IVHM software could operate on the Vehicle Management Computer. The scope of the experiment was to perform real-time fault detection and isolation for X-37's electrical power system and electro-mechanical actuators. The experiment used Livingstone, a software system that performs diagnosis using a qualitative, model-based reasoning approach that searches system-wide interactions to detect and isolate failures. Two of the challenges we faced were to make this research software more efficient so that it would fit within the limited computational resources that were available to us on the X-37 spacecraft, and to modify it so that it satisfied the X-37's software safety requirements. Although the experiment is currently unfunded, the development effort had value in that it resulted in major improvements in Livingstone's efficiency and safety. This paper reviews some of the details of the modeling and integration efforts, and some of the lessons that were learned.
Perfusion CT in acute stroke: effectiveness of automatically-generated colour maps.
Ukmar, Maja; Degrassi, Ferruccio; Pozzi Mucelli, Roberta Antea; Neri, Francesca; Mucelli, Fabio Pozzi; Cova, Maria Assunta
2017-04-01
To evaluate the accuracy of perfusion CT (pCT) in the definition of the infarcted core and the penumbra, comparing the data obtained from the evaluation of parametric maps [cerebral blood volume (CBV), cerebral blood flow (CBF) and mean transit time (MTT)] with software-generated colour maps. A retrospective analysis was performed to identify patients with suspected acute ischaemic strokes and who had undergone unenhanced CT and pCT carried out within 4.5 h from the onset of the symptoms. A qualitative evaluation of the CBV, CBF and MTT maps was performed, followed by an analysis of the colour maps automatically generated by the software. 26 patients were identified, but a direct CT follow-up was performed only on 19 patients after 24-48 h. In the qualitative analysis, 14 patients showed perfusion abnormalities. Specifically, 29 perfusion deficit areas were detected, of which 15 areas suggested the penumbra and the remaining 14 areas suggested the infarct. As for automatically software-generated maps, 12 patients showed perfusion abnormalities. 25 perfusion deficit areas were identified, 15 areas of which suggested the penumbra and the other 10 areas the infarct. The McNemar's test showed no statistically significant difference between the two methods of evaluation in highlighting infarcted areas proved later at CT follow-up. We demonstrated how pCT provides good diagnostic accuracy in the identification of acute ischaemic lesions. The limits of identification of the lesions mainly lie at the pons level and in the basal ganglia area. Qualitative analysis has proven to be more efficient in identification of perfusion lesions in comparison with software-generated maps. However, software-generated maps have proven to be very useful in the emergency setting. Advances in knowledge: The use of CT perfusion is requested in increasingly more patients in order to optimize the treatment, thanks also to the technological evolution of CT, which now allows a whole-brain study. The need for performing CT perfusion study also in the emergency setting could represent a problem for physicians who are not used to interpreting the parametric maps (CBV, MTT etc.). The software-generated maps could be of value in these settings, helping the less expert physician in the differentiation between different areas.
Virtual Planetary Analysis Environment for Remote Science
NASA Technical Reports Server (NTRS)
Keely, Leslie; Beyer, Ross; Edwards. Laurence; Lees, David
2009-01-01
All of the data for NASA's current planetary missions and most data for field experiments are collected via orbiting spacecraft, aircraft, and robotic explorers. Mission scientists are unable to employ traditional field methods when operating remotely. We have developed a virtual exploration tool for remote sites with data analysis capabilities that extend human perception quantitatively and qualitatively. Scientists and mission engineers can use it to explore a realistic representation of a remote site. It also provides software tools to "touch" and "measure" remote sites with an immediacy that boosts scientific productivity and is essential for mission operations.
McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R
2007-05-01
The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.
Learning to consult with computers.
Liaw, S T; Marty, J J
2001-07-01
To develop and evaluate a strategy to teach skills and issues associated with computers in the consultation. An overview lecture plus a workshop before and a workshop after practice placements, during the 10-week general practice (GP) term in the 5th year of the University of Melbourne medical course. Pre- and post-intervention study using a mix of qualitative and quantitative methods within a strategic evaluation framework. Self-reported attitudes and skills with clinical applications before, during and after the intervention. Most students had significant general computer experience but little in the medical area. They found the workshops relevant, interesting and easy to follow. The role-play approach facilitated students' learning of relevant communication and consulting skills and an appreciation of issues associated with using the information technology tools in simulated clinical situations to augment and complement their consulting skills. The workshops and exposure to GP systems were associated with an increase in the use of clinical software, more realistic expectations of existing clinical and medical record software and an understanding of the barriers to the use of computers in the consultation. The educational intervention assisted students to develop and express an understanding of the importance of consulting and communication skills in teaching and learning about medical informatics tools, hardware and software design, workplace issues and the impact of clinical computer systems on the consultation and patient care.
ESSAA: Embedded system safety analysis assistant
NASA Technical Reports Server (NTRS)
Wallace, Peter; Holzer, Joseph; Guarro, Sergio; Hyatt, Larry
1987-01-01
The Embedded System Safety Analysis Assistant (ESSAA) is a knowledge-based tool that can assist in identifying disaster scenarios. Imbedded software issues hazardous control commands to the surrounding hardware. ESSAA is intended to work from outputs to inputs, as a complement to simulation and verification methods. Rather than treating the software in isolation, it examines the context in which the software is to be deployed. Given a specified disasterous outcome, ESSAA works from a qualitative, abstract model of the complete system to infer sets of environmental conditions and/or failures that could cause a disasterous outcome. The scenarios can then be examined in depth for plausibility using existing techniques.
Practices and Challenges of Growth Monitoring and Promotion in Ethiopia: A Qualitative Study
Moser, Albine; Blanco, Roman; Spigt, Mark; Dinant, Geert Jan
2014-01-01
ABSTRACT The use of growth monitoring and promotion (GMP) has become widespread. It is a potential contributor towards achieving the Millennium Development Goals of halving hunger and reducing child mortality by two-thirds within 2015. Yet, GMP appears to be a prerequisite for good child health but several studies have shown that there is a discrepancy between the purpose and the practice of GMP. The high prevalence of malnutrition in many developing countries seems to confirm this fact. A descriptive qualitative study was carried out from April to September 2011. Focus group discussions and in-depth interviews were conducted amongst mothers and health workers. Data were analyzed using a qualitative content analysis technique, with the support of ATLAS.ti 5.0 software. The results suggest that most mothers were aware of the need for regular weight monitoring while health workers also seemed to be well-aware and to practise GMP according to the international guidelines. However, there was a deficit in maternal knowledge with regard to child-feeding and a lack of basic resources to keep and/or to buy healthful and nutritionally-rich food. Furthermore, the role of the husband was not always supportive of proper child-feeding. In general, GMP is unlikely to succeed if mothers lack awareness of proper child-feeding practices, and if they are not supported by their husbands. PMID:25395907
A Generic Modeling Process to Support Functional Fault Model Development
NASA Technical Reports Server (NTRS)
Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.
2016-01-01
Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.
Descriptions of Free and Freeware Software in the Mathematics Teaching
NASA Astrophysics Data System (ADS)
Antunes de Macedo, Josue; Neves de Almeida, Samara; Voelzke, Marcos Rincon
2016-05-01
This paper presents the analysis and the cataloging of free and freeware mathematical software available on the internet, a brief explanation of them, and types of licenses for use in teaching and learning. The methodology is based on the qualitative research. Among the different types of software found, it stands out in algebra, the Winmat, that works with linear algebra, matrices and linear systems. In geometry, the GeoGebra, which can be used in the study of functions, plan and spatial geometry, algebra and calculus. For graphing, can quote the Graph and Graphequation. With Graphmatica software, it is possible to build various graphs of mathematical equations on the same screen, representing cartesian equations, inequalities, parametric among other functions. The Winplot allows the user to build graphics in two and three dimensions functions and mathematical equations. Thus, this work aims to present the teachers some free math software able to be used in the classroom.
Lessons learned using Web conference technology for online focus group interviews.
Tuttas, Carol A
2015-01-01
Researchers use Internet technology for data collection in qualitative studies. In the literature there are published accounts of synchronous (real-time) and more commonly, asynchronous (not-real-time) focus group data collection methods supported by Internet technology in the form of email correspondence, LISTSERVs, discussion boards, and chat rooms. Real-time audiovisual Web conference technology offers qualitative researchers a promising alternative means to carry out focus groups. In this methodological article I describe how I used Web conference technology to host online focus groups for a qualitative study about job integration experiences of travel nurses geographically dispersed across the United States. I describe lessons learned from the use of this innovative method for qualitative data collection, including a brief overview about the use of dictation software for transcription. This new knowledge is useful to researchers considering Web conference technology to carry out focus group data collection in qualitative research. © The Author(s) 2014.
Priority of VHS Development Based in Potential Area using Principal Component Analysis
NASA Astrophysics Data System (ADS)
Meirawan, D.; Ana, A.; Saripudin, S.
2018-02-01
The current condition of VHS is still inadequate in quality, quantity and relevance. The purpose of this research is to analyse the development of VHS based on the development of regional potential by using principal component analysis (PCA) in Bandung, Indonesia. This study used descriptive qualitative data analysis using the principle of secondary data reduction component. The method used is Principal Component Analysis (PCA) analysis with Minitab Statistics Software tool. The results of this study indicate the value of the lowest requirement is a priority of the construction of development VHS with a program of majors in accordance with the development of regional potential. Based on the PCA score found that the main priority in the development of VHS in Bandung is in Saguling, which has the lowest PCA value of 416.92 in area 1, Cihampelas with the lowest PCA value in region 2 and Padalarang with the lowest PCA value.
Software Engineering Research/Developer Collaborations (C104)
NASA Technical Reports Server (NTRS)
Shell, Elaine; Shull, Forrest
2005-01-01
The goal of this collaboration was to produce Flight Software Branch (FSB) process standards for software inspections which could be used across three new missions within the FSB. The standard was developed by Dr. Forrest Shull (Fraunhofer Center for Experimental Software Engineering, Maryland) using the Perspective-Based Inspection approach, (PBI research has been funded by SARP) , then tested on a pilot Branch project. Because the short time scale of the collaboration ruled out a quantitative evaluation, it would be decided whether the standard was suitable for roll-out to other Branch projects based on a qualitative measure: whether the standard received high ratings from Branch personnel as to usability and overall satisfaction. The project used for piloting the Perspective-Based Inspection approach was a multi-mission framework designed for reuse. This was a good choice because key representatives from the three new missions would be involved in the inspections. The perspective-based approach was applied to produce inspection procedures tailored for the specific quality needs of the branch. The technical information to do so was largely drawn through a series of interviews with Branch personnel. The framework team used the procedures to review requirements. The inspections were useful for indicating that a restructuring of the requirements document was needed, which led to changes in the development project plan. The standard was sent out to other Branch personnel for review. Branch personnel were very positive. However, important changes were identified because the perspective of Attitude Control System (ACS) developers had not been adequately represented, a result of the specific personnel interviewed. The net result is that with some further work to incorporate the ACS perspective, and in synchrony with the roll out of independent Branch standards, the PBI approach will be implemented in the FSB. Also, the project intends to continue its collaboration with the technology provider (Dr. Forrest Shull) past the end of the grant, to allow a more rigorous quantitative evaluation.
Development of the electronic health records for nursing education (EHRNE) software program.
Kowitlawakul, Yanika; Wang, Ling; Chan, Sally Wai-Chi
2013-12-01
This paper outlines preliminary research of an innovative software program that enables the use of an electronic health record in a nursing education curriculum. The software application program is called EHRNE, which stands for Electronic Heath Record for Nursing Education. The aim of EHRNE is to enhance student's learning of health informatics when they are working in the simulation laboratory. Integrating EHRNE into the nursing curriculum exposes students to electronic health records before they go into the workplace. A qualitative study was conducted using focus group interviews of nine nursing students. Nursing students' perceptions of using the EHRNE application were explored. The interviews were audio-taped and transcribed verbatim. The data was analyzed following the Colaizzi (1978) guideline. Four main categories that related to the EHRNE application were identified from the interviews: functionality, data management, timing and complexity, and accessibility. The analysis of the data revealed advantages and limitations of using EHRNE in the classroom setting. Integrating the EHRNE program into the curriculum will promote students' awareness of electronic documentation and enhance students' learning in the simulation laboratory. Preliminary findings suggested that before integrating the EHRNE program into the nursing curriculum, educational sessions for both students and faculty outlining the software's purpose, advantages, and limitations were needed. Following the educational sessions, further investigation of students' perceptions and learning using the EHRNE program is recommended. Copyright © 2012 Elsevier Ltd. All rights reserved.
A virtual environment for medical radiation collaborative learning.
Bridge, Pete; Trapp, Jamie V; Kastanis, Lazaros; Pack, Darren; Parker, Jacqui C
2015-06-01
A software-based environment was developed to provide practical training in medical radiation principles and safety. The Virtual Radiation Laboratory application allowed students to conduct virtual experiments using simulated diagnostic and radiotherapy X-ray generators. The experiments were designed to teach students about the inverse square law, half value layer and radiation protection measures and utilised genuine clinical and experimental data. Evaluation of the application was conducted in order to ascertain the impact of the software on students' understanding, satisfaction and collaborative learning skills and also to determine potential further improvements to the software and guidelines for its continued use. Feedback was gathered via an anonymous online survey consisting of a mixture of Likert-style questions and short answer open questions. Student feedback was highly positive with 80 % of students reporting increased understanding of radiation protection principles. Furthermore 72 % enjoyed using the software and 87 % of students felt that the project facilitated collaboration within small groups. The main themes arising in the qualitative feedback comments related to efficiency and effectiveness of teaching, safety of environment, collaboration and realism. Staff and students both report gains in efficiency and effectiveness associated with the virtual experiments. In addition students particularly value the visualisation of "invisible" physical principles and increased opportunity for experimentation and collaborative problem-based learning. Similar ventures will benefit from adopting an approach that allows for individual experimentation while visualizing challenging concepts.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M
2008-01-01
To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.
Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM
2009-01-01
Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582
Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software
Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E.
2018-01-01
Background The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. Methods The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. Results The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. Conclusion The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably. PMID:29750166
Muehlwald, S; Buchner, N; Kroh, L W
2018-03-23
Because of the high number of possible pesticide residues and their chemical complexity, it is necessary to develop methods which cover a broad range of pesticides. In this work, a qualitative multi-screening method for pesticides was developed by use of HPLC-ESI-Q-TOF. 110 pesticides were chosen for the creation of a personal compound database and library (PCDL). The MassHunter Qualitative Analysis software from Agilent Technologies was used to identify the analytes. The software parameter settings were optimised to produce a low number of false positive as well as false negative results. The method was validated for 78 selected pesticides. However, the validation criteria were not fulfilled for 45 analytes. Due to this result, investigations were started to elucidate reasons for the low detectability. It could be demonstrated that the three main causes of the signal suppression were the co-eluting matrix (matrix effect), the low sensitivity of the analyte in standard solution and the fragmentation of the analyte in the ion source (in-source collision-induced dissociation). In this paper different examples are discussed showing that the impact of these three causes is different for each analyte. For example, it is possible that an analyte with low signal intensity and an intense fragmentation in the ion source is detectable in a difficult matrix, whereas an analyte with a high sensitivity and a low fragmentation is not detectable in a simple matrix. Additionally, it could be shown that in-source fragments are a helpful tool for an unambiguous identification. Copyright © 2018 Elsevier B.V. All rights reserved.
Learning Visualization Strategies: A qualitative investigation
NASA Astrophysics Data System (ADS)
Halpern, Daniel; Oh, Kyong Eun; Tremaine, Marilyn; Chiang, James; Bemis, Karen; Silver, Deborah
2015-12-01
The following study investigates the range of strategies individuals develop to infer and interpret cross-sections of three-dimensional objects. We focus on the identification of mental representations and problem-solving processes made by 11 individuals with the goal of building training applications that integrate the strategies developed by the participants in our study. Our results suggest that although spatial transformation and perspective-taking techniques are useful for visualizing cross-section problems, these visual processes are augmented by analytical thinking. Further, our study shows that participants employ general analytic strategies for extended periods which evolve through practice into a set of progressively more expert strategies. Theoretical implications are discussed and five main findings are recommended for integration into the design of education software that facilitates visual learning and comprehension.
C-quence: a tool for analyzing qualitative sequential data.
Duncan, Starkey; Collier, Nicholson T
2002-02-01
C-quence is a software application that matches sequential patterns of qualitative data specified by the user and calculates the rate of occurrence of these patterns in a data set. Although it was designed to facilitate analyses of face-to-face interaction, it is applicable to any data set involving categorical data and sequential information. C-quence queries are constructed using a graphical user interface. The program does not limit the complexity of the sequential patterns specified by the user.
Aural rehabilitation through music workshops for cochlear implant users.
van Besouw, Rachel M; Nicholls, David R; Oliver, Benjamin R; Hodkinson, Sarah M; Grasmeder, Mary L
2014-04-01
It has been reported that after speech perception, music appreciation is the second most commonly expressed requirement among cochlear implant (CI) recipients. Certain features of music are known to be more readily accessible; however, provision of music rehabilitation for adult CI users is limited. A series of music workshops were organized to (1) enable attendees to explore which aspects of music they are able to perceive and appreciate; (2) raise awareness of listening strategies, technology, and rehabilitation resources for music; and (3) develop ideas, and prototype software, for inclusion in a music rehabilitation program. The therapeutic value of music workshops was concurrently investigated. A qualitative, longitudinal study was used. Two consultation meetings were held before a series of nine music workshops that occurred over a period of 5 mo. Five adult CI users participated in consultations before the workshops. Twenty-eight adult CI users from the South of England Cochlear Implant Centre attended at least one of the workshops. Participants could attend as many workshops as they wished. Each workshop lasted between 2 to 2.5 hr and included individual computer-based and group activities. Responses to open-ended questions were transcribed in the consultation meetings and used to develop workshop activities. A preworkshop survey was used to determine attendees' aspirations and expectations. Postworkshop surveys were used to qualitatively and quantitatively evaluate attendees' immediate reactions to the workshop content, software, and perceived benefits. A 2-month, postworkshop survey evaluated the longer-term impact of the workshops. Overall reaction to the workshops and prototype software was positive. All attendees indicated that they anticipated changing how they engaged with music as a result of the workshops, and data from the preworkshop and postworkshop surveys suggest a positive change in listening habits. The workshops proved to be an effective means of simultaneously encouraging music exploration in a social and safe environment and obtaining feedback on prototype rehabilitation materials. Survey data suggested that through group listening and practical activities, certain aspects of music can be accessible and rewarding through a CI, leading to positive changes in attitude and behavior toward music. American Academy of Audiology.
Capsular Outcomes After Pediatric Cataract Surgery Without Intraocular Lens Implantation
Tan, Xuhua; Lin, Haotian; Lin, Zhuoling; Chen, Jingjing; Tang, Xiangchen; Luo, Lixia; Chen, Weirong; Liu, Yizhi
2016-01-01
Abstract The objective of this study was to investigate capsular outcomes 12 months after pediatric cataract surgery without intraocular lens implantation via qualitative classification and quantitative measurement. This study is a cross-sectional study that was approved by the institutional review board of Zhongshan Ophthalmic Center of Sun Yat-sen University in Guangzhou, China. Digital coaxial retro-illumination photographs of 329 aphakic pediatric eyes were obtained 12 months after pediatric cataract surgery without intraocular lens implantation. Capsule digital coaxial retro-illumination photographs were divided as follows: anterior capsule opening area (ACOA), posterior capsule opening area (PCOA), and posterior capsule opening opacity (PCOO). Capsular outcomes were qualitatively classified into 3 types based on the PCOO: Type I—capsule with mild opacification but no invasion into the capsule opening; Type II—capsule with moderate opacification accompanied by contraction of the ACOA and invasion to the occluding part of the PCOA; and Type III—capsule with severe opacification accompanied by total occlusion of the PCOA. Software was developed to quantitatively measure the ACOA, PCOA, and PCOO using standardized DCRPs. The relationships between the accurate intraoperative anterior and posterior capsulorhexis sizes and the qualitative capsular types were statistically analyzed. The DCRPs of 315 aphakic eyes (95.8%) of 191 children were included. Capsular outcomes were classified into 3 types: Type I—120 eyes (38.1%); Type II—157 eyes (49.8%); Type III—38 eyes (12.1%). The scores of the capsular outcomes were negatively correlated with intraoperative anterior capsulorhexis size (R = −0.572, P < 0.001), but no significant correlation with intraoperative posterior capsulorhexis size (R = −0.16, P = 0.122) was observed. The ACOA significantly decreased from Type I to Type II to Type III, the PCOA increased in size from Type I to Type II, and the PCOO increased from Type II to Type III (all P < 0.05). Capsular outcomes after pediatric cataract surgery can be qualitatively classified and quantitatively measured by acquisition, division, definition, and user-friendly software analyses of high-quality digital coaxial retro-illumination photographs. PMID:26962807
Exploring faculty perceptions towards electronic health records for nursing education.
Kowitlawakul, Y; Chan, S W C; Wang, L; Wang, W
2014-12-01
The use of electronic health records in nursing education is rapidly increasing worldwide. The successful implementation of electronic health records for nursing education software program relies on students as well as nursing faculty members. This study aimed to explore the experiences and perceptions of nursing faculty members using electronic health records for nursing education software program, and to identify the influential factors for successful implementation of this technology. This exploratory qualitative study was conducted using in-depth individual interviews at a university in Singapore. Seven faculty members participated in the study. The data were gathered and analysed at the end of the semester in the 2012/2013 academic year. The participants' perceptions of the software program were organized into three main categories: innovation, transition and integration. The participants perceived this technology as innovative, with both values and challenges for the users. In addition, using the new software program was perceived as transitional process. The integration of this technology required time from faculty members and students, as well as support from administrators. The software program had only been implemented for 2-3 months at the time of the interviews. Consequently, the participants might have lacked the necessary skill and competence and confidence to implement it successfully. In addition, the unequal exposure to the software program might have had an impact on participants' perceptions. The findings show that the integration of electronic health records into nursing education curricula is dependent on the faculty members' experiences with the new technology, as well as their perceptions of it. Hence, cultivating a positive attitude towards the use of new technologies is important. Electronic health records are significant applications of health information technology. Health informatics competency should be included as a required competency component in faculty professional development policy and programmes. © 2014 International Council of Nurses.
Transient Region Coverage in the Propulsion IVHM Technology Experiment
NASA Technical Reports Server (NTRS)
Balaban, Edward; Sweet, Adam; Bajwa, Anupa; Maul, William; Fulton, Chris; Chicatelli, amy
2004-01-01
Over the last several years researchers at NASA Glenn and Ames Research Centers have developed a real-time fault detection and isolation system for propulsion subsystems of future space vehicles. The Propulsion IVHM Technology Experiment (PITEX), as it is called follows the model-based diagnostic methodology and employs Livingstone, developed at NASA Ames, as its reasoning engine. The system has been tested on,flight-like hardware through a series of nominal and fault scenarios. These scenarios have been developed using a highly detailed simulation of the X-34 flight demonstrator main propulsion system and include realistic failures involving valves, regulators, microswitches, and sensors. This paper focuses on one of the recent research and development efforts under PITEX - to provide more complete transient region coverage. It describes the development of the transient monitors, the corresponding modeling methodology, and the interface software responsible for coordinating the flow of information between the quantitative monitors and the qualitative, discrete representation Livingstone.
Computational representation of the aponeuroses as NURBS surfaces in 3D musculoskeletal models.
Wu, Florence T H; Ng-Thow-Hing, Victor; Singh, Karan; Agur, Anne M; McKee, Nancy H
2007-11-01
Computational musculoskeletal (MSK) models - 3D graphics-based models that accurately simulate the anatomical architecture and/or the biomechanical behaviour of organ systems consisting of skeletal muscles, tendons, ligaments, cartilage and bones - are valued biomedical tools, with applications ranging from pathological diagnosis to surgical planning. However, current MSK models are often limited by their oversimplifications in anatomical geometries, sometimes lacking discrete representations of connective tissue components entirely, which ultimately affect their accuracy in biomechanical simulation. In particular, the aponeuroses - the flattened fibrous connective sheets connecting muscle fibres to tendons - have never been geometrically modeled. The initiative was thus to extend Anatomy3D - a previously developed software bundle for reconstructing muscle fibre architecture - to incorporate aponeurosis-modeling capacity. Two different algorithms for aponeurosis reconstruction were written in the MEL scripting language of the animation software Maya 6.0, using its NURBS (non-uniform rational B-splines) modeling functionality for aponeurosis surface representation. Both algorithms were validated qualitatively against anatomical and functional criteria.
[Feather--data acquisition in gynaecology and obstetrics].
Oppelt, P; Plathow, D; Oppelt, A; Stähler, J; Petrich, S; Scharl, A; Costa, S; Jesgarz, J; Kaufmann, M; Bergh, B
2002-07-01
Nowadays many types of medical documentation are based on computer facilities. Unfortunately, this involves the considerable disadvantage that almost every single department and specialty has its own software programs, with the physician having to learn a whole range of different programs. In addition, data sometimes have to be entered twice - since although open interfaces are often available, the elaborate programming required to transfer data from outside programs makes the financial costs too high. Since 1995 the University's of Frankfurt am Main Department of Gynecology and Obstetrics has therefore developed a consistent program of its own under Windows NT for in-patient facilities, as well as for some outpatient services. The program does not aim to achieve everything that is technically possible, but focuses primarily on user requirements. In addition to the general requirements for medical documentation in gynecology and obstetrics, the program can also handle perinatal inquiries and gynecological quality control (QSmed [Qualitätssicherung in der Medizin] of the BQS [Bundesgeschäftsstelle Qualitätssicherung]).
Effective nutrition education for Aboriginal Australians: lessons from a diabetes cooking course.
Abbott, Penelope A; Davison, Joyce E; Moore, Louise F; Rubinstein, Raechelle
2012-01-01
To examine the experiences of Aboriginal Australians with or at risk of diabetes who attended urban community cooking courses in 2002-2007; and to develop recommendations for increasing the uptake and effectiveness of nutrition education in Aboriginal communities. Descriptive qualitative approach using semistructured interviews with 23 Aboriginal course participants aged 19-72. Verbatim transcripts were coded using NVivo 7 software, and qualitative analysis was undertaken. Engagement and learning were increased by emphasizing the social aspects of the program, holding the course in a familiar Aboriginal community-controlled health setting and using small group learning with Aboriginal peers. Partnership with a vocational training institute provided teaching expertise, but there was conflict between vocational and health promotion objectives. Nutrition programs for Aboriginal Australians should be social, flexible, and held in accessible, culturally appropriate settings and focus on healthful cooking techniques using simple, affordable ingredients. Copyright © 2012 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Schulz-Behrendt, C; Salzwedel, A; Rabe, S; Ortmann, K; Völler, H
2017-06-01
This study investigated subjective biopsychosocial effects of coronary heart disease (CHD), coping strategies and social support in patients undergoing cardiac rehabilitation (CR) and having extensive work-related problems. A qualitative investigation was performed in 17 patients (48.9±7.0 y, 13 male) with extensive work-related problems (SIMBO-C>30). All patients were interviewed with structured surveys. Data analysis was performed using a software that is based on the content analysis approach of Mayring. In regard to effects of disease, patients indicated social aspects including occupational aspects (62%) more often than physical or mental factors (9 or 29%). Applied coping strategies and support services are mainly focused on physical impairments (70 or 45%). The development of appropriate coping strategies was insufficient although social effects of disease were subjectively meaningful for patients in CR. © Georg Thieme Verlag KG Stuttgart · New York.
Joost, Stéphane; Kalbermatten, Michael; Bezault, Etienne; Seehausen, Ole
2012-01-01
When searching for loci possibly under selection in the genome, an alternative to population genetics theoretical models is to establish allele distribution models (ADM) for each locus to directly correlate allelic frequencies and environmental variables such as precipitation, temperature, or sun radiation. Such an approach implementing multiple logistic regression models in parallel was implemented within a computing program named MATSAM: . Recently, this application was improved in order to support qualitative environmental predictors as well as to permit the identification of associations between genomic variation and individual phenotypes, allowing the detection of loci involved in the genetic architecture of polymorphic characters. Here, we present the corresponding methodological developments and compare the results produced by software implementing population genetics theoretical models (DFDIST: and BAYESCAN: ) and ADM (MATSAM: ) in an empirical context to detect signatures of genomic divergence associated with speciation in Lake Victoria cichlid fishes.
The impact of smart metal artefact reduction algorithm for use in radiotherapy treatment planning.
Guilfoile, Connor; Rampant, Peter; House, Michael
2017-06-01
The presence of metal artefacts in computed tomography (CT) create issues in radiation oncology. The loss of anatomical information and incorrect Hounsfield unit (HU) values produce inaccuracies in dose calculations, providing suboptimal patient treatment. Metal artefact reduction (MAR) algorithms were developed to combat these problems. This study provides a qualitative and quantitative analysis of the "Smart MAR" software (General Electric Healthcare, Chicago, IL, USA), determining its usefulness in a clinical setting. A detailed analysis was conducted using both patient and phantom data, noting any improvements in HU values and dosimetry with the GE-MAR enabled. This study indicates qualitative improvements in severity of the streak artefacts produced by metals, allowing for easier patient contouring. Furthermore, the GE-MAR managed to recover previously lost anatomical information. Additionally, phantom data showed an improvement in HU value with GE-MAR correction, producing more accurate point dose calculations in the treatment planning system. Overall, the GE-MAR is a useful tool and is suitable for clinical environments.
Atkinson, Thomas M.; DeBusk, Kendra P.A.; Liepa, Astra M.; Scanlon, Michael; Coons, Stephen Joel
2016-01-01
PURPOSE To describe the process and results of the preliminary qualitative development of a new symptom-based PRO measure intended to assess treatment benefit in advanced non-small cell lung cancer (NSCLC) clinical trials. METHODS Individual qualitative interviews were conducted with adult NSCLC (Stage I–IV) patients in the US. Experienced interviewers conducted concept elicitation (CE) and cognitive interviews using semi-structured interview guides. The CE interview guide was used to elicit spontaneous reports of symptom experiences along with probing to further explore and confirm concepts. Interview transcripts were coded and analyzed by professional qualitative coders using Atlas.ti software, and were summarized by like-content using an iterative coding framework. Data from the CE interviews were considered alongside existing literature and clinical expert opinion during an item-generation process, leading to development of a preliminary version of the NSCLC Symptom Assessment Questionnaire (NSCLC-SAQ). Three waves of cognitive interviews were conducted to evaluate concept relevance, item interpretability, and structure of the draft items to facilitate further instrument refinement. FINDINGS Fifty-one patients (mean age 64.9 [SD=11.2]; 51.0% female) participated in the CE interviews. A total of 1,897 expressions of NSCLC-related symptoms were identified and coded in interview transcripts, representing approximately 42 distinct symptom concepts. A 9-item initial draft instrument was developed for testing in three waves of cognitive interviews with additional NSCLC patients (n=20), during which both paper and electronic versions of the instrument were evaluated and refined. Participant responses and feedback during cognitive interviews led to the removal of 2 items and substantial modifications to others. IMPLICATIONS The NSCLC-SAQ is a 7-item PRO measure intended for use in advanced NSCLC clinical trials to support medical product labelling. The NSCLC-SAQ uses a 7-day recall period and verbal rating scales. It was developed in accordance with the FDA’s PRO Guidance and scientific best practices, and the resulting qualitative interview data provide evidence of content validity. The NSCLC-SAQ has been prepared in both paper and electronic administration formats and a tablet computer-based version is currently undergoing quantitative testing to confirm its measurement properties and support FDA qualification. PMID:27041408
The life sciences mass spectrometry research unit.
Hopfgartner, Gérard; Varesio, Emmanuel
2012-01-01
The Life Sciences Mass Spectrometry (LSMS) research unit focuses on the development of novel analytical workflows based on innovative mass spectrometric and software tools for the analysis of low molecular weight compounds, peptides and proteins in complex biological matrices. The present article summarizes some of the recent work of the unit: i) the application of matrix-assisted laser desorption/ionization (MALDI) for mass spectrometry imaging (MSI) of drug of abuse in hair, ii) the use of high resolution mass spectrometry for simultaneous qualitative/quantitative analysis in drug metabolism and metabolomics, and iii) the absolute quantitation of proteins by mass spectrometry using the selected reaction monitoring mode.
3-D interactive visualisation tools for Hi spectral line imaging
NASA Astrophysics Data System (ADS)
van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.
2017-06-01
Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.
Shachak, Aviv; Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R; Lemieux-Charles, Louise
2013-06-04
Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design.
Software Tools to Support the Assessment of System Health
NASA Technical Reports Server (NTRS)
Melcher, Kevin J.
2013-01-01
This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of sensors that optimally meet the performance goals and the constraints. It identifies optimal sensor suite solutions by utilizing a merit (i.e., cost) function with one of several available optimization approaches. As part of its analysis, S4 can expose fault conditions that are difficult to diagnose due to an incomplete diagnostic philosophy and/or a lack of sensors. S4 was originally developed and applied to liquid rocket engines. It was subsequently used to study the optimized selection of sensors for a simulation ]based aircraft engine diagnostic system. The ETA Tool is a software ]based analysis tool that augments the testability analysis and reporting capabilities of a commercial ]off ]the ]shelf (COTS) package. An initial diagnostic assessment is performed by the COTS software using a user ]developed, qualitative, directed ]graph model of the system being analyzed. The ETA Tool accesses system design information captured within the model and the associated testability analysis output to create a series of six reports for various system engineering needs. These reports are highlighted in the presentation. The ETA Tool was developed by NASA to support the verification of fault management requirements early in the Launch Vehicle process. Due to their early development during the design process, the TEAMS ]based diagnostic model and the ETA Tool were able to positively influence the system design by highlighting gaps in failure detection, fault isolation, and failure recovery.
SimGraph: A Flight Simulation Data Visualization Workstation
NASA Technical Reports Server (NTRS)
Kaplan, Joseph A.; Kenney, Patrick S.
1997-01-01
Today's modern flight simulation research produces vast amounts of time sensitive data, making a qualitative analysis of the data difficult while it remains in a numerical representation. Therefore, a method of merging related data together and presenting it to the user in a more comprehensible format is necessary. Simulation Graphics (SimGraph) is an object-oriented data visualization software package that presents simulation data in animated graphical displays for easy interpretation. Data produced from a flight simulation is presented by SimGraph in several different formats, including: 3-Dimensional Views, Cockpit Control Views, Heads-Up Displays, Strip Charts, and Status Indicators. SimGraph can accommodate the addition of new graphical displays to allow the software to be customized to each user s particular environment. A new display can be developed and added to SimGraph without having to design a new application, allowing the graphics programmer to focus on the development of the graphical display. The SimGraph framework can be reused for a wide variety of visualization tasks. Although it was created for the flight simulation facilities at NASA Langley Research Center, SimGraph can be reconfigured to almost any data visualization environment. This paper describes the capabilities and operations of SimGraph.
Distance education course on spatial multi-hazard risk assessment, using Open Source software
NASA Astrophysics Data System (ADS)
van Westen, C. J.; Frigerio, S.
2009-04-01
As part of the capacity building activities of the United Nations University - ITC School on Disaster Geo-Information Management (UNU-ITC DGIM) the International Institute for Geoinformation Science and Earth Observation (ITC) has developed a distance education course on the application of Geographic Information Systems for multi-hazard risk assessment. This course is designed for academic staff, as well as for professionals working in (non-) governmental organizations where knowledge of disaster risk management is essential. The course guides the participants through the entire process of risk assessment, on the basis of a case study of a city exposed to multiple hazards, in a developing country. The courses consists of eight modules, each with a guide book explaining the theoretical background, and guiding the participants through spatial data requirements for risk assessment, hazard assessment procedures, generation of elements at risk databases, vulnerability assessment, qualitative and quantitative risk assessment methods, risk evaluation and risk reduction. Linked to the theory is a large set of exercises, with exercise descriptions, answer sheets, demos and GIS data. The exercises deal with four different types of hazards: earthquakes, flooding, technological hazards, and landslides. One important consideration in designing the course is that people from developing countries should not be restricted in using it due to financial burdens for software acquisition. Therefore the aim was to use Open Source software as a basis. The GIS exercises are written for the ILWIS software. All exercises have also been integrated into a WebGIS, using the Open source software CartoWeb (based on GNU License). It is modular and customizable thanks to its object-oriented architecture and based on a hierarchical structure (to manage and organize every package of information of every step required in risk assessment). Different switches for every component of the risk assessment course have been defined and through various menus the user can define the options for every exercise. For every layer of information tools for querying, printing, searching and surface analysis are implemented, allowing the option to compare maps at different scale and for on-line interpretations.
Outcomes assessment of dental hygiene clinical teaching workshops.
Wallace, Juanita S; Infante, Taline D
2008-10-01
Faculty development courses related to acquiring clinical teaching skills in the health professions are limited. Consequently, the Department of Dental Hygiene at the University of Texas Health Science Center at San Antonio conducted a series of clinical teaching workshops to address clinical teaching methodology. The goal of these workshops was to promote a problem-solving learning atmosphere for dental hygiene faculty to acquire and share sound clinical teaching strategies. To determine the value of the annual workshops on clinical teaching and evaluation, a web-based qualitative program assessment was developed using software by Survey Tracker. Four open-ended questions were designed to elicit perceptions regarding what significant changes in teaching strategies were achieved, what barriers or challenges were encountered in making these changes, and what strategies were used to overcome the barriers. The assessment was sent to dental hygiene educators representing thirty-eight dental hygiene programs who had participated in two or more of these workshops. Twenty-eight programs provided collective responses to the questions, and the narrative data were analyzed, using a qualitative methodology. Responses revealed that programs had made productive changes to their clinical education curricula and the information gained from the workshops had a positive effect on clinical teaching.
The declared barriers of the large developing countries waste management projects: The STAR model.
Bufoni, André Luiz; Oliveira, Luciano Basto; Rosa, Luiz Pinguelli
2016-06-01
The aim of this study is to investigate and describe the barriers system that precludes the feasibility, or limits the performance of the waste management projects through the analysis of which are the declared barriers at the 432 large waste management projects registered as CDM during the period 2004-2014. The final product is a waste management barriers conceptual model proposal (STAR), supported by literature and corroborated by projects design documents. This paper uses the computer assisted qualitative content analysis (CAQCA) methodology with the qualitative data analysis (QDA) software NVivo®, by 890 fragments, to investigate the motives to support our conclusions. Results suggest the main barriers classification in five types: sociopolitical, technological, regulatory, financial, and human resources constraints. Results also suggest that beyond the waste management industry, projects have disadvantages added related to the same barriers inherent to others renewable energies initiatives. The STAR model sheds some light over the interactivity and dynamics related to the main constraints of the industry, describing the mutual influences and relationships among each one. Future researches are needed to better and comprehensively understand these relationships and ease the development of tools to alleviate or eliminate them. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bastani, Peivand; Mehralian, Gholamhossein; Dinarvand, Rasoul
2015-01-01
The aim of this study was to review the current methods of pharmaceutical purchasing by Iranian insurance organizations within the World Bank conceptual framework model so as to provide applicable pharmaceutical resource allocation and purchasing (RAP) arrangements in Iran. This qualitative study was conducted through a qualitative document analysis (QDA), applying the four-step Scott method in document selection, and conducting 20 semi-structured interviews using a triangulation method. Furthermore, the data were analyzed applying five steps framework analysis using Atlas-ti software. The QDA showed that the purchasers face many structural, financing, payment, delivery and service procurement and purchasing challenges. Moreover, the findings of interviews are provided in three sections including demand-side, supply-side and price and incentive regime. Localizing RAP arrangements as a World Bank Framework in a developing country like Iran considers the following as the prerequisite for implementing strategic purchasing in pharmaceutical sector: The improvement of accessibility, subsidiary mechanisms, reimbursement of new drugs, rational use, uniform pharmacopeia, best supplier selection, reduction of induced demand and moral hazard, payment reform. It is obvious that for Iran, these customized aspects are more various and detailed than those proposed in a World Bank model for developing countries.
Hetrick, Evan M; Kramer, Timothy T; Risley, Donald S
2017-03-17
Based on a column-screening exercise, a column ranking system was developed for sample mixtures containing any combination of 26 sugar and sugar alcohol analytes using 16 polar stationary phases in the HILIC mode with acetonitrile/water or acetone/water mobile phases. Each analyte was evaluated on the HILIC columns with gradient elution and the subsequent chromatography data was compiled into a statistical software package where any subset of the analytes can be selected and the columns are then ranked by the greatest separation. Since these analytes lack chromophores, aerosol-based detectors, including an evaporative light scattering detector (ELSD) and a charged aerosol detector (CAD) were employed for qualitative and quantitative detection. Example qualitative applications are provided to illustrate the practicality and efficiency of this HILIC column ranking. Furthermore, the design-space approach was used as a starting point for a quantitative method for the trace analysis of glucose in trehalose samples in a complex matrix. Knowledge gained from evaluating the design-space led to rapid development of a capable method as demonstrated through validation of the following parameters: specificity, accuracy, precision, linearity, limit of quantitation, limit of detection, and range. Copyright © 2017 Elsevier B.V. All rights reserved.
For Better or For Worse: Environmental Health Promotion in ...
Environmental Health Education (EHE) is most effective when it incorporates environmental science, risk education, and health education. When paired with the local knowledge of community members, EHE can promote health equity and community action, especially for socially disadvantaged communities, which are disproportionately exposed to environmental hazards. Developing EHE programs that inform residents about toxic exposures that damage their health and affect their quality of life is critical for them to understand their true risk. The community of interest is a public housing development surrounded by landfills, hazardous waste sites, and manufacturing facilities located in a Midwestern city of the United States (Chicago, Illinois). An environmental justice organization, People for Community Recovery (PCR), was the community partner. Data was collected during one week in March 2009 from community residents using both qualitative and quantitative research methods, including both a focus group and a survey instrument provided to two different resident groups, to understand their attitudes/beliefs about environmental hazards, including exposure to hazardous wastes, landfills, and lead, and their preferences for EHE. The data was analyzed using standard qualitative analytical procedures and statistical software, when appropriate. This research assesses the impact that Environmental Health Education (EHE) can have on: improved civic engagement (i.e., increased int
Development of a testlet generator in re-engineering the Indonesian physics national-exams
NASA Astrophysics Data System (ADS)
Mindyarto, Budi Naini; Mardapi, Djemari; Bastari
2017-08-01
The Indonesian Physics national-exams are end-of-course summative assessments that could be utilized to support the assessment for learning in physics educations. This paper discusses the development and evaluation of a testlet generator based on a re-engineering of Indonesian physics national exams. The exam problems were dissected and decomposed into testlets revealing the deeper understanding of the underlying physical concepts by inserting a qualitative question and its scientific reasoning question. A template-based generator was built to facilitate teachers in generating testlet variants that would be more conform to students' scientific attitude development than their original simple multiple-choice formats. The testlet generator was built using open source software technologies and was evaluated focusing on the black-box testing by exploring the generator's execution, inputs and outputs. The results showed the correctly-performed functionalities of the developed testlet generator in validating inputs, generating testlet variants, and accommodating polytomous item characteristics.
Yamazaki, Hiroshi; Slingsby, Brian Taylor; Takahashi, Miyako; Hayashi, Yoko; Sugimori, Hiroki; Nakayama, Takeo
2009-12-01
Although qualitative studies have increased since the 1990s, some reports note that relatively few influential journals published them up until 2000. This study critically reviewed the characteristics of qualitative studies published in top tier medical journals since 2000. We assessed full texts of qualitative studies published between 2000 and 2004 in the Annals of Internal Medicine, BMJ, JAMA, Lancet, and New England Journal of Medicine. We found 80 qualitative studies, of which 73 (91%) were published in BMJ. Only 10 studies (13%) combined qualitative and quantitative methods. Sixty-two studies (78%) used only one method of data collection. Interviews dominated the choice of data collection. The median sample size was 36 (range: 9-383). Thirty-three studies (41%) did not specify the type of analysis used but rather described the analytic process in detail. The rest indicated the mode of data analysis, in which the most prevalent methods were the constant comparative method (23%) and the grounded theory approach (22%). Qualitative data analysis software was used by 33 studies (41%). Among influential journals of general medicine, only BMJ consistently published an average of 15 qualitative study reports between 2000 and 2004. These findings lend insight into what qualities and characteristics make a qualitative study worthy of consideration to be published in an influential journal, primarily BMJ.
Methods for the thematic synthesis of qualitative research in systematic reviews
Thomas, James; Harden, Angela
2008-01-01
Background There is a growing recognition of the value of synthesising qualitative research in the evidence base in order to facilitate effective and appropriate health care. In response to this, methods for undertaking these syntheses are currently being developed. Thematic analysis is a method that is often used to analyse data in primary qualitative research. This paper reports on the use of this type of analysis in systematic reviews to bring together and integrate the findings of multiple qualitative studies. Methods We describe thematic synthesis, outline several steps for its conduct and illustrate the process and outcome of this approach using a completed review of health promotion research. Thematic synthesis has three stages: the coding of text 'line-by-line'; the development of 'descriptive themes'; and the generation of 'analytical themes'. While the development of descriptive themes remains 'close' to the primary studies, the analytical themes represent a stage of interpretation whereby the reviewers 'go beyond' the primary studies and generate new interpretive constructs, explanations or hypotheses. The use of computer software can facilitate this method of synthesis; detailed guidance is given on how this can be achieved. Results We used thematic synthesis to combine the studies of children's views and identified key themes to explore in the intervention studies. Most interventions were based in school and often combined learning about health benefits with 'hands-on' experience. The studies of children's views suggested that fruit and vegetables should be treated in different ways, and that messages should not focus on health warnings. Interventions that were in line with these suggestions tended to be more effective. Thematic synthesis enabled us to stay 'close' to the results of the primary studies, synthesising them in a transparent way, and facilitating the explicit production of new concepts and hypotheses. Conclusion We compare thematic synthesis to other methods for the synthesis of qualitative research, discussing issues of context and rigour. Thematic synthesis is presented as a tried and tested method that preserves an explicit and transparent link between conclusions and the text of primary studies; as such it preserves principles that have traditionally been important to systematic reviewing. PMID:18616818
Control of surface thermal scratch of strip in tandem cold rolling
NASA Astrophysics Data System (ADS)
Chen, Jinshan; Li, Changsheng
2014-07-01
The thermal scratch seriously affects the surface quality of the cold rolled stainless steel strip. Some researchers have carried out qualitative and theoretical studies in this field. However, there is currently a lack of research on effective forecast and control of thermal scratch defects in practical production, especially in tandem cold rolling. In order to establish precise mathematical model of oil film thickness in deformation zone, the lubrication in cold rolling process of SUS410L stainless steel strip is studied, and major factors affecting oil film thickness are also analyzed. According to the principle of statistics, mathematical model of critical oil film thickness in deformation zone for thermal scratch is built, with fitting and regression analytical method, and then based on temperature comparison method, the criterion for deciding thermal scratch defects is put forward. Storing and calling data through SQL Server 2010, a software on thermal scratch defects control is developed through Microsoft Visual Studio 2008 by MFC technique for stainless steel in tandem cold rolling, and then it is put into practical production. Statistics indicate that the hit rate of thermal scratch is as high as 92.38%, and the occurrence rate of thermal scratch is decreased by 89.13%. Owing to the application of the software, the rolling speed is increased by approximately 9.3%. The software developed provides an effective solution to the problem of thermal scratch defects in tandem cold rolling, and helps to promote products surface quality of stainless steel strips in practical production.
Serials Evaluation: An Innovative Approach.
ERIC Educational Resources Information Center
Berger, Marilyn; Devine, Jane
1990-01-01
Describes a method of analyzing serials collections in special libraries that combines evaluative criteria with database management technology. Choice of computer software is discussed, qualitative information used to evaluate subject coverage is examined, and quantitative and descriptive data that can be used for collection management are…
Effects of MicroCAD on Learning Fundamental Engineering Graphical Concepts: A Qualitative Study.
ERIC Educational Resources Information Center
Leach, James A.; Gull, Randall L.
1990-01-01
Students' reactions and performances were examined when taught engineering geometry concepts using a standard microcomputer-aided drafting software package. Two sample groups were compared based on their computer experience. Included are the methodology, data analysis, and conclusions. (KR)
Conducting clinical post-conference in clinical teaching: a qualitative study.
Hsu, Li-Ling
2007-08-01
The aim of this study was to explore nurse educators' perceptions regarding clinical postconferences. Additional aims included the exploration of interaction characteristics between students and faculty in clinical postconferences. Nursing students are challenged to think and learn in ways that will prepare them for practice in a complex health care environment. Clinical postconferences give students the opportunity to share knowledge gained through transformative learning and provide a forum for discussion and critical thinking. Faculty members must guide students as the latter participate in discussions, develop problem-solving skills and express feedings and attitudes in clinical conferences. The study used qualitative research methods, including participant observation and an open-ended questionnaire. Participant observers watched interaction activities between teachers and students in clinical postconferences. A total of 20 clinical postconferences, two conferences per teacher, were observed. The Non-Numerical Unstructured Data Indexing Searching and Theory-building qualitative software program was used in data analysis. Research findings indicated that, of the six taxonomy questions, lower-level questions (knowledge and comprehensive questions) were mostly asked by faculty members' postclinical conferences. The most frequently used guideline was task orientation, which is related to practice goals and was found in discussions of assignments, reading reports, discussions of clinical experiences, role plays, psychomotor skill practice, quizzes and student evaluations. It is an essential responsibility of nurse educators to employ postconferences to assist students in applying their knowledge in practical situations, in developing professional values and in enhancing their problem solving abilities.
Arman, Soroor; Golmohammadi, Farnaz; Maracy, Mohammadreza; Molaeinezhad, Mitra
2018-01-01
Despite conducting wide-ranging of pharmacotherapy for bipolar adolescents, many of them are showing a deficit in functioning with high relapse rate. The aim of the current study was to develop a manual and investigate the efficacy of group cognitive-behavioral therapy (G-CBT) for female bipolar adolescents. During the first qualitative phase of a mixed-methods study, a manual of G-CBT was developed. Then, 32 female bipolar adolescents aged 12-19 years old, receiving usual maintenance medications (UMM), were selected. Participants were randomized to the control (UMM) and intervention group (5, 2 h weekly sessions based on G-CBT manual with UMM). The parents in intervention group participated in three parallel sessions. All participants filled the following questionnaires before 1, 3, and 6 months after the initiation of the study: Young Mania Rating Scale, Children Depression Inventory and Global Assessment of Functioning. The results were analyzed using SPSS 21 software. The concurrent qualitative phase was analyzed through thematic analysis. The results showed no significant differences in all questionnaires' scores through intervention and follow-up sessions ( P > 0.05). However, using cutoff point of CDI, G-CBT was effective for intervention group (relapse rate: 25% vs. 44.4%). Two themes were extracted from the second qualitative phase: emotion recognition and emotion regulation, especially in anger control. The results showed that the addition of G-CBT to UMM leads to decrease in the depressive scores but has no effect on manic symptoms and relapse rate.
Virtual reality haptic dissection.
Erolin, Caroline; Wilkinson, Caroline; Soames, Roger
2011-12-01
This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist, and investigate cross-discipline collaborations in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills, before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.
The development of participatory health research among incarcerated women in a Canadian prison
Murphy, K.; Hanson, D.; Hemingway, C.; Ramsden, V.; Buxton, J.; Granger-Brown, A.; Condello, L-L.; Buchanan, M.; Espinoza-Magana, N.; Edworthy, G.; Hislop, T. G.
2009-01-01
This paper describes the development of a unique prison participatory research project, in which incarcerated women formed a research team, the research activities and the lessons learned. The participatory action research project was conducted in the main short sentence minimum/medium security women's prison located in a Western Canadian province. An ethnographic multi-method approach was used for data collection and analysis. Quantitative data was collected by surveys and analysed using descriptive statistics. Qualitative data was collected from orientation package entries, audio recordings, and written archives of research team discussions, forums and debriefings, and presentations. These data and ethnographic observations were transcribed and analysed using iterative and interpretative qualitative methods and NVivo 7 software. Up to 15 women worked each day as prison research team members; a total of 190 women participated at some time in the project between November 2005 and August 2007. Incarcerated women peer researchers developed the research processes including opportunities for them to develop leadership and technical skills. Through these processes, including data collection and analysis, nine health goals emerged. Lessons learned from the research processes were confirmed by the common themes that emerged from thematic analysis of the research activity data. Incarceration provides a unique opportunity for engagement of women as expert partners alongside academic researchers and primary care workers in participatory research processes to improve their health. PMID:25759141
Modeling crime events by d-separation method
NASA Astrophysics Data System (ADS)
Aarthee, R.; Ezhilmaran, D.
2017-11-01
Problematic legal cases have recently called for a scientifically founded method of dealing with the qualitative and quantitative roles of evidence in a case [1].To deal with quantitative, we proposed a d-separation method for modeling the crime events. A d-separation is a graphical criterion for identifying independence in a directed acyclic graph. By developing a d-separation method, we aim to lay the foundations for the development of a software support tool that can deal with the evidential reasoning in legal cases. Such a tool is meant to be used by a judge or juror, in alliance with various experts who can provide information about the details. This will hopefully improve the communication between judges or jurors and experts. The proposed method used to uncover more valid independencies than any other graphical criterion.
Leveraging Modeling Approaches: Reaction Networks and Rules
Blinov, Michael L.; Moraru, Ion I.
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high resolution and/or high throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatio-temporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks – the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks. PMID:22161349
Leveraging modeling approaches: reaction networks and rules.
Blinov, Michael L; Moraru, Ion I
2012-01-01
We have witnessed an explosive growth in research involving mathematical models and computer simulations of intracellular molecular interactions, ranging from metabolic pathways to signaling and gene regulatory networks. Many software tools have been developed to aid in the study of such biological systems, some of which have a wealth of features for model building and visualization, and powerful capabilities for simulation and data analysis. Novel high-resolution and/or high-throughput experimental techniques have led to an abundance of qualitative and quantitative data related to the spatiotemporal distribution of molecules and complexes, their interactions kinetics, and functional modifications. Based on this information, computational biology researchers are attempting to build larger and more detailed models. However, this has proved to be a major challenge. Traditionally, modeling tools require the explicit specification of all molecular species and interactions in a model, which can quickly become a major limitation in the case of complex networks - the number of ways biomolecules can combine to form multimolecular complexes can be combinatorially large. Recently, a new breed of software tools has been created to address the problems faced when building models marked by combinatorial complexity. These have a different approach for model specification, using reaction rules and species patterns. Here we compare the traditional modeling approach with the new rule-based methods. We make a case for combining the capabilities of conventional simulation software with the unique features and flexibility of a rule-based approach in a single software platform for building models of molecular interaction networks.
Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach
NASA Astrophysics Data System (ADS)
Mwangi, M. W.
2015-12-01
Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.
DOT National Transportation Integrated Search
2013-01-01
The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...
NASA Astrophysics Data System (ADS)
Latief, F. D. E.; Mohammad, I. H.; Rarasati, A. D.
2017-11-01
Digital imaging of a concrete sample using high resolution tomographic imaging by means of X-Ray Micro Computed Tomography (μ-CT) has been conducted to assess the characteristic of the sample’s structure. A standard procedure of image acquisition, reconstruction, image processing of the method using a particular scanning device i.e., the Bruker SkyScan 1173 High Energy Micro-CT are elaborated. A qualitative and a quantitative analysis were briefly performed on the sample to deliver some basic ideas of the capability of the system and the bundled software package. Calculation of total VOI volume, object volume, percent of object volume, total VOI surface, object surface, object surface/volume ratio, object surface density, structure thickness, structure separation, total porosity were conducted and analysed. This paper should serve as a brief description of how the device can produce the preferred image quality as well as the ability of the bundled software packages to help in performing qualitative and quantitative analysis.
It's a sentence, not a word: insights from a keyword analysis in cancer communication.
Taylor, Kimberly; Thorne, Sally; Oliffe, John L
2015-01-01
Keyword analysis has been championed as a methodological option for expanding the insights that can be extracted from qualitative datasets using various properties available in qualitative software. Intrigued by the pioneering applications of Clive Seale and his colleagues in this regard, we conducted keyword analyses for word frequency and "keyness" on a qualitative database of interview transcripts from a study on cancer communication. We then subjected the results from these operations to an in-depth contextual inquiry by resituating word instances within their original speech contexts, finding that most of what had initially appeared as group variations broke down under close analysis. In this article, we illustrate the various threads of analysis, and explain how they unraveled under closer scrutiny. On the basis of this tentative exercise, we conclude that a healthy skepticism for the benefits of keyword analysis within a qualitative investigative process seems warranted. © The Author(s) 2014.
Space Flight Software Development Software for Intelligent System Health Management
NASA Technical Reports Server (NTRS)
Trevino, Luis C.; Crumbley, Tim
2004-01-01
The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.
Continuous time Boolean modeling for biological signaling: application of Gillespie algorithm.
Stoll, Gautier; Viara, Eric; Barillot, Emmanuel; Calzone, Laurence
2012-08-29
Mathematical modeling is used as a Systems Biology tool to answer biological questions, and more precisely, to validate a network that describes biological observations and predict the effect of perturbations. This article presents an algorithm for modeling biological networks in a discrete framework with continuous time. There exist two major types of mathematical modeling approaches: (1) quantitative modeling, representing various chemical species concentrations by real numbers, mainly based on differential equations and chemical kinetics formalism; (2) and qualitative modeling, representing chemical species concentrations or activities by a finite set of discrete values. Both approaches answer particular (and often different) biological questions. Qualitative modeling approach permits a simple and less detailed description of the biological systems, efficiently describes stable state identification but remains inconvenient in describing the transient kinetics leading to these states. In this context, time is represented by discrete steps. Quantitative modeling, on the other hand, can describe more accurately the dynamical behavior of biological processes as it follows the evolution of concentration or activities of chemical species as a function of time, but requires an important amount of information on the parameters difficult to find in the literature. Here, we propose a modeling framework based on a qualitative approach that is intrinsically continuous in time. The algorithm presented in this article fills the gap between qualitative and quantitative modeling. It is based on continuous time Markov process applied on a Boolean state space. In order to describe the temporal evolution of the biological process we wish to model, we explicitly specify the transition rates for each node. For that purpose, we built a language that can be seen as a generalization of Boolean equations. Mathematically, this approach can be translated in a set of ordinary differential equations on probability distributions. We developed a C++ software, MaBoSS, that is able to simulate such a system by applying Kinetic Monte-Carlo (or Gillespie algorithm) on the Boolean state space. This software, parallelized and optimized, computes the temporal evolution of probability distributions and estimates stationary distributions. Applications of the Boolean Kinetic Monte-Carlo are demonstrated for three qualitative models: a toy model, a published model of p53/Mdm2 interaction and a published model of the mammalian cell cycle. Our approach allows to describe kinetic phenomena which were difficult to handle in the original models. In particular, transient effects are represented by time dependent probability distributions, interpretable in terms of cell populations.
Google glass based immunochromatographic diagnostic test analysis
NASA Astrophysics Data System (ADS)
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2015-03-01
Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.
What Experiences in Medical School Trigger Professional Identity Development?
Kay, Denise; Berry, Andrea; Coles, Nicholas A
2018-04-02
Phenomenon: This qualitative inquiry used conceptual change theory as a theoretical lens to illuminate experiences in medical school that trigger professional identity formation. According to conceptual change theory, changes in personal conceptualizations are initiated when cognitive disequilibrium is introduced. We sought to identify the experiences that trigger cognitive disequilibrium and to subsequently describe students' perceptions of self-in-profession prior to the experience; the nature of the experience; and, when applicable, the outcomes of the experience. This article summarizes findings from portions of data collected in a larger qualitative study conducted at a new medical school in the United States that utilizes diverse pedagogies and experiences to develop student knowledge, clinical skills, attitudes, and dispositions. Primary data sources included focus groups and individual interviews with students across the 4 years of the curriculum (audio data). Secondary data included students' comments from course and end-of-year evaluations for the 2013-2017 classes (text data). Data treatment tools available in robust qualitative software, NVivo 10, were utilized to expedite coding of both audio and text data. Content analysis was adopted as the analysis method for both audio and text data. We identified four experiences that triggered cognitive disequilibrium in relationship to students' perceptions of self-in-profession: (a) transition from undergraduate student to medical student, (b) clinical experiences in the preclinical years, (c) exposure to the business of medicine, and (d) exposure to physicians in clinical practice. Insights: We believe these experiences represent vulnerable periods of professional identity formation during medical school. Educators interested in purposefully shaping curriculum to encourage adaptive professional identity development during medical school may find it useful to integrate educational interventions that assist students with navigating the disequilibrium that is introduced during these periods.
A semi-quantitative and thematic analysis of medical student attitudes towards M-Learning.
Green, Ben L; Kennedy, Iain; Hassanzadeh, Hadi; Sharma, Suneal; Frith, Gareth; Darling, Jonathan C
2015-10-01
Smartphone and mobile application technology have in recent years furthered the development of novel learning and assessment resources. 'MBChB Mobile' is a pioneering mobile learning (M-Learning) programme at University of Leeds, United Kingdom and provides all senior medical students with iPhone handsets complete with academic applications, assessment software and a virtual reflective environment. This study aimed to evaluate the impact of MBChB Mobile on student learning. Ethical approval was granted to invite fourth and fifth year medical students to participate in a semi-quantitative questionnaire: data were collected anonymously with informed consent and analysed where appropriate using chi-squared test of association. Qualitative data generated through focus group participation were subjected to both content and thematic analysis. A total of 278 of 519 (53.6%) invited participants responded. Overall, 72.6% of students agreed that MBChB Mobile enhanced their learning experience; however, this was significantly related to overall usage (P < 0.001) and self-reported mobile technology proficiency (P < 0.001). Qualitative data revealed barriers to efficacy including technical software issues, non-transferability to different mobile devices, and perceived patient acceptability. As one of the largest evaluative and only quantitative study of smartphone-assisted M-Learning in undergraduate medical education, MBChB Mobile suggests that smartphone and application technology enhances students' learning experience. Barriers to implementation may be addressed through the provision of tailored learning resources, along with user-defined support systems, and appropriate means of ensuring acceptability to patients. © 2015 John Wiley & Sons, Ltd.
[AC-STB: dedicated software for managed healthcare of chronic headache patients].
Wallasch, T-M; Bek, J; Pabel, R; Modahl, M; Demir, M; Straube, A
2009-04-01
This paper examines a new approach to managed healthcare where a network of care providers exchanges patient information through the internet. Integrating networks of clinical specialists and general care providers promises to achieve qualitative and economic improvements in the German healthcare system. In practice, problems related to patient management and data exchange between the managing clinic and assorted caregivers arise. The implementation and use of a cross-spectrum computerized solution for the management of patients and their care is the key for a successful managed healthcare system. This paper documents the managed healthcare of chronic headache patients and the development of an IT-solution capable of providing distributed patient care and case management.
Client - server programs analysis in the EPOCA environment
NASA Astrophysics Data System (ADS)
Donatelli, Susanna; Mazzocca, Nicola; Russo, Stefano
1996-09-01
Client - server processing is a popular paradigm for distributed computing. In the development of client - server programs, the designer has first to ensure that the implementation behaves correctly, in particular that it is deadlock free. Second, he has to guarantee that the program meets predefined performance requirements. This paper addresses the issues in the analysis of client - server programs in EPOCA. EPOCA is a computer-aided software engeneering (CASE) support system that allows the automated construction and analysis of generalized stochastic Petri net (GSPN) models of concurrent applications. The paper describes, on the basis of a realistic case study, how client - server systems are modelled in EPOCA, and the kind of qualitative and quantitative analysis supported by its tools.
The Digital Divide through the Lens of Critical Race Theory: The Digitally Denied
ERIC Educational Resources Information Center
Hollins, Stacy Gee
2015-01-01
The purpose of this qualitative research study was to examine African American community college students' availability to technological resources and how that availability affects their success. In this study, technological resources include access to the internet, software, hardware, technology training, technology support, and community…
USDA-ARS?s Scientific Manuscript database
Photography has been a welcome tool in documenting and conveying qualitative soil information. When coupled with image analysis software, the usefulness of digital cameras can be increased to advance the field of micropedology. The determination of a Representative Elementary Area (REA) still rema...
The Use of Technology in the Medical Assisting Classroom
ERIC Educational Resources Information Center
Kozielski, Tracy L.
2014-01-01
The growing presence of technology in health care has infiltrated educational institutions. Numerous software and hardware technologies have been designed to improve student learning; however, their use in the classroom is unclear. The purpose of this qualitative case study was to examine the experiences of medical assisting faculty using…
Virtual Classroom: Reflections of Online Learning
ERIC Educational Resources Information Center
Michael, Kathy
2012-01-01
Purpose: The purpose of this study is to identify student and staff experiences with online learning at higher education (HE) using the software Elluminate Live! Design/methodology/approach: This paper adopts a qualitative approach, focusing on the reflections of participants (student and teacher) collated over a 12 month period of piloting online…
Primary School Principals' Experiences with Smartphone Apps
ERIC Educational Resources Information Center
Çakir, Rahman; Aktay, Sayim
2016-01-01
Smartphones are not just pieces of hardware, they at same time also dip into software features such as communication systems. The aim of this study is to examine primary school principals' experiences with smart phone applications. Shedding light on this subject means that this research is qualitative. Criterion sampling has been intentionally…
Assessing Adult Student Reactions to Assistive Technology in Writing Instruction
ERIC Educational Resources Information Center
Mueller, Julie; Wood, Eileen; Hunt, Jen; Specht, Jacqueline
2009-01-01
The authors examined the implementation of assistive technology in a community literacy centre's writing program for adult learners. Quantitative and qualitative analyses indicated that (a) software and instructional methods for writing must be selected according to the needs of and in conjunction with adult learners, (b) learners needed…
NASA Astrophysics Data System (ADS)
Basri, Shuib; O'Connor, Rory V.
This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.
[Propensity score matching in SPSS].
Huang, Fuqiang; DU, Chunlin; Sun, Menghui; Ning, Bing; Luo, Ying; An, Shengli
2015-11-01
To realize propensity score matching in PS Matching module of SPSS and interpret the analysis results. The R software and plug-in that could link with the corresponding versions of SPSS and propensity score matching package were installed. A PS matching module was added in the SPSS interface, and its use was demonstrated with test data. Score estimation and nearest neighbor matching was achieved with the PS matching module, and the results of qualitative and quantitative statistical description and evaluation were presented in the form of a graph matching. Propensity score matching can be accomplished conveniently using SPSS software.
Virtual reality haptic human dissection.
Needham, Caroline; Wilkinson, Caroline; Soames, Roger
2011-01-01
This project aims to create a three-dimensional digital model of the human hand and wrist which can be virtually 'dissected' through a haptic interface. Tissue properties will be added to the various anatomical structures to replicate a realistic look and feel. The project will explore the role of the medical artist and investigate the cross-discipline collaborations required in the field of virtual anatomy. The software will be used to train anatomy students in dissection skills before experience on a real cadaver. The effectiveness of the software will be evaluated and assessed both quantitatively as well as qualitatively.
Estimation of sample size and testing power (part 5).
Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo
2012-02-01
Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.
Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes
2014-01-01
The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.
Workforce development to provide person-centered care
Austrom, Mary Guerriero; Carvell, Carly A.; Alder, Catherine A.; Gao, Sujuan; Boustani, Malaz; LaMantia, Michael
2018-01-01
Objectives Describe the development of a competent workforce committed to providing patient-centered care to persons with dementia and/or depression and their caregivers; to report on qualitative analyses of our workforce’s case reports about their experiences; and to present lessons learned about developing and implementing a collaborative care community-based model using our new workforce that we call care coordinator assistants (CCAs). Method Sixteen CCAs were recruited and trained in person-centered care, use of mobile office, electronic medical record system, community resources, and team member support. CCAs wrote case reports quarterly that were analyzed for patient-centered care themes. Results Qualitative analysis of 73 cases using NVivo software identified six patient-centered care themes: (1) patient familiarity/understanding; (2) patient interest/engagement encouraged; (3) flexibility and continuity of care; (4) caregiver support/engagement; (5) effective utilization/integration of training; and (6) teamwork. Most frequently reported themes were patient familiarity – 91.8% of case reports included reference to patient familiarity, 67.1% included references to teamwork and 61.6% of case reports included the theme flexibility/continuity of care. CCAs made a mean number of 15.7 (SD = 15.6) visits, with most visits for coordination of care services, followed by home visits and phone visits to over 1200 patients in 12 months. Discussion Person-centered care can be effectively implemented by well-trained CCAs in the community. PMID:26666358
Marcon, Luciano; Diego, Xavier; Sharpe, James; Müller, Patrick
2016-04-08
The Turing reaction-diffusion model explains how identical cells can self-organize to form spatial patterns. It has been suggested that extracellular signaling molecules with different diffusion coefficients underlie this model, but the contribution of cell-autonomous signaling components is largely unknown. We developed an automated mathematical analysis to derive a catalog of realistic Turing networks. This analysis reveals that in the presence of cell-autonomous factors, networks can form a pattern with equally diffusing signals and even for any combination of diffusion coefficients. We provide a software (available at http://www.RDNets.com) to explore these networks and to constrain topologies with qualitative and quantitative experimental data. We use the software to examine the self-organizing networks that control embryonic axis specification and digit patterning. Finally, we demonstrate how existing synthetic circuits can be extended with additional feedbacks to form Turing reaction-diffusion systems. Our study offers a new theoretical framework to understand multicellular pattern formation and enables the wide-spread use of mathematical biology to engineer synthetic patterning systems.
NASA Astrophysics Data System (ADS)
Zarante, Paola Helena Barros; Sodré, José Ricardo
2018-07-01
This work presents a numerical simulation model for aldehyde formation and exhaust emissions from ethanol-fueled spark ignition engines. The aldehyde simulation model was developed using FORTRAN software, with the input data obtained from the dedicated engine cycle simulation software AVL BOOST. The model calculates formaldehyde and acetaldehyde concentrations from post-flame partial oxidation of methane, ethane and unburned ethanol. The calculated values were compared with experimental data obtained from a mid-size sedan powered by a 1.4-l spark ignition engine, tested on a chassis dynamometer. Exhaust aldehyde concentrations were determined using a Fourier Transform Infrared (FTIR) Spectroscopy analyzer. In general, the results demonstrate that the concentrations of aldehydes and the source elements increased with engine speed and exhaust gas temperature. The measured acetaldehyde concentrations showed values from 3 to 6 times higher than formaldehyde in the range studied. The model could predict reasonably well the qualitative experimental trends, with the quantitative results showing a maximum discrepancy of 39% for acetaldehyde concentration and 21 ppm for exhaust formaldehyde.
NASA Astrophysics Data System (ADS)
Zarante, Paola Helena Barros; Sodré, José Ricardo
2018-02-01
This work presents a numerical simulation model for aldehyde formation and exhaust emissions from ethanol-fueled spark ignition engines. The aldehyde simulation model was developed using FORTRAN software, with the input data obtained from the dedicated engine cycle simulation software AVL BOOST. The model calculates formaldehyde and acetaldehyde concentrations from post-flame partial oxidation of methane, ethane and unburned ethanol. The calculated values were compared with experimental data obtained from a mid-size sedan powered by a 1.4-l spark ignition engine, tested on a chassis dynamometer. Exhaust aldehyde concentrations were determined using a Fourier Transform Infrared (FTIR) Spectroscopy analyzer. In general, the results demonstrate that the concentrations of aldehydes and the source elements increased with engine speed and exhaust gas temperature. The measured acetaldehyde concentrations showed values from 3 to 6 times higher than formaldehyde in the range studied. The model could predict reasonably well the qualitative experimental trends, with the quantitative results showing a maximum discrepancy of 39% for acetaldehyde concentration and 21 ppm for exhaust formaldehyde.
Marcon, Luciano; Diego, Xavier; Sharpe, James; Müller, Patrick
2016-01-01
The Turing reaction-diffusion model explains how identical cells can self-organize to form spatial patterns. It has been suggested that extracellular signaling molecules with different diffusion coefficients underlie this model, but the contribution of cell-autonomous signaling components is largely unknown. We developed an automated mathematical analysis to derive a catalog of realistic Turing networks. This analysis reveals that in the presence of cell-autonomous factors, networks can form a pattern with equally diffusing signals and even for any combination of diffusion coefficients. We provide a software (available at http://www.RDNets.com) to explore these networks and to constrain topologies with qualitative and quantitative experimental data. We use the software to examine the self-organizing networks that control embryonic axis specification and digit patterning. Finally, we demonstrate how existing synthetic circuits can be extended with additional feedbacks to form Turing reaction-diffusion systems. Our study offers a new theoretical framework to understand multicellular pattern formation and enables the wide-spread use of mathematical biology to engineer synthetic patterning systems. DOI: http://dx.doi.org/10.7554/eLife.14022.001 PMID:27058171
BioNetSim: a Petri net-based modeling tool for simulations of biochemical processes.
Gao, Junhui; Li, Li; Wu, Xiaolin; Wei, Dong-Qing
2012-03-01
BioNetSim, a Petri net-based software for modeling and simulating biochemistry processes, is developed, whose design and implement are presented in this paper, including logic construction, real-time access to KEGG (Kyoto Encyclopedia of Genes and Genomes), and BioModel database. Furthermore, glycolysis is simulated as an example of its application. BioNetSim is a helpful tool for researchers to download data, model biological network, and simulate complicated biochemistry processes. Gene regulatory networks, metabolic pathways, signaling pathways, and kinetics of cell interaction are all available in BioNetSim, which makes modeling more efficient and effective. Similar to other Petri net-based softwares, BioNetSim does well in graphic application and mathematic construction. Moreover, it shows several powerful predominances. (1) It creates models in database. (2) It realizes the real-time access to KEGG and BioModel and transfers data to Petri net. (3) It provides qualitative analysis, such as computation of constants. (4) It generates graphs for tracing the concentration of every molecule during the simulation processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knirsch, Fabian; Engel, Dominik; Neureiter, Christian
In a smart grid, data and information are transported, transmitted, stored, and processed with various stakeholders having to cooperate effectively. Furthermore, personal data is the key to many smart grid applications and therefore privacy impacts have to be taken into account. For an effective smart grid, well integrated solutions are crucial and for achieving a high degree of customer acceptance, privacy should already be considered at design time of the system. To assist system engineers in early design phase, frameworks for the automated privacy evaluation of use cases are important. For evaluation, use cases for services and software architectures needmore » to be formally captured in a standardized and commonly understood manner. In order to ensure this common understanding for all kinds of stakeholders, reference models have recently been developed. In this paper we present a model-driven approach for the automated assessment of such services and software architectures in the smart grid that builds on the standardized reference models. The focus of qualitative and quantitative evaluation is on privacy. For evaluation, the framework draws on use cases from the University of Southern California microgrid.« less
van der Leij, Christiaan; Lavini, Cristina; van de Sande, Marleen G H; de Hair, Marjolein J H; Wijffels, Christophe; Maas, Mario
2015-12-01
To compare the between-session reproducibility of dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) combined with time-intensity curve (TIC)-shape analysis in arthritis patients, within one scanner and between two different scanners, and to compare this method with qualitative analysis and pharmacokinetic modeling (PKM). Fifteen knee joint arthritis patients were included and scanned twice on a closed-bore 1.5T scanner (n = 9, group 1), or on a closed-bore 1.5T and on an open-bore 1.0T scanner (n = 6, group 2). DCE-MRI data were postprocessed using in-house developed software ("Dynamo"). Disease activity was assessed. Disease activity was comparable between the two visits. In group 1 qualitative analysis showed the highest reproducibility with intraclass correlation coefficients (ICCs) between 0.78 and 0.98 and root mean square-coefficients of variation (RMS-CoV) of 8.0%-14.9%. TIC-shape analysis showed a slightly lower reproducibility with similar ICCs (0.78-0.97) but higher RMS-CoV (18.3%-42.9%). The PKM analysis showed the lowest reproducibility with ICCs between 0.39 and 0.64 (RMS-CoV 21.5%-51.9%). In group 2 TIC-shape analysis of the two most important TIC-shape types showed the highest reproducibility with ICCs of 0.78 and 0.71 (RMS-CoV 29.8% and 59.4%) and outperformed the reproducibility of the most important qualitative parameter (ICC 0.31, RMS-CoV 45.1%) and the within-scanner reproducibility of PKM analysis. TIC-shape analysis is a robust postprocessing method within one scanner, almost as reproducible as the qualitative analysis. Between scanners, the reproducibility of the most important TIC-shapes outperform that of the most important qualitative parameter and the within-scanner reproducibility of PKM analysis. © 2015 Wiley Periodicals, Inc.
Therapeutic Alliances in Stroke Rehabilitation: A Meta-Ethnography.
Lawton, Michelle; Haddock, Gillian; Conroy, Paul; Sage, Karen
2016-11-01
To synthesize qualitative studies exploring patients' and professionals' perspectives and experiences of developing and maintaining therapeutic alliances in stroke rehabilitation. A systematic literature search was conducted using the following electronic databases: PsycINFO, CINAHL, Embase, MEDLINE, Allied and Complementary Medicine Database, Applied Social Sciences Index and Abstracts, and ComDisDome from inception to May 2014. This was supplemented by hand searching, reference tracking, generic web searching, and e-mail contact with experts. Qualitative peer reviewed articles reporting experiences or perceptions of the patient or professional in relation to therapeutic alliance construction and maintenance in stroke rehabilitation were selected for inclusion. After a process of exclusion, 17 publications were included in the synthesis. All text identified in the results and discussion sections of the selected studies were extracted verbatim for analysis in a qualitative software program. Studies were critically appraised independently by 2 reviewers. Articles were synthesized using a technique of meta-ethnography. Four overarching themes emerged from the process of reciprocal translation: (1) the professional-patient relationship: degree of connectedness; (2) asymmetrical contributions; (3) the process of collaboration: finding the middle ground; and (4) system drivers. The findings from the meta-ethnography suggest that the balance of power between the patient and professional is asymmetrically distributed in the construction of the alliance. However, given that none of the studies included in the review addressed therapeutic alliance as a primary research area, further research is required to develop a conceptual framework relevant to stroke rehabilitation, in order to determine how this construct contributes to treatment efficacy. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Okuga, Monica; Nabirye, Rose Chalo; Sewankambo, Nelson Kaulukusi; Nakanjako, Damalie
2017-01-01
Limited data are available on the experiences of parental HIV disclosure to children in Uganda. We conducted a qualitative study comprising sixteen in-depth interviews and four focus group discussions with parents receiving highly active antiretroviral therapy. Analysis was done using Atlas.ti qualitative research software. Back-and-forth triangulation was done between transcripts of the in-depth interviews and focus group discussions, and themes and subthemes were developed. Barriers to parents' disclosure included perceptions that children are too young to understand what HIV infection means and fears of secondary disclosure by the children. Immediate outcomes of disclosure included children getting scared and crying, although such instances often gave way to more enduring positive experiences for the parents, such as support in adherence to medical care, help in household chores, and a decrease in financial demands from the children. Country-specific interventions are needed to improve the process of parental HIV disclosure to children and this should encompass preparation on how to deal with the immediate psychological challenges associated with the parent's disclosure. PMID:29209538
Key Health Information Technologies and Related Issues for Iran: A Qualitative Study.
Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammadreza; Saghafi, Fatemeh
2018-01-01
Planning for the future of Health Information Technology (HIT) requires applying a systematic approach when conducting foresight studies. The aim of this study was to identify key health information technologies and related issues for Iran until 2025. This was a qualitative study and the participants included experts and policy makers in the field of health information technology. In-depth semi-structured interviews were conducted and data were analyzed by using framework analysis and MAXQDA software. The findings revealed that the development of national health information network, electronic health records, patient health records, a cloud-based service center, interoperability standards, patient monitoring technologies, telehealth, mhealth, clinical decision support systems, health information technology and mhealth infrastructure were found to be the key technologies for the future. These technologies could influence the economic, organizational and individual levels. To achieve them, the economic and organizational obstacles need to be overcome. In this study, a number of key technologies and related issues were identified. This approach can help to focus on the most important technologies in the future and to priorities these technologies for better resource allocation and policy making.
Integrated restructurable flight control system demonstration results
NASA Technical Reports Server (NTRS)
Weiss, Jerold L.; Hsu, John Y.
1987-01-01
The purpose of this study was to examine the complementary capabilities of several restructurable flight control system (RFCS) concepts through the integration of these technologies into a complete system. Performance issues were addressed through a re-examination of RFCS functional requirements, and through a qualitative analysis of the design issues that, if properly addressed during integration, will lead to the highest possible degree of fault-tolerant performance. Software developed under previous phases of this contract and under NAS1-18004 was modified and integrated into a complete RFCS subroutine for NASA's B-737 simulation. The integration of these modules involved the development of methods for dealing with the mismatch between the outputs of the failure detection module and the input requirements of the automatic control system redesign module. The performance of this demonstration system was examined through extensive simulation trials.
Tavener, Meredith; Chojenta, Catherine; Loxton, Deborah
2016-07-15
Objectives and importance of study: The purpose of this study was to illustrate how qualitative free-text comments, collected within the context of a health survey, represent a rich data source for understanding specific phenomena. Work conducted with data from the Australian Longitudinal Study on Women's Health (ALSWH) was used to demonstrate the breadth and depth of qualitative information that can be collected. The ALSWH has been collecting data on women's health since 1996, and represents a unique opportunity for understanding lived experiences across the lifecourse. A multiple case study design was used to demonstrate the techniques that researchers have used to manage free-text qualitative comments collected by the ALSWH. Eleven projects conducted using free-text comments are discussed according to the method of analysis. These methods include coding (both inductively and deductively), longitudinal analyses and software-based analyses. This work shows that free-text comments are a data resource in their own right, and have the potential to provide rich and valuable information about a wide variety of topics.
Bastani, Peivand; Mehralian, Gholamhossein; Dinarvand, Rasoul
2015-01-01
Objective: The aim of this study was to review the current methods of pharmaceutical purchasing by Iranian insurance organizations within the World Bank conceptual framework model so as to provide applicable pharmaceutical resource allocation and purchasing (RAP) arrangements in Iran. Methods: This qualitative study was conducted through a qualitative document analysis (QDA), applying the four-step Scott method in document selection, and conducting 20 semi-structured interviews using a triangulation method. Furthermore, the data were analyzed applying five steps framework analysis using Atlas-ti software. Findings: The QDA showed that the purchasers face many structural, financing, payment, delivery and service procurement and purchasing challenges. Moreover, the findings of interviews are provided in three sections including demand-side, supply-side and price and incentive regime. Conclusion: Localizing RAP arrangements as a World Bank Framework in a developing country like Iran considers the following as the prerequisite for implementing strategic purchasing in pharmaceutical sector: The improvement of accessibility, subsidiary mechanisms, reimbursement of new drugs, rational use, uniform pharmacopeia, best supplier selection, reduction of induced demand and moral hazard, payment reform. It is obvious that for Iran, these customized aspects are more various and detailed than those proposed in a World Bank model for developing countries. PMID:25710045
2013-01-01
Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455
Corrêa, Ana Grasielle Dionísio; de Assis, Gilda Aparecida; do Nascimento, Marilena; de Deus Lopes, Roseli
2017-04-01
Augmented Reality musical software (GenVirtual) is a technology, which primarily allows users to develop music activities for rehabilitation. This study aimed to analyse the perceptions of health care professionals regarding the clinical utility of GenVirtual. A second objective was to identify improvements to GenVirtual software and similar technologies. Music therapists, occupational therapists, physiotherapists and speech and language therapist who assist people with physical and cognitive disabilities were enrolled in three focus groups. The quantitative and qualitative data were collected through inductive thematic analysis. Three main themes were identified: the use of GenVirtual in health care areas; opportunities for realistic application of GenVirtual; and limitations in the use of GenVirtual. The registration units identified were: motor stimulation, cognitive stimulation, verbal learning, recreation activity, musicality, accessibility, motivation, sonic accuracy, interference of lighting, poor sound, children and adults. This research suggested that the GenVirtual is a complementary tool to conventional clinical practice and has great potential to motor and cognitive rehabilitation of children and adults. Implications for Rehabilitation Gaining health professional' perceptions of the Augmented Reality musical game (GenVirtual) give valuable information as to the clinical utility of the software. GenVirtual was perceived as a tool that could be used as enhancing the motor and cognitive rehabilitation process. GenVirtual was viewed as a tool that could enhance clinical practice and communication among various agencies, but it was suggested that it should be used with caution to avoid confusion and replacement of important services.
Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian
2013-11-09
The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.
Marsan, Josianne; Paré, Guy
2013-08-01
Open source software (OSS) adoption and use in health care organizations (HCOs) is relatively low in developed countries, but several contextual factors have recently encouraged the consideration of the possible role of OSS in information technology (IT) application portfolios. This article aims at developing a research model for investigating the antecedents of OSS adoption decisions in HCOs. Based on a conceptual framework derived from a synthesis of the literature on IT adoption in organizations, we conducted 18 semi-structured interviews with IT experts from all levels of the Province of Quebec's health and social services sector in Canada. We also interviewed 10 IT suppliers in the province. A qualitative data analysis of the interviews was performed to identify major antecedents of OSS adoption decisions in HCOs. Eight factors associated with three distinct theoretical perspectives influence OSS adoption. More specifically, they are associated with the classical diffusion of innovations theory, the theory of resources, as well as institutional theory and its spin-off, the organizing vision theory. The factors fall under three categories: the characteristics of OSS as an innovation, the characteristics of the HCO with respect to its ability to absorb OSS, and the characteristics of the external environment with respect to institutional pressures and public discourse surrounding OSS. We shed light on two novel factors that closely interact with each other: (1) interest of the health care community in the public discourse surrounding OSS, and (2) clarity, consistency and richness of this discourse, whether found in magazines or other media. OSS still raises many questions and presents several challenges for HCOs. It is crucial that the different factors that explain an HCO's decision on OSS adoption be considered simultaneously. Doing so allows a better understanding of HCOs' rationale when deciding to adopt, or not to adopt, OSS. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Online faculty development for creating E-learning materials.
Niebuhr, Virginia; Niebuhr, Bruce; Trumble, Julie; Urbani, Mary Jo
2014-01-01
Faculty who want to develop e-learning materials face pedagogical challenges of transforming instruction for the online environment, especially as many have never experienced online learning themselves. They face technical challenges of learning new software and time challenges of not all being able to be in the same place at the same time to learn these new skills. The objective of the Any Day Any Place Teaching (ADAPT) faculty development program was to create an online experience in which faculty could learn to produce e-learning materials. The ADAPT curriculum included units on instructional design, copyright principles and peer review, all for the online environment, and units on specific software tools. Participants experienced asynchronous and synchronous methods, including a learning management system, PC-based videoconferencing, online discussions, desktop sharing, an online toolbox and optional face-to-face labs. Project outcomes were e-learning materials developed and participants' evaluations of the experience. Likert scale responses for five instructional units (quantitative) were analyzed for distance from neutral using one-sample t-tests. Interview data (qualitative) were analyzed with assurance of data trustworthiness and thematic analysis techniques. Participants were 27 interprofessional faculty. They evaluated the program instruction as easy to access, engaging and logically presented. They reported increased confidence in new skills and increased awareness of copyright issues, yet continued to have time management challenges and remained uncomfortable about peer review. They produced 22 new instructional materials. Online faculty development methods are helpful for faculty learning to create e-learning materials. Recommendations are made to increase the success of such a faculty development program.
Computer-Controlled Cylindrical Polishing Process for Large X-Ray Mirror Mandrels
NASA Technical Reports Server (NTRS)
Khan, Gufran S.; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian
2010-01-01
We are developing high-energy grazing incidence shell optics for hard-x-ray telescopes. The resolution of a mirror shells depends on the quality of cylindrical mandrel from which they are being replicated. Mid-spatial-frequency axial figure error is a dominant contributor in the error budget of the mandrel. This paper presents our efforts to develop a deterministic cylindrical polishing process in order to keep the mid-spatial-frequency axial figure errors to a minimum. Simulation software is developed to model the residual surface figure errors of a mandrel due to the polishing process parameters and the tools used, as well as to compute the optical performance of the optics. The study carried out using the developed software was focused on establishing a relationship between the polishing process parameters and the mid-spatial-frequency error generation. The process parameters modeled are the speeds of the lap and the mandrel, the tool s influence function, the contour path (dwell) of the tools, their shape and the distribution of the tools on the polishing lap. Using the inputs from the mathematical model, a mandrel having conical approximated Wolter-1 geometry, has been polished on a newly developed computer-controlled cylindrical polishing machine. The preliminary results of a series of polishing experiments demonstrate a qualitative agreement with the developed model. We report our first experimental results and discuss plans for further improvements in the polishing process. The ability to simulate the polishing process is critical to optimize the polishing process, improve the mandrel quality and significantly reduce the cost of mandrel production
Lyons, Antonia C; Goodwin, Ian; McCreanor, Tim; Griffin, Christine
2015-04-01
Understandings of health behaviors can be enriched by using innovative qualitative research designs. We illustrate this with a project that used multiple qualitative methods to explore the confluence of young adults' drinking behaviors and social networking practices in Aotearoa, New Zealand. Participants were 18-25 year old males and females from diverse ethnic, class, and occupational backgrounds. In Stage 1, 34 friendship focus group discussions were video-recorded with 141 young adults who talked about their drinking and social networking practices. In Stage 2, 23 individual interviews were conducted using screen-capture software and video to record participants showing and discussing their Facebook pages. In Stage 3, a database of Web-based material regarding drinking and alcohol was developed and analyzed. In friendship group data, young adults co-constructed accounts of drinking practices and networking about drinking via Facebook as intensely social and pleasurable. However, this pleasure was less prominent in individual interviews, where there was greater explication of unpleasant or problematic experiences and practices. The pleasure derived from drinking and social networking practices was also differentiated by ethnicity, gender, and social class. Juxtaposing the Web-based data with participants' talk about their drinking and social media use showed the deep penetration of online alcohol marketing into young people's social worlds. Multiple qualitative methods, generating multimodal datasets, allowed valuable nuanced insights into young adults' drinking practices and social networking behaviors. This knowledge can usefully inform health policy, health promotion strategies, and targeted health interventions. (c) 2015 APA, all rights reserved).
Three-dimensional analysis of third molar development to estimate age of majority.
Márquez-Ruiz, Ana Belén; Treviño-Tijerina, María Concepción; González-Herrera, Lucas; Sánchez, Belén; González-Ramírez, Amanda Rocío; Valenzuela, Aurora
2017-09-01
Third molars are one of the few biological markers available for age estimation in undocumented juveniles close the legal age of majority, assuming an age of 18years as the most frequent legal demarcation between child and adult status. To obtain more accurate visualization and evaluation of third molar mineralization patterns from computed tomography images, a new software application, DentaVol©, was developed. Third molar mineralization according to qualitative (Demirjian's maturational stage) and quantitative parameters (third molar volume) of dental development was assessed in multi-slice helical computed tomography images of both maxillary arches displayed by DentaVol© from 135 individuals (62 females and 73 males) aged between 14 and 23years. Intra- and inter-observer agreement values were remarkably high for both evaluation procedures and for all third molars. A linear correlation between third molar mineralization and chronological age was found, with third molar maturity occurring earlier in males than in females. Assessment of dental development with both procedures, by using DentaVol© software, can be considered a good indicator of age of majority (18years or older) in all third molars. Our results indicated that virtual computed tomography imaging can be considered a valid alternative to orthopantomography for evaluations of third molar mineralization, and therefore a complementary tool for determining the age of majority. Copyright © 2017 The Chartered Society of Forensic Sciences. Published by Elsevier B.V. All rights reserved.
Consumer involvement in dietary guideline development: opinions from European stakeholders.
Brown, Kerry A; Hermoso, Maria; Timotijevic, Lada; Barnett, Julie; Lillegaard, Inger Therese L; Řehůřková, Irena; Larrañaga, Ainhoa; Lončarević-Srmić, Azra; Andersen, Lene Frost; Ruprich, Jiří; Fernández-Celemín, Laura; Raats, Monique M
2013-05-01
The involvement of consumers in the development of dietary guidelines has been promoted by national and international bodies. Yet, few best practice guidelines have been established to assist with such involvement. Qualitative semi-structured interviews explored stakeholders' beliefs about consumer involvement in dietary guideline development. Interviews were conducted in six European countries: the Czech Republic, Germany, Norway, Serbia, Spain and the UK. Seventy-seven stakeholders were interviewed. Stakeholders were grouped as government, scientific advisory body, professional and academic, industry or non-government organisations. Response rate ranged from 45 % to 95 %. Thematic analysis was conducted with the assistance of NVivo qualitative software. Analysis identified two main themes: (i) type of consumer involvement and (ii) pros and cons of consumer involvement. Direct consumer involvement (e.g. consumer organisations) in the decision-making process was discussed as a facilitator to guideline communication towards the end of the process. Indirect consumer involvement (e.g. consumer research data) was considered at both the beginning and the end of the process. Cons to consumer involvement included the effect of vested interests on objectivity; consumer disinterest; and complications in terms of time, finance and technical understanding. Pros related to increased credibility and trust in the process. Stakeholders acknowledged benefits to consumer involvement during the development of dietary guidelines, but remained unclear on the advantage of direct contributions to the scientific content of guidelines. In the absence of established best practice, clarity on the type and reasons for consumer involvement would benefit all actors.
Dow, Rustam; Barnsley, Jan; Tu, Karen; Domb, Sharon; Jadad, Alejandro R.; Lemieux-Charles, Louise
2015-01-01
Research problem Tutorials and user manuals are important forms of impersonal support for using software applications including electronic medical records (EMRs). Differences between user- and vendor documentation may indicate support needs, which are not sufficiently addressed by the official documentation, and reveal new elements that may inform the design of tutorials and user manuals. Research question What are the differences between user-generated tutorials and manuals for an EMR and the official user manual from the software vendor? Literature review Effective design of tutorials and user manuals requires careful packaging of information, balance between declarative and procedural texts, an action and task-oriented approach, support for error recognition and recovery, and effective use of visual elements. No previous research compared these elements between formal and informal documents. Methodology We conducted an mixed methods study. Seven tutorials and two manuals for an EMR were collected from three family health teams and compared with the official user manual from the software vendor. Documents were qualitatively analyzed using a framework analysis approach in relation to the principles of technical documentation described above. Subsets of the data were quantitatively analyzed using cross-tabulation to compare the types of error information and visual cues in screen captures between user- and vendor-generated manuals. Results and discussion The user-developed tutorials and manuals differed from the vendor-developed manual in that they contained mostly procedural and not declarative information; were customized to the specific workflow, user roles, and patient characteristics; contained more error information related to work processes than to software usage; and used explicit visual cues on screen captures to help users identify window elements. These findings imply that to support EMR implementation, tutorials and manuals need to be customized and adapted to specific organizational contexts and workflows. The main limitation of the study is its generalizability. Future research should address this limitation and may explore alternative approaches to software documentation, such as modular manuals or participatory design. PMID:26190888
Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph
2018-06-01
Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.
Caiata-Zufferey, Maria; Pagani, Olivia; Cina, Viviane; Membrez, Véronique; Taborelli, Monica; Unger, Sheila; Murphy, Anne; Monnerat, Christian; Chappuis, Pierre O
2015-09-01
Women carrying BRCA1/BRCA2 germ-line mutations have an increased risk of developing breast/ovarian cancer. To minimize this risk, international guidelines recommend lifelong surveillance and preventive measures. This study explores the challenges that unaffected women genetically predisposed to breast/ovarian cancer face in managing their risk over time and the psychosocial processes behind these challenges. Between 2011 and 2013, biographical qualitative interviews were conducted in Switzerland with 32 unaffected French- and Italian-speaking women carrying BRCA1/BRCA2 mutations. Their mutation status had been known for at least 3 years (mean, 6 years). Data were analyzed through constant comparative analysis using software for qualitative analysis. From the time these women received their positive genetic test results, they were encouraged to follow medical guidelines. Meanwhile, their adherence to these guidelines was constantly questioned by their social and medical environments. As a result of these contradictory pressures, BRCA1/BRCA2 mutation carriers experienced a sense of disorientation about the most appropriate way of dealing with genetic risk. Given the contradictory attitudes of health-care professionals in caring for unaffected BRCA1/BRCA2 mutation carriers, there is an urgent need to educate physicians in dealing with genetically at-risk women and to promote a shared representation of this condition among them.Genet Med 17 9, 726-732.
Perception of masculinity amongst young Malaysian men: a qualitative study of university students.
Fazli Khalaf, Zahra; Low, Wah Yun; Ghorbani, Behzad; Merghati Khoei, Effat
2013-11-11
Perception of Masculinity plays an important role in men's lifestyles and health behaviors. Although, the importance of masculinity has been widely discussed in men's health literature, very little is known about the meanings of masculinity in the Malaysian setting. This research aimed to explore the meanings of masculinity among Malaysian university men. This qualitative study utilized in-depth interviews with 34 young Malaysian university men, aged 20-30 years from three main ethnic groups in Malaysia (Malay, Chinese and Indian). Thematic analysis approach was used to extract data. NVIVO v8 qualitative software was used for data management. From the data collected several concepts emerged that reflected the meanings of masculinity from the participants' view points. These meanings were associated with a combination of traditional and non-traditional norms that generally benefit men who behave according to culturally dominant role expectations. These included: "Having a good body shape", "being respected", "having success with women", "being a family man", and "having financial independence". Socio-cultural factors, such as family environment, religion, public media and popular life style patterns helped to shape and reinforce the meanings of masculinities among university men. This study revealed that the university context provided a particular culture for construction and reinforcement of the meanings of masculinities, which should be considered by the educators to help in development of healthy masculinities.
Effects of Social Injustice on Breast Health–Seeking Behaviors of Low-Income Women
Bowen, Shelly-Ann; Williams, Edith M.; Stoneberg-Cooper, Chayah M.; Glover, Saundra H.; Williams, Michelle S.; Byrd, Michael D.
2014-01-01
Purpose The study uses qualitative research to gain a better understanding of what occurs after low-income women receive an abnormal breast screening and the factors that influence their decisions and behavior. A heuristic model is presented for understanding this complexity. Design Qualitative research methods used to elicited social and cultural themes related to breast cancer screening follow-up. Setting Individual telephone interviews were conducted with 16 women with confirmed breast anomaly. Participants Low-income women screened through a national breast cancer early detection program. Method Grounded theory using selective coding was employed to elicit factors that influenced the understanding and follow-up of an abnormal breast screening result. Interviews were digitally recorded, transcribed, and uploaded into NVivo 8, a qualitative management and analysis software package. Results For women (16, or 72% of case management referrals) below 250% of the poverty level, the impact of social and economic inequities creates a psychosocial context underlined by structural and cultural barriers to treatment that forecasts the mechanism that generates differences in health outcomes. The absence of insurance due to underemployment and unemployment and inadequate public infrastructure intensified emotional stress impacting participants’ health decisions. Conclusion The findings that emerged offer explanations of how consistent patterns of social injustice impact treatment decisions in a high-risk vulnerable population that have implications for health promotion research and systems-level program improvement and development. PMID:23448411
Software Acquisition Program Dynamics
2011-10-24
greatest capability, which requires latest technologies • Contractors prefer using latest technologies to boost staff competency for future bids Risk...mistakes Build foundation to test future mitigation/solution approaches to assess value • Qualitatively validate new approaches before applying them to...classroom training, eLearning , certification, and more—to serve the needs of customers and partners worldwide.
ERIC Educational Resources Information Center
Penn-Edwards, Sorrel
2010-01-01
The qualitative research methodology of phenomenography has traditionally required a manual sorting and analysis of interview data. In this paper I explore a potential means of streamlining this procedure by considering a computer aided process not previously reported upon. Two methods of lexicological analysis, manual and automatic, were examined…
Beyond the Quantitative and Qualitative Divide: Research in Art Education as Border Skirmish.
ERIC Educational Resources Information Center
Sullivan, Graeme
1996-01-01
Analyzes a research project that utilizes a coherent conceptual model of art education research incorporating the demand for empirical rigor and providing for diverse interpretive frameworks. Briefly profiles the NUD*IST (Non-numerical Unstructured Data Indexing Searching and Theorizing) software system that can organize and retrieve complex…
Writing in the Ether: A Collaborative Approach to Academic Research.
ERIC Educational Resources Information Center
Winograd, David; Milton, Katherine
The purpose of this paper is to shed light on the developmental stages of academic publication collaborations through both research on the collaborative process itself, as well as through analysis of the discovery process. Using the qualitative software package, NUD*IST, the teleconferencing system, FirstClass, and standard e-mail, the study…
Computer Series, 82. The Application of Expert Systems in the General Chemistry Laboratory.
ERIC Educational Resources Information Center
Settle, Frank A., Jr.
1987-01-01
Describes the construction of expert computer systems using artificial intelligence technology and commercially available software, known as an expert system shell. Provides two applications; a simple one, the identification of seven white substances, and a more complicated one involving the qualitative analysis of six metal ions. (TW)
Quizlet: What the Students Think--A Qualitative Data Analysis
ERIC Educational Resources Information Center
Lander, Bruce
2016-01-01
The immediate area of interest in this study is the primary building block of all foreign languages: vocabulary acquisition. Due to recent updates and innovations in educational software, foreign language educators now have a huge supply of ever improving tools to help enhance, transform and completely modify learning. Despite this surge in…
A Grounded Theory Study of the Relationship between E-Mail and Burnout
ERIC Educational Resources Information Center
Camargo, Marta Rocha
2008-01-01
Introduction: This study consisted of a qualitative investigation into the role of e-mail in work-related burnout among high technology employees working full time and on-site for Internet, hardware, and software companies. Method: Grounded theory methodology was used to provide a systemic approach in categorising, sorting, and analysing data…
A Qualitative Content Analysis of Early Algebra Education iOS Apps for Primary Children
ERIC Educational Resources Information Center
Ledbetter, Lissa S.
2017-01-01
Educational software applications (apps) on multi-touch, mobile devices provide a promising space to help learners work toward long-term educational goals, like learning with understanding (Bransford, Brown, & Cocking, 2000). Such goals are particularly relevant in supporting a learner's efforts to become more mathematically literate. Yet, a…
Perceived Educational Values of Omani School Principals
ERIC Educational Resources Information Center
Al-Ani, Wajeha Thabit; Al-Harthi, Aisha Salim
2017-01-01
This qualitative study investigated the perceived educational values of Omani school principals. Data were collected using a semi-structured interview form which focused on the core values of school administration as perceived by a sample of 44 school principals; a focus group interview was also held. Data were analysed using Nvivo software. The…
Teachers' Perceptions Regarding School Principals' Coaching Skills
ERIC Educational Resources Information Center
Yirci, Ramazan; Özdemir, Tuncay Yavuz; Kartal, Seçil Eda; Kocabas, Ibrahim
2014-01-01
The purpose of this study was to find out teachers' perceptions about school principals' coaching skills. The study was carried out within qualitative research methods. The study group included 76 teachers in Elazig and 73 teachers in Kahramanmaras provinces of Turkey. All the data were processed using Nvivo 9 software. The results indicate that…
ERIC Educational Resources Information Center
Kato, Fumie; Spring, Ryan; Mori, Chikako
2016-01-01
Providing learners of a foreign language with meaningful opportunities for interactions, specifically with native speakers, is especially challenging for instructors. One way to overcome this obstacle is through video-synchronous computer-mediated communication tools such as Skype software. This study reports quantitative and qualitative data from…
E-Classical Fairy Tales: Multimedia Builder as a Tool
ERIC Educational Resources Information Center
Eteokleous, Nikleia; Ktoridou, Despo; Tsolakidis, Symeon
2011-01-01
The study examines pre-service teachers' experiences in delivering a traditional-classical fairy tale using the Multimedia Builder software, in other words an e-fairy tale. A case study approach was employed, collecting qualitative data through classroom observations and focus groups. The results focus on pre-service teachers' reactions, opinions,…
The Use of "Socrative" in ESL Classrooms: Towards Active Learning
ERIC Educational Resources Information Center
El Shaban, Abir
2017-01-01
The online student response system (SRS) is a technological tool that can be effectively implemented in English language classroom contexts and be used to promote students' active learning. In this qualitative study, "Socrative", a Web 2.0 software, was integrated with active learning activities and used as an SRS to explore English…
NASA Technical Reports Server (NTRS)
Voigt, S. (Editor); Beskenis, S. (Editor)
1985-01-01
Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.
Development of Risk Assessment Matrix for NASA Engineering and Safety Center
NASA Technical Reports Server (NTRS)
Malone, Roy W., Jr.; Moses, Kelly
2004-01-01
This paper describes a study, which had as its principal goal the development of a sufficiently detailed 5 x 5 Risk Matrix Scorecard. The purpose of this scorecard is to outline the criteria by which technical issues can be qualitatively and initially prioritized. The tool using this score card has been proposed to be one of the information resources the NASA Engineering and Safety Center (NESC) takes into consideration when making decisions with respect to incoming information on safety concerns across the entire NASA agency. The contents of this paper discuss in detail each element of the risk matrix scorecard, definitions for those elements and the rationale behind the development of those definitions. This scorecard development was performed in parallel with the tailoring of the existing Futron Corporation Integrated Risk Management Application (IRMA) software tool. IRMA was tailored to fit NESC needs for evaluating incoming safety concerns and was renamed NESC Assessment Risk Management Application (NAFMA) which is still in developmental phase.
Gallart, Francesc; Cid, Núria; Latron, Jérôme; Llorens, Pilar; Bonada, Núria; Jeuffroy, Justin; Jiménez-Argudo, Sara-María; Vega, Rosa-María; Solà, Carolina; Soria, Maria; Bardina, Mònica; Hernández-Casahuga, Antoni-Josep; Fidalgo, Aránzazu; Estrela, Teodoro; Munné, Antoni; Prat, Narcís
2017-12-31
When the regime of a river is not perennial, there are four main difficulties with the use of hydrographs for assessing hydrological alteration: i) the main hydrological features relevant for biological communities are not quantitative (discharges) but qualitative (phases such as flowing water, stagnant pools or lack of surface water), ii) stream flow records do not inform on the temporal occurrence of stagnant pools, iii) as most of the temporary streams are ungauged, their regime has to be evaluated by alternative methods such as remote sensing or citizen science, and iv) the biological quality assessment of the ecological status of a temporary stream must follow a sampling schedule and references adapted to the flow- pool-dry regime. To overcome these challenges within an operational approach, the freely available software tool TREHS has been developed within the EU LIFE TRIVERS project. This software permits the input of information from flow simulations obtained with any rainfall-runoff model (to set an unimpacted reference stream regime) and compares this with the information obtained from flow gauging records (if available) and interviews with local people, as well as instantaneous observations by individuals and interpretation of ground-level or aerial photographs. Up to six metrics defining the permanence of water flow, the presence of stagnant pools and their temporal patterns of occurrence are used to determine natural and observed river regimes and to assess the degree of hydrological alteration. A new regime classification specifically designed for temporary rivers was developed using the metrics that measure the relative permanence of the three main phases: flow, disconnected pools and dry stream bed. Finally, the software characterizes the differences between the natural and actual regimes, diagnoses the hydrological status (degree of hydrological alteration), assesses the significance and robustness of the diagnosis and recommends the best periods for biological quality samplings. Copyright © 2017 Elsevier B.V. All rights reserved.
CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.
Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi
2015-10-26
Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.
Longo, Christopher J; Fitch, Margaret; Grignon, Michel; McAndrew, Alison
2016-11-01
This research informs existing work by examining the full scope of out-of-pocket costs and lost income, patients' private insurance behaviors, and their overall management of finances during their cancer treatment. The intent was to gain a deeper understanding of patient circumstances and the related costs. Participant qualitative interviews were conducted in person during outpatient clinic visits or by telephone and were recorded between June 2011 and July 2012. Interviews were transcribed verbatim and subjected to a descriptive qualitative analysis. The research team collaborated early in the process (after three subjects were enrolled) to develop a preliminary coding framework. The coding framework was modified to incorporate additional emerging content until saturation of data was evident. Transcripts were coded using the qualitative software NVivo version 9.0. Fifteen patients agreed to participate in the study and 14 completed the interview (seven breast, three colorectal, two lung, and two prostate). Consistent with existing published work, participants expressed concerns regarding expenses related to medications, complementary/alternative medicines, devices, parking and travel. These concerns were exacerbated if patients did not have insurance or lost insurance coverage due to loss of work. Although many acknowledged in hindsight that additional insurance would have helped, they also recognized that at the time of their diagnoses, it was not a viable option. Previously unidentified categorical costs identified in this study included modifications to housing arrangements or renovations, special clothing, fitness costs and the impact of an altered diet. We confirmed the results of earlier Canadian quantitative work. Additionally, cost categories not previously explored were identified, which will facilitate the development of an improved and more comprehensive quantitative questionnaire for future research. Many patients indicated that supplemental health insurance would have made their cancer journey less stressful, highlighting existing gaps in the government funded health care system.
A Quantitative Study of Global Software Development Teams, Requirements, and Software Projects
ERIC Educational Resources Information Center
Parker, Linda L.
2016-01-01
The study explored the relationship between global software development teams, effective software requirements, and stakeholders' perception of successful software development projects within the field of information technology management. It examined the critical relationship between Global Software Development (GSD) teams creating effective…
The Knowledge-Based Software Assistant: Beyond CASE
NASA Technical Reports Server (NTRS)
Carozzoni, Joseph A.
1993-01-01
This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.
Development of Evidence-Based Health Policy Documents in Developing Countries: A Case of Iran
Imani-Nasab, Mohammad Hasan; Seyedin, Hesam; Majdzadeh, Reza; Yazdizadeh, Bahareh; Salehi, Masoud
2014-01-01
Background: Evidence-based policy documents that are well developed by senior civil servants and are timely available can reduce the barriers to evidence utilization by health policy makers. This study examined the barriers and facilitators in developing evidence-based health policy documents from the perspective of their producers in a developing country. Methods: In a qualitative study with a framework analysis approach, we conducted semi-structured interviews using purposive and snowball sampling. A qualitative analysis software (MAXQDA-10) was used to apply the codes and manage the data. This study was theory-based and the results were compared to exploratory studies about the factors influencing evidence-based health policymaking. Results: 18 codes and three main themes of behavioral, normative, and control beliefs were identified. Factors that influence the development of evidence-based policy documents were identified by the participants: behavioral beliefs included quality of policy documents, use of resources, knowledge and innovation, being time-consuming and contextualization; normative beliefs included policy authorities, policymakers, policy administrators, and co-workers; and control beliefs included recruitment policy, performance management, empowerment, management stability, physical environment, access to evidence, policy making process, and effect of other factors. Conclusion: Most of the cited barriers to the development of evidence-based policy were related to control beliefs, i.e. barriers at the organizational and health system levels. This study identified the factors that influence the development of evidence-based policy documents based on the components of the theory of planned behavior. But in exploratory studies on evidence utilization by health policymakers, the identified factors were only related to control behaviors. This suggests that the theoretical approach may be preferable to the exploratory approach in identifying the barriers and facilitators of a behavior. PMID:24762343
Development of evidence-based health policy documents in developing countries: a case of Iran.
Imani-Nasab, Mohammad Hasan; Seyedin, Hesam; Majdzadeh, Reza; Yazdizadeh, Bahareh; Salehi, Masoud
2014-02-07
Evidence-based policy documents that are well developed by senior civil servants and are timely available can reduce the barriers to evidence utilization by health policy makers. This study examined the barriers and facilitators in developing evidence-based health policy documents from the perspective of their producers in a developing country. In a qualitative study with a framework analysis approach, we conducted semi-structured interviews using purposive and snowball sampling. A qualitative analysis software (MAXQDA-10) was used to apply the codes and manage the data. This study was theory-based and the results were compared to exploratory studies about the factors influencing evidence-based health policy-making. 18 codes and three main themes of behavioral, normative, and control beliefs were identified. Factors that influence the development of evidence-based policy documents were identified by the participants: behavioral beliefs included quality of policy documents, use of resources, knowledge and innovation, being time-consuming and contextualization; normative beliefs included policy authorities, policymakers, policy administrators, and co-workers; and control beliefs included recruitment policy, performance management, empowerment, management stability, physical environment, access to evidence, policy making process, and effect of other factors. Most of the cited barriers to the development of evidence-based policy were related to control beliefs, i.e. barriers at the organizational and health system levels. This study identified the factors that influence the development of evidence-based policy documents based on the components of the theory of planned behavior. But in exploratory studies on evidence utilization by health policymakers, the identified factors were only related to control behaviors. This suggests that the theoretical approach may be preferable to the exploratory approach in identifying the barriers and facilitators of a behavior.
NASA Astrophysics Data System (ADS)
Brown, Linda Lou
Federal educational policy, No Child Left Behind Act of 2001, focused attention on America's education with conspicuous results. One aspect, highly qualified classroom teacher and principal (HQ), was taxing since states established individual accountability structures. The HQ impact and use of data-informed decision-making (DIDM) for Texas elementary science education monitoring by campus administrators, Campus Instruction Leader (CILs), provides crucial relationships to 5th grade students' learning and achievement. Forty years research determined improved student results when sustained, supported, and focused professional development (PD) for teachers is available. Using mixed methods research, this study applied quantitative and qualitative analysis from two, electronic, on-line surveys: Texas Elementary, Intermediate or Middle School Teacher Survey(c) and the Texas Elementary Campus Administrator Survey(c) with results from 22.3% Texas school districts representing 487 elementary campuses surveyed. Participants selected in random, stratified sampling of 5th grade teachers who attended local Texas Regional Collaboratives science professional development (PD) programs between 2003-2008. Survey information compared statistically to campus-level average passing rate scores on the 5th grade science TAKS using Statistical Process Software (SPSS). Written comments from both surveys analyzed with Qualitative Survey Research (NVivo) software. Due to the level of uncertainty of variables within a large statewide study, Mauchly's Test of Sphericity statistical test used to validate repeated measures factor ANOVAs. Although few individual results were statistically significant, when jointly analyzed, striking constructs were revealed regarding the impact of HQ policy applications and elementary CILs use of data-informed decisions on improving 5th grade students' achievement and teachers' PD learning science content. Some constructs included the use of data-warehouse programs; teachers' applications of DIDM to modify lessons for differentiated science instruction, the numbers of years' teachers attended science PD, and teachers' influence on CILs staffing decisions. Yet CILs reported 14% of Texas elementary campuses had limited or no science education programs due to federal policy requirement for reading and mathematics. Three hypothesis components were supported and accepted from research data resulted in two models addressing elementary science, science education PD, and CILs impact for federal policy applications.
Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis
NASA Technical Reports Server (NTRS)
Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven
1997-01-01
Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.
Predicting Software Suitability Using a Bayesian Belief Network
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.
2005-01-01
The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.
The Effects of Development Team Skill on Software Product Quality
NASA Technical Reports Server (NTRS)
Beaver, Justin M.; Schiavone, Guy A.
2006-01-01
This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics
NASA Astrophysics Data System (ADS)
Diaz-Merced, Wanda Liz; Casado, Johanna; Garcia, Beatriz; Aarnio, Alicia; Knierman, Karen; Monkiewicz, Jacqueline; Alicia Aarnio.
2018-01-01
Big Data" is a subject that has taken special relevance today, particularly in Astrophysics, where continuous advances in technology are leading to ever larger data sets. A multimodal approach in perception of astronomical data data (achieved through sonification used for the processing of data) increases the detection of signals in very low signal-to-noise ratio limits and is of special importance to achieve greater inclusion in the field of Astronomy. In the last ten years, different software tools have been developed that perform the sonification of astronomical data from tables or databases, among them the best known and in multiplatform development are Sonification Sandbox, MathTrack, and xSonify.In order to determine the accessibility of software we propose to start carrying out a conformity analysis of ISO (International Standard Organization) 9241-171171: 2008. This standard establishes the general guidelines that must be taken into account for accessibility in software design, and it is applied to software used in work, public places, and at home. To analyze the accessibility of web databases, we take into account the "Web Content Content Accessibility Guidelines (WCAG) 2.0", accepted and published by ISO in the ISO / IEC 40500: 2012 standard.In this poster, we present a User Centered Design (UCD), Human Computer Interaction (HCI), and User Experience (UX) framework to address a non-segregational provision of access to bibliographic databases and telemetry databases in Astronomy. Our framework is based on an ISO evaluation on a selection of data bases such as ADS, Simbad and SDSS. The WCAG 2.0 and ISO 9241-171171: 2008 should not be taken as absolute accessibility standards: these guidelines are very general, are not absolute, and do not address particularities. They are not to be taken as a substitute for UCD, HCI, UX design and evaluation. Based on our results, this research presents the framework for a focus group and qualitative data analysis aimed to lay the foundations for the employment of UCD functionalities on astronomical databases.
Printing quality control automation
NASA Astrophysics Data System (ADS)
Trapeznikova, O. V.
2018-04-01
One of the most important problems in the concept of standardizing the process of offset printing is the control the quality rating of printing and its automation. To solve the problem, a software has been developed taking into account the specifics of printing system components and the behavior in printing process. In order to characterize the distribution of ink layer on the printed substrate the so-called deviation of the ink layer thickness on the sheet from nominal surface is suggested. The geometric data construction the surface projections of the color gamut bodies allows to visualize the color reproduction gamut of printing systems in brightness ranges and specific color sectors, that provides a qualitative comparison of the system by the reproduction of individual colors in a varying ranges of brightness.
NASA Astrophysics Data System (ADS)
Adamczewski-Musch, J.; Akishin, P.; Becker, K.-H.; Belogurov, S.; Bendarouach, J.; Boldyreva, N.; Deveaux, C.; Dobyrn, V.; Dürr, M.; Eschke, J.; Förtsch, J.; Heep, J.; Höhne, C.; Kampert, K.-H.; Kochenda, L.; Kopfer, J.; Kravtsov, P.; Kres, I.; Lebedev, S.; Lebedeva, E.; Leonova, E.; Linev, S.; Mahmoud, T.; Michel, J.; Miftakhov, N.; Niebur, W.; Ovcharenko, E.; Patel, V.; Pauly, C.; Pfeifer, D.; Querchfeld, S.; Rautenberg, J.; Reinecke, S.; Riabov, Y.; Roshchin, E.; Samsonov, V.; Schetinin, V.; Tarasenkova, O.; Traxler, M.; Ugur, C.; Vznuzdaev, E.; Vznuzdaev, M.
2017-12-01
The Compressed Baryonic Matter (CBM) experiment at the future Facility for Anti-proton and Ion Research (FAIR) will investigate the phase diagram of strongly interacting matter at high net-baryon density and moderate temperature in A+A collisions. One of the key detectors of CBM to explore this physics program is a Ring Imaging CHerenkov (RICH) detector for electron identification. For a high performance of the RICH detector precise mirror alignment is essential. A three-step correction cycle has been developed, which will be discussed: First a qualitative, fast check of the mirror positions, second a quantitative determination of possible misalignments and third a software correction routine, allowing a proper functioning of the RICH under misalignment conditions.
Evans, Catrin; Tweheyo, Ritah; McGarry, Julie; Eldridge, Jeanette; McCormick, Carol; Nkoyo, Valentine; Higginbottom, Gina Marie Awoko
2017-12-14
Female genital mutilation (FGM) is an issue of global concern. High levels of migration mean that healthcare systems in higher-income western countries are increasingly being challenged to respond to the care needs of affected communities. Research has identified significant challenges in the provision of, and access to, FGM-related healthcare. There is a lack of confidence and competence among health professionals in providing appropriate care, suggesting an urgent need for evidence-based service development in this area. This study will involve two systematic reviews of qualitative evidence to explore the experiences, needs, barriers and facilitators to seeking and providing FGM-related healthcare in high-income (Organisation for Economic Cooperation and Development) countries, from the perspectives of: (1) women and girls who have undergone FGM and (2) health professionals. Twelve databases including MEDLINE, EMBASE, PsycINFO, ASSIA, Web of Science, ERIC, CINAHL, and POPLINE will be searched with no limits on publication year. Relevant grey literature will be identified from digital sources and professional networks.Two reviewers will independently screen, select and critically appraise the studies. Study quality will be assessed using the Joanna Briggs Institute Qualitative Assessment and Review Instrument appraisal tool. Findings will be extracted into NVivo software. Synthesis will involve inductive thematic analysis, including in-depth reading, line by line coding of the findings, development of descriptive themes and re-coding to higher level analytical themes. Confidence in the review findings will be assessed using the CERQual approach. Findings will be integrated into a comprehensive set of recommendations for research, policy and practice. The syntheses will be reported as per the Enhancing Transparency in Reporting the Synthesis of Qualitative Research (ENTREQ) statement. Two reviews will be published in peer-reviewed journals and an integrated report disseminated at stakeholder engagement events. CRD42015030001: 2015 and CRD42015030004: 2015. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Brewer, Denise
The air transport industry (ATI) is a dynamic, communal, international, and intercultural environment in which the daily operations of airlines, airports, and service providers are dependent on information technology (IT). Many of the IT legacy systems are more than 30 years old, and current regulations and the globally distributed workplace have brought profound changes to the way the ATI community interacts. The purpose of the study was to identify the areas of resistance to change in the ATI community and the corresponding factors in change management requirements that minimize product development delays and lead to a successful and timely shift from legacy to open web-based systems in upgrading ATI operations. The research questions centered on product development team processes as well as the members' perceived need for acceptance of change. A qualitative case study approach rooted in complexity theory was employed using a single case of an intercultural product development team dispersed globally. Qualitative data gathered from questionnaires were organized using Nvivo software, which coded the words and themes. Once coded, themes emerged identifying the areas of resistance within the product development team. Results of follow-up interviews with team members suggests that intercultural relationship building prior to and during project execution; focus on common team goals; and, development of relationships to enhance interpersonal respect, understanding and overall communication help overcome resistance to change. Positive social change in the form of intercultural group effectiveness evidenced in increased team functioning during major project transitions is likely to result when global managers devote time to cultural understanding.
Computer-Aided Software Engineering - An approach to real-time software development
NASA Technical Reports Server (NTRS)
Walker, Carrie K.; Turkovich, John J.
1989-01-01
A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.
NASA Astrophysics Data System (ADS)
Radakovic, Nenad; McDougall, Douglas
2012-10-01
This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships, describe the quantitative relationship between two sets. The second feature is the slider and animation component of dynamic geometry software enabling students to observe how the change in the base rate of an event influences conditional probability. A hypothetical instructional sequence using a well-known breast cancer example is described.
Goldberg, Shauna; Albright, Karen; Allison, Mandy; Haemer, Matthew
2015-01-01
Abstract Background: Childhood obesity disproportionately affects low-income minority populations, yet there is a paucity of literature about effective interventions in this population. This study sought to understand the experience of low-income majority Hispanic families engaged in obesity treatment. Methods: We conducted six focus groups (2=English, 4=Spanish) with families who completed a community-based, family-oriented obesity treatment program, using standard qualitative focus group interview methods. Transcripts were recorded, transcribed, and analyzed for thematic content. Two coders using the software program ATLAS.ti (v.7.0; Scientific Software Development GmbH, Berlin, Germany) coded each transcript independently; reflexive team analysis with three study team members was used to reach a consensus. Results: Participants (n=37) indicated high program satisfaction. Parents reported buying less junk/fast food, increased consumption of fruits and vegetables, preparing and eating more meals as a family, and increasing their families' physical activity (PA). Four barrier and facilitator themes emerged. Barrier themes were time and financial cost, parent's lack of time and energy, influence of family members, and challenges regarding physical environment. Facilitator themes were skill building around healthy eating and parenting, family involvement, and long-term health concerns. Unanticipated findings, parents reported, were that changes resulted in children sleeping better, feeling happier, and less irritability. Conclusions: Despite low-income families experiencing barriers to lifestyle changes to manage obesity, they made positive dietary changes and increased PA by learning specific skills and including the whole family in those changes. Additionally, some unexpected benefits were noted, including improved sleep, less irritability, and children appearing happier. Future studies should consider using these parent-identified outcomes as secondary measures of program effectiveness. PMID:25715107
Cason-Wilkerson, Rochelle; Goldberg, Shauna; Albright, Karen; Allison, Mandy; Haemer, Matthew
2015-04-01
Childhood obesity disproportionately affects low-income minority populations, yet there is a paucity of literature about effective interventions in this population. This study sought to understand the experience of low-income majority Hispanic families engaged in obesity treatment. We conducted six focus groups (2=English, 4=Spanish) with families who completed a community-based, family-oriented obesity treatment program, using standard qualitative focus group interview methods. Transcripts were recorded, transcribed, and analyzed for thematic content. Two coders using the software program ATLAS.ti (v.7.0; Scientific Software Development GmbH, Berlin, Germany) coded each transcript independently; reflexive team analysis with three study team members was used to reach a consensus. Participants (n=37) indicated high program satisfaction. Parents reported buying less junk/fast food, increased consumption of fruits and vegetables, preparing and eating more meals as a family, and increasing their families' physical activity (PA). Four barrier and facilitator themes emerged. Barrier themes were time and financial cost, parent's lack of time and energy, influence of family members, and challenges regarding physical environment. Facilitator themes were skill building around healthy eating and parenting, family involvement, and long-term health concerns. Unanticipated findings, parents reported, were that changes resulted in children sleeping better, feeling happier, and less irritability. Despite low-income families experiencing barriers to lifestyle changes to manage obesity, they made positive dietary changes and increased PA by learning specific skills and including the whole family in those changes. Additionally, some unexpected benefits were noted, including improved sleep, less irritability, and children appearing happier. Future studies should consider using these parent-identified outcomes as secondary measures of program effectiveness.
Chiang, Kuei-Feng; Wang, Hsiu-Hung
2016-07-01
To examine nurses' experiences regarding the benefits and obstacles of using a smart mobile device application in home care. The popularity of mobile phones and Internet technology has established an opportunity for interaction between patients and health care professionals. Line is an application allowing instant communication that is available for free globally. However, the literature relating to use of Line in this area is limited. A qualitative study involving individual in-depth interviews. Participants included community nurses (N = 17) from six home care facilities in southern Taiwan who had used Line for home care of chronically ill patients for at least six months. The study was conducted using semi-structured in-depth interviews, which were recorded and converted into transcripts for content analysis. Seven themes emerged from data analysis: reduction in medical care consumption and costs, reduction in workload and stress, facilitating improvement in the quality of care, promotion of the nurse-patient relationship, perceived risk, lack of organisational incentives and operating procedures and disturbance to personal life. Nurses considered Line valuable for use in home care. While this application has diverse functions, its video transfer function could in particular help nursing staff make prompt decisions about patients' problems and promote nurse-patient relationships. However, there might be hidden risks including legal consequences, safety risks to patients, possible violations of professionalism and increased risk of nurse burnout. Increasing nursing staff awareness of using mobile messaging software applications is necessary. This study provides relevant information about the benefits, disadvantages, risks and limitations of nurses' use of Line. The study also provides suggestions for software programmers and future organisational strategy and development. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Yetman, G.; Downs, R. R.
2011-12-01
Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.
The Elements of an Effective Software Development Plan - Software Development Process Guidebook
2011-11-11
standards and practices required for all XMPL software development. This SDP implements the <corporate> Standard Software Process (SSP). as tailored...Developing and integrating reusable software products • Approach to managing COTS/Reuse software implementation • COTS/Reuse software selection...final selection and submit to change board for approval MAINTENANCE Monitor current products for obsolescence or end of support Track new
Artificial intelligence approaches to software engineering
NASA Technical Reports Server (NTRS)
Johannes, James D.; Macdonald, James R.
1988-01-01
Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.
1992-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).
Comprehensive Design Reliability Activities for Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Whitley, M. R.; Knight, K. C.
2000-01-01
This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.
Development of a Mobile User Interface for Image-based Dietary Assessment.
Kim, Sungye; Schap, Tusarebecca; Bosch, Marc; Maciejewski, Ross; Delp, Edward J; Ebert, David S; Boushey, Carol J
2010-12-31
In this paper, we present a mobile user interface for image-based dietary assessment. The mobile user interface provides a front end to a client-server image recognition and portion estimation software. In the client-server configuration, the user interactively records a series of food images using a built-in camera on the mobile device. Images are sent from the mobile device to the server, and the calorie content of the meal is estimated. In this paper, we describe and discuss the design and development of our mobile user interface features. We discuss the design concepts, through initial ideas and implementations. For each concept, we discuss qualitative user feedback from participants using the mobile client application. We then discuss future designs, including work on design considerations for the mobile application to allow the user to interactively correct errors in the automatic processing while reducing the user burden associated with classical pen-and-paper dietary records.
Combining continuing education with expert consultation via telemedicine in Cambodia.
Engle, Xavier; Aird, James; Tho, Ly; Bintcliffe, Fiona; Monsell, Fergal; Gollogly, Jim; Noor, Saqib
2014-04-01
Telemedicine has the potential to increase access to both clinical consultation and continuing medical education in Cambodia. We present a Cambodian surgical centre's experience with a collaboration in which complicated orthopaedic cases were presented to a panel of consultants using free online videoconferencing software, providing a combined opportunity for both continuing education and the enhancement of patient care. Effects of the case conference on patient care were examined via a retrospective review and clinician perspectives were elicited via a qualitative survey. The case conference altered patient care in 69% of cases. All Cambodian staff reported learning from the conference and 78% reported changes in their care for patients not presented at the conference. Real-time videoconferencing between consultants in the developed world and physicians in a developing country may be an effective, low-cost and easily replicable means of combining direct benefits to patient care with continuing medical education.
ERIC Educational Resources Information Center
Biju, Soly Mathew
2008-01-01
Many software development firms are now adopting the agile software development method. This method involves the customer at every level of software development, thus reducing the impact of change in the requirement at a later stage. In this article, the principles of the agile method for software development are explored and there is a focus on…
Development of a comprehensive software engineering environment
NASA Technical Reports Server (NTRS)
Hartrum, Thomas C.; Lamont, Gary B.
1987-01-01
The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.
Reducing Risk in DoD Software-Intensive Systems Development
2016-03-01
intensive systems development risk. This research addresses the use of the Technical Readiness Assessment (TRA) using the nine-level software Technology...The software TRLs are ineffective in reducing technical risk for the software component development. • Without the software TRLs, there is no...effective method to perform software TRA or reduce the technical development risk. The software component will behave as a new, untried technology in nearly
ERIC Educational Resources Information Center
Radakovic, Nenad; McDougall, Douglas
2012-01-01
This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships,…
ERIC Educational Resources Information Center
Cárdenas-Claros, Mónica Stella
2015-01-01
This paper reports on the findings of two qualitative exploratory studies that sought to investigate design features of help options in computer-based L2 listening materials. Informed by principles of participatory design, language learners, software designers, language teachers, and a computer programmer worked collaboratively in a series of…
Nonstandard Career Trajectories and Their Various Forms
ERIC Educational Resources Information Center
Fournier, Genevive; Bujold, Charles
2005-01-01
A sample of 124 participants (62 men, 62 women) was used in this qualitative research study of people having experienced nonstandard work for the last 3 years. Those participants were met for individual semistructured interviews of approximately 2 hours in length. On the basis of a content analysis with the use of the NUD*IST analysis software,…
ERIC Educational Resources Information Center
Osler, James Edward
2013-01-01
This paper discusses the implementation of the Tri-Squared Test as an advanced statistical measure used to verify and validate the research outcomes of Educational Technology software. A mathematical and epistemological rational is provided for the transformative process of qualitative data into quantitative outcomes through the Tri-Squared Test…
A Qualitative Investigation of an All-Female Group in a Software Engineering Course Project
ERIC Educational Resources Information Center
Cox, Anthony; Fisher, Maryanne
2008-01-01
Past research suggests that single-sex educational environments provide many benefits to women's learning. Similarly, as indicated by their under-representation, it is known that there are problems in attracting and subsequently retaining women in information technology disciplines. In an effort to improve the enrollment and retention of women, we…
ERIC Educational Resources Information Center
Caughlan, Samantha; Kelly, Sean
2004-01-01
Quantitative analyses using CLASS 3.0 software and qualitative discourse analyses were conducted of the instructional and institutional effects of tracking in high-and low-track American literature classes taught by the same teacher, a participant in a national study of the effects of dialogic classroom discourse patterns on student achievement.…
Using Qualitative Data Analysis Software in Teaching about Group Work Practice
ERIC Educational Resources Information Center
Macgowan, Mark J.; Beaulaurier, Richard L.
2005-01-01
Courses on social group work have traditionally relied on in-class role plays to teach group work skills. The most common technological aid in such courses has been analog videotape. In recent years new technologies have emerged that allow the instructor to customize and tailor didactic experiences to individual classes and individual learners.…
Teaching Media Design by Using Scrum. A Qualitative Study within a Media Informatics Elective Course
ERIC Educational Resources Information Center
Herrmann, Ines; Münster, Sander; Tietz, Vincent; Uhlemann, Rainer
2017-01-01
Cross-disciplinary skills are today's key skills for media informatics students to gain employment after graduation. However, such problem-based learning projects almost never take place due to organizational struggles. The authors suggest Scrum, a framework that is increasingly used in software engineering, as a solution for the challenges. Scrum…
Using Content Analysis Software to Analyze Survey Comments
ERIC Educational Resources Information Center
Dennis, Bradford W.; Bower, Tim
2008-01-01
In order to get the most from LibQUAL+[TM] qualitative data, libraries must organize and classify the comments of their patrons. The challenge is to do this effectively and efficiently. This article illustrates how researchers at Western Michigan University Libraries utilized ATLAS.ti 5.0 to organize, classify, and consolidate the LibQUAL+[TM]…
Electronic Storybooks: A Constructivist Approach to Improving Reading Motivation in Grade 1 Students
ERIC Educational Resources Information Center
Ciampa, Katia
2012-01-01
This study stemmed from a concern of the perceived decline in students' reading motivation after the early years of schooling. This research investigated the effectiveness of online eBooks on eight grade 1 students' reading motivation. Eight students were given ten 25-minute sessions with the software programs over 15 weeks. Qualitative data were…
Kroeger, Axel; Olliaro, Piero; Rocklöv, Joacim; Sewe, Maquins Odhiambo; Tejeda, Gustavo; Benitez, David; Gill, Balvinder; Hakim, S. Lokman; Gomes Carvalho, Roberta; Bowman, Leigh; Petzold, Max
2018-01-01
Background Dengue outbreaks are increasing in frequency over space and time, affecting people’s health and burdening resource-constrained health systems. The ability to detect early emerging outbreaks is key to mounting an effective response. The early warning and response system (EWARS) is a toolkit that provides countries with early-warning systems for efficient and cost-effective local responses. EWARS uses outbreak and alarm indicators to derive prediction models that can be used prospectively to predict a forthcoming dengue outbreak at district level. Methods We report on the development of the EWARS tool, based on users’ recommendations into a convenient, user-friendly and reliable software aided by a user’s workbook and its field testing in 30 health districts in Brazil, Malaysia and Mexico. Findings 34 Health officers from the 30 study districts who had used the original EWARS for 7 to 10 months responded to a questionnaire with mainly open-ended questions. Qualitative content analysis showed that participants were generally satisfied with the tool but preferred open-access vs. commercial software. EWARS users also stated that the geographical unit should be the district, while access to meteorological information should be improved. These recommendations were incorporated into the second-generation EWARS-R, using the free R software, combined with recent surveillance data and resulted in higher sensitivities and positive predictive values of alarm signals compared to the first-generation EWARS. Currently the use of satellite data for meteorological information is being tested and a dashboard is being developed to increase user-friendliness of the tool. The inclusion of other Aedes borne viral diseases is under discussion. Conclusion EWARS is a pragmatic and useful tool for detecting imminent dengue outbreaks to trigger early response activities. PMID:29727447
Operable Data Management for Ocean Observing Systems
NASA Astrophysics Data System (ADS)
Chavez, F. P.; Graybeal, J. B.; Godin, M. A.
2004-12-01
As oceanographic observing systems become more numerous and complex, data management solutions must follow. Most existing oceanographic data management systems fall into one of three categories: they have been developed as dedicated solutions, with limited application to other observing systems; they expect that data will be pre-processed into well-defined formats, such as netCDF; or they are conceived as robust, generic data management solutions, with complexity (high) and maturity and adoption rates (low) to match. Each approach has strengths and weaknesses; no approach yet fully addresses, nor takes advantage of, the sophistication of ocean observing systems as they are now conceived. In this presentation we describe critical data management requirements for advanced ocean observing systems, of the type envisioned by ORION and IOOS. By defining common requirements -- functional, qualitative, and programmatic -- for all such ocean observing systems, the performance and nature of the general data management solution can be characterized. Issues such as scalability, maintaining metadata relationships, data access security, visualization, and operational flexibility suggest baseline architectural characteristics, which may in turn lead to reusable components and approaches. Interoperability with other data management systems, with standards-based solutions in metadata specification and data transport protocols, and with the data management infrastructure envisioned by IOOS and ORION, can also be used to define necessary capabilities. Finally, some requirements for the software infrastructure of ocean observing systems can be inferred. Early operational results and lessons learned, from development and operations of MBARI ocean observing systems, are used to illustrate key requirements, choices, and challenges. Reference systems include the Monterey Ocean Observing System (MOOS), its component software systems (Software Infrastructure and Applications for MOOS, and the Shore Side Data System), and the Autonomous Ocean Sampling Network (AOSN).
Hussain-Alkhateeb, Laith; Kroeger, Axel; Olliaro, Piero; Rocklöv, Joacim; Sewe, Maquins Odhiambo; Tejeda, Gustavo; Benitez, David; Gill, Balvinder; Hakim, S Lokman; Gomes Carvalho, Roberta; Bowman, Leigh; Petzold, Max
2018-01-01
Dengue outbreaks are increasing in frequency over space and time, affecting people's health and burdening resource-constrained health systems. The ability to detect early emerging outbreaks is key to mounting an effective response. The early warning and response system (EWARS) is a toolkit that provides countries with early-warning systems for efficient and cost-effective local responses. EWARS uses outbreak and alarm indicators to derive prediction models that can be used prospectively to predict a forthcoming dengue outbreak at district level. We report on the development of the EWARS tool, based on users' recommendations into a convenient, user-friendly and reliable software aided by a user's workbook and its field testing in 30 health districts in Brazil, Malaysia and Mexico. 34 Health officers from the 30 study districts who had used the original EWARS for 7 to 10 months responded to a questionnaire with mainly open-ended questions. Qualitative content analysis showed that participants were generally satisfied with the tool but preferred open-access vs. commercial software. EWARS users also stated that the geographical unit should be the district, while access to meteorological information should be improved. These recommendations were incorporated into the second-generation EWARS-R, using the free R software, combined with recent surveillance data and resulted in higher sensitivities and positive predictive values of alarm signals compared to the first-generation EWARS. Currently the use of satellite data for meteorological information is being tested and a dashboard is being developed to increase user-friendliness of the tool. The inclusion of other Aedes borne viral diseases is under discussion. EWARS is a pragmatic and useful tool for detecting imminent dengue outbreaks to trigger early response activities.
Modification of infant hypothyroidism and phenylketonuria screening program using electronic tools.
Taheri, Behjat; Haddadpoor, Asefeh; Mirkhalafzadeh, Mahmood; Mazroei, Fariba; Aghdak, Pezhman; Nasri, Mehran; Bahrami, Gholamreza
2017-01-01
Congenital hypothyroidism and phenylketonuria (PKU) are the most common cause for preventable mental retardation in infants worldwide. Timely diagnosis and treatment of these disorders can have lasting effects on the mental development of newborns. However, there are several problems at different stages of screening programs that along with imposing heavy costs can reduce the precision of the screening, increasing the chance of undiagnosed cases which in turn can have damaging consequences for the society. Therefore, given these problems and the importance of information systems in facilitating the management and improving the quality of health care the aim of this study was to improve the screening process of hypothyroidism and PKU in infants with the help of electronic resources. The current study is a qualitative, action research designed to improve the quality of screening, services, performance, implementation effectiveness, and management of hypothyroidism and PKU screening program in Isfahan province. To this end, web-based software was designed. Programming was carried out using Delphi.net software and used SQL Server 2008 for database management. Given the weaknesses, problems, and limitations of hypothyroidism and PKU screening program, and the importance of these diseases in a national scale, this study resulted in design of hypothyroidism and PKU screening software for infants in Isfahan province. The inputs and outputs of the software were designed in three levels including Health Care Centers in charge of the screening program, provincial reference lab, and health and treatment network of Isfahan province. Immediate registration of sample data at the time and location of sampling, providing the provincial reference Laboratory and Health Centers of different eparchies with the ability to instantly observe, monitor, and follow-up on the samples at any moment, online verification of samples by reference lab, creating a daily schedule for reference lab, and receiving of the results from analysis equipment; and entering the results into the database without the need for user input are among the features of this software. The implementation of hypothyroidism screening software led to an increase in the quality and efficiency of the screening program; minimized the risk of human error in the process and solved many of the previous limitations of the screening program which were the main goals for implementation of this software. The implementation of this software also resulted in improvement in precision and quality of services provided for these two diseases and better accuracy and precision for data inputs by providing the possibility of entering the sample data at the place and time of sampling which then resulted in the possibility of management based on precise data and also helped develop a comprehensive database and improved the satisfaction of service recipients.
Evolution of Secondary Software Businesses: Understanding Industry Dynamics
NASA Astrophysics Data System (ADS)
Tyrväinen, Pasi; Warsta, Juhani; Seppänen, Veikko
Primary software industry originates from IBM's decision to unbundle software-related computer system development activities to external partners. This kind of outsourcing from an enterprise internal software development activity is a common means to start a new software business serving a vertical software market. It combines knowledge of the vertical market process with competence in software development. In this research, we present and analyze the key figures of the Finnish secondary software industry, in order to quantify its interaction with the primary software industry during the period of 2000-2003. On the basis of the empirical data, we present a model for evolution of a secondary software business, which makes explicit the industry dynamics. It represents the shift from internal software developed for competitive advantage to development of products supporting standard business processes on top of standardized technologies. We also discuss the implications for software business strategies in each phase.
Doherty, Megan L; Owusu-Dabo, Ellis; Kantanka, Osei Sarfo; Brawer, Rickie O; Plumb, James D
2014-10-14
Urban centers in Sub-Saharan Africa, such as Kumasi, Ghana, are especially impacted by the dual burden of infectious and non-communicable disease (NCD), including a rise in type 2 diabetes mellitus (T2DM) prevalence. To develop effective intervention programs, the World Health Organization recommends more research to better understand the relationship between food consumption and the escalation of non-communicable disease such as T2DM. This study provides qualitative information about current food knowledge, attitudes and practices among T2DM patients and their caregivers in the region of Kumasi, Ghana. In this qualitative study, three focus groups discussions of 30 persons total and 10 individual interviews were used to assess food preferences, knowledge, attitudes and practices of patients with T2DM as well as caregivers responsible for food preparation. Participants included both urban and rural dwellers. Hospital-based health talks were observed, a dietician was interviewed, and educational documents were collected. Themes were identified and coded using Nvivo10 software. Findings suggest that messages regarding sweetened foods, fats, use of seasonings and meal timing are followed. However, confusion exists regarding the impact of fruits, food portioning, plantains and processed foods on health outcomes for diabetic patients. Results also revealed a problem-solving approach to increasing vegetable consumption, and a concern about unhealthy food preferences among younger generations. Education about the impact of commonly available carbohydrates on blood sugar should be emphasized; messaging on portion sizes and certain foods should be more consistent; the economic benefits of local vegetable consumption should be promoted; and a research-informed, T2DM prevention campaign should be developed specifically for younger generations.
Messer, Lynne C; Parnell, Heather; Huffaker, Renee; Wooldredge, Rich; Wilkin, Aimee
2012-10-01
The Regional Health Information Integration Project (RHIIP) has developed the Carolina HIV Information Cooperative regional health information organization (CHIC RHIO). The CHIC RHIO was implemented to improve patient care and health outcomes by enhancing communication among geographically disconnected networks of HIV care providers in rural North Carolina. CHIC RHIO comprises one medical clinic and five AIDS Service Organizations (ASOs) serving clients in eight rural counties. Communication among the CHIC RHIO members is facilitated by CAREWare software. The RHIIP team assessed organizational readiness to change, facilitated relationship-building for CHIC RHIO, created the CHIC RHIO and used both qualitative and quantitative approaches to evaluate the process-related effects of implementing a data-sharing intervention. We found the CHIC RHIO member organizations were ready to engage in the IT intervention prior to its implementation, which most likely contributed to its successful adoption. The qualitative findings indicate that CHIC RHIO members personally benefited - and perceived their clients benefited - from participation in the information exchange. The quantitative results echoed the qualitative findings; following the CHIC RHIO intervention, quality improvements were noted in the ASO and medical clinic relationships, information exchange, and perceived level of patient care. Furthermore, hopes for what data sharing would accomplish were overly high at the beginning of the project, thus requiring a recalibration of expectations as the project came to a close. Innovative strategies for health information exchange can be implemented in rural communities to increase communication among providers. With this increased communication comes the potential for improved health outcomes and, in turn, healthier communities. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.
1992-01-01
The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.
NASA Astrophysics Data System (ADS)
Wang, Cuihuan; Kim, Leonard; Barnard, Nicola; Khan, Atif; Pierce, Mark C.
2016-02-01
Our long term goal is to develop a high-resolution imaging method for comprehensive assessment of tissue removed during lumpectomy procedures. By identifying regions of high-grade disease within the excised specimen, we aim to develop patient-specific post-operative radiation treatment regimens. We have assembled a benchtop spectral-domain optical coherence tomography (SD-OCT) system with 1320 nm center wavelength. Automated beam scanning enables "sub-volumes" spanning 5 mm x 5 mm x 2 mm (500 A-lines x 500 B-scans x 2 mm in depth) to be collected in under 15 seconds. A motorized sample positioning stage enables multiple sub-volumes to be acquired across an entire tissue specimen. Sub-volumes are rendered from individual B-scans in 3D Slicer software and en face (XY) images are extracted at specific depths. These images are then tiled together using MosaicJ software to produce a large area en face view (up to 40 mm x 25 mm). After OCT imaging, specimens were sectioned and stained with HE, allowing comparison between OCT image features and disease markers on histopathology. This manuscript describes the technical aspects of image acquisition and reconstruction, and reports initial qualitative comparison between large area en face OCT images and HE stained tissue sections. Future goals include developing image reconstruction algorithms for mapping an entire sample, and registering OCT image volumes with clinical CT and MRI images for post-operative treatment planning.
Development and Testing of a Smartphone Application Prototype for Oral Health Promotion.
Nolen, Sara L; Giblin-Scanlon, Lori J; Boyd, Linda D; Rainchuso, Lori
2018-04-01
Purpose: The purpose of this study was to develop and test a smartphone application (app) prototype, ToothSense, as an oral health promotion tool for the prevention of Early Childhood Caries (ECC) based on the Theory of Planned Behavior (TPB). Methods: A quantitative and qualitative design process based on the TPB was used for the app development in the first phase of the study. A behavioral intervention technologic model was used to document the app features design, accounting for Doshi's intervention strategies for the TPB. Beta-testing of the app was hosted via an online software program. Testers were presented with a series of tasks and prompts followed by a 5-point Likert-scale questionnaire that quantitatively measured perceptions of the app's interactive design based on Jakob Nielsen's principles and behavioral strategies. A Net Promotor Score was calculated to determine the tester's likelihood to recommend the app prototype. Audio and video aspects of the app were qualitatively measured using a template approach. Results: Beta testers agreed the app met the majority of the five usability statements. The Net Promotor Score indicated a likelihood to recommend the app prototype. Thematic analyses revealed the following themes: interface design, navigation, terminology, information, and oral health promotion. Conclusion: Beta testing results from this study provided health promotion project design information for the prevention of ECC using the TPB and highlighted the importance and usability of smartphone app for oral health promotion. Copyright © 2018 The American Dental Hygienists’ Association.
Rashidian, Hamideh; Nedjat, Saharnaz; Majdzadeh, Reza; Gholami, Jaleh; Haghjou, Leila; Abdollahi, Bahar Sadeghi; Davatchi, Fereydoun; Rashidian, Arash
2013-09-25
Patient preference is one of the main components of clinical decision making, therefore leading to the development of patient decision aids. The goal of this study was to describe physicians' and patients' viewpoints on the barriers and limitations of using patient decision aids in Iran, their proposed solutions, and, the benefits of using these tools. This qualitative study was conducted in 2011 in Iran by holding in-depth interviews with 14 physicians and 8 arthritis patient. Interviewees were selected through purposeful and maximum variation sampling. As an example, a patient decision aid on the treatment of knee arthritis was developed upon literature reviews and gathering expert opinion, and was presented at the time of interview. Thematic analysis was conducted to analyze the data by using the OpenCode software. The results were summarized into three categories and ten codes. The extracted categories were the perceived benefits of using the tools, as well as the patient-related and physician-related barriers in using decision aids. The following barriers in using patient decision aids were identified in this study: lack of patients and physicians' trainings in shared decision making, lack of specialist per capita, low treatment tariffs and lack of an exact evaluation system for patient participation in decision making. No doubt these barriers demand the health authorities' special attention. Hence, despite patients and physicians' inclination toward using patient decision aids, these problems have hindered the practical usage of these tools in Iran--as a developing country.
LIME: 3D visualisation and interpretation of virtual geoscience models
NASA Astrophysics Data System (ADS)
Buckley, Simon; Ringdal, Kari; Dolva, Benjamin; Naumann, Nicole; Kurz, Tobias
2017-04-01
Three-dimensional and photorealistic acquisition of surface topography, using methods such as laser scanning and photogrammetry, has become widespread across the geosciences over the last decade. With recent innovations in photogrammetric processing software, robust and automated data capture hardware, and novel sensor platforms, including unmanned aerial vehicles, obtaining 3D representations of exposed topography has never been easier. In addition to 3D datasets, fusion of surface geometry with imaging sensors, such as multi/hyperspectral, thermal and ground-based InSAR, and geophysical methods, create novel and highly visual datasets that provide a fundamental spatial framework to address open geoscience research questions. Although data capture and processing routines are becoming well-established and widely reported in the scientific literature, challenges remain related to the analysis, co-visualisation and presentation of 3D photorealistic models, especially for new users (e.g. students and scientists new to geomatics methods). Interpretation and measurement is essential for quantitative analysis of 3D datasets, and qualitative methods are valuable for presentation purposes, for planning and in education. Motivated by this background, the current contribution presents LIME, a lightweight and high performance 3D software for interpreting and co-visualising 3D models and related image data in geoscience applications. The software focuses on novel data integration and visualisation of 3D topography with image sources such as hyperspectral imagery, logs and interpretation panels, geophysical datasets and georeferenced maps and images. High quality visual output can be generated for dissemination purposes, to aid researchers with communication of their research results. The background of the software is described and case studies from outcrop geology, in hyperspectral mineral mapping and geophysical-geospatial data integration are used to showcase the novel methods developed.
Characterizing and Assessing a Large-Scale Software Maintenance Organization
NASA Technical Reports Server (NTRS)
Briand, Lionel; Melo, Walcelio; Seaman, Carolyn; Basili, Victor
1995-01-01
One important component of a software process is the organizational context in which the process is enacted. This component is often missing or incomplete in current process modeling approaches. One technique for modeling this perspective is the Actor-Dependency (AD) Model. This paper reports on a case study which used this approach to analyze and assess a large software maintenance organization. Our goal was to identify the approach's strengths and weaknesses while providing practical recommendations for improvement and research directions. The AD model was found to be very useful in capturing the important properties of the organizational context of the maintenance process, and aided in the understanding of the flaws found in this process. However, a number of opportunities for extending and improving the AD model were identified. Among others, there is a need to incorporate quantitative information to complement the qualitative model.
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Lewicki, Scott; Morgan, Scott
2011-01-01
The measurement techniques for organizations which have achieved the Software Engineering Institutes CMMI Maturity Levels 4 and 5 are well documented. On the other hand, how to effectively measure when an organization is Maturity Level 3 is less well understood, especially when there is no consistency in tool use and there is extensive tailoring of the organizational software processes. Most organizations fail in their attempts to generate, collect, and analyze standard process improvement metrics under these conditions. But at JPL, NASA's prime center for deep space robotic exploration, we have a long history of proving there is always a solution: It just may not be what you expected. In this paper we describe the wide variety of qualitative and quantitative techniques we have been implementing over the last few years, including the various approaches used to communicate the results to both software technical managers and senior managers.
NASA Astrophysics Data System (ADS)
Zengin, Yılmaz
2017-11-01
The purpose of this study is to determine the effect of GeoGebra software on pre-service mathematics teachers' attitudes towards proof and proving and to determine pre-service teachers' pre- and post-views regarding proof. The study lasted nine weeks and the participants of the study consisted of 24 pre-service mathematics teachers. The study used the 'Attitude Scale Towards Proof and Proving' and an open-ended questionnaire that were administered before and after the intervention as data collection tools. Paired samples t-test analysis was used for the analysis of quantitative data and content and descriptive analyses were utilized for the analysis of qualitative data. As a result of the data analysis, it was determined that GeoGebra software was an effective tool in increasing pre-service teachers' attitudes towards proof and proving.
Software Development as Music Education Research
ERIC Educational Resources Information Center
Brown, Andrew R.
2007-01-01
This paper discusses how software development can be used as a method for music education research. It explains how software development can externalize ideas, stimulate action and reflection, and provide evidence to support the educative value of new software-based experiences. Parallels between the interactive software development process and…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-22
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes for Digital... Software Life Cycle Processes for Digital Computer Software used in Safety Systems of Nuclear Power Plants... clarifications, the enhanced consensus practices for developing software life-cycle processes for digital...
Generic domain models in software engineering
NASA Technical Reports Server (NTRS)
Maiden, Neil
1992-01-01
This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.
A knowledge-based approach to identification and adaptation in dynamical systems control
NASA Technical Reports Server (NTRS)
Glass, B. J.; Wong, C. M.
1988-01-01
Artificial intelligence techniques are applied to the problems of model form and parameter identification of large-scale dynamic systems. The object-oriented knowledge representation is discussed in the context of causal modeling and qualitative reasoning. Structured sets of rules are used for implementing qualitative component simulations, for catching qualitative discrepancies and quantitative bound violations, and for making reconfiguration and control decisions that affect the physical system. These decisions are executed by backward-chaining through a knowledge base of control action tasks. This approach was implemented for two examples: a triple quadrupole mass spectrometer and a two-phase thermal testbed. Results of tests with both of these systems demonstrate that the software replicates some or most of the functionality of a human operator, thereby reducing the need for a human-in-the-loop in the lower levels of control of these complex systems.
A Qualitative Study on Organizational Factors Affecting Occupational Accidents
ESKANDARI, Davood; JAFARI, Mohammad Javad; MEHRABI, Yadollah; KIAN, Mostafa Pouya; CHARKHAND, Hossein; MIRGHOTBI, Mostafa
2017-01-01
Background: Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts’ experiences and perception of organizational factors. Methods: This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Results: Eleven organizational factors’ sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. Conclusion: The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents. PMID:28435824
A Qualitative Study on Organizational Factors Affecting Occupational Accidents.
Eskandari, Davood; Jafari, Mohammad Javad; Mehrabi, Yadollah; Kian, Mostafa Pouya; Charkhand, Hossein; Mirghotbi, Mostafa
2017-03-01
Technical, human, operational and organizational factors have been influencing the sequence of occupational accidents. Among them, organizational factors play a major role in causing occupational accidents. The aim of this research was to understand the Iranian safety experts' experiences and perception of organizational factors. This qualitative study was conducted in 2015 by using the content analysis technique. Data were collected through semi-structured interviews with 17 safety experts working in Iranian universities and industries and analyzed with a conventional qualitative content analysis method using the MAXQDA software. Eleven organizational factors' sub-themes were identified: management commitment, management participation, employee involvement, communication, blame culture, education and training, job satisfaction, interpersonal relationship, supervision, continuous improvement, and reward system. The participants considered these factors as effective on occupational accidents. The mentioned 11 organizational factors are probably involved in occupational accidents in Iran. Naturally, improving organizational factors can increase the safety performance and reduce occupational accidents.
Poghosyan, Lusine; Poghosyan, Hermine; Berlin, Kristen; Truzyan, Nune; Danielyan, Lusine; Khourshudyan, Kristine
2012-09-01
The purpose of this qualitative descriptive study was to explore the views of head and staff nurses about nursing practice in the hospitals of Armenia. Armenia inherited its nursing frameworks from the Soviet Union. After the Soviet collapse, many changes took place to reform nursing. However, to date little has been systematically documented about nursing practice in Armenia. Qualitative descriptive design was implemented. Three major hospitals in Yerevan, the capital city of Armenia, participated in the study. Purposeful sampling was used. Forty-three nurses participated, 29 staff and fourteen head nurses. Data were collected through five focus groups comprised of seven to ten participants. A focus group guide was developed. The researcher facilitated the discussions in Armenian, which were audio taped. The research assistant took notes. Data were transcribed and translated into English, imported into atlas.ti 6.1 qualitative software, and analysed by three authors. Five themes were extracted. Lack of role clarity theme was identified from the head nurse data. The practice environment theme was identified from the staff nurse data. Nursing education, value, respect and appreciation of nursing, and becoming a nurse were common themes identified from both head and staff nurse data. Head nurses lack autonomy, do not have clear roles and are burdened with documentation. Staff nurses practice in challenging work environments with inadequate staffing and demanding workloads. All nurses reported the need to improve nursing education. This is the first study conducted in Armenia exploring nursing practice in the hospitals from the nurses' perspectives. Nurses face challenges that may impact their wellbeing and patient care. Understanding challenges nursing practice faces in the hospitals in Armenia will help administrators and care providers to take actions to improve nursing practice and subsequently patient care. © 2012 Blackwell Publishing Ltd.
Métier de sociologue, approche inductive et objet d'analyse. Brèves remarques à partir de Bourdieu.
Hamel, Jacques
2015-05-01
This article seeks to reveal the role played by the inductive approach in sociology. Grounded Theory assumes its full importance in formulating sociological explanations. However, the theory does pose a problem, in that the "method" is not based on clearly defined operations, which remain implicit. This article attempts to show that the object of analysis-what is being analyzed-makes perceptible the operations implicitly conceived by the analyst, based on Grounded Theory. With qualitative analysis software, such as Atlas.ti, it is possible to shed light on these operations. The article is illustrated by the theory of Pierre Bourdieu and the epistemological considerations he developed as a result of his qualitative inquiry, La Misère du monde. Cet article cherche à montrer le rôle que joue l'approche inductive en sociologie. La Grounded Theory revêt son importance pour formuler l'explication sociologique. Celle-ci pose toutefois problème. En effet, la «méthode» ne repose pas sur des opérations clairement définies et celles-ci restent implicites. Dans cet article, on cherche à montrer que l'objet d'analyse-ce sur quoi porte l'analyse-rend perceptibles les opérations que l'analyste conçoit implicitement en s'appuyant sur la Grounded Theory. Les logiciels d'analyse qualitative, comme Atlas.ti, permettent d'autre part de les mettre en évidence. L'article est illustré par la théorie de Pierre Bourdieu et les considérations épistémologiques qu'a développées cet auteur à la suite de son enquête qualitative sur la Misère du monde. © 2015 Canadian Sociological Association/La Société canadienne de sociologie.
NASA Astrophysics Data System (ADS)
Hwang, L.; Kellogg, L. H.
2017-12-01
Curation of software promotes discoverability and accessibility and works hand in hand with scholarly citation to ascribe value to, and provide recognition for software development. To meet this challenge, the Computational Infrastructure for Geodynamics (CIG) maintains a community repository built on custom and open tools to promote discovery, access, identification, credit, and provenance of research software for the geodynamics community. CIG (geodynamics.org) originated from recognition of the tremendous effort required to develop sound software and the need to reduce duplication of effort and to sustain community codes. CIG curates software across 6 domains and has developed and follows software best practices that include establishing test cases, documentation, and a citable publication for each software package. CIG software landing web pages provide access to current and past releases; many are also accessible through the CIG community repository on github. CIG has now developed abc - attribution builder for citation to enable software users to give credit to software developers. abc uses zenodo as an archive and as the mechanism to obtain a unique identifier (DOI) for scientific software. To assemble the metadata, we searched the software's documentation and research publications and then requested the primary developers to verify. In this process, we have learned that each development community approaches software attribution differently. The metadata gathered is based on guidelines established by groups such as FORCE11 and OntoSoft. The rollout of abc is gradual as developers are forward-looking, rarely willing to go back and archive prior releases in zenodo. Going forward all actively developed packages will utilize the zenodo and github integration to automate the archival process when a new release is issued. How to handle legacy software, multi-authored libraries, and assigning roles to software remain open issues.
SEI Software Engineering Education Directory.
1987-02-01
Software Design and Development Gilbert. Philip Systems: CDC Cyber 170/750 CDC Cyber 170760 DEC POP 11/44 PRIME AT&T 3B5 IBM PC IBM XT IBM RT...Macintosh VAx 8300 Software System Development and Laboratory CS 480/480L U P X T Textbooks: Software Design and Development Gilbert, Philip Systems: CDC...Acting Chair (618) 692-2386 Courses: Software Design and Development CS 424 U P E Y Textbooks: Software Design and Development, Gilbert, Philip Topics
McLoone, J K; Watts, K J; Menzies, S W; Barlow-Stewart, K; Mann, G J; Kasparian, N A
2013-09-01
Providing ongoing clinical care that adequately addresses patients' medical, psychosocial and information needs is challenging, particularly for patient groups at increased risk of developing life-threatening disease such as malignant melanoma. This study examined a model of clinical care developed by the High Risk Clinic (HRC) of the Sydney Melanoma Diagnostic Centre in relation to patient satisfaction. Semi-structured telephone interviews were conducted and analyzed using the framework of Miles and Huberman, and themes were organized using the qualitative software package, QSR NVivo8. Twenty HRC patients participated in the study (nine men, 11 women; mean age 57.6 years, age range 34-74 years; response rate 91%). Satisfaction with clinical care at the HRC was high. Factors contributing to patient satisfaction included: rapid and regular access to physicians who were perceived by participants as experts, the development of confidence and trust in one's treating doctor, and a sense of being cared about and understood by one's healthcare team. Although one-third of the participants reported some inconveniences in attending the clinic, these were viewed as minor difficulties and not significant barriers to care. Formal psychological support was not sought or expected by participants, although many expressed long-standing melanoma-related fears and concerns. Accessible, expert medical attention, delivered in a patient-centered manner was integral to melanoma survivors' satisfaction with clinical management. Appropriate referrals to psychological support may further increase satisfaction with clinical care. Copyright © 2013 John Wiley & Sons, Ltd.
Advanced software development workstation project: Engineering scripting language. Graphical editor
NASA Technical Reports Server (NTRS)
1992-01-01
Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.
Using XML and XSLT for flexible elicitation of mental-health risk knowledge.
Buckingham, C D; Ahmed, A; Adams, A E
2007-03-01
Current tools for assessing risks associated with mental-health problems require assessors to make high-level judgements based on clinical experience. This paper describes how new technologies can enhance qualitative research methods to identify lower-level cues underlying these judgements, which can be collected by people without a specialist mental-health background. Content analysis of interviews with 46 multidisciplinary mental-health experts exposed the cues and their interrelationships, which were represented by a mind map using software that stores maps as XML. All 46 mind maps were integrated into a single XML knowledge structure and analysed by a Lisp program to generate quantitative information about the numbers of experts associated with each part of it. The knowledge was refined by the experts, using software developed in Flash to record their collective views within the XML itself. These views specified how the XML should be transformed by XSLT, a technology for rendering XML, which resulted in a validated hierarchical knowledge structure associating patient cues with risks. Changing knowledge elicitation requirements were accommodated by flexible transformations of XML data using XSLT, which also facilitated generation of multiple data-gathering tools suiting different assessment circumstances and levels of mental-health knowledge.
Fenwick, Eva K; Pesudovs, Konrad; Khadka, Jyoti; Dirani, Mohamed; Rees, Gwyn; Wong, Tien Y; Lamoureux, Ecosse L
2012-12-01
Assessing the efficacy of treatment modalities for diabetic retinopathy (DR) from the patient's perspective is restricted due to a lack of a comprehensive patient-reported outcome measure. We are developing a DR-specific quality of life (QoL) item bank, and we report here on the qualitative results from the first phase of this project. Eight focus groups and 18 semi-structured interviews were conducted with 57 patients with DR. The sessions were transcribed verbatim and iteratively analysed using the constant comparative method and NVIVO software. Participants had a median age of 58 years (range 27-83 years). Twenty-seven (47%) participants had proliferative DR in the better eye, and 14 (25%) had clinically significant macular oedema. Nine QoL domains were identified, namely visual symptoms, ocular surface symptoms, vision-related activity limitation, mobility, emotional well-being, health concerns, convenience, social, and economic. Participants described many vision-related activity limitations, particularly under challenging lighting conditions; however, socioemotional issues were equally important. Participants felt frustrated due to their visual restrictions, concerned about further vision loss and had difficulty coping with this uncertainty. Restrictions on driving were pervasive, affecting transport, social life, relationships, responsibilities, work and independence. Patients with DR experience many socioemotional issues in addition to vision-related activity limitations. Data from this study will be used to generate data for a DR-specific QoL item bank.
Cresswell, Kathrin M; Lee, Lisa; Mozaffar, Hajar; Williams, Robin; Sheikh, Aziz
2017-10-01
To explore and understand approaches to user engagement through investigating the range of ways in which health care workers and organizations accommodated the introduction of computerized physician order entry (CPOE) and computerized decision support (CDS) for hospital prescribing. Six hospitals in England, United Kingdom. Qualitative case study. We undertook qualitative semi-structured interviews, non-participant observations of meetings and system use, and collected organizational documents over three time periods from six hospitals. Thematic analysis was initially undertaken within individual cases, followed by cross-case comparisons. We conducted 173 interviews, conducted 24 observations, and collected 17 documents between 2011 and 2015. We found that perceived individual and safety benefits among different user groups tended to facilitate engagement in some, while other less engaged groups developed resistance and unsanctioned workarounds if systems were perceived to be inadequate. We identified both the opportunity and need for sustained engagement across user groups around system enhancement (e.g., through customizing software) and the development of user competencies and effective use. There is an urgent need to move away from an episodic view of engagement focused on the preimplementation phase, to more continuous holistic attempts to engage with and respond to end-users. © Health Research and Educational Trust.
Sacks, Emma; Bailey, Joanne Motiño; Robles, Chayla; Low, Lisa Kane
2013-01-01
Traditional birth attendants (TBAs) have limited ability to reduce maternal mortality, but may be able to have a significant impact on neonatal survival. This qualitative study explores TBAs' (possessive) experience with neonatal care in a rural Honduran community. In 6 semistructured focus groups, TBAs described services they routinely provide to newborns. Using Atlas.ti, Version 6.0. (ATLAS.ti Scientific Software Development GmbH, University of Berlin), transcripts were coded by bilingual researchers and analyzed by thematic content. TBAs demonstrated limited knowledge of newborn physiology, yet were aware of many internationally recommended practices. Despite attempts to follow recommendations, all TBAs expressed difficulty due to resource constraints. TBAs were strong advocates of immediate breast-feeding and skin-to-skin care, but they did not demonstrate knowledge regarding delayed bathing and thermal care. Most TBAs stated that a sick neonate could be identified immediately at birth; thus, infections or other illnesses developed in later days may be missed. TBAs did not believe they could have averted neonatal complications or deaths that had occurred under their care. For most healthy newborns, TBAs are the primary providers until the 2-month vaccine visit at the healthcare clinic. Improved TBA training focused on infection symptomotology, physiology, and thermoregulation for newborns may increase opportunities for improved health and timely referrals to healthcare facilities.
Bianchi, Monica; Bagnasco, Annamaria; Aleo, Giuseppe; Catania, Gianluca; Zanini, Milko Patrick; Timmins, Fiona; Carnevale, Franco; Sasso, Loredana
2018-05-01
This article presents a qualitative research protocol to explore and understand the interprofessional collaboration (IPC) preparation process implemented by clinical tutors and students of different professions involved in interprofessional education (IPE). Many studies have shown that IPE initiatives improve students' understanding of the roles and responsibilities of other professionals. This improves students' attitudes towards other professions, facilitating mutual respect, and IPC. However, there is limited information about how students are prepared to work collaboratively within interprofessional teams. This is a constructivist grounded theory (GT) study, which will involve data collection through in-depth semi-structured interviews (to 9-15 students and 6-9 clinical tutors), participant observations, and the analysis of documentation. After analysing, coding, integrating, and comparing the data if necessary, a second round of interviews could be conducted to explore any particularly interesting aspects or clarify any issues. This will then be followed by focused and theoretical coding. Qualitative data analysis will be conducted with the support of NVivo 10 software (Victoria, Australia). A better conceptual understanding will help to understand if IPE experiences have contributed to the acquisition of competencies considered important for IPC, and if they have facilitated the development of teamwork attitudes.
Designing a Medical Tourism Website: A Qualitative Study.
Samadbeik, Mahnaz; Asadi, Heshmatollah; Mohseni, Mohammad; Takbiri, Afsaneh; Moosavi, Ahmad; Garavand, Ali
2017-02-01
Informing plays a prominent role in attracting medical tourists. The enjoyment of proper medical information systems is one of the most important tools for the attraction of medical tourists. Iran's ability in designing and implementing information networks has remained largely unknown. The current study aimed to explore information needs for designing a medical tourism website. This qualitative study was conducted in 2015 for designing Hospital Medical-Tourism Website (HMTW). A purposive sampling method was used and data were gathered using a semi-structured questionnaire. Totally, 12 faculty members and experts in the field of medical tourism were interviewed. Data were analyzed using the MAXQDA10 software. Totally 41 sub-themes and 10 themes were identified. The themes included the introduction of hospital, general guide for patients, tourism information, information related to physicians in hospital, costs, treatment follow-up, online hospital appointment scheduling in website, statistics and news of hospital medical tourism, photo gallery and contacts. Among the themes, the participants highly emphasized four themes including costs (100%), tourism information (91.6%), information related to physicians in hospital, (83.3%) and treatment follow-up (83.3%). This profitable industry can be developed through considering information requirements for hospital medical tourism website.
Designing a Medical Tourism Website: A Qualitative Study
SAMADBEIK, Mahnaz; ASADI, Heshmatollah; MOHSENI, Mohammad; TAKBIRI, Afsaneh; MOOSAVI, Ahmad; GARAVAND, Ali
2017-01-01
Background: Informing plays a prominent role in attracting medical tourists. The enjoyment of proper medical information systems is one of the most important tools for the attraction of medical tourists. Iran’s ability in designing and implementing information networks has remained largely unknown. The current study aimed to explore information needs for designing a medical tourism website. Methods: This qualitative study was conducted in 2015 for designing Hospital Medical-Tourism Website (HMTW). A purposive sampling method was used and data were gathered using a semi-structured questionnaire. Totally, 12 faculty members and experts in the field of medical tourism were interviewed. Data were analyzed using the MAXQDA10 software. Results: Totally 41 sub-themes and 10 themes were identified. The themes included the introduction of hospital, general guide for patients, tourism information, information related to physicians in hospital, costs, treatment follow-up, online hospital appointment scheduling in website, statistics and news of hospital medical tourism, photo gallery and contacts. Among the themes, the participants highly emphasized four themes including costs (100%), tourism information (91.6%), information related to physicians in hospital, (83.3%) and treatment follow-up (83.3%). Conclusion: This profitable industry can be developed through considering information requirements for hospital medical tourism website. PMID:28451562
Jorbozeh, Hamideh; Dehdari, Tahereh; Ashoorkhani, Mahnaz; Taghdisi, Mohammad Hossein
2014-01-01
Background: Empowerment of children and adolescents in terms of social skills is critical for promoting their social health. Objectives: This study attempts to explore a framework of influential factors on empowering primary school students by means of peer mediation from the stakeholders' point of view, as a qualitative content analysis design. Patients and Methods: This study was a qualitative content analysis (conventional method). Seven focused group discussions and six in-depth interviews were conducted with schoolchildren, parents and education authorities. Following each interview, recordings were entered to an open code software and analyzed. Data collection was continued up to data saturation. Results: Within the provided framework, the participants' views and comments were classified into two major categories “educational empowerment” and “social empowerment”, and into two themes; “program” and “advocacy”. The “program” theme included factors such as design and implementation, development, maintenance and improvement, and individual and social impact. The “advocacy” theme included factors such as social, emotional and physical support. Conclusions: The explained framework components regarding peer mediation are useful to design peace education programs and to empower school-age children in peer mediation. PMID:25763191
Alameddine, Mohamad; Yassoub, Rami; Mourad, Yara; Khodr, Hiba
2017-01-01
This study explores the recruitment and retention conditions influencing primary health care (PHC) human resources for health (HRH) in Qatar and suggests strategies for their improvement. A qualitative design employing semistructured key informant interviews with PHC stakeholders in Qatar was utilized. Key interviewees were originally recognized, and snowball sampling was used to identify additional interviewees until reaching saturation point. Interview scripts were transcribed and then analyzed thematically using the Nvivo software package. Thematic analysis precipitated a number of themes. Under recruitment, the centrality of enhancing collaboration with academic institutions, enhancing extrinsic benefits, and strengthening human resources recruitment and management practices. Dedicated support needs to be provided to expatriate HRH especially in regard to housing services, children schooling, and streamlining administrative processes for relocation. Findings revealed that job security, continuous professional development, objective performance appraisal systems, enhanced job transparency, and remuneration are key retention concerns. The study provides a number of recommendations for the proper recruitment and retention of HRH. Health planners and decision makers must take these recommendations into consideration to ensure the presence of a competent and sustainable HRH in the PHC sector in the future. PMID:28853314
Alameddine, Mohamad; Yassoub, Rami; Mourad, Yara; Khodr, Hiba
2017-01-01
This study explores the recruitment and retention conditions influencing primary health care (PHC) human resources for health (HRH) in Qatar and suggests strategies for their improvement. A qualitative design employing semistructured key informant interviews with PHC stakeholders in Qatar was utilized. Key interviewees were originally recognized, and snowball sampling was used to identify additional interviewees until reaching saturation point. Interview scripts were transcribed and then analyzed thematically using the Nvivo software package. Thematic analysis precipitated a number of themes. Under recruitment, the centrality of enhancing collaboration with academic institutions, enhancing extrinsic benefits, and strengthening human resources recruitment and management practices. Dedicated support needs to be provided to expatriate HRH especially in regard to housing services, children schooling, and streamlining administrative processes for relocation. Findings revealed that job security, continuous professional development, objective performance appraisal systems, enhanced job transparency, and remuneration are key retention concerns. The study provides a number of recommendations for the proper recruitment and retention of HRH. Health planners and decision makers must take these recommendations into consideration to ensure the presence of a competent and sustainable HRH in the PHC sector in the future.
Key Health Information Technologies and Related Issues for Iran: A Qualitative Study
Hemmat, Morteza; Ayatollahi, Haleh; Maleki, Mohammadreza; Saghafi, Fatemeh
2018-01-01
Background and Objective: Planning for the future of Health Information Technology (HIT) requires applying a systematic approach when conducting foresight studies. The aim of this study was to identify key health information technologies and related issues for Iran until 2025. Methods: This was a qualitative study and the participants included experts and policy makers in the field of health information technology. In-depth semi-structured interviews were conducted and data were analyzed by using framework analysis and MAXQDA software. Results: The findings revealed that the development of national health information network, electronic health records, patient health records, a cloud-based service center, interoperability standards, patient monitoring technologies, telehealth, mhealth, clinical decision support systems, health information technology and mhealth infrastructure were found to be the key technologies for the future. These technologies could influence the economic, organizational and individual levels. To achieve them, the economic and organizational obstacles need to be overcome. Conclusion: In this study, a number of key technologies and related issues were identified. This approach can help to focus on the most important technologies in the future and to priorities these technologies for better resource allocation and policy making. PMID:29854016
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
NASA Technical Reports Server (NTRS)
Lindsey, Tony; Pecheur, Charles
2004-01-01
Livingstone PathFinder (LPF) is a simulation-based computer program for verifying autonomous diagnostic software. LPF is designed especially to be applied to NASA s Livingstone computer program, which implements a qualitative-model-based algorithm that diagnoses faults in a complex automated system (e.g., an exploratory robot, spacecraft, or aircraft). LPF forms a software test bed containing a Livingstone diagnosis engine, embedded in a simulated operating environment consisting of a simulator of the system to be diagnosed by Livingstone and a driver program that issues commands and faults according to a nondeterministic scenario provided by the user. LPF runs the test bed through all executions allowed by the scenario, checking for various selectable error conditions after each step. All components of the test bed are instrumented, so that execution can be single-stepped both backward and forward. The architecture of LPF is modular and includes generic interfaces to facilitate substitution of alternative versions of its different parts. Altogether, LPF provides a flexible, extensible framework for simulation-based analysis of diagnostic software; these characteristics also render it amenable to application to diagnostic programs other than Livingstone.
Software Safety Progress in NASA
NASA Technical Reports Server (NTRS)
Radley, Charles F.
1995-01-01
NASA has developed guidelines for development and analysis of safety-critical software. These guidelines have been documented in a Guidebook for Safety Critical Software Development and Analysis. The guidelines represent a practical 'how to' approach, to assist software developers and safety analysts in cost effective methods for software safety. They provide guidance in the implementation of the recent NASA Software Safety Standard NSS-1740.13 which was released as 'Interim' version in June 1994, scheduled for formal adoption late 1995. This paper is a survey of the methods in general use, resulting in the NASA guidelines for safety critical software development and analysis.
pyam: Python Implementation of YaM
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
pyam is a software development framework with tools for facilitating the rapid development of software in a concurrent software development environment. pyam provides solutions for development challenges associated with software reuse, managing multiple software configurations, developing software product lines, and multiple platform development and build management. pyam uses release-early, release-often development cycles to allow developers to integrate their changes incrementally into the system on a continual basis. It facilitates the creation and merging of branches to support the isolated development of immature software to avoid impacting the stability of the development effort. It uses modules and packages to organize and share software across multiple software products, and uses the concepts of link and work modules to reduce sandbox setup times even when the code-base is large. One sidebenefit is the enforcement of a strong module-level encapsulation of a module s functionality and interface. This increases design transparency, system stability, and software reuse. pyam is written in Python and is organized as a set of utilities on top of the open source SVN software version control package. All development software is organized into a collection of modules. pyam packages are defined as sub-collections of the available modules. Developers can set up private sandboxes for module/package development. All module/package development takes place on private SVN branches. High-level pyam commands support the setup, update, and release of modules and packages. Released and pre-built versions of modules are available to developers. Developers can tailor the source/link module mix for their sandboxes so that new sandboxes (even large ones) can be built up easily and quickly by pointing to pre-existing module releases. All inter-module interfaces are publicly exported via links. A minimal, but uniform, convention is used for building modules.
Clinical software development for the Web: lessons learned from the BOADICEA project
2012-01-01
Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389
Clinical software development for the Web: lessons learned from the BOADICEA project.
Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F
2012-04-10
In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.
Rural patients’ experiences accessing surgery in British Columbia
Humber, Nancy; Dickinson, Paul
2010-01-01
Background More than 33% of Canadians live in rural areas. The vulnerability of rural surgical patients makes them particularly sensitive to barriers to accessing health care. This study aims to describe rural patients’ experiences accessing local non-specialist, family physician–surgeon care and regional specialist surgical care when no local surgical care was available. Methods We conducted a qualitative pilot study of self-selected patients. Interviews were analyzed using a modified Delphi technique and NVivo qualitative software. Results The needs of rural surgical patients were reflective of Maslow’s hierarchy of needs: physiologic, safety and security, community belonging and self-esteem/self-actualization. Rural patients expressed a strong desire for individualized care in a familiar environment. When such care was not available, patients found it difficult to meet even basic physiologic needs. Maternity patients and marginalized populations were particularly vulnerable. Conclusion Rural patients seem to prefer individualized care in a familiar environment to address more of their qualitative emotional, psychological and cultural needs rather than only the physiologic needs of surgery. Larger studies are needed to delineate more clearly the qualitative aspects of surgical care. PMID:21092429
Barolia, Rubina Iqbal; Clark, Alexander M; Higginbottom, Gina M A
2013-01-01
Introduction There is a misconception that cardiovascular disease (CVD) is the burden of wealthy nations, but, in fact, it is the leading cause of death and disability-adjusted life worldwide. Healthy diets are an essential factor in the prevention of CVD. However, promoting healthy diet is challenging, particularly for people with low-socioeconomic status (SES), because poverty is linked with many risk behaviours such as smoking, unhealthy eating and obesity. Multiple factors, cultural values and beliefs interact and make healthy eating very challenging. The effects of these factors in the context of low-SES populations with CVD are largely unknown. To address this gap, this study will examine the factors that affect decisions about consuming healthy diet in Pakistanis with low SES who suffer from CVD. Methods and analysis A qualitative method of interpretive description will be used. 25 participants will be selected from two cardiac rehabilitation (CR) centres in Karachi, Pakistan. Face-to-face interviews using a critical realist framework will be used to understand individual and contextual factors in the food choices of people with low SES and CVD. ATLAS.ti qualitative data analysis software will be used to identify themes and patterns in the interview data. Ethics and discussion Ethical approvals were received from the Ethics Review board of University of Alberta, Canada and Aga Khan University, Karachi Pakistan. The findings will generate new knowledge about which and how factors influence the food choices of Pakistanis with CVD and low SES to provide an insight into the development of an operational framework for designing interventions for prevention of CVD. For knowledge-translation purposes, we will publish the findings in highly accessed, peer-reviewed scientific and health policy journals at the national and international level. This research protocol received IRDC (International Development Research Centre) doctoral award from International Development Research Centre, Ottawa, Canada. PMID:24309173
Luckett, Tim; Davidson, Patricia M; Green, Anna; Boyle, Frances; Stubbs, John; Lovell, Melanie
2013-08-01
Cancer pain is a common, burdensome problem, which is not well managed despite evidence-based guidelines. To develop insights for managing barriers and optimizing facilitators to adult cancer pain assessment and management within a comprehensive framework of patient care. We undertook a systematic review and synthesis of qualitative studies. Medline, PsycINFO, Embase, AMED, CINAHL, and Sociological Abstracts were searched from May 20 to 26, 2011. To be included, the articles had to be published in a peer-reviewed journal since 2000; written in English; and report original qualitative studies on the perspectives of patients, their significant others, or health care providers. Article quality was rated using the checklist of Kitto et al. Thematic synthesis followed a three-stage approach using Evidence for Policy and Practice Information and Co-ordinating Centre-Reviewer 4 software: 1) free line-by-line coding of "Results," 2) organization into "descriptive" themes, and 3) development of "analytical" themes informative to our objective. At Stage 3, a conceptual framework was selected from the peer-reviewed literature according to prima facie "fit" for descriptive themes. Of 659 articles screened, 70 met the criteria, reporting 65 studies with 48 patient, 19 caregiver, and 21 health care provider samples. Authors rarely reported reflexivity or negative cases. Mead and Bower's model of patient-centered care accommodated 85% of the descriptive themes; 12% more related to the caregiver and service/system factors. Three themes could not be accommodated. Findings highlight the need to integrate patient/family education within improved communication, individualize care, use more nonpharmacological strategies, empower patients/families to self-manage pain, and reorganize multidisciplinary roles around patient-centered care and outcomes. These conclusions require validation via consensus and intervention trials. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.
Strategies for a Creative Future with Computer Science, Quality Design and Communicability
NASA Astrophysics Data System (ADS)
Cipolla Ficarra, Francisco V.; Villarreal, Maria
In the current work is presented the importance of the two-way triad between computer science, design and communicability. It is demonstrated how the principles of quality of software engineering are not universal since they are disappearing inside university training. Besides, a short analysis of the term "creativity" males apparent the existence of plagiarism as a human factor that damages the future of communicability applied to the on-line and off-line contents of the open software. A set of measures and guidelines are presented so that the triad works again correctly in the next years to foster the qualitative design of the interactive systems on-line and/or off-line.
NecroQuant: quantitative assessment of radiological necrosis
NASA Astrophysics Data System (ADS)
Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay
2017-11-01
Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.
Bahramian, Hoda; Mohebbi, Simin Z; Khami, Mohammad Reza; Quinonez, Rocio Beatriz
2018-05-10
Pregnant women are vulnerable to a wide range of oral health conditions that could be harmful to their own health and future child. Despite the usefulness of regular dental service utilization in prevention and early detection of oral diseases, it is notably low among pregnant women. In this qualitative study, we aimed to explore barriers and facilitators influencing pregnant women's dental service utilization. Using a triangulation approach, we included pregnant women (n = 22) from two public health centers, midwives (n = 8) and dentists (n = 12) from 12 other public centers in Tehran (Iran). Data was gathered through face-to-face semi-structured interviewing and focus group discussion methods. The analysis of qualitative data was performed using conventional content analysis with MAXQDA10 software. Reported barriers of dental service utilization among pregnant women were categorized under emerging themes: Lack of knowledge and misbelief, cost of dental care, physiological changes, fear and other psychological conditions, time constraint, dentists' unwillingness to accept pregnant women treatment, cultural taboos and lack of interprofessional collaboration. Solutions proposed by dentists, midwives and pregnant women to improve dental care utilization during pregnancy were categorized under three themes: Provision of knowledge, financial support and establishing supportive policies. Understanding perceived barriers of dental service utilization during pregnancy can serve as baseline information for planning and formulating appropriate oral health education, financial support, and legislations tailored for lower income pregnant women, midwives and dentists in countries with developing oral health care system.
Family diabetes matters: a view from the other side.
Samuel-Hodge, Carmen D; Cene, Crystal W; Corsino, Leonor; Thomas, Chelsea; Svetkey, Laura P
2013-03-01
Typically, chronic disease self-management happens in a family context, and for African American adults living with diabetes, family seems to matter in self-management processes. Many qualitative studies describe family diabetes interactions from the perspective of adults living with diabetes, but we have not heard from family members. To explore patient and family perspectives on family interactions around diabetes. Qualitative study using focus group methodology. PARTICIPANTS & APPROACH: We conducted eight audiotaped focus groups among African Americans (four with patients with diabetes and four with family members not diagnosed with diabetes), with a focus on topics of family communication, conflict, and support. The digital files were transcribed verbatim, coded, and analyzed using qualitative data analysis software. Directed content analysis and grounded theory approaches guided the interpretation of code summaries. Focus groups included 67 participants (81 % female, mean age 64 years). Family members primarily included spouses, siblings, and adult children/grandchildren. For patients with diabetes, central issues included shifting family roles to accommodate diabetes and conflicts stemming from family advice-giving. Family members described discomfort with the perceived need to police or "stand over" the diabetic family member, not wanting to "throw diabetes in their [relative's] face," perceiving their communications as unhelpful, and confusion about their role in diabetes care. These concepts generated an emergent theme of "family diabetes silence." Diabetes silence, role adjustments, and conflict appear to be important aspects to address in family-centered diabetes self-management interventions. Contextual data gathered through formative research can inform such family-centered intervention development.
A qualitative study of user perceptions of mobile health apps.
Peng, Wei; Kanthawala, Shaheen; Yuan, Shupei; Hussain, Syed Ali
2016-11-14
Mobile apps for health exist in large numbers today, but oftentimes, consumers do not continue to use them after a brief period of initial usage, are averse toward using them at all, or are unaware that such apps even exist. The purpose of our study was to examine and qualitatively determine the design and content elements of health apps that facilitate or impede usage from the users' perceptive. In 2014, six focus groups and five individual interviews were conducted in the Midwest region of the U.S. with a mixture of 44 smartphone owners of various social economic status. The participants were asked about their general and health specific mobile app usage. They were then shown specific features of exemplar health apps and prompted to discuss their perceptions. The focus groups and interviews were audio recorded, transcribed verbatim, and coded using the software NVivo. Inductive thematic analysis was adopted to analyze the data and nine themes were identified: 1) barriers to adoption of health apps, 2) barriers to continued use of health apps, 3) motivators, 4) information and personalized guidance, 5) tracking for awareness and progress, 6) credibility, 7) goal setting, 8) reminders, and 9) sharing personal information. The themes were mapped to theories for interpretation of the results. This qualitative research with a diverse pool of participants extended previous research on challenges and opportunities of health apps. The findings provide researchers, app designers, and health care providers insights on how to develop and evaluate health apps from the users' perspective.
Perception of masculinity amongst young Malaysian men: a qualitative study of university students
2013-01-01
Background Perception of Masculinity plays an important role in men’s lifestyles and health behaviors. Although, the importance of masculinity has been widely discussed in men’s health literature, very little is known about the meanings of masculinity in the Malaysian setting. This research aimed to explore the meanings of masculinity among Malaysian university men. Methods This qualitative study utilized in-depth interviews with 34 young Malaysian university men, aged 20–30 years from three main ethnic groups in Malaysia (Malay, Chinese and Indian). Thematic analysis approach was used to extract data. NVIVO v8 qualitative software was used for data management. Results From the data collected several concepts emerged that reflected the meanings of masculinity from the participants’ view points. These meanings were associated with a combination of traditional and non-traditional norms that generally benefit men who behave according to culturally dominant role expectations. These included: “Having a good body shape”, “being respected”, “having success with women”, “being a family man”, and “having financial independence”. Socio-cultural factors, such as family environment, religion, public media and popular life style patterns helped to shape and reinforce the meanings of masculinities among university men. Conclusions This study revealed that the university context provided a particular culture for construction and reinforcement of the meanings of masculinities, which should be considered by the educators to help in development of healthy masculinities. PMID:24215138
Breaking Bad News in Oncology: A Metasynthesis.
Bousquet, Guilhem; Orri, Massimiliano; Winterman, Sabine; Brugière, Charlotte; Verneuil, Laurence; Revah-Levy, Anne
2015-08-01
The delivery of bad news by oncologists to their patients is a key moment in the physician-patient relationship. We performed a systematic review of qualitative studies (a metasynthesis) that focused on the experiences and points of view of oncologists about breaking bad news to patients. We searched international publications to identify relevant qualitative research exploring oncologists' perspectives about this topic. Thematic analysis, which compensates for the potential lack of generalizability of the primary studies by their conjoint interpretation, was used to identify key themes and synthesize them. NVivo qualitative analysis software was used. We identified 40 articles (> 600 oncologists) from 12 countries and assessed their quality as good according to the Critical Appraisal Skills Programme (CASP). Two main themes emerged: the patient-oncologist encounter during the breaking of bad news, comprising essential aspects of the communication, including the process of dealing with emotions; and external factors shaping the patient-oncologist encounter, composed of factors that influence the announcement beyond the physician-patient relationship: the family, systemic and institutional factors, and cultural factors. Breaking bad news is a balancing act that requires oncologists to adapt continually to different factors: their individual relationships with the patient, the patient's family, the institutional and systemic environment, and the cultural milieu. Extending the development of the ability to personalize and adapt therapeutic treatment to this realm of communications would be a major step forward from the stereotyped way that oncologists are currently trained in communication skills. © 2015 by American Society of Clinical Oncology.
ERIC Educational Resources Information Center
Lanting, Ashley
This study used a qualitative research methodology to examine how four primary teachers used a district literacy performance assessment. Data were collected through observations, interviews, and documents. Grounded theory and NUD*IST software were used for text analysis and theory building. Findings show that a theory-grounded teacher-empowered…
ERIC Educational Resources Information Center
Jocius, Robin
2013-01-01
This qualitative study explores how adolescent high school students in an AP English class used multiple forms of media (the internet, digital video, slide show software, video editing tools, literary texts, and writing) to respond to and analyze a contemporary novel, "The Kite Runner". Using a multimodal analysis framework, the author explores…
ERIC Educational Resources Information Center
Vasquez-Colina, Maria D.; Maslin-Ostrowski, Pat; Baba, Suria
2017-01-01
This case study used qualitative and quantitative methods to investigate challenges of learning and teaching research methods by examining graduate students' use of collaborative technology (i.e., digital tools that enable collaboration and information seeking such as software and social media) and students' computer self-efficacy. We conducted…
Transfer of training through a science education professional development program
NASA Astrophysics Data System (ADS)
Sowards, Alan Bosworth
Educational research substantiates that effective professional development models must be developed in order for reform-based teaching strategies to be implemented in classrooms. This study examined the effectiveness of an established reform-based science education professional development program, Project LIFE. The study investigated what impact Project LIFE had on participants implementation of reform-based instruction in their classroom three years after participation in the science inservice program. Participants in the case studies described use of reform-based instruction and program factors that influenced transfer of training to their classrooms. Subjects of the study were 5th--10th grade teachers who participated in the 1997--98 Project LIFE professional development program. The study employed a mixed design including both qualitative and quantitative methodology. The qualitative data was collected from multiple sources which included: an open-ended survey, classroom observations, structured interviews, and artifacts. Three purposeful selection of teachers for case studies were made with teacher approval and authorization from building principals. Interview responses from the three case studies were further analyzed qualitatively using the microcomputer software NUD*IST. Tables and figures generated from NUD*IST graphically represented the case study teachers response and case comparison to six established categories: (1) continued implementation of reform-based instruction, (2) use of reform-based instruction, (3) program factors supporting transfer of training, (4) professional development, (5) goals of Project LIFE, and (6) critical issues in science education. Paired t-tests were used to analysis the quantitative data collected from the Survey of Attitudes Toward Science and Science Teaching. The study concluded the 1997--98 Project LIFE participants continued to implement reform-based instruction in their classrooms three years later. According to the teachers the program factors having the most influence on transferring training to their classroom were the positive responses from students; reflections with other teachers regarding instructional activities and strategies; modeling of activities and strategies they received from Project LIFE staff while participating in the program; and teachers commitment to reform-based instruction. These findings are important in enhancing national science reform goals. In order for teachers to be able to implement science-reform-based instruction in their classrooms they must experience effective professional development models. Designers of professional development programs must understand which factors in staff development programs most contribute to transfer of training.
Al-Hussaini, Ali; Tomkinson, Alun
2016-01-01
Undergraduate otolaryngology exposure is limited. It may be consolidated by the use of an iBook as a self-study tool. Following invitation to participate by email, five focus groups were formed, each consisting of six medical students (18 female, 12 male, median age 23 years). The focus group transcripts were imported to the qualitative data analysis software NVivo (QSR International, UK). The iBook was found to have a clear and consistent presentation, and a focused and user-friendly style, with reasonable interactivity and a good range of well-integrated media elements. It was, overall, perceived to be a valuable educational resource by the medical students.
User systems guidelines for software projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abrahamson, L.
1986-04-01
This manual presents guidelines for software standards which were developed so that software project-development teams and management involved in approving the software could have a generalized view of all phases in the software production procedure and the steps involved in completing each phase. Guidelines are presented for six phases of software development: project definition, building a user interface, designing software, writing code, testing code, and preparing software documentation. The discussions for each phase include examples illustrating the recommended guidelines. 45 refs. (DWL)
Three-dimensional micro-scale strain mapping in living biological soft tissues.
Moo, Eng Kuan; Sibole, Scott C; Han, Sang Kuy; Herzog, Walter
2018-04-01
Non-invasive characterization of the mechanical micro-environment surrounding cells in biological tissues at multiple length scales is important for the understanding of the role of mechanics in regulating the biosynthesis and phenotype of cells. However, there is a lack of imaging methods that allow for characterization of the cell micro-environment in three-dimensional (3D) space. The aims of this study were (i) to develop a multi-photon laser microscopy protocol capable of imprinting 3D grid lines onto living tissue at a high spatial resolution, and (ii) to develop image processing software capable of analyzing the resulting microscopic images and performing high resolution 3D strain analyses. Using articular cartilage as the biological tissue of interest, we present a novel two-photon excitation imaging technique for measuring the internal 3D kinematics in intact cartilage at sub-micrometer resolution, spanning length scales from the tissue to the cell level. Using custom image processing software, we provide accurate and robust 3D micro-strain analysis that allows for detailed qualitative and quantitative assessment of the 3D tissue kinematics. This novel technique preserves tissue structural integrity post-scanning, therefore allowing for multiple strain measurements at different time points in the same specimen. The proposed technique is versatile and opens doors for experimental and theoretical investigations on the relationship between tissue deformation and cell biosynthesis. Studies of this nature may enhance our understanding of the mechanisms underlying cell mechano-transduction, and thus, adaptation and degeneration of soft connective tissues. We presented a novel two-photon excitation imaging technique for measuring the internal 3D kinematics in intact cartilage at sub-micrometer resolution, spanning from tissue length scale to cellular length scale. Using a custom image processing software (lsmgridtrack), we provide accurate and robust micro-strain analysis that allowed for detailed qualitative and quantitative assessment of the 3D tissue kinematics. The approach presented here can also be applied to other biological tissues such as meniscus and annulus fibrosus, as well as tissue-engineered tissues for the characterization of their mechanical properties. This imaging technique opens doors for experimental and theoretical investigation on the relationship between tissue deformation and cell biosynthesis. Studies of this nature may enhance our understanding of the mechanisms underlying cell mechano-transduction, and thus, adaptation and degeneration of soft connective tissues. Copyright © 2018 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valerio, Luis G.; Arvidson, Kirk B.; Chanderbhan, Ronald F.
2007-07-01
Consistent with the U.S. Food and Drug Administration (FDA) Critical Path Initiative, predictive toxicology software programs employing quantitative structure-activity relationship (QSAR) models are currently under evaluation for regulatory risk assessment and scientific decision support for highly sensitive endpoints such as carcinogenicity, mutagenicity and reproductive toxicity. At the FDA's Center for Food Safety and Applied Nutrition's Office of Food Additive Safety and the Center for Drug Evaluation and Research's Informatics and Computational Safety Analysis Staff (ICSAS), the use of computational SAR tools for both qualitative and quantitative risk assessment applications are being developed and evaluated. One tool of current interest ismore » MDL-QSAR predictive discriminant analysis modeling of rodent carcinogenicity, which has been previously evaluated for pharmaceutical applications by the FDA ICSAS. The study described in this paper aims to evaluate the utility of this software to estimate the carcinogenic potential of small, organic, naturally occurring chemicals found in the human diet. In addition, a group of 19 known synthetic dietary constituents that were positive in rodent carcinogenicity studies served as a control group. In the test group of naturally occurring chemicals, 101 were found to be suitable for predictive modeling using this software's discriminant analysis modeling approach. Predictions performed on these compounds were compared to published experimental evidence of each compound's carcinogenic potential. Experimental evidence included relevant toxicological studies such as rodent cancer bioassays, rodent anti-carcinogenicity studies, genotoxic studies, and the presence of chemical structural alerts. Statistical indices of predictive performance were calculated to assess the utility of the predictive modeling method. Results revealed good predictive performance using this software's rodent carcinogenicity module of over 1200 chemicals, comprised primarily of pharmaceutical, industrial and some natural products developed under an FDA-MDL cooperative research and development agreement (CRADA). The predictive performance for this group of dietary natural products and the control group was 97% sensitivity and 80% concordance. Specificity was marginal at 53%. This study finds that the in silico QSAR analysis employing this software's rodent carcinogenicity database is capable of identifying the rodent carcinogenic potential of naturally occurring organic molecules found in the human diet with a high degree of sensitivity. It is the first study to demonstrate successful QSAR predictive modeling of naturally occurring carcinogens found in the human diet using an external validation test. Further test validation of this software and expansion of the training data set for dietary chemicals will help to support the future use of such QSAR methods for screening and prioritizing the risk of dietary chemicals when actual animal data are inadequate, equivocal, or absent.« less
A measurement system for large, complex software programs
NASA Technical Reports Server (NTRS)
Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.
1994-01-01
This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.
Numerical simulation of synthesis gas incineration
NASA Astrophysics Data System (ADS)
Kazakov, A. V.; Khaustov, S. A.; Tabakaev, R. B.; Belousova, Y. A.
2016-04-01
The authors have analysed the expediency of the suggested low-grade fuels application method. Thermal processing of solid raw materials in the gaseous fuel, called synthesis gas, is investigated. The technical challenges concerning the applicability of the existing gas equipment developed and extensively tested exclusively for natural gas were considered. For this purpose computer simulation of three-dimensional syngas-incinerating flame dynamics was performed by means of the ANSYS Multiphysics engineering software. The subjects of studying were: a three-dimensional aerodynamic flame structure, heat-release and temperature fields, a set of combustion properties: a flare range and the concentration distribution of burnout reagents. The obtained results were presented in the form of a time-averaged pathlines with color indexing. The obtained results can be used for qualitative and quantitative evaluation of complex multicomponent gas incineration singularities.
Maintenance = reuse-oriented software development
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1989-01-01
Maintenance is viewed as a reuse process. In this context, a set of models that can be used to support the maintenance process is discussed. A high level reuse framework is presented that characterizes the object of reuse, the process for adapting that object for its target application, and the reused object within its target application. Based upon this framework, a qualitative comparison is offered of the three maintenance process models with regard to their strengths and weaknesses and the circumstances in which they are appropriate. To provide a more systematic, quantitative approach for evaluating the appropriateness of the particular maintenance model, a measurement scheme is provided, based upon the reuse framework, in the form of an organized set of questions that need to be answered. To support the reuse perspective, a set of reuse enablers are discussed.
Solutions in radiology services management: a literature review.
Pereira, Aline Garcia; Vergara, Lizandra Garcia Lupi; Merino, Eugenio Andrés Díaz; Wagner, Adriano
2015-01-01
The present study was aimed at reviewing the literature to identify solutions for problems observed in radiology services. Basic, qualitative, exploratory literature review at Scopus and SciELO databases, utilizing the Mendeley and Illustrator CC Adobe softwares. In the databases, 565 papers - 120 out of them, pdf free - were identified. Problems observed in the radiology sector are related to procedures scheduling, humanization, lack of training, poor knowledge and use of management techniques, and interaction with users. The design management provides the services with interesting solutions such as Benchmarking, CRM, Lean Approach, ServiceBlueprinting, continued education, among others. Literature review is an important tool to identify problems and respective solutions. However, considering the small number of studies approaching management of radiology services, this is a great field of research for the development of deeper studies.
The dynamics of software development project management: An integrative systems dynamic perspective
NASA Technical Reports Server (NTRS)
Vandervelde, W. E.; Abdel-Hamid, T.
1984-01-01
Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.
Austin, Peter C
2010-04-22
Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.
Wright, Adam; Sittig, Dean F; Ash, Joan S; Erickson, Jessica L; Hickman, Trang T; Paterno, Marilyn; Gebhardt, Eric; McMullen, Carmit; Tsurikova, Ruslana; Dixon, Brian E; Fraser, Greg; Simonaitis, Linas; Sonnenberg, Frank A; Middleton, Blackford
2015-11-01
To identify challenges, lessons learned and best practices for service-oriented clinical decision support, based on the results of the Clinical Decision Support Consortium, a multi-site study which developed, implemented and evaluated clinical decision support services in a diverse range of electronic health records. Ethnographic investigation using the rapid assessment process, a procedure for agile qualitative data collection and analysis, including clinical observation, system demonstrations and analysis and 91 interviews. We identified challenges and lessons learned in eight dimensions: (1) hardware and software computing infrastructure, (2) clinical content, (3) human-computer interface, (4) people, (5) workflow and communication, (6) internal organizational policies, procedures, environment and culture, (7) external rules, regulations, and pressures and (8) system measurement and monitoring. Key challenges included performance issues (particularly related to data retrieval), differences in terminologies used across sites, workflow variability and the need for a legal framework. Based on the challenges and lessons learned, we identified eight best practices for developers and implementers of service-oriented clinical decision support: (1) optimize performance, or make asynchronous calls, (2) be liberal in what you accept (particularly for terminology), (3) foster clinical transparency, (4) develop a legal framework, (5) support a flexible front-end, (6) dedicate human resources, (7) support peer-to-peer communication, (8) improve standards. The Clinical Decision Support Consortium successfully developed a clinical decision support service and implemented it in four different electronic health records and four diverse clinical sites; however, the process was arduous. The lessons identified by the Consortium may be useful for other developers and implementers of clinical decision support services. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Developing sustainable software solutions for bioinformatics by the “ Butterfly” paradigm
Ahmed, Zeeshan; Zeeshan, Saman; Dandekar, Thomas
2014-01-01
Software design and sustainable software engineering are essential for the long-term development of bioinformatics software. Typical challenges in an academic environment are short-term contracts, island solutions, pragmatic approaches and loose documentation. Upcoming new challenges are big data, complex data sets, software compatibility and rapid changes in data representation. Our approach to cope with these challenges consists of iterative intertwined cycles of development (“ Butterfly” paradigm) for key steps in scientific software engineering. User feedback is valued as well as software planning in a sustainable and interoperable way. Tool usage should be easy and intuitive. A middleware supports a user-friendly Graphical User Interface (GUI) as well as a database/tool development independently. We validated the approach of our own software development and compared the different design paradigms in various software solutions. PMID:25383181
Microcomputer software development facilities
NASA Technical Reports Server (NTRS)
Gorman, J. S.; Mathiasen, C.
1980-01-01
A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, C.; Crook, J.
1998-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.
Four simple recommendations to encourage best practices in research software
Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965
Four simple recommendations to encourage best practices in research software.
Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve
2017-01-01
Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.
A Legal Guide for the Software Developer.
ERIC Educational Resources Information Center
Minnesota Small Business Assistance Office, St. Paul.
This booklet has been prepared to familiarize the inventor, creator, or developer of a new computer software product or software invention with the basic legal issues involved in developing, protecting, and distributing the software in the United States. Basic types of software protection and related legal matters are discussed in detail,…
Understanding Acceptance of Software Metrics--A Developer Perspective
ERIC Educational Resources Information Center
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
Current Practice in Software Development for Computational Neuroscience and How to Improve It
Gewaltig, Marc-Oliver; Cannon, Robert
2014-01-01
Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research. PMID:24465191
Current practice in software development for computational neuroscience and how to improve it.
Gewaltig, Marc-Oliver; Cannon, Robert
2014-01-01
Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.
Coombs, Maureen A; Davidson, Judy E; Nunnally, Mark E; Wickline, Mary A; Curtis, J Randall
2017-08-01
To explore the importance, challenges, and opportunities using qualitative research to enhance development of clinical practice guidelines, using recent guidelines for family-centered care in the ICU as an example. In developing the Society of Critical Care Medicine guidelines for family-centered care in the neonatal ICU, PICU, and adult ICU, we developed an innovative adaptation of the Grading of Recommendations, Assessments, Development and Evaluations approach to explicitly incorporate qualitative research. Using Grading of Recommendations, Assessments, Development and Evaluations and the Council of Medical Specialty Societies principles, we conducted a systematic review of qualitative research to establish family-centered domains and outcomes. Thematic analyses were undertaken on study findings and used to support Population, Intervention, Comparison, Outcome question development. We identified and employed three approaches using qualitative research in these guidelines. First, previously published qualitative research was used to identify important domains for the Population, Intervention, Comparison, Outcome questions. Second, this qualitative research was used to identify and prioritize key outcomes to be evaluated. Finally, we used qualitative methods, member checking with patients and families, to validate the process and outcome of the guideline development. In this, a novel report, we provide direction for standardizing the use of qualitative evidence in future guidelines. Recommendations are made to incorporate qualitative literature review and appraisal, include qualitative methodologists in guideline taskforce teams, and develop training for evaluation of qualitative research into guideline development procedures. Effective methods of involving patients and families as members of guideline development represent opportunities for future work.
Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design
NASA Technical Reports Server (NTRS)
Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.
2003-01-01
A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.
SAO mission support software and data standards, version 1.0
NASA Technical Reports Server (NTRS)
Hsieh, P.
1993-01-01
This document defines the software developed by the SAO AXAF Mission Support (MS) Program and defines standards for the software development process and control of data products generated by the software. The SAO MS is tasked to develop and use software to perform a variety of functions in support of the AXAF mission. Software is developed by software engineers and scientists, and commercial off-the-shelf (COTS) software is used either directly or customized through the use of scripts to implement analysis procedures. Software controls real-time laboratory instruments, performs data archiving, displays data, and generates model predictions. Much software is used in the analysis of data to generate data products that are required by the AXAF project, for example, on-orbit mirror performance predictions or detailed characterization of the mirror reflection performance with energy.
Qualitative review of usability problems in health information systems for radiology.
Dias, Camila Rodrigues; Pereira, Marluce Rodrigues; Freire, André Pimenta
2017-12-01
Radiology processes are commonly supported by Radiology Information System (RIS), Picture Archiving and Communication System (PACS) and other software for radiology. However, these information technologies can present usability problems that affect the performance of radiologists and physicians, especially considering the complexity of the tasks involved. The purpose of this study was to extract, classify and analyze qualitatively the usability problems in PACS, RIS and other software for radiology. A systematic review was performed to extract usability problems reported in empirical usability studies in the literature. The usability problems were categorized as violations of Nielsen and Molich's usability heuristics. The qualitative analysis indicated the causes and the effects of the identified usability problems. From the 431 papers initially identified, 10 met the study criteria. The analysis of the papers identified 90 instances of usability problems, classified into categories corresponding to established usability heuristics. The five heuristics with the highest number of instances of usability problems were "Flexibility and efficiency of use", "Consistency and standards", "Match between system and the real world", "Recognition rather than recall" and "Help and documentation", respectively. These problems can make the interaction time consuming, causing delays in tasks, dissatisfaction, frustration, preventing users from enjoying all the benefits and functionalities of the system, as well as leading to more errors and difficulties in carrying out clinical analyses. Furthermore, the present paper showed a lack of studies performed on systems for radiology, especially usability evaluations using formal methods of evaluation involving the final users. Copyright © 2017 Elsevier Inc. All rights reserved.
Miere, Alexandra; Oubraham, Hassiba; Amoroso, Francesca; Butori, Pauline; Astroz, Polina; Semoun, Oudy; Bruyere, Elsa; Pedinielli, Alexandre; Addou-Regnard, Manar; Jung, Camille; Cohen, Salomon Y; Souied, Eric H
2018-01-01
To compare the qualitative and quantitative choroidal neovascularization (CNV) changes after antivascular endothelial growth factor (anti-VEGF) therapy in treatment-naïve and treated eyes with age-related macular degeneration (AMD) using optical coherence tomography angiography (OCTA). Consecutive patients with neovascular AMD underwent multimodal imaging, including OCTA (AngioPlex, CIRRUS HD-OCT model 5000; Carl Zeiss Meditec, Inc., Dublin, OH) at baseline and at three monthly follow-up visits. Treatment-naive AMD patients undergoing anti-VEGF loading phase were included in group A, while treated patients were included in group B. Qualitative and quantitative OCTA analyses were performed on outer retina to choriocapillaris (ORCC) slab. CNV size was measured using a free image analysis software (ImageJ, open-source imaging processing software, 2.0.0). Twenty-five eyes of 25 patients were enrolled in our study (mean age 78.32 ± 6.8 years): 13 treatment-naïve eyes in group A and 12 treated eyes in group B. While qualitative analysis revealed no significant differences from baseline to follow-up in the two groups, quantitative analysis showed in group A a significant decrease in lesion area ( P = 0.023); in group B, no significant change in the lesion area was observed during anti-VEGF therapy ( P = 0.93). Treatment-naïve and treated eyes with CNV secondary to neovascular AMD respond differently to anti-VEGF therapy. This should be taken into account when using OCTA for CNV follow-up or planning therapeutic strategies.
Software technology insertion: A study of success factors
NASA Technical Reports Server (NTRS)
Lydon, Tom
1990-01-01
Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.
NASA Technical Reports Server (NTRS)
Church, Victor E.; Long, D.; Hartenstein, Ray; Perez-Davila, Alfredo
1992-01-01
A set of functional requirements for software configuration management (CM) and metrics reporting for Space Station Freedom ground systems software are described. This report is one of a series from a study of the interfaces among the Ground Systems Development Environment (GSDE), the development systems for the Space Station Training Facility (SSTF) and the Space Station Control Center (SSCC), and the target systems for SSCC and SSTF. The focus is on the CM of the software following delivery to NASA and on the software metrics that relate to the quality and maintainability of the delivered software. The CM and metrics requirements address specific problems that occur in large-scale software development. Mechanisms to assist in the continuing improvement of mission operations software development are described.
Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee
2018-06-01
We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.
The JINR Tier1 Site Simulation for Research and Development Purposes
NASA Astrophysics Data System (ADS)
Korenkov, V.; Nechaevskiy, A.; Ososkov, G.; Pryahina, D.; Trofimov, V.; Uzhinskiy, A.; Voytishin, N.
2016-02-01
Distributed complex computing systems for data storage and processing are in common use in the majority of modern scientific centers. The design of such systems is usually based on recommendations obtained via a preliminary simulated model used and executed only once. However big experiments last for years and decades, and the development of their computing system is going on, not only quantitatively but also qualitatively. Even with the substantial efforts invested in the design phase to understand the systems configuration, it would be hard enough to develop a system without additional research of its future evolution. The developers and operators face the problem of the system behaviour predicting after the planned modifications. A system for grid and cloud services simulation is developed at LIT (JINR, Dubna). This simulation system is focused on improving the effciency of the grid/cloud structures development by using the work quality indicators of some real system. The development of such kind of software is very important for making a new grid/cloud infrastructure for such big scientific experiments like the JINR Tier1 site for WLCG. The simulation of some processes of the Tier1 site is considered as an example of our application approach.
ERIC Educational Resources Information Center
Ichu, Emmanuel A.
2010-01-01
Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…
Automated Software Development Workstation (ASDW)
NASA Technical Reports Server (NTRS)
Fridge, Ernie
1990-01-01
Software development is a serious bottleneck in the construction of complex automated systems. An increase of the reuse of software designs and components has been viewed as a way to relieve this bottleneck. One approach to achieving software reusability is through the development and use of software parts composition systems. A software parts composition system is a software development environment comprised of a parts description language for modeling parts and their interfaces, a catalog of existing parts, a composition editor that aids a user in the specification of a new application from existing parts, and a code generator that takes a specification and generates an implementation of a new application in a target language. The Automated Software Development Workstation (ASDW) is an expert system shell that provides the capabilities required to develop and manipulate these software parts composition systems. The ASDW is now in Beta testing at the Johnson Space Center. Future work centers on responding to user feedback for capability and usability enhancement, expanding the scope of the software lifecycle that is covered, and in providing solutions to handling very large libraries of reusable components.
Promoting Science Software Best Practices: A Scientist's Perspective (Invited)
NASA Astrophysics Data System (ADS)
Blanton, B. O.
2013-12-01
Software is at the core of most modern scientific activities, and as societal awareness of, and impacts from, extreme weather, disasters, and climate and global change continue to increase, the roles that scientific software play in analyses and decision-making are brought more to the forefront. Reproducibility of research results (particularly those that enter into the decision-making arena) and open access to the software is essential for scientific and scientists' credibility. This has been highlighted in a recent article by Joppa et al (Troubling Trends in Scientific Software Use, Science Magazine, May 2013) that describes reasons for particular software being chosen by scientists, including that the "developer is well-respected" and on "recommendation from a close colleague". This reliance on recommendation, Joppa et al conclude, is fraught with risks to both sciences and scientists. Scientists must frequently take software for granted, assuming that it performs as expected and advertised and that the software itself has been validated and results verified. This is largely due to the manner in which much software is written and developed; in an ad hoc manner, with an inconsistent funding stream, and with little application of core software engineering best practices. Insufficient documentation, limited test cases, and code unavailability are significant barriers to informed and intelligent science software usage. This situation is exacerbated when the scientist becomes the software developer out of necessity due to resource constraints. Adoption of, and adherence to, best practices in scientific software development will substantially increase intelligent software usage and promote a sustainable evolution of the science as encoded in the software. We describe a typical scientist's perspective on using and developing scientific software in the context of storm surge research and forecasting applications that have real-time objectives and regulatory constraints. This include perspectives on what scientists/users of software can contribute back to the software development process and examples of successful scientist/developer interactions, and the competition between "getting it done" and "getting it done right".
Reuse at the Software Productivity Consortium
NASA Technical Reports Server (NTRS)
Weiss, David M.
1989-01-01
The Software Productivity Consortium is sponsored by 14 aerospace companies as a developer of software engineering methods and tools. Software reuse and prototyping are currently the major emphasis areas. The Methodology and Measurement Project in the Software Technology Exploration Division has developed some concepts for reuse which they intend to develop into a synthesis process. They have identified two approaches to software reuse: opportunistic and systematic. The assumptions underlying the systematic approach, phrased as hypotheses, are the following: the redevelopment hypothesis, i.e., software developers solve the same problems repeatedly; the oracle hypothesis, i.e., developers are able to predict variations from one redevelopment to others; and the organizational hypothesis, i.e., software must be organized according to behavior and structure to take advantage of the predictions that the developers make. The conceptual basis for reuse includes: program families, information hiding, abstract interfaces, uses and information hiding hierarchies, and process structure. The primary reusable software characteristics are black-box descriptions, structural descriptions, and composition and decomposition based on program families. Automated support can be provided for systematic reuse, and the Consortium is developing a prototype reuse library and guidebook. The software synthesis process that the Consortium is aiming toward includes modeling, refinement, prototyping, reuse, assessment, and new construction.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.
Software Quality Assurance Metrics
NASA Technical Reports Server (NTRS)
McRae, Kalindra A.
2004-01-01
Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.
ERIC Educational Resources Information Center
Vratulis, Vetta; Morton, Charlene
2011-01-01
This qualitative research study is an exploration of the merit and shortcomings of using a combination of the music software GarageBand[TM] and an electronic bulletin board to facilitate musical and peer learning in a 3-month elementary music methods curriculum and instruction course. A pedagogical objective of this assignment was to increase the…
Software Configuration Management Guidebook
NASA Technical Reports Server (NTRS)
1995-01-01
The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes which are used in software development. The Software Assurance Guidebook, SMAP-GB-A201, issued in September, 1989, provides an overall picture of the concepts and practices of NASA in software assurance. Lower level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the Software Configuration Management Guidebook which describes software configuration management in a way that is compatible with practices in industry and at NASA Centers. Software configuration management is a key software development process, and is essential for doing software assurance.
Software Reuse Within the Earth Science Community
NASA Technical Reports Server (NTRS)
Marshall, James J.; Olding, Steve; Wolfe, Robert E.; Delnore, Victor E.
2006-01-01
Scientific missions in the Earth sciences frequently require cost-effective, highly reliable, and easy-to-use software, which can be a challenge for software developers to provide. The NASA Earth Science Enterprise (ESE) spends a significant amount of resources developing software components and other software development artifacts that may also be of value if reused in other projects requiring similar functionality. In general, software reuse is often defined as utilizing existing software artifacts. Software reuse can improve productivity and quality while decreasing the cost of software development, as documented by case studies in the literature. Since large software systems are often the results of the integration of many smaller and sometimes reusable components, ensuring reusability of such software components becomes a necessity. Indeed, designing software components with reusability as a requirement can increase the software reuse potential within a community such as the NASA ESE community. The NASA Earth Science Data Systems (ESDS) Software Reuse Working Group is chartered to oversee the development of a process that will maximize the reuse potential of existing software components while recommending strategies for maximizing the reusability potential of yet-to-be-designed components. As part of this work, two surveys of the Earth science community were conducted. The first was performed in 2004 and distributed among government employees and contractors. A follow-up survey was performed in 2005 and distributed among a wider community, to include members of industry and academia. The surveys were designed to collect information on subjects such as the current software reuse practices of Earth science software developers, why they choose to reuse software, and what perceived barriers prevent them from reusing software. In this paper, we compare the results of these surveys, summarize the observed trends, and discuss the findings. The results are very similar, with the second, larger survey confirming the basic results of the first, smaller survey. The results suggest that reuse of ESE software can drive down the cost and time of system development, increase flexibility and responsiveness of these systems to new technologies and requirements, and increase effective and accountable community participation.
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
Impact of Agile Software Development Model on Software Maintainability
ERIC Educational Resources Information Center
Gawali, Ajay R.
2012-01-01
Software maintenance and support costs account for up to 60% of the overall software life cycle cost and often burdens tightly budgeted information technology (IT) organizations. Agile software development approach delivers business value early, but implications on software maintainability are still unknown. The purpose of this quantitative study…
ERIC Educational Resources Information Center
Boyd, David W.
1993-01-01
Asserts that a new generation of software authoring applications has led to improvements in the development of economics education software. Describes new software development applications and discusses how to use them. Concludes that object-oriented programming helps economists develop their own courseware. (CFR)
ERIC Educational Resources Information Center
Kramer, Aleksey
2013-01-01
The topic of software security has become paramount in information technology (IT) related scholarly research. Researchers have addressed numerous software security topics touching on all phases of the Software Development Life Cycle (SDLC): requirements gathering phase, design phase, development phase, testing phase, and maintenance phase.…
Application of industry-standard guidelines for the validation of avionics software
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J.; Shagnea, Anita M.
1990-01-01
The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.
A Comparison of Authoring Software for Developing Mathematics Self-Learning Software Packages.
ERIC Educational Resources Information Center
Suen, Che-yin; Pok, Yang-ming
Four years ago, the authors started to develop a self-paced mathematics learning software called NPMaths by using an authoring package called Tencore. However, NPMaths had some weak points. A development team was hence formed to develop similar software called Mathematics On Line. This time the team used another development language called…
Developing midwifery practice through work-based learning: an exploratory study.
Marshall, Jayne E
2012-09-01
To explore what effect the introduction of a Work-Based Learning Module undertaken by midwives in a range of maternity settings has had on their personal professional development, as well as the impact on developing local maternity and neonatal care provision. A case study approach was used consisting of mixed methods. Quantitative data were collected through questionnaires from midwives and their Clinical Supervisors at the end of the module, with a survey questionnaire to each midwifery manager, six months following the implementation of the midwives' project in practice. Qualitative data were collected by focus groups at six different work place locations, with health professionals who had experienced the midwives' projects within the workplace. Quantitative data were manually analysed whereas content analysis was used to identify recurrent themes from the qualitative data, with the support of Computer Assisted Qualitative Data Analysis Software. The University of Nottingham granted ethical approval for the study. Twelve midwives who undertook the work-based module, their respective Clinical Supervisors (n = 12), their employers/managers (n = 12) and health professionals (n = 28) within six individual National Health Service Trusts in the East Midlands of the United Kingdom took part in the study. The work-based learning module not only led to the personal and professional development of the midwife, but also to improving multi-professional collaboration and the consequential development of maternity services within the local Trusts. The value of leading change by completing an innovative and tangible work-based project through a flexible mode of study strengthened the midwives' clinical credibility among colleagues and employers and supports the philosophy of inter-professional learning and working. This novel Work Based approach to learning has the potential to further develop the provision of post-registration education for midwives and other health professionals, as it helps to bridge the theory-practice gap. Learning in the workplace is efficient and cost effective to employee and employer and serves in increasing the link between higher education and the workplace. Furthermore, as the principles of work-based learning could be transferred to other contexts outside of the United Kingdom, such an approach has the potential to directly influence the development of global midwifery education and maternity services and ultimately benefit mothers, their babies and families throughout the world. Copyright © 2012 Elsevier Ltd. All rights reserved.
Cornford, Tony; Barber, Nicholas; Avery, Anthony; Takian, Amirhossein; Lichtner, Valentina; Petrakaki, Dimitra; Crowe, Sarah; Marsden, Kate; Robertson, Ann; Morrison, Zoe; Klecun, Ela; Prescott, Robin; Quinn, Casey; Jani, Yogini; Ficociello, Maryam; Voutsina, Katerina; Paton, James; Fernando, Bernard; Jacklin, Ann; Cresswell, Kathrin
2011-01-01
Objectives To evaluate the implementation and adoption of the NHS detailed care records service in “early adopter” hospitals in England. Design Theoretically informed, longitudinal qualitative evaluation based on case studies. Setting 12 “early adopter” NHS acute hospitals and specialist care settings studied over two and a half years. Data sources Data were collected through in depth interviews, observations, and relevant documents relating directly to case study sites and to wider national developments that were perceived to impact on the implementation strategy. Data were thematically analysed, initially within and then across cases. The dataset consisted of 431 semistructured interviews with key stakeholders, including hospital staff, developers, and governmental stakeholders; 590 hours of observations of strategic meetings and use of the software in context; 334 sets of notes from observations, researchers’ field notes, and notes from national conferences; 809 NHS documents; and 58 regional and national documents. Results Implementation has proceeded more slowly, with a narrower scope and substantially less clinical functionality than was originally planned. The national strategy had considerable local consequences (summarised under five key themes), and wider national developments impacted heavily on implementation and adoption. More specifically, delays related to unrealistic expectations about the capabilities of systems; the time needed to build, configure, and customise the software; the work needed to ensure that systems were supporting provision of care; and the needs of end users for training and support. Other factors hampering progress included the changing milieu of NHS policy and priorities; repeatedly renegotiated national contracts; different stages of development of diverse NHS care records service systems; and a complex communication process between different stakeholders, along with contractual arrangements that largely excluded NHS providers. There was early evidence that deploying systems resulted in important learning within and between organisations and the development of relevant competencies within NHS hospitals. Conclusions Implementation of the NHS Care Records Service in “early adopter” sites proved time consuming and challenging, with as yet limited discernible benefits for clinicians and no clear advantages for patients. Although our results might not be directly transferable to later adopting sites because the functionalities we evaluated were new and untried in the English context, they shed light on the processes involved in implementing major new systems. The move to increased local decision making that we advocated based on our interim analysis has been pursued and welcomed by the NHS, but it is important that policymakers do not lose sight of the overall goal of an integrated interoperable solution. PMID:22006942
Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi
2011-01-01
AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation coefficient between AQCEL and conventional methods were 0.973 and 0.986 for the normal and affected sides at rest, respectively, and 0.977 and 0.984 for the normal and affected sides after ACZ loading, respectively. The quality of images reconstructed using the application software AQCEL were superior to that obtained using conventional method after ACZ loading, and high correlations were shown in quantity at rest and after ACZ loading. This software can be applied to clinical practice and is a useful tool for improvement of reproducibility and throughput.
NASA Technical Reports Server (NTRS)
1982-01-01
An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Ackley, Keith A.; Crump, Wes; Sanders, Les
1991-01-01
The design of the Framework Processor (FP) component of the Framework Programmable Software Development Platform (FFP) is described. The FFP is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by the model, this Framework Processor will take advantage of an integrated operating environment to provide automated support for the management and control of the software development process so that costly mistakes during the development phase can be eliminated.
Development of a patient-specific 3D dose evaluation program for QA in radiation therapy
NASA Astrophysics Data System (ADS)
Lee, Suk; Chang, Kyung Hwan; Cao, Yuan Jie; Shim, Jang Bo; Yang, Dae Sik; Park, Young Je; Yoon, Won Sup; Kim, Chul Yong
2015-03-01
We present preliminary results for a 3-dimensional dose evaluation software system ( P DRESS, patient-specific 3-dimensional dose real evaluation system). Scanned computed tomography (CT) images obtained by using dosimetry were transferred to the radiation treatment planning system (ECLIPSE, VARIAN, Palo Alto, CA) where the intensity modulated radiation therapy (IMRT) nasopharynx plan was designed. We used a 10 MV photon beam (CLiX, VARIAN, Palo Alto, CA) to deliver the nasopharynx treatment plan. After irradiation, the TENOMAG dosimeter was scanned using a VISTA ™ scanner. The scanned data were reconstructed using VistaRecon software to obtain a 3D dose distribution of the optical density. An optical-CT scanner was used to readout the dose distribution in the gel dosimeter. Moreover, we developed the P DRESS by using Flatform, which were developed by our group, to display the 3D dose distribution by loading the DICOM RT data which are exported from the radiotherapy treatment plan (RTP) and the optical-CT reconstructed VFF file, into the independent P DRESS with an ioniz ation chamber and EBT film was used to compare the dose distribution calculated from the RTP with that measured by using a gel dosimeter. The agreement between the normalized EBT, the gel dosimeter and RTP data was evaluated using both qualitative and quantitative methods, such as the isodose distribution, dose difference, point value, and profile. The profiles showed good agreement between the RTP data and the gel dosimeter data, and the precision of the dose distribution was within ±3%. The results from this study showed significantly discrepancies between the dose distribution calculated from the treatment plan and the dose distribution measured by a TENOMAG gel and by scanning with an optical CT scanner. The 3D dose evaluation software system ( P DRESS, patient specific dose real evaluation system), which were developed in this study evaluates the accuracies of the three-dimensional dose distributions. Further applications of the system utility are expected to result from future studies.
NASA Technical Reports Server (NTRS)
1992-01-01
This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.
IPLaminator: an ImageJ plugin for automated binning and quantification of retinal lamination.
Li, Shuai; Woodfin, Michael; Long, Seth S; Fuerst, Peter G
2016-01-16
Information in the brain is often segregated into spatially organized layers that reflect the function of the embedded circuits. This is perhaps best exemplified in the layering, or lamination, of the retinal inner plexiform layer (IPL). The neurites of the retinal ganglion, amacrine and bipolar cell subtypes that form synapses in the IPL are precisely organized in highly refined strata within the IPL. Studies focused on developmental organization and cell morphology often use this layered stratification to characterize cells and identify the function of genes in development of the retina. A current limitation to such analysis is the lack of standardized tools to quantitatively analyze this complex structure. Most previous work on neuron stratification in the IPL is qualitative and descriptive. In this study we report the development of an intuitive platform to rapidly and reproducibly assay IPL lamination. The novel ImageJ based software plugin we developed: IPLaminator, rapidly analyzes neurite stratification patterns in the retina and other neural tissues. A range of user options allows researchers to bin IPL stratification based on fixed points, such as the neurites of cholinergic amacrine cells, or to define a number of bins into which the IPL will be divided. Options to analyze tissues such as cortex were also added. Statistical analysis of the output then allows a quantitative value to be assigned to differences in laminar patterning observed in different models, genotypes or across developmental time. IPLaminator is an easy to use software application that will greatly speed and standardize quantification of neuron organization.
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
Engel, Nora; Wachter, Keri; Pai, Madhukar; Gallarda, Jim; Boehme, Catharina; Celentano, Isabelle; Weintraub, Rebecca
2016-01-01
Several barriers challenge development, adoption and scale-up of diagnostics in low and middle income countries. An innovative global health discussion platform allows capturing insights from the global health community on factors driving demand and supply for diagnostics. We conducted a qualitative content analysis of the online discussion 'Advancing Care Delivery: Driving Demand and Supply of Diagnostics' organised by the Global Health Delivery Project (GHD) (http://www.ghdonline.org/) at Harvard University. The discussion, driven by 12 expert panellists, explored what must be done to develop delivery systems, business models, new technologies, interoperability standards, and governance mechanisms to ensure that patients receive the right diagnostic at the right time. The GHD Online (GHDonline) platform reaches over 19 000 members from 185 countries. Participants (N=99) in the diagnostics discussion included academics, non-governmental organisations, manufacturers, policymakers, and physicians. Data was coded and overarching categories analysed using qualitative data analysis software. Participants considered technical characteristics of diagnostics as smaller barriers to effective use of diagnostics compared with operational and health system challenges, such as logistics, poor fit with user needs, cost, workforce, infrastructure, access, weak regulation and political commitment. Suggested solutions included: health system strengthening with patient-centred delivery; strengthened innovation processes; improved knowledge base; harmonised guidelines and evaluation; supply chain innovations; and mechanisms for ensuring quality and capacity. Engaging and connecting different actors involved with diagnostic development and use is paramount for improving diagnostics. While the discussion participants were not representative of all actors involved, the platform enabled a discussion between globally acknowledged experts and physicians working in different countries.
2013-01-01
Background Patient preference is one of the main components of clinical decision making, therefore leading to the development of patient decision aids. The goal of this study was to describe physicians’ and patients’ viewpoints on the barriers and limitations of using patient decision aids in Iran, their proposed solutions, and, the benefits of using these tools. Methods This qualitative study was conducted in 2011 in Iran by holding in-depth interviews with 14 physicians and 8 arthritis patient. Interviewees were selected through purposeful and maximum variation sampling. As an example, a patient decision aid on the treatment of knee arthritis was developed upon literature reviews and gathering expert opinion, and was presented at the time of interview. Thematic analysis was conducted to analyze the data by using the OpenCode software. Results The results were summarized into three categories and ten codes. The extracted categories were the perceived benefits of using the tools, as well as the patient-related and physician-related barriers in using decision aids. The following barriers in using patient decision aids were identified in this study: lack of patients and physicians’ trainings in shared decision making, lack of specialist per capita, low treatment tariffs and lack of an exact evaluation system for patient participation in decision making. Conclusions No doubt these barriers demand the health authorities’ special attention. Hence, despite patients and physicians’ inclination toward using patient decision aids, these problems have hindered the practical usage of these tools in Iran - as a developing country. PMID:24066792
How well does voice interaction work in space?
NASA Technical Reports Server (NTRS)
Morris, Randy B.; Whitmore, Mihriban; Adam, Susan C.
1993-01-01
The methods and results of an evaluation of the Voice Navigator software package are discussed. The first phase or ground phase of the study consisted of creating, or training, computer voice files of specific commands. This consisted of repeating each of six commands eight times. The files were then tested for recognition accuracy by the software aboard the microgravity aircraft. During the second phase, both voice training and testing were performed in microgravity. Inflight training was done due to problems encountered in phase one which were believed to be caused by ambient noise levels. Both quantitative and qualitative data were collected. Only one of the commands was found to offer consistently high recognition rates across subjects during the second phase.
Third-Party Software's Trust Quagmire.
Voas, J; Hurlburt, G
2015-12-01
Current software development has trended toward the idea of integrating independent software sub-functions to create more complete software systems. Software sub-functions are often not homegrown - instead they are developed by unknown 3 rd party organizations and reside in software marketplaces owned or controlled by others. Such software sub-functions carry plausible concern in terms of quality, origins, functionality, security, interoperability, to name a few. This article surveys key technical difficulties in confidently building systems from acquired software sub-functions by calling out the principle software supply chain actors.
Wang, Xiaofeng; Abrahamsson, Pekka
2014-01-01
For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866
Whole earth modeling: developing and disseminating scientific software for computational geophysics.
NASA Astrophysics Data System (ADS)
Kellogg, L. H.
2016-12-01
Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.
Software Quality Perceptions of Stakeholders Involved in the Software Development Process
ERIC Educational Resources Information Center
Padmanabhan, Priya
2013-01-01
Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…
Computer-aided software development process design
NASA Technical Reports Server (NTRS)
Lin, Chi Y.; Levary, Reuven R.
1989-01-01
The authors describe an intelligent tool designed to aid managers of software development projects in planning, managing, and controlling the development process of medium- to large-scale software projects. Its purpose is to reduce uncertainties in the budget, personnel, and schedule planning of software development projects. It is based on dynamic model for the software development and maintenance life-cycle process. This dynamic process is composed of a number of time-varying, interacting developmental phases, each characterized by its intended functions and requirements. System dynamics is used as a modeling methodology. The resulting Software LIfe-Cycle Simulator (SLICS) and the hybrid expert simulation system of which it is a subsystem are described.
A high order approach to flight software development and testing
NASA Technical Reports Server (NTRS)
Steinbacher, J.
1981-01-01
The use of a software development facility is discussed as a means of producing a reliable and maintainable ECS software system, and as a means of providing efficient use of the ECS hardware test facility. Principles applied to software design are given, including modularity, abstraction, hiding, and uniformity. The general objectives of each phase of the software life cycle are also given, including testing, maintenance, code development, and requirement specifications. Software development facility tools are summarized, and tool deficiencies recognized in the code development and testing phases are considered. Due to limited lab resources, the functional simulation capabilities may be indispensable in the testing phase.
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S.; Mcginnis, Issac
2017-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions. PMID:29278255
Software Reliability Analysis of NASA Space Flight Software: A Practical Experience.
Sukhwani, Harish; Alonso, Javier; Trivedi, Kishor S; Mcginnis, Issac
2016-01-01
In this paper, we present the software reliability analysis of the flight software of a recently launched space mission. For our analysis, we use the defect reports collected during the flight software development. We find that this software was developed in multiple releases, each release spanning across all software life-cycle phases. We also find that the software releases were developed and tested for four different hardware platforms, spanning from off-the-shelf or emulation hardware to actual flight hardware. For releases that exhibit reliability growth or decay, we fit Software Reliability Growth Models (SRGM); otherwise we fit a distribution function. We find that most releases exhibit reliability growth, with Log-Logistic (NHPP) and S-Shaped (NHPP) as the best-fit SRGMs. For the releases that experience reliability decay, we investigate the causes for the same. We find that such releases were the first software releases to be tested on a new hardware platform, and hence they encountered major hardware integration issues. Also such releases seem to have been developed under time pressure in order to start testing on the new hardware platform sooner. Such releases exhibit poor reliability growth, and hence exhibit high predicted failure rate. Other problems include hardware specification changes and delivery delays from vendors. Thus, our analysis provides critical insights and inputs to the management to improve the software development process. As NASA has moved towards a product line engineering for its flight software development, software for future space missions will be developed in a similar manner and hence the analysis results for this mission can be considered as a baseline for future flight software missions.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-02
... NUCLEAR REGULATORY COMMISSION [NRC-2012-0195] Developing Software Life Cycle Processes Used in... revised regulatory guide (RG), revision 1 of RG 1.173, ``Developing Software Life Cycle Processes for... Developing a Software Project Life Cycle Process,'' issued 2006, with the clarifications and exceptions as...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-02
... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... software developers can provide input on these technical specifications for the Common Formats Version 1.1... specifications, which provide direction to software developers that plan to implement the Common Formats...
IT Software Development and IT Operations Strategic Alignment: An Agile DevOps Model
ERIC Educational Resources Information Center
Hart, Michael
2017-01-01
Information Technology (IT) departments that include development and operations are essential to develop software that meet customer needs. DevOps is a term originally constructed from software development and IT operations. DevOps includes the collaboration of all stakeholders such as software engineers and systems administrators involved in the…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-25
... Software Developers on the Technical Specifications for Common Formats for Patient Safety Data Collection... designed as an interactive forum where PSOs and software developers can provide input on these technical... updated event descriptions, forms, and technical specifications for software developers. As an update to...
Trujillo, Elena María; Suárez, Daniel Enrique; Lema, Mariana; Londoño, Alicia
2015-02-01
In Colombia, the use of alcohol is one of the main risky behaviors carried out by adolescents, given that alcohol is the principal drug of abuse in this age group. Understanding how adolescents learn about risk and behavior is important in developing effective prevention programs. The Theory of Social learning underlines the importance of social interaction in the learning process. It suggests that learning can occur in three ways: a live model in which a person is enacting the desired behavior, verbal instruction when the desired behavior is described, and symbolic learning in which modeling occurs by influence of the media. This study explores these three forms of learning in the perception of risk and behavior related to the use of alcohol in a group of students between 12 and 14 years of age in Bogotá, Colombia. This is a qualitative research study, which is part of a larger study exploring the social representations of risk and alcohol use in adolescents and their communities. The sample group included 160 students from two middle schools (7th and 8th graders) in Bogotá, Colombia. Six sessions of participant observation, 12 semi-structured interviews, and 12 focus group discussions were conducted for data collection. Data were analyzed using the Atlas ti software (V7.0) (ATLAS.ti Scientific Software Development GmbH, London, UK), and categories of analysis were developed using a framework analysis approach. Adolescents can identify several risks related to the use of alcohol, which for the most part, appear to have been learned through verbal instruction. However, this risk recognition does not appear to correlate with their behavior. Parental modeling and messages conveyed by the media represent two other significant sources of learning that are constantly contradicting the messages relayed through verbal instruction and correlate to a greater extent with adolescent behavior. The three different forms of learning described by Social Learning Theory play a significant role in the construction of risk perception and behavior in adolescents. This underlines the necessity of consciously evaluating how examples set by adults as well as the ideas expressed by the media influence adolescents' attitudes and behavior, ensuring that these do not directly contradict and ultimately obliterate the messages we are constantly trying to convey to this age group.
Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges
NASA Astrophysics Data System (ADS)
Maruping, Likoebe M.
Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.
Paiva, Carlos Eduardo; Siquelli, Felipe Augusto Ferreira; Zaia, Gabriela Rossi; de Andrade, Diocésio Alves Pinto; Borges, Marcos Aristoteles; Jácome, Alexandre A; Giroldo, Gisele Augusta Sousa Nascimento; Santos, Henrique Amorim; Hahn, Elizabeth A; Uemura, Gilberto; Paiva, Bianca Sakamoto Ribeiro
2016-01-01
To develop and validate a new multimedia instrument to measure health-related quality of life (HRQOL) in Portuguese-speaking patients with cancer. A mixed-methods study conducted in a large Brazilian Cancer Hospital. The instrument was developed along the following sequential phases: identification of HRQOL issues through qualitative content analysis of individual interviews, evaluation of the most important items according to the patients, review of the literature, evaluation by an expert committee, and pretesting. In sequence, an exploratory factor analysis was conducted (pilot testing, n = 149) to reduce the number of items and to define domains and scores. The psychometric properties of the IQualiV-OG-21 were measured in a large multicentre Brazilian study (n = 323). A software containing multimedia resources were developed to facilitate self-administration of IQualiV-OG-21; its feasibility and patients' preferences ("paper and pencil" vs. software) were further tested (n = 54). An exploratory factor analysis reduced the 30-item instrument to 21 items. The IQualiV-OG-21 was divided into 6 domains: emotional, physical, existential, interpersonal relationships, functional and financial. The multicentre study confirmed that it was valid and reliable. The electronic multimedia instrument was easy to complete and acceptable to patients. Regarding preferences, 61.1 % of them preferred the electronic format in comparison with the paper and pencil format. The IQualiV-OG-21 is a new valid and reliable multimedia HRQOL instrument that is well-understood, even by patients with low literacy skills, and can be answered quickly. It is a useful new tool that can be translated and tested in other cultures and languages.
An assessment of space shuttle flight software development processes
NASA Technical Reports Server (NTRS)
1993-01-01
In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.
Modernization of software quality assurance
NASA Technical Reports Server (NTRS)
Bhaumik, Gokul
1988-01-01
The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Waters, M.D.; Stack, H.F.; Garrett, N.E.
A graphic approach, terms a Genetic Activity Profile (GAP), was developed to display a matrix of data on the genetic and related effects of selected chemical agents. The profiles provide a visual overview of the quantitative (doses) and qualitative (test results) data for each chemical. Either the lowest effective dose or highest ineffective dose is recorded for each agent and bioassay. Up to 200 different test systems are represented across the GAP. Bioassay systems are organized according to the phylogeny of the test organisms and the end points of genetic activity. The methodology for producing and evaluating genetic activity profilemore » was developed in collaboration with the International Agency for Research on Cancer (IARC). Data on individual chemicals were compiles by IARC and by the US Environmental Protection Agency (EPA). Data are available on 343 compounds selected from volumes 1-53 of the IARC Monographs and on 115 compounds identified as Superfund Priority Substances. Software to display the GAPs on an IBM-compatible personal computer is available from the authors. Structurally similar compounds frequently display qualitatively and quantitatively similar profiles of genetic activity. Through examination of the patterns of GAPs of pairs and groups of chemicals, it is possible to make more informed decisions regarding the selection of test batteries to be used in evaluation of chemical analogs. GAPs provided useful data for development of weight-of-evidence hazard ranking schemes. Also, some knowledge of the potential genetic activity of complex environmental mixtures may be gained from an assessment of the genetic activity profiles of component chemicals. The fundamental techniques and computer programs devised for the GAP database may be used to develop similar databases in other disciplines. 36 refs., 2 figs.« less
Macedo, Nayana Damiani; Buzin, Aline Rodrigues; de Araujo, Isabela Bastos Binotti Abreu; Nogueira, Breno Valentim; de Andrade, Tadeu Uggere; Endringer, Denise Coutinho; Lenz, Dominik
2017-02-01
The current study proposes an automated machine learning approach for the quantification of cells in cell death pathways according to DNA fragmentation. A total of 17 images of kidney histological slide samples from male Wistar rats were used. The slides were photographed using an Axio Zeiss Vert.A1 microscope with a 40x objective lens coupled with an Axio Cam MRC Zeiss camera and Zen 2012 software. The images were analyzed using CellProfiler (version 2.1.1) and CellProfiler Analyst open-source software. Out of the 10,378 objects, 4970 (47,9%) were identified as TUNEL positive, and 5408 (52,1%) were identified as TUNEL negative. On average, the sensitivity and specificity values of the machine learning approach were 0.80 and 0.77, respectively. Image cytometry provides a quantitative analytical alternative to the more traditional qualitative methods more commonly used in studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Beamline Insertions Manager at Jefferson Lab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Michael C.
2015-09-01
The beam viewer system at Jefferson Lab provides operators and beam physicists with qualitative and quantitative information on the transverse electron beam properties. There are over 140 beam viewers installed on the 12 GeV CEBAF accelerator. This paper describes an upgrade consisting of replacing the EPICS-based system tasked with managing all viewers with a mixed system utilizing EPICS and high-level software. Most devices, particularly the beam viewers, cannot be safely inserted into the beam line during high-current beam operations. Software is partly responsible for protecting the machine from untimely insertions. The multiplicity of beam-blocking and beam-vulnerable devices motivates us tomore » try a data-driven approach. The beamline insertions application components are centrally managed and configured through an object-oriented software framework created for this purpose. A rules-based engine tracks the configuration and status of every device, along with the beam status of the machine segment containing the device. The application uses this information to decide on which device actions are allowed at any given time.« less
University Approaches to Software Copyright and Licensure Policies.
ERIC Educational Resources Information Center
Hawkins, Brian L.
Issues of copyright policy and software licensure at Drexel University that were developed during the introduction of a new microcomputing program are discussed. Channels for software distribution include: individual purchase of externally-produced software, distribution of internally-developed software, institutional licensure, and "read…
Modular Rocket Engine Control Software (MRECS)
NASA Technical Reports Server (NTRS)
Tarrant, Charlie; Crook, Jerry
1997-01-01
The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for a generic, advanced engine control system that will result in lower software maintenance (operations) costs. It effectively accommodates software requirements changes that occur due to hardware. technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives and benefits of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishment are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software, architecture, reuse software, and reduced software reverification time related to software changes. Currently, the program is focused on supporting MSFC in accomplishing a Space Shuttle Main Engine (SSME) hot-fire test at Stennis Space Center and the Low Cost Boost Technology (LCBT) Program.
NA-42 TI Shared Software Component Library FY2011 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, Christa K.; Rutz, Frederick C.; Dorow, Kevin E.
The NA-42 TI program initiated an effort in FY2010 to standardize its software development efforts with the long term goal of migrating toward a software management approach that will allow for the sharing and reuse of code developed within the TI program, improve integration, ensure a level of software documentation, and reduce development costs. The Pacific Northwest National Laboratory (PNNL) has been tasked with two activities that support this mission. PNNL has been tasked with the identification, selection, and implementation of a Shared Software Component Library. The intent of the library is to provide a common repository that is accessiblemore » by all authorized NA-42 software development teams. The repository facilitates software reuse through a searchable and easy to use web based interface. As software is submitted to the repository, the component registration process captures meta-data and provides version control for compiled libraries, documentation, and source code. This meta-data is then available for retrieval and review as part of library search results. In FY2010, PNNL and staff from the Remote Sensing Laboratory (RSL) teamed up to develop a software application with the goal of replacing the aging Aerial Measuring System (AMS). The application under development includes an Advanced Visualization and Integration of Data (AVID) framework and associated AMS modules. Throughout development, PNNL and RSL have utilized a common AMS code repository for collaborative code development. The AMS repository is hosted by PNNL, is restricted to the project development team, is accessed via two different geographic locations and continues to be used. The knowledge gained from the collaboration and hosting of this repository in conjunction with PNNL software development and systems engineering capabilities were used in the selection of a package to be used in the implementation of the software component library on behalf of NA-42 TI. The second task managed by PNNL is the development and continued maintenance of the NA-42 TI Software Development Questionnaire. This questionnaire is intended to help software development teams working under NA-42 TI in documenting their development activities. When sufficiently completed, the questionnaire illustrates that the software development activities recorded incorporate significant aspects of the software engineering lifecycle. The questionnaire template is updated as comments are received from NA-42 and/or its development teams and revised versions distributed to those using the questionnaire. PNNL also maintains a list of questionnaire recipients. The blank questionnaire template, the AVID and AMS software being developed, and the completed AVID AMS specific questionnaire are being used as the initial content to be established in the TI Component Library. This report summarizes the approach taken to identify requirements, search for and evaluate technologies, and the approach taken for installation of the software needed to host the component library. Additionally, it defines the process by which users request access for the contribution and retrieval of library content.« less
Duan, Fang; Liu, Yuhong; Chen, Xiang; Congdon, Nathan; Zhang, Jian; Chen, Qianyun; Chen, Lingling; Chen, Xi; Zhang, Xiulan; Yu, Chengpu; Liu, Yizhi
2017-01-01
Objective To identify the reasons for low adherence among patients with diabetic retinopathy (DR) in southern China using a qualitative method. Methods Exploratory indepth interviews were conducted in 27 diabetic patients with proliferative diabetic retinopathy who required vitrectomy surgery at Zhongshan Ophthalmic Centre, Sun Yat-sen University, from March to August 2015. Qualitative data analysis and research software (ATLAS.ti7) was used for data processing and analysis. Results Factors influencing the occurrence of timely visits included lack of DR related knowledge, fear and worries about insulin, interactions between patients and society combined with the complexity of emotions and social culture, and the economic burden of treatment. Conclusions Although the reasons for low adherence involved social, emotional, cultural and economic factors, the key issue was the lack of awareness and knowledge of DR. Our findings have several practical implications for health policymakers and programme planners in China. PMID:28348188
As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...
Precise Documentation: The Key to Better Software
NASA Astrophysics Data System (ADS)
Parnas, David Lorge
The prime cause of the sorry “state of the art” in software development is our failure to produce good design documentation. Poor documentation is the cause of many errors and reduces efficiency in every phase of a software product's development and use. Most software developers believe that “documentation” refers to a collection of wordy, unstructured, introductory descriptions, thousands of pages that nobody wanted to write and nobody trusts. In contrast, Engineers in more traditional disciplines think of precise blueprints, circuit diagrams, and mathematical specifications of component properties. Software developers do not know how to produce precise documents for software. Software developments also think that documentation is something written after the software has been developed. In other fields of Engineering much of the documentation is written before and during the development. It represents forethought not afterthought. Among the benefits of better documentation would be: easier reuse of old designs, better communication about requirements, more useful design reviews, easier integration of separately written modules, more effective code inspection, more effective testing, and more efficient corrections and improvements. This paper explains how to produce and use precise software documentation and illustrate the methods with several examples.
Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project
NASA Technical Reports Server (NTRS)
Wallick, Michael N.; Mittman, David S.; Shams, Khawaja, S.; Bachmann, Andrew G.; Ludowise, Melissa
2013-01-01
This software simplifies the process of having to set up an Eclipse IDE programming environment for the members of the cross-NASA center project, Ensemble. It achieves this by assembling all the necessary add-ons and custom tools/preferences. This software is unique in that it allows developers in the Ensemble Project (approximately 20 to 40 at any time) across multiple NASA centers to set up a development environment almost instantly and work on Ensemble software. The software automatically has the source code repositories and other vital information and settings included. The Eclipse IDE is an open-source development framework. The NASA (Ensemble-specific) version of the software includes Ensemble-specific plug-ins as well as settings for the Ensemble project. This software saves developers the time and hassle of setting up a programming environment, making sure that everything is set up in the correct manner for Ensemble development. Existing software (i.e., standard Eclipse) requires an intensive setup process that is both time-consuming and error prone. This software is built once by a single user and tested, allowing other developers to simply download and use the software
Shuttle avionics software trials, tribulations and success
NASA Technical Reports Server (NTRS)
Henderson, O. L.
1985-01-01
The early problems and the solutions developed to provide the required quality software needed to support the space shuttle engine development program are described. The decision to use a programmable digital control system on the space shuttle engine was primarily based upon the need for a flexible control system capable of supporting the total engine mission on a large complex pump fed engine. The mission definition included all control phases from ground checkout through post shutdown propellant dumping. The flexibility of the controller through reprogrammable software allowed the system to respond to the technical challenges and innovation required to develop both the engine and controller hardware. This same flexibility, however, placed a severe strain on the capability of the software development and verification organization. The overall development program required that the software facility accommodate significant growth in both the software requirements and the number of software packages delivered. This challenge was met by reorganization and evolution in the process of developing and verifying software.
Software-Engineering Process Simulation (SEPS) model
NASA Technical Reports Server (NTRS)
Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.
1992-01-01
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.
Insights into software development in Japan
NASA Technical Reports Server (NTRS)
Duvall, Lorraine M.
1992-01-01
The interdependence of the U.S.-Japanese economies makes it imperative that we in the United States understand how business and technology developments take place in Japan. We can gain insight into these developments in software engineering by studying the context in which Japanese software is developed, the practices that are used, the problems encountered, the setting surrounding these problems, and the resolution of these problems. Context includes the technological and sociological characteristics of the software development environment, the software processes applied, personnel involved in the development process, and the corporate and social culture surrounding the development. Presented in this paper is a summary of results of a study that addresses these issues. Data for this study was collected during a three month visit to Japan where the author interviewed 20 software managers representing nine companies involved in developing software in Japan. These data are compared to similar data from the United States in which 12 managers from five companies were interviewed.
Software for Automated Image-to-Image Co-registration
NASA Technical Reports Server (NTRS)
Benkelman, Cody A.; Hughes, Heidi
2007-01-01
The project objectives are: a) Develop software to fine-tune image-to-image co-registration, presuming images are orthorectified prior to input; b) Create a reusable software development kit (SDK) to enable incorporation of these tools into other software; d) provide automated testing for quantitative analysis; and e) Develop software that applies multiple techniques to achieve subpixel precision in the co-registration of image pairs.
Measuring the software process and product: Lessons learned in the SEL
NASA Technical Reports Server (NTRS)
Basili, V. R.
1985-01-01
The software development process and product can and should be measured. The software measurement process at the Software Engineering Laboratory (SEL) has taught a major lesson: develop a goal-driven paradigm (also characterized as a goal/question/metric paradigm) for data collection. Project analysis under this paradigm leads to a design for evaluating and improving the methodology of software development and maintenance.
NASA Technical Reports Server (NTRS)
1976-01-01
Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.
A design and implementation methodology for diagnostic systems
NASA Technical Reports Server (NTRS)
Williams, Linda J. F.
1988-01-01
A methodology for design and implementation of diagnostic systems is presented. Also discussed are the advantages of embedding a diagnostic system in a host system environment. The methodology utilizes an architecture for diagnostic system development that is hierarchical and makes use of object-oriented representation techniques. Additionally, qualitative models are used to describe the host system components and their behavior. The methodology architecture includes a diagnostic engine that utilizes a combination of heuristic knowledge to control the sequence of diagnostic reasoning. The methodology provides an integrated approach to development of diagnostic system requirements that is more rigorous than standard systems engineering techniques. The advantages of using this methodology during various life cycle phases of the host systems (e.g., National Aerospace Plane (NASP)) include: the capability to analyze diagnostic instrumentation requirements during the host system design phase, a ready software architecture for implementation of diagnostics in the host system, and the opportunity to analyze instrumentation for failure coverage in safety critical host system operations.
Development of a Mobile User Interface for Image-based Dietary Assessment
Kim, SungYe; Schap, TusaRebecca; Bosch, Marc; Maciejewski, Ross; Delp, Edward J.; Ebert, David S.; Boushey, Carol J.
2011-01-01
In this paper, we present a mobile user interface for image-based dietary assessment. The mobile user interface provides a front end to a client-server image recognition and portion estimation software. In the client-server configuration, the user interactively records a series of food images using a built-in camera on the mobile device. Images are sent from the mobile device to the server, and the calorie content of the meal is estimated. In this paper, we describe and discuss the design and development of our mobile user interface features. We discuss the design concepts, through initial ideas and implementations. For each concept, we discuss qualitative user feedback from participants using the mobile client application. We then discuss future designs, including work on design considerations for the mobile application to allow the user to interactively correct errors in the automatic processing while reducing the user burden associated with classical pen-and-paper dietary records. PMID:24455755
Preparation for an online asynchronous university doctoral course. Lessons learned.
Milstead, J A; Nelson, R
1998-01-01
This article addresses the development of the initial course in the first completely online doctoral program in nursing. Synchronous and asynchronous methods of distance education were assessed. Planning focused at the university, school, and course levels. University planning involved the technical infrastructure, registration, student services, and library services. School planning examined administrative commitment and faculty commitment and willingness. Course planning focused on marketing, precourse information, time frame, modular design, planned interaction, and professor availability and support. Implementation issues centered on getting students connected, learning the software, changing instructional methods, and managing chats. Traditional methods of evaluating student learning and course evaluation were supplemented with the development of qualitative and quantitative tools to gather data for making administrative decisions. The Dean and faculty agreed that the internet was an effective method of delivering content in the initial Health Policy course. The Dean and faculty agreed to continue the PhD program online for one cohort and continue to evaluate student progress and faculty and student satisfaction.
Hand hygiene among healthcare workers: A qualitative meta summary using the GRADE-CERQual process
Chatfield, Sheryl L.; DeBois, Kristen; Nolan, Rachael; Crawford, Hannah; Hallam, Jeffrey S.
2016-01-01
Background: Hand hygiene is considered an effective and potentially modifiable infection control behaviour among healthcare workers (HCW). Several meta-studies have been published that compare quantitatively expressed findings, but limited efforts have been made to synthesise qualitative research. Objectives: This paper provides the first report of integrated findings from qualitative research reports on hand hygiene compliance among HCW worldwide that employs the GRADE-CERQual process of quality assessment. Methods: We conducted database searches and identified 36 reports in which authors conducted qualitative or mixed methods research on hand hygiene compliance among HCW. We used Dedoose analysis software to facilitate extraction of relevant excerpts. We applied the GRADE-CERQual process to describe relative confidence as high, moderate or low for nine aggregate findings. Findings: Highest confidence findings included that HCW believe they have access to adequate training, and that management and resource support are sometimes lacking. Individual, subjective criteria also influence hand hygiene. Discussion: These results suggest the need for further investigation into healthcare cultures that are perceived as supportive for infection control. Surveillance processes have potential, especially if information is perceived by HCW as timely and relevant. PMID:28989515
Hotel housekeeping work influences on hypertension management.
Sanon, Marie-Anne
2013-12-01
Characteristics of hotel housekeeping work increase the risk for hypertension development. Little is known about the influences of such work on hypertension management. For this qualitative study, 27 Haitian immigrant hotel housekeepers from Miami-Dade County, FL were interviewed. Interview transcripts were analyzed with the assistance of the Atlas.ti software for code and theme identification. Influences of hotel housekeeping work on hypertension management arose both at the individual and system levels. Factors at the individual level included co-worker dynamics and maintenance of transmigrant life. Factors at the system level included supervisory support, workload, work pace, and work hiring practices. No positive influences were reported for workload and hiring practices. Workplace interventions may be beneficial for effective hypertension management among hotel housekeepers. These work influences must be considered when determining effective methods for hypertension management among hotel housekeepers. © 2013 Wiley Periodicals, Inc.
Learning Human Aspects of Collaborative Software Development
ERIC Educational Resources Information Center
Hadar, Irit; Sherman, Sofia; Hazzan, Orit
2008-01-01
Collaboration has become increasingly widespread in the software industry as systems have become larger and more complex, adding human complexity to the technological complexity already involved in developing software systems. To deal with this complexity, human-centric software development methods, such as Extreme Programming and other agile…
Estimating Software-Development Costs With Greater Accuracy
NASA Technical Reports Server (NTRS)
Baker, Dan; Hihn, Jairus; Lum, Karen
2008-01-01
COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.
Barriers and facilitators to Electronic Medical Record (EMR) use in an urban slum.
Jawhari, Badeia; Keenan, Louanne; Zakus, David; Ludwick, Dave; Isaac, Abraam; Saleh, Abdullah; Hayward, Robert
2016-10-01
Rapid urbanization has led to the growth of urban slums and increased healthcare burdens for vulnerable populations. Electronic Medical Records (EMRs) have the potential to improve continuity of care for slum residents, but their implementation is complicated by technical and non-technical limitations. This study sought practical insights about facilitators and barriers to EMR implementation in urban slum environments. Descriptive qualitative method was used to explore staff perceptions about a recent open-source EMR deployment in two primary care clinics in Kibera, Nairobi. Participants were interviewed using open-ended, semi-structured questions. Content analysis was used when exploring transcribed data. Three major themes - systems, software, and social considerations - emerged from content analysis, with sustainability concerns prevailing. Although participants reported many systems (e.g., power, network, Internet, hardware, interoperability) and software (e.g., data integrity, confidentiality, function) challenges, social factors (e.g., identity management, training, use incentives) appeared the most important impediments to sustainability. These findings are consistent with what others have reported, especially the importance of practical barriers to EMR deployments in resource-constrained settings. Other findings contribute unique insights about social determinants of EMR impact in slum settings, including the challenge of multiple-identity management and development of meaningful incentives to staff compliance. This study exposes front-line experiences with opportunities and shortcomings of EMR implementations in urban slum primary care clinics. Although the promise is great, there are a number of unique system, software and social challenges that EMR advocates should address before expecting sustainable EMR use in resource-constrained settings. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Integrating systems biology models and biomedical ontologies
2011-01-01
Background Systems biology is an approach to biology that emphasizes the structure and dynamic behavior of biological systems and the interactions that occur within them. To succeed, systems biology crucially depends on the accessibility and integration of data across domains and levels of granularity. Biomedical ontologies were developed to facilitate such an integration of data and are often used to annotate biosimulation models in systems biology. Results We provide a framework to integrate representations of in silico systems biology with those of in vivo biology as described by biomedical ontologies and demonstrate this framework using the Systems Biology Markup Language. We developed the SBML Harvester software that automatically converts annotated SBML models into OWL and we apply our software to those biosimulation models that are contained in the BioModels Database. We utilize the resulting knowledge base for complex biological queries that can bridge levels of granularity, verify models based on the biological phenomenon they represent and provide a means to establish a basic qualitative layer on which to express the semantics of biosimulation models. Conclusions We establish an information flow between biomedical ontologies and biosimulation models and we demonstrate that the integration of annotated biosimulation models and biomedical ontologies enables the verification of models as well as expressive queries. Establishing a bi-directional information flow between systems biology and biomedical ontologies has the potential to enable large-scale analyses of biological systems that span levels of granularity from molecules to organisms. PMID:21835028
An Experimental Characterization System for Deep Ultra-Violet (UV) Photoresists
NASA Astrophysics Data System (ADS)
Drako, Dean M.; Partlo, William N.; Oldham, William G.; Neureuther, Andrew R.
1989-08-01
A versatile system designed specifically for experimental automated photoresist characterization has been constructed utilizing an excimer laser source for exposure at 248nm. The system was assembled, as much as possible, from commercially available components in order to facilitate its replication. The software and hardware are completely documented in a University of California-Berkeley Engineering Research Lab Memo. An IBM PC-AT compatible computer controls an excimer laser, operates a Fourier Transform Infrared (FTIR) Spectrometer, measures and records the energy of each laser pulse (incident, reflected, and transmitted), opens and closes shutters, and operates two linear stages for sample movement. All operations (except FTIR data reduction) are managed by a control program written in the "C" language. The system is capable of measuring total exposure dose, performing bleaching measurements, creating and recording exposure pulse sequences, and generating exposure patterns suitable for multiple channel monitoring of the development. The total exposure energy, energy per pulse, and pulse rate are selectable over a wide range. The system contains an in-situ Fourier Transform Infrared Spectrometer for qualitative and quantitative analysis of the photoresist baking and exposure processes (baking is not done in-situ). FIIR may be performed in transmission or reflection. The FTIR data will form the basis of comprehensive multi-state resist models. The system's versatility facilitates the development of new automated and repeatable experiments. Simple controlling software, utilizing the provided interface sub-routines, can be written to control new experiments and collect data.
A novel navigation system for maxillary positioning in orthognathic surgery: Preclinical evaluation.
Lutz, Jean-Christophe; Nicolau, Stéphane; Agnus, Vincent; Bodin, Frédéric; Wilk, Astrid; Bruant-Rodier, Catherine; Rémond, Yves; Soler, Luc
2015-11-01
Appropriate positioning of the maxilla is critical in orthognathic surgery. As opposed to splint-based positioning, navigation systems are versatile and appropriate in assessing the vertical dimension. Bulk and disruption to the line of sight are drawbacks of optical navigation systems. Our aim was to develop and assess a novel navigation system based on electromagnetic tracking of the maxilla, including real-time registration of head movements. Since the software interface has proved to greatly influence the accuracy of the procedure, we purposely designed and evaluated an original, user-friendly interface. A sample of 12 surgeons had to navigate the phantom osteotomized maxilla to eight given target positions using the software we have developed. Time and accuracy (translational error and angular error) were compared between a conventional and a navigated session. A questionnaire provided qualitative evaluation. Our system definitely allows a reduction in variability of time and accuracy among different operators. Accuracy was improved in all surgeons (mean terror difference = 1.11 mm, mean aerror difference = 1.32°). Operative time was decreased in trainees. Therefore, they would benefit from such a system that could also serve for educational purposes. The majority of surgeons who strongly agreed that such a navigation system would prove very helpful in complex deformities, also stated that it would be helpful in everyday orthognathic procedures. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Workstation-Based Avionics Simulator to Support Mars Science Laboratory Flight Software Development
NASA Technical Reports Server (NTRS)
Henriquez, David; Canham, Timothy; Chang, Johnny T.; McMahon, Elihu
2008-01-01
The Mars Science Laboratory developed the WorkStation TestSet (WSTS) to support flight software development. The WSTS is the non-real-time flight avionics simulator that is designed to be completely software-based and run on a workstation class Linux PC. This provides flight software developers with their own virtual avionics testbed and allows device-level and functional software testing when hardware testbeds are either not yet available or have limited availability. The WSTS has successfully off-loaded many flight software development activities from the project testbeds. At the writing of this paper, the WSTS has averaged an order of magnitude more usage than the project's hardware testbeds.
NASA Technical Reports Server (NTRS)
Gaffney, J. E., Jr.; Judge, R. W.
1981-01-01
A model of a software development process is described. The software development process is seen to consist of a sequence of activities, such as 'program design' and 'module development' (or coding). A manpower estimate is made by multiplying code size by the rates (man months per thousand lines of code) for each of the activities relevant to the particular case of interest and summing up the results. The effect of four objectively determinable factors (organization, software product type, computer type, and code type) on productivity values for each of nine principal software development activities was assessed. Four factors were identified which account for 39% of the observed productivity variation.
COSTMODL: An automated software development cost estimation tool
NASA Technical Reports Server (NTRS)
Roush, George B.
1991-01-01
The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
Factors Influencing Fast-Food Consumption Among Adolescents in Tehran: A Qualitative Study
Askari Majabadi, Hesamedin; Solhi, Mahnaz; Montazeri, Ali; Shojaeizadeh, Davoud; Nejat, Saharnaz; Khalajabadi Farahani, Farideh; Djazayeri, Abolghasem
2016-01-01
Background: The consumption of different types of fast food is increasingly growing in all parts of the world, both in developed and developing countries. Because of the changes and transitions in the lifestyle and dietary habits of people, an increasing number of people from different age groups, particularly adolescents and young adults, are inclined toward consuming fast food. Objectives: The objective of this study was to investigate the factors influencing fast-food consumption among adolescents in Tehran, Iran. Patients and Methods: The present qualitative study was conducted in 2012 - 2013 in Tehran, the capital of Iran. To achieve the objective of this study, 42 adolescents were enrolled in this study through a purposive sampling method, and the required data was collected via individual semi-structured in-depth interviews. Data collection and analysis were carried out simultaneously, and the collected data was analyzed via a thematic content analysis and using MAXQDA 10 software. Results: In this study after coding the transcribed interviews, the findings were categorized into three main themes as follows: personal views, social factors, and family factors. Each theme included several categories and subcategories, and the coded sentences and phrases were placed under each category and subcategory. Conclusions: The results of this study showed that the number of factors promoting fast-food consumption appeared to be more than the inhibiting factors and that the diverse factors at the individual and social level influenced fast-food consumption among adolescents. PMID:27247793
Nadir, Maha; Hamza, Muhammad; Mehmood, Nadir
2018-01-01
Biopsychosocial (BPS) model has been a mainstay in the ideal practice of modern medicine. It is attributed to improve patient care, compliance, and satisfaction and to reduce doctor-patient conflict. The study aimed to understand the importance given to BPS model while conducting routine doctor-patient interactions in public sector hospitals of a developing country where health resources are limited. The study was conducted in Rawalpindi, Pakistan. The study design is qualitative. Structured interviews were conducted from 44 patients from surgical and medical units of Benazir Bhutto Hospital and Holy Family Hospital. The questions were formulated based on patient-centered interviewing methods by reviewing the literature on BPS model. The analysis was done thematically using the software NVivo 11 for qualitative data. The study revealed four emerging themes: (1) Lack of doctor-patient rapport. (2) Utilization of a paternalistic approach during treatment. (3) Utilization of a reductionist biomedical approach during treatment. (4) Patients' concern with their improvement in health and doctor's demeanor. The study highlights the fact that BPS is not given considerable importance while taking routine medical history. This process remains doctor centered and paternalistic. However, patients are more concerned with their improvement in health rather than whether or not they are being provided informational care. Sequential studies will have to be conducted to determine whether this significantly affects patient care and compliance and whether BPS is a workable model in the healthcare system in the third world.
Student midwives and paramedic students' experiences of shared learning in pre-hospital childbirth.
Feltham, Christina; Foster, Julie; Davidson, Tom; Ralph, Stewart
2016-06-01
To explore the experiences of midwifery and paramedic students undertaking interprofessional learning. A one day interprofessional learning workshop incorporating peer assisted learning for undergraduate pre-registration midwifery and paramedic students was developed based on collaborative practice theory and simulation based learning. Twenty-five student midwives and thirty-one paramedic students participated in one of two identical workshops conducted over separate days. Videoed focus group sessions were held following the workshop sessions in order to obtain qualitative data around student experience. Qualitative data analysis software (ATLAS.ti) was used to collate the transcriptions from the focus group sessions and the video recordings were scrutinised. Thematic analysis was adopted. Four main themes were identified around the understanding of each other's roles and responsibilities, the value of interprofessional learning, organisation and future learning. Students appeared to benefit from a variety of learning opportunities including interprofessional learning and peer assisted learning through the adoption of both formal and informal teaching methods, including simulation based learning. A positive regard for each other's profession including professional practice, professional governing bodies, professional codes and scope of practice was apparent. Students expressed a desire to undertake similar workshops with other professional students. Interprofessional learning workshops were found to be a positive experience for the students involved. Consideration needs to be given to developing interprofessional learning with other student groups aligned with midwifery at appropriate times in relation to stage of education. Copyright © 2016 Elsevier Ltd. All rights reserved.
Using computer-based video analysis in the study of fidgety movements.
Adde, Lars; Helbostad, Jorunn L; Jensenius, Alexander Refsum; Taraldsen, Gunnar; Støen, Ragnhild
2009-09-01
Absence of fidgety movements (FM) in high-risk infants is a strong marker for later cerebral palsy (CP). FMs can be classified by the General Movement Assessment (GMA), based on Gestalt perception of the infant's movement pattern. More objective movement analysis may be provided by computer-based technology. The aim of this study was to explore the feasibility of a computer-based video analysis of infants' spontaneous movements in classifying non-fidgety versus fidgety movements. GMA was performed from video material of the fidgety period in 82 term and preterm infants at low and high risks of developing CP. The same videos were analysed using the developed software called General Movement Toolbox (GMT) with visualisation of the infant's movements for qualitative analyses. Variables derived from the calculation of displacement of pixels from one video frame to the next were used for quantitative analyses. Visual representations from GMT showed easily recognisable patterns of FMs. Of the eight quantitative variables derived, the variability in displacement of a spatial centre of active pixels in the image had the highest sensitivity (81.5) and specificity (70.0) in classifying FMs. By setting triage thresholds at 90% sensitivity and specificity for FM, the need for further referral was reduced by 70%. Video recordings can be used for qualitative and quantitative analyses of FMs provided by GMT. GMT is easy to implement in clinical practice, and may provide assistance in detecting infants without FMs.
NASA Software Assurance's Roles in Research and Technology
NASA Technical Reports Server (NTRS)
Wetherholt, Martha
2010-01-01
This slide presentation reviews the interactions between the scientist and engineers doing research and technology and the software developers and others who are doing software assurance. There is a discussion of the role of the Safety and Mission Assurance (SMA) in developing software to be used for research and technology, and the importance of this role as the technology moves to the higher levels of the technology readiness levels (TRLs). There is also a call to change the way the development of software is developed.