Reliability Centered Maintenance - Methodologies
NASA Technical Reports Server (NTRS)
Kammerer, Catherine C.
2009-01-01
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
2010-09-01
application of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and...of existing assessment tools that may be applicable to Marine Air Ground Task Force (MAGTF) Command, Control, Communications and Computers (C4...assessment tools and analysis concepts that may be extended to the Marine Corps’ C4 System of Systems assessment methodology as a means to obtain a
System engineering toolbox for design-oriented engineers
NASA Technical Reports Server (NTRS)
Goldberg, B. E.; Everhart, K.; Stevens, R.; Babbitt, N., III; Clemens, P.; Stout, L.
1994-01-01
This system engineering toolbox is designed to provide tools and methodologies to the design-oriented systems engineer. A tool is defined as a set of procedures to accomplish a specific function. A methodology is defined as a collection of tools, rules, and postulates to accomplish a purpose. For each concept addressed in the toolbox, the following information is provided: (1) description, (2) application, (3) procedures, (4) examples, if practical, (5) advantages, (6) limitations, and (7) bibliography and/or references. The scope of the document includes concept development tools, system safety and reliability tools, design-related analytical tools, graphical data interpretation tools, a brief description of common statistical tools and methodologies, so-called total quality management tools, and trend analysis tools. Both relationship to project phase and primary functional usage of the tools are also delineated. The toolbox also includes a case study for illustrative purposes. Fifty-five tools are delineated in the text.
ERIC Educational Resources Information Center
Kalathaki, Maria
2015-01-01
Greek school community emphasizes on the discovery direction of teaching methodology in the school Environmental Education (EE) in order to promote Education for the Sustainable Development (ESD). In ESD school projects the used methodology is experiential teamwork for inquiry based learning. The proposed tool checks whether and how a school…
Drenkard, K N
2001-01-01
The application of a strategic planning methodology for the discipline of nursing is described in use by a large, nonprofit integrated healthcare system. The methodology uses a transformational leadership assessment tool, quality planning methods, and large group intervention to engage nurses in the implementation of strategies. Based on systems theory, the methodology outlined by the author has application at any level in an organization, from an entire delivery network, to a patient care unit. The author discusses getting started on a strategic planning journey, tools that are useful in the process, integrating already existing business plans into the strategies for nursing, preliminary measures to monitor progress, and lessons learned along the journey.
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
NASA Technical Reports Server (NTRS)
Young, Larry A.; Yetter, Jeffrey A.; Guynn, Mark D.
2006-01-01
Maturation of intelligent systems technologies and their incorporation into aerial platforms are dictating the development of new analysis tools and incorporation of such tools into existing system analysis methodologies in order to fully capture the trade-offs of autonomy on vehicle and mission success. A first-order "system analysis of autonomy" methodology is outlined in this paper. Further, this analysis methodology is subsequently applied to notional high-altitude long-endurance (HALE) aerial vehicle missions.
A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators.
Sánchez, Borja Bordel; Alcarria, Ramón; Sánchez-Picot, Álvaro; Sánchez-de-Rivera, Diego
2017-09-22
Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users' needs and requirements and various additional factors such as the development team's experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal.
A Methodology for the Design of Application-Specific Cyber-Physical Social Sensing Co-Simulators
Sánchez-Picot, Álvaro
2017-01-01
Cyber-Physical Social Sensing (CPSS) is a new trend in the context of pervasive sensing. In these new systems, various domains coexist in time, evolve together and influence each other. Thus, application-specific tools are necessary for specifying and validating designs and simulating systems. However, nowadays, different tools are employed to simulate each domain independently. Mainly, the cause of the lack of co-simulation instruments to simulate all domains together is the extreme difficulty of combining and synchronizing various tools. In order to reduce that difficulty, an adequate architecture for the final co-simulator must be selected. Therefore, in this paper the authors investigate and propose a methodology for the design of CPSS co-simulation tools. The paper describes the four steps that software architects should follow in order to design the most adequate co-simulator for a certain application, considering the final users’ needs and requirements and various additional factors such as the development team’s experience. Moreover, the first practical use case of the proposed methodology is provided. An experimental validation is also included in order to evaluate the performing of the proposed co-simulator and to determine the correctness of the proposal. PMID:28937610
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
NASA Astrophysics Data System (ADS)
Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David
2014-01-01
Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.
2016-12-22
assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology
Systems scenarios: a tool for facilitating the socio-technical design of work systems.
Hughes, Helen P N; Clegg, Chris W; Bolton, Lucy E; Machon, Lauren C
2017-10-01
The socio-technical systems approach to design is well documented. Recognising the benefits of this approach, organisations are increasingly trying to work with systems, rather than their component parts. However, few tools attempt to analyse the complexity inherent in such systems, in ways that generate useful, practical outputs. In this paper, we outline the 'System Scenarios Tool' (SST), which is a novel, applied methodology that can be used by designers, end-users, consultants or researchers to help design or re-design work systems. The paper introduces the SST using examples of its application, and describes the potential benefits of its use, before reflecting on its limitations. Finally, we discuss potential opportunities for the tool, and describe sets of circumstances in which it might be used. Practitioner Summary: The paper presents a novel, applied methodological tool, named the 'Systems Scenarios Tool'. We believe this tool can be used as a point of reference by designers, end-users, consultants or researchers, to help design or re-design work systems. Included in the paper are two worked examples, demonstrating the tool's application.
NASA Technical Reports Server (NTRS)
Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.
1989-01-01
A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.
Stochastic response surface methodology: A study in the human health area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa
2015-03-10
In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.
Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics
ERIC Educational Resources Information Center
Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.
2016-01-01
Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…
The PHM-Ethics methodology: interdisciplinary technology assessment of personal health monitoring.
Schmidt, Silke; Verweij, Marcel
2013-01-01
The contribution briefly introduces the PHM Ethics project and the PHM methodology. Within the PHM-Ethics project, a set of tools and modules had been developed that may assist in the evaluation and assessment of new technologies for personal health monitoring, referred to as "PHM methodology" or "PHM toolbox". An overview on this interdisciplinary methodology and its comprising modules is provided, areas of application and intended target groups are indicated.
Microfluidic tools for cell biological research
Velve-Casquillas, Guilhem; Le Berre, Maël; Piel, Matthieu; Tran, Phong T.
2010-01-01
Summary Microfluidic technology is creating powerful tools for cell biologists to control the complete cellular microenvironment, leading to new questions and new discoveries. We review here the basic concepts and methodologies in designing microfluidic devices, and their diverse cell biological applications. PMID:21152269
Software reuse in spacecraft planning and scheduling systems
NASA Technical Reports Server (NTRS)
Mclean, David; Tuchman, Alan; Broseghini, Todd; Yen, Wen; Page, Brenda; Johnson, Jay; Bogovich, Lynn; Burkhardt, Chris; Mcintyre, James; Klein, Scott
1993-01-01
The use of a software toolkit and development methodology that supports software reuse is described. The toolkit includes source-code-level library modules and stand-alone tools which support such tasks as data reformatting and report generation, simple relational database applications, user interfaces, tactical planning, strategic planning and documentation. The current toolkit is written in C and supports applications that run on IBM-PC's under DOS and UNlX-based workstations under OpenLook and Motif. The toolkit is fully integrated for building scheduling systems that reuse AI knowledge base technology. A typical scheduling scenario and three examples of applications that utilize the reuse toolkit will be briefly described. In addition to the tools themselves, a description of the software evolution and reuse methodology that was used is presented.
1987-06-01
evaluation and chip layout planning for VLSI digital systems. A high-level applicative (functional) language, implemented at UCLA, allows combining of...operating system. 2.1 Introduction The complexity of VLSI requires the application of CAD tools at all levels of the design process. In order to be...effective, these tools must be adaptive to the specific design. In this project we studied a design method based on the use of applicative languages
2009-03-01
III. Methodology ...............................................................................................................26 Overview...applications relating to this research and the results they have obtained, as well as the background on LEEDR. Chapter 3 will detail the methodology ...different in that the snow dissipates faster and it is better to descend slower, at rates of 200 – 300 ft/min. 26 III. Methodology This chapter
Modern Psychometric Methodology: Applications of Item Response Theory
ERIC Educational Resources Information Center
Reid, Christine A.; Kolakowsky-Hayner, Stephanie A.; Lewis, Allen N.; Armstrong, Amy J.
2007-01-01
Item response theory (IRT) methodology is introduced as a tool for improving assessment instruments used with people who have disabilities. Need for this approach in rehabilitation is emphasized; differences between IRT and classical test theory are clarified. Concepts essential to understanding IRT are defined, necessary data assumptions are…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynn, R.Y.S.; Bolmarcich, J.J.
The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less
Application of CFD in Indonesian Research: A review
NASA Astrophysics Data System (ADS)
Ambarita, H.; Siregar, M. R.; Kishinami, K.; Daimaruya, M.; Kawai, H.
2018-04-01
Computational Fluid Dynamics (CFD) is a numerical method that solves fluid flow and related governing equations using a computational tool. The studies on CFD, its methodology and its application as a research tool, are increasing. In this study, application of CFD by Indonesian researcher is briefly reviewed. The main objective is to explore the characteristics of CFD applications in Indonesian researchers. Considering the size and reputation, this study uses Scopus publications indexed data base. All of the documents in Scopus related to CFD which is affiliated by at least one of Indonesian researcher are collected to be reviewed. Research topics, CFD method, and simulation results are reviewed in brief. The results show that there are 260 documents found in literature indexed by Scopus. These documents divided into research articles 125 titles, conference paper 135 titles, book 1 title and review 1 title. In the research articles, only limited researchers focused on the development of CFD methodology. Almost all of the articles focus on using CFD in a particular application, as a research tool, such as aircraft application, wind power and heat exchanger. The topics of the 125 research articles can be divided into 12 specific applications and 1 miscellaneous application. The most popular application is Heating Ventilating and Air Conditioning and followed by Reactor, Transportation and Heat Exchanger applications. The most popular commercial CFD code used is ANSYS Fluent and only several researchers use CFX.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman, Carol S.; Benzinger, Leonora; Beshers, George; Hammerslag, David; Kimball, John; Kirslis, Peter A.; Render, Hal; Richards, Paul; Terwilliger, Robert
1985-01-01
The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. The SAGA system consists of a small number of software components that are adapted by the meta-tools into specific tools for use in the software development application. The modules are design so that the meta-tools can construct an environment which is both integrated and flexible. The SAGA project is documented in several papers which are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seimenis, Ioannis; Tsekos, Nikolaos V.; Keroglou, Christoforos
2012-04-15
Purpose: The aim of this work was to develop and test a general methodology for the planning and performance of robot-assisted, MR-guided interventions. This methodology also includes the employment of software tools with appropriately tailored routines to effectively exploit the capabilities of MRI and address the relevant spatial limitations. Methods: The described methodology consists of: (1) patient-customized feasibility study that focuses on the geometric limitations imposed by the gantry, the robotic hardware, and interventional tools, as well as the patient; (2) stereotactic preoperative planning for initial positioning of the manipulator and alignment of its end-effector with a selected target; andmore » (3) real-time, intraoperative tool tracking and monitoring of the actual intervention execution. Testing was performed inside a standard 1.5T MRI scanner in which the MR-compatible manipulator is deployed to provide the required access. Results: A volunteer imaging study demonstrates the application of the feasibility stage. A phantom study on needle targeting is also presented, demonstrating the applicability and effectiveness of the proposed preoperative and intraoperative stages of the methodology. For this purpose, a manually actuated, MR-compatible robotic manipulation system was used to accurately acquire a prescribed target through alternative approaching paths. Conclusions: The methodology presented and experimentally examined allows the effective performance of MR-guided interventions. It is suitable for, but not restricted to, needle-targeting applications assisted by a robotic manipulation system, which can be deployed inside a cylindrical scanner to provide the required access to the patient facilitating real-time guidance and monitoring.« less
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
Rodríguez-Prieto, V; Vicente-Rubiano, M; Sánchez-Matamoros, A; Rubio-Guerri, C; Melero, M; Martínez-López, B; Martínez-Avilés, M; Hoinville, L; Vergne, T; Comin, A; Schauer, B; Dórea, F; Pfeiffer, D U; Sánchez-Vizcaíno, J M
2015-07-01
In this globalized world, the spread of new, exotic and re-emerging diseases has become one of the most important threats to animal production and public health. This systematic review analyses conventional and novel early detection methods applied to surveillance. In all, 125 scientific documents were considered for this study. Exotic (n = 49) and re-emerging (n = 27) diseases constituted the most frequently represented health threats. In addition, the majority of studies were related to zoonoses (n = 66). The approaches found in the review could be divided in surveillance modalities, both active (n = 23) and passive (n = 5); and tools and methodologies that support surveillance activities (n = 57). Combinations of surveillance modalities and tools (n = 40) were also found. Risk-based approaches were very common (n = 60), especially in the papers describing tools and methodologies (n = 50). The main applications, benefits and limitations of each approach were extracted from the papers. This information will be very useful for informing the development of tools to facilitate the design of cost-effective surveillance strategies. Thus, the current literature review provides key information about the advantages, disadvantages, limitations and potential application of methodologies for the early detection of new, exotic and re-emerging diseases.
Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang
2015-02-01
To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.
The report gives results of a research project to develop tools and methodologies to measure aerosol chemical and particle dispersion through space. These tools can be used to devise pollution prevention strategies that could reduce occupant chemical exposures and guide manufactu...
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
GuidosToolbox: universal digital image object analysis
Peter Vogt; Kurt Riitters
2017-01-01
The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...
Application of Bayesian and cost benefit risk analysis in water resources management
NASA Astrophysics Data System (ADS)
Varouchakis, E. A.; Palogos, I.; Karatzas, G. P.
2016-03-01
Decision making is a significant tool in water resources management applications. This technical note approaches a decision dilemma that has not yet been considered for the water resources management of a watershed. A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. The methodological steps are analytically presented associated with originally developed code. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits.
Predictive Model and Methodology for Heat Treatment Distortion Final Report CRADA No. TC-298-92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikkel, D. J.; McCabe, J.
This project was a multi-lab, multi-partner CRADA involving LLNL, Los Alamos National Laboratory, Sandia National Laboratories, Oak Ridge National Laboratory, Martin Marietta Energy Systems and the industrial partner, The National Center of Manufacturing Sciences (NCMS). A number of member companies of NCMS participated including General Motors Corporation, Ford Motor Company, The Torrington Company, Gear Research, the Illinois Institute of Technology Research Institute, and Deformation Control Technology •. LLNL was the lead laboratory for metrology technology used for validation of the computational tool/methodology. LLNL was also the lead laboratory for the development of the software user interface , for the computationalmore » tool. This report focuses on the participation of LLNL and NCMS. The purpose of the project was to develop a computational tool/methodology that engineers would use to predict the effects of heat treatment on the _size and shape of industrial parts made of quench hardenable alloys. Initially, the target application of the tool was gears for automotive power trains.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janjusic, Tommy; Kartsaklis, Christos
Memory scalability is an enduring problem and bottleneck that plagues many parallel codes. Parallel codes designed for High Performance Systems are typically designed over the span of several, and in some instances 10+, years. As a result, optimization practices which were appropriate for earlier systems may no longer be valid and thus require careful optimization consideration. Specifically, parallel codes whose memory footprint is a function of their scalability must be carefully considered for future exa-scale systems. In this paper we present a methodology and tool to study the memory scalability of parallel codes. Using our methodology we evaluate an applicationmore » s memory footprint as a function of scalability, which we coined memory efficiency, and describe our results. In particular, using our in-house tools we can pinpoint the specific application components which contribute to the application s overall memory foot-print (application data- structures, libraries, etc.).« less
Overview of Automotive Core Tools: Applications and Benefits
NASA Astrophysics Data System (ADS)
Doshi, Jigar A.; Desai, Darshak
2017-08-01
Continuous improvement of product and process quality is always challenging and creative task in today's era of globalization. Various quality tools are available and used for the same. Some of them are successful and few of them are not. Considering the complexity in the continuous quality improvement (CQI) process various new techniques are being introduced by the industries, as well as proposed by researchers and academia. Lean Manufacturing, Six Sigma, Lean Six Sigma is some of the techniques. In recent years, there are new tools being opted by the industry, especially automotive, called as Automotive Core Tools (ACT). The intention of this paper is to review the applications and benefits along with existing research on Automotive Core Tools with special emphasis on continuous quality improvement. The methodology uses an extensive review of literature through reputed publications—journals, conference proceedings, research thesis, etc. This paper provides an overview of ACT, its enablers, and exertions, how it evolved into sophisticated methodologies and benefits used in organisations. It should be of value to practitioners of Automotive Core Tools and to academics who are interested in how CQI can be achieved using ACT. It needs to be stressed here that this paper is not intended to scorn Automotive Core Tools, rather, its purpose is limited only to provide a balance on the prevailing positive views toward ACT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaminsky, J.; Tschanz, J.F.
In order to adress barriers to community energy-conservation efforts, DOE has established the Comprehensive Community Energy Management (CCEM) program. The role of CCEM is to provide direction and technical support for energy-conservation efforts at the local level. The program to date has included project efforts to develop combinations and variations of community energy planning and management tools applicable to communities of diverse characteristics. This paper describes the salient features of some of the tools and relates them to the testing program soon to begin in several pilot-study communities. Two methodologies that arose within such an actual planning context are takenmore » from DOE-sponsored projects in Clarksburg, West Virginia and the proposed new capital city for Alaska. Energy management in smaller communities and/or communities with limited funding and manpower resources has received special attention. One project of this type developed in general methodology that emphasizes efficient ways for small communities to reach agreement on local energy problems and potential solutions; by this guidance, the community is led to understand where it should concentrate its efforts in subsequent management activities. Another project concerns rapid growth of either a new or an existing community that could easily outstrip the management resources available locally. This methodology strives to enable the community to seize the opportunity for energy conservation through integrating the design of its energy systems and its development pattern. The last methodology creates applicable tools for comprehensive community energy planning. (MCW)« less
Innovative Technologies for Multicultural Education Needs
ERIC Educational Resources Information Center
Ferdig, Richard E.; Coutts, Jade; DiPietro, Joseph; Lok, Benjamin; Davis, Niki
2007-01-01
Purpose: The purpose of this paper is to discuss several technology applications that are being used to address current problems or opportunities related to multicultural education. Design/methodology/approach: Five technology applications or technology-related projects are discussed, including a teacher education literacy tool, social networking…
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
Bosslet, Gabriel T; Carlos, W Graham; Tybor, David J; McCallister, Jennifer; Huebert, Candace; Henderson, Ashley; Miles, Matthew C; Twigg, Homer; Sears, Catherine R; Brown, Cynthia; Farber, Mark O; Lahm, Tim; Buckley, John D
2017-04-01
Few data have been published regarding scoring tools for selection of postgraduate medical trainee candidates that have wide applicability. The authors present a novel scoring tool developed to assist postgraduate programs in generating an institution-specific rank list derived from selected elements of the U.S. Electronic Residency Application System (ERAS) application. The authors developed and validated an ERAS and interview day scoring tool at five pulmonary and critical care fellowship programs: the ERAS Application Scoring Tool-Interview Scoring Tool. This scoring tool was then tested for intrarater correlation versus subjective rankings of ERAS applications. The process for development of the tool was performed at four other institutions, and it was performed alongside and compared with the "traditional" ranking methods at the five programs and compared with the submitted National Residency Match Program rank list. The ERAS Application Scoring Tool correlated highly with subjective faculty rankings at the primary institution (average Spearman's r = 0.77). The ERAS Application Scoring Tool-Interview Scoring Tool method correlated well with traditional ranking methodology at all five institutions (Spearman's r = 0.54, 0.65, 0.72, 0.77, and 0.84). This study validates a process for selecting and weighting components of the ERAS application and interview day to create a customizable, institution-specific tool for ranking candidates to postgraduate medical education programs. This scoring system can be used in future studies to compare the outcomes of fellowship training.
Delivering spacecraft control centers with embedded knowledge-based systems: The methodology issue
NASA Technical Reports Server (NTRS)
Ayache, S.; Haziza, M.; Cayrac, D.
1994-01-01
Matra Marconi Space (MMS) occupies a leading place in Europe in the domain of satellite and space data processing systems. The maturity of the knowledge-based systems (KBS) technology, the theoretical and practical experience acquired in the development of prototype, pre-operational and operational applications, make it possible today to consider the wide operational deployment of KBS's in space applications. In this perspective, MMS has to prepare the introduction of the new methods and support tools that will form the basis of the development of such systems. This paper introduces elements of the MMS methodology initiatives in the domain and the main rationale that motivated the approach. These initiatives develop along two main axes: knowledge engineering methods and tools, and a hybrid method approach for coexisting knowledge-based and conventional developments.
Integrated Systems Health Management (ISHM) Toolkit
NASA Technical Reports Server (NTRS)
Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim
2013-01-01
A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.
Methodology of management of dredging operations II. Applications.
Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D
2006-04-01
This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.
USDA-ARS?s Scientific Manuscript database
Due to economic and environmental consequences of nitrogen (N) lost from fertilizer applications in corn (Zea mays L.), considerable public and industry attention has been devoted to development of N decision tools. Now a wide variety of tools are available to farmers for managing N inputs. However,...
Cavalcante, Fátima Gonçalves; Minayo, Maria Cecília de Souza; Gutierrez, Denise Machado Duran; de Sousa, Girliani Silva; da Silva, Raimunda Magalhães; Moura, Rosylaine; Meneghel, Stela Nazareth; Grubits, Sonia; Conte, Marta; Cavalcante, Ana Célia Sousa; Figueiredo, Ana Elisa Bastos; Mangas, Raimunda Matilde do Nascimento; Fachola, María Cristina Heuguerot; Izquierdo, Giovane Mendieta
2015-06-01
The article analyses the quality and consistency of a comprehensive interview guide, adapted to study attempted suicide and its ideation among the elderly, and imparts the method followed in applying this tool. The objective is to show how the use of a semi-structured interview and the organization and data analysis set-up were tested and perfected by a network of researchers from twelve universities or research centers in Brazil, Uruguay and Colombia. The method involved application and evaluation of the tool and joint production of an instruction manual on data collection, systematization and analysis. The methodology was followed in 67 interviews with elderly people of 60 or older and in 34 interviews with health professionals in thirteen Brazilian municipalities and in Montevideo and Bogotá, allowing the consistency of the tool and the applicability of the method to be checked, during the process and at the end. The enhanced guide and the instructions for reproducing it are presented herein. The results indicate the suitability and credibility of this methodological approach, tested and certified in interdisciplinary and interinstitutional terms.
Applications of AN OO Methodology and Case to a Daq System
NASA Astrophysics Data System (ADS)
Bee, C. P.; Eshghi, S.; Jones, R.; Kolos, S.; Magherini, C.; Maidantchik, C.; Mapelli, L.; Mornacchi, G.; Niculescu, M.; Patel, A.; Prigent, D.; Spiwoks, R.; Soloviev, I.; Caprini, M.; Duval, P. Y.; Etienne, F.; Ferrato, D.; Le van Suu, A.; Qian, Z.; Gaponenko, I.; Merzliakov, Y.; Ambrosini, G.; Ferrari, R.; Fumagalli, G.; Polesello, G.
The RD13 project has evaluated the use of the Object Oriented Information Engineering (OOIE) method during the development of several software components connected to the DAQ system. The method is supported by a sophisticated commercial CASE tool (Object Management Workbench) and programming environment (Kappa) which covers the full life-cycle of the software including model simulation, code generation and application deployment. This paper gives an overview of the method, CASE tool, DAQ components which have been developed and we relate our experiences with the method and tool, its integration into our development environment and the spiral lifecycle it supports.
NASA Technical Reports Server (NTRS)
Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe
2008-01-01
NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-01-01
Purpose: With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. Methods: A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. Results: The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. Conclusions: The work demonstrates the viability of the design approach and the software tool for analysis of large data sets. PMID:24320426
Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard
2013-11-01
With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.
Christopher A. Lupoli; Wayde C. Morse; Conner Bailey; John Schelhas
2015-01-01
Two prominent critiques of volunteer tourism are that it is a top-down imposed form of development treating host communities as passive recipients of international aid, and that the impacts of volunteer tourism in host communities are not systematically evaluated. To address this we identified a pre-existing participatory methodology for assessing community...
NASA Astrophysics Data System (ADS)
Fitkov-Norris, Elena; Yeghiazarian, Ara
2016-11-01
The analytical tools available to social scientists have traditionally been adapted from tools originally designed for analysis of natural science phenomena. This article discusses the applicability of systems dynamics - a qualitative based modelling approach, as a possible analysis and simulation tool that bridges the gap between social and natural sciences. After a brief overview of the systems dynamics modelling methodology, the advantages as well as limiting factors of systems dynamics to the potential applications in the field of social sciences and human interactions are discussed. The issues arise with regards to operationalization and quantification of latent constructs at the simulation building stage of the systems dynamics methodology and measurement theory is proposed as a ready and waiting solution to the problem of dynamic model calibration, with a view of improving simulation model reliability and validity and encouraging the development of standardised, modular system dynamics models that can be used in social science research.
Advanced project management : training manual.
DOT National Transportation Integrated Search
2006-07-14
This course identifies the principles and methodologies adopted by the Indiana Department of Transportation (INDOT) to support successful project management and delivery. Project management requires the application of knowledge, skills, tools, and te...
Application of Benchmark Dose Methodology to a Variety of Endpoints and Exposures
This latest beta version (1.1b) of the U.S. Environmental Protection Agency (EPA) Benchmark Dose Software (BMDS) is being distributed for public comment. The BMDS system is being developed as a tool to facilitate the application of benchmark dose (BMD) methods to EPA hazardous p...
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
Patwardhan, Anjali; Patwardhan, Prakash
2009-01-01
In the recent climate of consumerism and consumer focused care, health and social care needs to be more responsive than ever before. Consumer needs and preferences can be elicited with accepted validity and reliability only by strict methodological control, customerisation of the questionnaire and skilled interpretation. To construct, conduct, interpret and implement improved service provision, requires a trained work force and infrastructure. This article aims to appraise various aspects of consumer surveys and to assess their value as effective service improvement tools. The customer is the sole reason organisations exist. Consumer surveys are used worldwide as service and quality of care improvement tools by all types of service providers including health service providers. The article critically appraises the value of consumer surveys as service improvement tools in health services tool and its future applications. No one type of survey is the best or ideal. The key is the selection of the correct survey methodology, unique and customised for the particular type/aspect of care being evaluated. The method used should reflect the importance of the information required. Methodological rigor is essential for the effectiveness of consumer surveys as service improvement tools. Unfortunately so far there is no universal consensus on superiority of one particular methodology over another or any benefit of one specific methodology in a given situation. More training and some dedicated resource allocation is required to develop consumer surveys. More research is needed to develop specific survey methodology and evaluation techniques for improved validity and reliability of the surveys as service improvement tools. Measurement of consumer preferences/priorities, evaluation of services and key performance scores, is not easy. Consumer surveys seem impressive tools as they provide the customer a voice for change or modification. However, from a scientific point-of-view their credibility in service improvement in terms of reproducibility, reliability and validity, has remained debatable. This artcile is a critical appraisal of the value of consumer surveys as a service improvement tool in health services--a lesson which needs to be learnt.
Supporting Open Access to European Academic Courses: The ASK-CDM-ECTS Tool
ERIC Educational Resources Information Center
Sampson, Demetrios G.; Zervas, Panagiotis
2013-01-01
Purpose: This paper aims to present and evaluate a web-based tool, namely ASK-CDM-ECTS, which facilitates authoring and publishing on the web descriptions of (open) academic courses in machine-readable format using an application profile of the Course Description Metadata (CDM) specification, namely CDM-ECTS. Design/methodology/approach: The paper…
The Engineering of Engineering Education: Curriculum Development from a Designer's Point of View
ERIC Educational Resources Information Center
Rompelman, Otto; De Graaff, Erik
2006-01-01
Engineers have a set of powerful tools at their disposal for designing robust and reliable technical systems. In educational design these tools are seldom applied. This paper explores the application of concepts from the systems approach in an educational context. The paradigms of design methodology and systems engineering appear to be suitable…
[The GIPSY-RECPAM model: a versatile approach for integrated evaluation in cardiologic care].
Carinci, F
2009-01-01
Tree-structured methodology applied for the GISSI-PSICOLOGIA project, although performed in the framework of earliest GISSI studies, represents a powerful tool to analyze different aspects of cardiologic care. The GISSI-PSICOLOGIA project has delivered a novel methodology based on the joint application of psychometric tools and sophisticated statistical techniques. Its prospective use could allow building effective epidemiological models relevant to the prognosis of the cardiologic patient. The various features of the RECPAM method allow a versatile use in the framework of modern e-health projects. The study used the Cognitive Behavioral Assessment H Form (CBA-H) psychometrics scales. The potential for its future application in the framework of Italian cardiology is relevant and particularly indicated to assist planning of systems for integrated care and routine evaluation of the cardiologic patient.
Measuring attitudes towards the dying process: A systematic review of tools.
Groebe, Bernadette; Strupp, Julia; Eisenmann, Yvonne; Schmidt, Holger; Schlomann, Anna; Rietz, Christian; Voltz, Raymond
2018-04-01
At the end of life, anxious attitudes concerning the dying process are common in patients in Palliative Care. Measurement tools can identify vulnerabilities, resources and the need for subsequent treatment to relieve suffering and support well-being. To systematically review available tools measuring attitudes towards dying, their operationalization, the method of measurement and the methodological quality including generalizability to different contexts. Systematic review according to the PRISMA Statement. Methodological quality of tools assessed by standardized review criteria. MEDLINE, PsycINFO, PsyndexTests and the Health and Psychosocial Instruments were searched from their inception to April 2017. A total of 94 identified studies reported the development and/or validation of 44 tools. Of these, 37 were questionnaires and 7 alternative measurement methods (e.g. projective measures). In 34 of 37 questionnaires, the emotional evaluation (e.g. anxiety) towards dying is measured. Dying is operationalized in general items ( n = 20), in several specific aspects of dying ( n = 34) and as dying of others ( n = 14). Methodological quality of tools was reported inconsistently. Nine tools reported good internal consistency. Of 37 tools, 4 were validated in a clinical sample (e.g. terminal cancer; Huntington disease), indicating questionable generalizability to clinical contexts for most tools. Many tools exist to measure attitudes towards the dying process using different endpoints. This overview can serve as decision framework on which tool to apply in which contexts. For clinical application, only few tools were available. Further validation of existing tools and potential alternative methods in various populations is needed.
NASA Astrophysics Data System (ADS)
Brennan-Tonetta, Margaret
This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or GGE generated by different feedstock/technology combinations at the state and county level. Modification of the model to incorporate factors that will enable the analysis of broader energy policy issues as those mentioned above, are recommended for future research efforts.
Crossing trend analysis methodology and application for Turkish rainfall records
NASA Astrophysics Data System (ADS)
Şen, Zekâi
2018-01-01
Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.
An Assessment of IMPAC - Integrated Methodology for Propulsion and Airframe Controls
NASA Technical Reports Server (NTRS)
Walker, G. P.; Wagner, E. A.; Bodden, D. S.
1996-01-01
This report documents the work done under a NASA sponsored contract to transition to industry technologies developed under the NASA Lewis Research Center IMPAC (Integrated Methodology for Propulsion and Airframe Control) program. The critical steps in IMPAC are exercised on an example integrated flight/propulsion control design for linear airframe/engine models of a conceptual STOVL (Short Take-Off and Vertical Landing) aircraft, and MATRIXX (TM) executive files to implement each step are developed. The results from the example study are analyzed and lessons learned are listed along with recommendations that will improve the application of each design step. The end product of this research is a set of software requirements for developing a user-friendly control design tool which will automate the steps in the IMPAC methodology. Prototypes for a graphical user interface (GUI) are sketched to specify how the tool will interact with the user, and it is recommended to build the tool around existing computer aided control design software packages.
Kansei, surfaces and perception engineering
NASA Astrophysics Data System (ADS)
Rosen, B.-G.; Eriksson, L.; Bergman, M.
2016-09-01
The aesthetic and pleasing properties of a product are important and add significantly to the meaning and relevance of a product. Customer sensation and perception are largely about psychological factors. There has been a strong industrial and academic need and interest for methods and tools to quantify and link product properties to the human response but a lack of studies of the impact of surfaces. In this study, affective surface engineering is used to illustrate and model the link between customer expectations and perception to controllable product surface properties. The results highlight the use of the soft metrology concept for linking physical and human factors contributing to the perception of products. Examples of surface applications of the Kansei methodology are presented from sauna bath, health care, architectural and hygiene tissue application areas to illustrate, discuss and confirm the strength of the methodology. In the conclusions of the study, future research in soft metrology is proposed to allow understanding and modelling of product perception and sensations in combination with a development of the Kansei surface engineering methodology and software tools.
Methodology Investigation of AI(Artificial Intelligence) Test Officer Support Tool. Volume 1
1989-03-01
American Association for Artificial inteligence A! ............. Artificial inteliigence AMC ............ Unt:ed States Army Maeriel Comand ASL...block number) FIELD GROUP SUB-GROUP Artificial Intelligence, Expert Systems Automated Aids to Testing 9. ABSTRACT (Continue on reverse if necessary and...identify by block number) This report covers the application of Artificial Intelligence-Techniques to the problem of creating automated tools to
Generation Y, Learner Autonomy and the Potential of Web 2.0 Tools for Language Learning and Teaching
ERIC Educational Resources Information Center
Morgan, Liam
2012-01-01
Purpose: The purpose of this paper is to examine the relationship between the development of learner autonomy and the application of Web 2.0 tools in the language classroom. Design/methodology/approach: The approach taken is that of qualitative action research within an explicit theoretical framework and the data were collected via surveys and…
Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.
2017-11-15
The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.
ERIC Educational Resources Information Center
Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis
2011-01-01
This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…
NASA Technical Reports Server (NTRS)
Biernacki, John; Juhasz, John; Sadler, Gerald
1991-01-01
A team of Space Station Freedom (SSF) system engineers are in the process of extensive analysis of the SSF requirements, particularly those pertaining to the electrical power system (EPS). The objective of this analysis is the development of a comprehensive, computer-based requirements model, using an enhanced modern structured analysis methodology (EMSA). Such a model provides a detailed and consistent representation of the system's requirements. The process outlined in the EMSA methodology is unique in that it allows the graphical modeling of real-time system state transitions, as well as functional requirements and data relationships, to be implemented using modern computer-based tools. These tools permit flexible updating and continuous maintenance of the models. Initial findings resulting from the application of EMSA to the EPS have benefited the space station program by linking requirements to design, providing traceability of requirements, identifying discrepancies, and fostering an understanding of the EPS.
Case formulation and management using pattern-based formulation (PBF) methodology: clinical case 1.
Fernando, Irosh; Cohen, Martin
2014-02-01
A tool for psychiatric case formulation known as pattern-based formulation (PBF) has been recently introduced. This paper presents an application of this methodology in formulating and managing complex clinical cases. The symptomatology of the clinical presentation has been parsed into individual clinical phenomena and interpreted by selecting explanatory models. The clinical presentation demonstrates how PBF has been used as a clinical tool to guide clinicians' thinking, that takes a structured approach to manage multiple issues using a broad range of management strategies. In doing so, the paper also introduces a number of patterns related to the observed clinical phenomena that can be re-used as explanatory models when formulating other clinical cases. It is expected that this paper will assist clinicians, and particularly trainees, to better understand PBF methodology and apply it to improve their formulation skills.
Methodology for building confidence measures
NASA Astrophysics Data System (ADS)
Bramson, Aaron L.
2004-04-01
This paper presents a generalized methodology for propagating known or estimated levels of individual source document truth reliability to determine the confidence level of a combined output. Initial document certainty levels are augmented by (i) combining the reliability measures of multiply sources, (ii) incorporating the truth reinforcement of related elements, and (iii) incorporating the importance of the individual elements for determining the probability of truth for the whole. The result is a measure of confidence in system output based on the establishing of links among the truth values of inputs. This methodology was developed for application to a multi-component situation awareness tool under development at the Air Force Research Laboratory in Rome, New York. Determining how improvements in data quality and the variety of documents collected affect the probability of a correct situational detection helps optimize the performance of the tool overall.
NASA Astrophysics Data System (ADS)
Scaini, C.; Felpeto, A.; Martí, J.; Carniel, R.
2014-05-01
This paper presents a GIS-based methodology to estimate damages produced by volcanic eruptions. The methodology is constituted by four parts: definition and simulation of eruptive scenarios, exposure analysis, vulnerability assessment and estimation of expected damages. Multi-hazard eruptive scenarios are defined for the Teide-Pico Viejo active volcanic complex, and simulated through the VORIS tool. The exposure analysis identifies the elements exposed to the hazard at stake and focuses on the relevant assets for the study area. The vulnerability analysis is based on previous studies on the built environment and complemented with the analysis of transportation and urban infrastructures. Damage assessment is performed associating a qualitative damage rating to each combination of hazard and vulnerability. This operation consists in a GIS-based overlap, performed for each hazardous phenomenon considered and for each element. The methodology is then automated into a GIS-based tool using an ArcGIS® program. Given the eruptive scenarios and the characteristics of the exposed elements, the tool produces expected damage maps. The tool is applied to the Icod Valley (North of Tenerife Island) which is likely to be affected by volcanic phenomena in case of eruption from both the Teide-Pico Viejo volcanic complex and North-West basaltic rift. Results are thematic maps of vulnerability and damage that can be displayed at different levels of detail, depending on the user preferences. The aim of the tool is to facilitate territorial planning and risk management in active volcanic areas.
Applications of Support Vector Machines In Chemo And Bioinformatics
NASA Astrophysics Data System (ADS)
Jayaraman, V. K.; Sundararajan, V.
2010-10-01
Conventional linear & nonlinear tools for classification, regression & data driven modeling are being replaced on a rapid scale by newer techniques & tools based on artificial intelligence and machine learning. While the linear techniques are not applicable for inherently nonlinear problems, newer methods serve as attractive alternatives for solving real life problems. Support Vector Machine (SVM) classifiers are a set of universal feed-forward network based classification algorithms that have been formulated from statistical learning theory and structural risk minimization principle. SVM regression closely follows the classification methodology. In this work recent applications of SVM in Chemo & Bioinformatics will be described with suitable illustrative examples.
[SciELO: method for electronic publishing].
Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C
2001-01-01
It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.
Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)
NASA Technical Reports Server (NTRS)
1983-01-01
The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.
Design and Analysis of Turbines for Space Applications
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.; Huber, Frank W.
2003-01-01
In order to mitigate the risk of rocket propulsion development, efficient, accurate, detailed fluid dynamics analysis of the turbomachinery is necessary. This analysis is used for component development, design parametrics, performance prediction, and environment definition. To support this requirement, a task was developed at NASAh4arshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. The turbine chosen on which to demonstrate the procedure was a supersonic design suitable for a reusable launch vehicle (RLV). The hot gas path and blading were redesigned to obtain an increased efficiency. The redesign of the turbine was conducted with a consideration of system requirements, realizing that a highly efficient turbine that, for example, significantly increases engine weight, is of limited benefit. Both preliminary and detailed designs were considered. To generate an improved design, one-dimensional (1D) design and analysis tools, computational fluid dynamics (CFD), response surface methodology (RSM), and neural nets (NN) were used.
ERIC Educational Resources Information Center
Dawe, Gerald F. M.; Vetter, Arnie; Martin, Stephen
2004-01-01
A sustainability audit of Holme Lacy College is described. The approach adopted a "triple bottom line" assessment, comprising a number of key steps: a scoping review utilising a revised Royal Institution of Chartered Surveyors project appraisal tool; an environmental impact assessment based on ecological footprinting and a social and…
Applying axiomatic design to a medication distribution system
NASA Astrophysics Data System (ADS)
Raguini, Pepito B.
As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.
SYRCLE’s risk of bias tool for animal studies
2014-01-01
Background Systematic Reviews (SRs) of experimental animal studies are not yet common practice, but awareness of the merits of conducting such SRs is steadily increasing. As animal intervention studies differ from randomized clinical trials (RCT) in many aspects, the methodology for SRs of clinical trials needs to be adapted and optimized for animal intervention studies. The Cochrane Collaboration developed a Risk of Bias (RoB) tool to establish consistency and avoid discrepancies in assessing the methodological quality of RCTs. A similar initiative is warranted in the field of animal experimentation. Methods We provide an RoB tool for animal intervention studies (SYRCLE’s RoB tool). This tool is based on the Cochrane RoB tool and has been adjusted for aspects of bias that play a specific role in animal intervention studies. To enhance transparency and applicability, we formulated signalling questions to facilitate judgment. Results The resulting RoB tool for animal studies contains 10 entries. These entries are related to selection bias, performance bias, detection bias, attrition bias, reporting bias and other biases. Half these items are in agreement with the items in the Cochrane RoB tool. Most of the variations between the two tools are due to differences in design between RCTs and animal studies. Shortcomings in, or unfamiliarity with, specific aspects of experimental design of animal studies compared to clinical studies also play a role. Conclusions SYRCLE’s RoB tool is an adapted version of the Cochrane RoB tool. Widespread adoption and implementation of this tool will facilitate and improve critical appraisal of evidence from animal studies. This may subsequently enhance the efficiency of translating animal research into clinical practice and increase awareness of the necessity of improving the methodological quality of animal studies. PMID:24667063
Doing accelerator physics using SDDS, UNIX, and EPICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borland, M.; Emery, L.; Sereno, N.
1995-12-31
The use of the SDDS (Self-Describing Data Sets) file protocol, together with the UNIX operating system and EPICS (Experimental Physics and Industrial Controls System), has proved powerful during the commissioning of the APS (Advanced Photon Source) accelerator complex. The SDDS file protocol has permitted a tool-oriented approach to developing applications, wherein generic programs axe written that function as part of multiple applications. While EPICS-specific tools were written for data collection, automated experiment execution, closed-loop control, and so forth, data processing and display axe done with the SDDS Toolkit. Experiments and data reduction axe implemented as UNIX shell scripts that coordinatemore » the execution of EPICS specific tools and SDDS tools. Because of the power and generic nature of the individual tools and of the UNIX shell environment, automated experiments can be prepared and executed rapidly in response to unanticipated needs or new ideas. Examples are given of application of this methodology to beam motion characterization, beam-position-monitor offset measurements, and klystron characterization.« less
The GenABEL Project for statistical genomics.
Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.
Designing application software in wide area network settings
NASA Technical Reports Server (NTRS)
Makpangou, Mesaac; Birman, Ken
1990-01-01
Progress in methodologies for developing robust local area network software has not been matched by similar results for wide area settings. The design of application software spanning multiple local area environments is examined. For important classes of applications, simple design techniques are presented that yield fault tolerant wide area programs. An implementation of these techniques as a set of tools for use within the ISIS system is described.
Application of risk analysis in water resourses management
NASA Astrophysics Data System (ADS)
Varouchakis, Emmanouil; Palogos, Ioannis
2017-04-01
A common cost-benefit analysis approach, which is novel in the risk analysis of hydrologic/hydraulic applications, and a Bayesian decision analysis are applied to aid the decision making on whether or not to construct a water reservoir for irrigation purposes. The alternative option examined is a scaled parabolic fine variation in terms of over-pumping violations in contrast to common practices that usually consider short-term fines. Such an application, and in such detail, represents new feedback. The results indicate that the probability uncertainty is the driving issue that determines the optimal decision with each methodology, and depending on the unknown probability handling, each methodology may lead to a different optimal decision. Thus, the proposed tool can help decision makers (stakeholders) to examine and compare different scenarios using two different approaches before making a decision considering the cost of a hydrologic/hydraulic project and the varied economic charges that water table limit violations can cause inside an audit interval. In contrast to practices that assess the effect of each proposed action separately considering only current knowledge of the examined issue, this tool aids decision making by considering prior information and the sampling distribution of future successful audits. This tool is developed in a web service for the easier stakeholders' access.
Requirement Metrics for Risk Identification
NASA Technical Reports Server (NTRS)
Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence
1996-01-01
The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.
GAMES II Project: a general architecture for medical knowledge-based systems.
Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M
1994-10-01
GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.
Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design
NASA Technical Reports Server (NTRS)
Wuerer, J. E.; Gran, M.; Held, T. W.
1994-01-01
The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.
Semantic integration of gene expression analysis tools and data sources using software connectors
2013-01-01
Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380
Semantic integration of gene expression analysis tools and data sources using software connectors.
Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G
2013-10-25
The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Deckard, Gloria J; Borkowski, Nancy; Diaz, Deisell; Sanchez, Carlos; Boisette, Serge A
2010-01-01
Designated primary care clinics largely serve low-income and uninsured patients who present a disproportionate number of chronic illnesses and face great difficulty in obtaining the medical care they need, particularly the access to specialty physicians. With limited capacity for providing specialty care, these primary care clinics generally refer patients to safety net hospitals' specialty ambulatory care clinics. A large public safety net health system successfully improved the effectiveness and efficiency of the specialty clinic referral process through application of Lean Six Sigma, an advanced process-improvement methodology and set of tools driven by statistics and engineering concepts.
Díaz-Zuccarini, V.; Narracott, A.J.; Burriesci, G.; Zervides, C.; Rafiroiu, D.; Jones, D.; Hose, D.R.; Lawford, P.V.
2009-01-01
This paper describes the use of diverse software tools in cardiovascular applications. These tools were primarily developed in the field of engineering and the applications presented push the boundaries of the software to address events related to venous and arterial valve closure, exploration of dynamic boundary conditions or the inclusion of multi-scale boundary conditions from protein to organ levels. The future of cardiovascular research and the challenges that modellers and clinicians face from validation to clinical uptake are discussed from an end-user perspective. PMID:19487202
Díaz-Zuccarini, V; Narracott, A J; Burriesci, G; Zervides, C; Rafiroiu, D; Jones, D; Hose, D R; Lawford, P V
2009-07-13
This paper describes the use of diverse software tools in cardiovascular applications. These tools were primarily developed in the field of engineering and the applications presented push the boundaries of the software to address events related to venous and arterial valve closure, exploration of dynamic boundary conditions or the inclusion of multi-scale boundary conditions from protein to organ levels. The future of cardiovascular research and the challenges that modellers and clinicians face from validation to clinical uptake are discussed from an end-user perspective.
NASA Technical Reports Server (NTRS)
Mayer, Richard
1988-01-01
The integrated development support environment (IDSE) is a suite of integrated software tools that provide intelligent support for information modelling. These tools assist in function, information, and process modeling. Additional tools exist to assist in gathering and analyzing information to be modeled. This is a user's guide to application of the IDSE. Sections covering the requirements and design of each of the tools are presented. There are currently three integrated computer aided manufacturing definition (IDEF) modeling methodologies: IDEF0, IDEF1, and IDEF2. Also, four appendices exist to describe hardware and software requirements, installation procedures, and basic hardware usage.
Space system operations and support cost analysis using Markov chains
NASA Technical Reports Server (NTRS)
Unal, Resit; Dean, Edwin B.; Moore, Arlene A.; Fairbairn, Robert E.
1990-01-01
This paper evaluates the use of Markov chain process in probabilistic life cycle cost analysis and suggests further uses of the process as a design aid tool. A methodology is developed for estimating operations and support cost and expected life for reusable space transportation systems. Application of the methodology is demonstrated for the case of a hypothetical space transportation vehicle. A sensitivity analysis is carried out to explore the effects of uncertainty in key model inputs.
Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J
2008-04-01
Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.
NASA Astrophysics Data System (ADS)
Bruno, N.; Roncella, R.
2018-05-01
The need to safeguard and preserve Cultural Heritage (CH) is increasing and especially in Italy, where the amount of historical buildings is considerable, having efficient and standardized processes of CH management and conservation becomes strategic. At the time being, there are no tools capable of fulfilling all the specific functions required by Cultural Heritage documentation and, due to the complexity of historical assets, there are no solution as flexible and customizable as CH specific needs require. Nevertheless, BIM methodology can represent the most effective solution, on condition that proper methodologies, tools and functions are made available. The paper describes an ongoing research on the implementation of a Historical BIM system for the Parma cathedral, aimed at the maintenance, conservation and restoration. Its main goal was to give a concrete answer to the lack of specific tools required by Cultural Heritage documentation: organized and coordinated storage and management of historical data, easy analysis and query, time management, 3D modelling of irregular shapes, flexibility, user-friendliness, etc. The paper will describe the project and the implemented methodology, focusing mainly on survey and modelling phases. In describing the methodology, critical issues about the creation of a HBIM will be highlighted, trying to outline a workflow applicable also in other similar contexts.
Applications of artificial intelligence V; Proceedings of the Meeting, Orlando, FL, May 18-20, 1987
NASA Technical Reports Server (NTRS)
Gilmore, John F. (Editor)
1987-01-01
The papers contained in this volume focus on current trends in applications of artificial intelligence. Topics discussed include expert systems, image understanding, artificial intelligence tools, knowledge-based systems, heuristic systems, manufacturing applications, and image analysis. Papers are presented on expert system issues in automated, autonomous space vehicle rendezvous; traditional versus rule-based programming techniques; applications to the control of optional flight information; methodology for evaluating knowledge-based systems; and real-time advisory system for airborne early warning.
Aglago, Elom Kouassivi; Landais, Edwige; Nicolas, Geneviève; Margetts, Barrie; Leclercq, Catherine; Allemand, Pauline; Aderibigbe, Olaide; Agueh, Victoire Damienne; Amuna, Paul; Annor, George Amponsah; El Ati, Jalila; Coates, Jennifer; Colaiezzi, Brooke; Compaore, Ella; Delisle, Hélène; Faber, Mieke; Fungo, Robert; Gouado, Inocent; El Hamdouchi, Asmaa; Hounkpatin, Waliou Amoussa; Konan, Amoin Georgette; Labzizi, Saloua; Ledo, James; Mahachi, Carol; Maruapula, Segametsi Ditshebo; Mathe, Nonsikelelo; Mbabazi, Muniirah; Mirembe, Mandy Wilja; Mizéhoun-Adissoda, Carmelle; Nzi, Clement Diby; Pisa, Pedro Terrence; El Rhazi, Karima; Zotor, Francis; Slimani, Nadia
2017-06-19
Collection of reliable and comparable individual food consumption data is of primary importance to better understand, control and monitor malnutrition and its related comorbidities in low- and middle-income countries (LMICs), including in Africa. The lack of standardised dietary tools and their related research support infrastructure remains a major obstacle to implement concerted and region-specific research and action plans worldwide. Citing the magnitude and importance of this challenge, the International Agency for Research on Cancer (IARC/WHO) launched the "Global Nutrition Surveillance initiative" to pilot test the use of a standardized 24-h dietary recall research tool (GloboDiet), validated in Europe, in other regions. In this regard, the development of the GloboDiet-Africa can be optimised by better understanding of the local specific methodological needs, barriers and opportunities. The study aimed to evaluate the standardized 24-h dietary recall research tool (GloboDiet) as a possible common methodology for research and surveillance across Africa. A consultative panel of African and international experts in dietary assessment participated in six e-workshop sessions. They completed an in-depth e-questionnaire to evaluate the GloboDiet dietary methodology before and after participating in the e-workshop. The 29 experts expressed their satisfaction on the potential of the software to address local specific needs when evaluating the main structure of the software, the stepwise approach for data collection and standardisation concept. Nevertheless, additional information to better describe local foods and recipes, as well as particular culinary patterns (e.g. mortar pounding), were proposed. Furthermore, food quantification in shared-plates and -bowls eating situations and interviewing of populations with low literacy skills, especially in rural settings, were acknowledged as requiring further specific considerations and appropriate solutions. An overall positive evaluation of the GloboDiet methodology by both African and international experts, supports the flexibility and potential applicability of this tool in diverse African settings and sets a positive platform for improved dietary monitoring and surveillance. Following this evaluation, prerequisite for future implementation and/or adaptation of GloboDiet in Africa, rigorous and robust capacity building as well as knowledge transfer will be required to roadmap a stepwise approach to implement this methodology across pilot African countries/regions.
NASA Astrophysics Data System (ADS)
Zolfaghari, Mohammad R.
2009-07-01
Recent achievements in computer and information technology have provided the necessary tools to extend the application of probabilistic seismic hazard mapping from its traditional engineering use to many other applications. Examples for such applications are risk mitigation, disaster management, post disaster recovery planning and catastrophe loss estimation and risk management. Due to the lack of proper knowledge with regard to factors controlling seismic hazards, there are always uncertainties associated with all steps involved in developing and using seismic hazard models. While some of these uncertainties can be controlled by more accurate and reliable input data, the majority of the data and assumptions used in seismic hazard studies remain with high uncertainties that contribute to the uncertainty of the final results. In this paper a new methodology for the assessment of seismic hazard is described. The proposed approach provides practical facility for better capture of spatial variations of seismological and tectonic characteristics, which allows better treatment of their uncertainties. In the proposed approach, GIS raster-based data models are used in order to model geographical features in a cell-based system. The cell-based source model proposed in this paper provides a framework for implementing many geographically referenced seismotectonic factors into seismic hazard modelling. Examples for such components are seismic source boundaries, rupture geometry, seismic activity rate, focal depth and the choice of attenuation functions. The proposed methodology provides improvements in several aspects of the standard analytical tools currently being used for assessment and mapping of regional seismic hazard. The proposed methodology makes the best use of the recent advancements in computer technology in both software and hardware. The proposed approach is well structured to be implemented using conventional GIS tools.
Towards sustainable mobile systems configurations: Application to a tuna purse seiner.
García Rellán, A; Vázquez Brea, C; Bello Bugallo, P M
2018-08-01
Fishing is one of the most important marine activities. It contributes to both overfishing and marine pollution, the two main threats to the ocean environment. In this context, the aim of this work is to investigate and validate methodologies for the identification of more sustainable operating configurations for a tuna purse seiner. The proposed methodology is based on a previous one applied to secondary industrial systems, taking into account the Integrated Pollution Prevention and Control focus, developed for the most potentially industrial polluting sources. The idea is to apply the same type of methodologies and concepts used for secondary industrial punctual sources, to a primary industrial mobile activity. This methodology combines two tools: "Material and Energy Flow Analysis" (a tool from industrial metabolism), and "Best Available Techniques Analysis". The first provides a way to detect "Improvable Flows" into de system, and the second provides a way to define sustainable options to improve them. Five main Improvable Flows have been identified in the selected case study, the activity of a purse seiner, most of them related with energy consumption and air emission, in different stages of the fishing activity. Thirty-one Best Available Techniques candidates for the system have been inventoried, that potentially could improve the sustainability of the activity. Seven of them are not implemented yet to the case study. The potential improvements of the system proposed by this work are related to energy efficiency, waste management, prevention and control of air emissions. This methodology demonstrates to be a good tool towards sustainable punctual systems, but also towards sustainable mobile systems such as the fishing activity in oceans, as the tuna purse seiner validated here. The practical application of the identified technologies to fishing systems will contribute to prevent and reduce marine pollution, one of the greatest threats of today's oceans. Copyright © 2017 Elsevier B.V. All rights reserved.
The GenABEL Project for statistical genomics
Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381
Techniques and Tools for Performance Tuning of Parallel and Distributed Scientific Applications
NASA Technical Reports Server (NTRS)
Sarukkai, Sekhar R.; VanderWijngaart, Rob F.; Castagnera, Karen (Technical Monitor)
1994-01-01
Performance degradation in scientific computing on parallel and distributed computer systems can be caused by numerous factors. In this half-day tutorial we explain what are the important methodological issues involved in obtaining codes that have good performance potential. Then we discuss what are the possible obstacles in realizing that potential on contemporary hardware platforms, and give an overview of the software tools currently available for identifying the performance bottlenecks. Finally, some realistic examples are used to illustrate the actual use and utility of such tools.
A Screening Method for Assessing Cumulative Impacts
Alexeeff, George V.; Faust, John B.; August, Laura Meehan; Milanes, Carmen; Randles, Karen; Zeise, Lauren; Denton, Joan
2012-01-01
The California Environmental Protection Agency (Cal/EPA) Environmental Justice Action Plan calls for guidelines for evaluating “cumulative impacts.” As a first step toward such guidelines, a screening methodology for assessing cumulative impacts in communities was developed. The method, presented here, is based on the working definition of cumulative impacts adopted by Cal/EPA [1]: “Cumulative impacts means exposures, public health or environmental effects from the combined emissions and discharges in a geographic area, including environmental pollution from all sources, whether single or multi-media, routinely, accidentally, or otherwise released. Impacts will take into account sensitive populations and socio-economic factors, where applicable and to the extent data are available.” The screening methodology is built on this definition as well as current scientific understanding of environmental pollution and its adverse impacts on health, including the influence of both intrinsic, biological factors and non-intrinsic socioeconomic factors in mediating the effects of pollutant exposures. It addresses disparities in the distribution of pollution and health outcomes. The methodology provides a science-based tool to screen places for relative cumulative impacts, incorporating both the pollution burden on a community- including exposures to pollutants, their public health and environmental effects- and community characteristics, specifically sensitivity and socioeconomic factors. The screening methodology provides relative rankings to distinguish more highly impacted communities from less impacted ones. It may also help identify which factors are the greatest contributors to a community’s cumulative impact. It is not designed to provide quantitative estimates of community-level health impacts. A pilot screening analysis is presented here to illustrate the application of this methodology. Once guidelines are adopted, the methodology can serve as a screening tool to help Cal/EPA programs prioritize their activities and target those communities with the greatest cumulative impacts. PMID:22470315
Computer-Aided Sensor Development Focused on Security Issues.
Bialas, Andrzej
2016-05-26
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.
Computer-Aided Sensor Development Focused on Security Issues
Bialas, Andrzej
2016-01-01
The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research. PMID:27240360
Abdur-Rashid, Khalil; Furber, Steven Woodward; Abdul-Basser, Taha
2013-04-01
We survey the meta-ethical tools and institutional processes that traditional Islamic ethicists apply when deliberating on bioethical issues. We present a typology of these methodological elements, giving particular attention to the meta-ethical techniques and devices that traditional Islamic ethicists employ in the absence of decisive or univocal authoritative texts or in the absence of established transmitted cases. In describing how traditional Islamic ethicists work, we demonstrate that these experts possess a variety of discursive tools. We find that the ethical responsa-i.e., the products of the application of the tools that we describe-are generally characterized by internal consistency. We also conclude that Islamic ethical reasoning on bioethical issues, while clearly scripture-based, is also characterized by strong consequentialist elements and possesses clear principles-based characteristics. The paper contributes to the study of bioethics by familiarizing non-specialists in Islamic ethics with the role, scope, and applicability of key Islamic ethical concepts, such as "aims" (maqāṣid), "universals" (kulliyyāt), "interest" (maṣlaḥa), "maxims" (qawā`id), "controls" (ḍawābit), "differentiators" (furūq), "preponderization" (tarjīḥ), and "extension" (tafrī`).
Szaleniec, Maciej
2012-01-01
Artificial Neural Networks (ANNs) are introduced as robust and versatile tools in quantitative structure-activity relationship (QSAR) modeling. Their application to the modeling of enzyme reactivity is discussed, along with methodological issues. Methods of input variable selection, optimization of network internal structure, data set division and model validation are discussed. The application of ANNs in the modeling of enzyme activity over the last 20 years is briefly recounted. The discussed methodology is exemplified by the case of ethylbenzene dehydrogenase (EBDH). Intelligent Problem Solver and genetic algorithms are applied for input vector selection, whereas k-means clustering is used to partition the data into training and test cases. The obtained models exhibit high correlation between the predicted and experimental values (R(2) > 0.9). Sensitivity analyses and study of the response curves are used as tools for the physicochemical interpretation of the models in terms of the EBDH reaction mechanism. Neural networks are shown to be a versatile tool for the construction of robust QSAR models that can be applied to a range of aspects important in drug design and the prediction of biological activity.
Validation of highly reliable, real-time knowledge-based systems
NASA Technical Reports Server (NTRS)
Johnson, Sally C.
1988-01-01
Knowledge-based systems have the potential to greatly increase the capabilities of future aircraft and spacecraft and to significantly reduce support manpower needed for the space station and other space missions. However, a credible validation methodology must be developed before knowledge-based systems can be used for life- or mission-critical applications. Experience with conventional software has shown that the use of good software engineering techniques and static analysis tools can greatly reduce the time needed for testing and simulation of a system. Since exhaustive testing is infeasible, reliability must be built into the software during the design and implementation phases. Unfortunately, many of the software engineering techniques and tools used for conventional software are of little use in the development of knowledge-based systems. Therefore, research at Langley is focused on developing a set of guidelines, methods, and prototype validation tools for building highly reliable, knowledge-based systems. The use of a comprehensive methodology for building highly reliable, knowledge-based systems should significantly decrease the time needed for testing and simulation. A proven record of delivering reliable systems at the beginning of the highly visible testing and simulation phases is crucial to the acceptance of knowledge-based systems in critical applications.
NASA Astrophysics Data System (ADS)
Marti, Joan; Bartolini, Stefania; Becerril, Laura
2016-04-01
VeTOOLS is a project funded by the European Commission's Humanitarian Aid and Civil Protection department (ECHO), and aims at creating an integrated software platform specially designed to assess and manage volcanic risk. The project facilitates interaction and cooperation between scientists and Civil Protection Agencies in order to share, unify, and exchange procedures, methodologies and technologies to effectively reduce the impacts of volcanic disasters. The project aims at 1) improving and developing volcanic risk assessment and management capacities in active volcanic regions; 2) developing universal methodologies, scenario definitions, response strategies and alert protocols to cope with the full range of volcanic threats; 4) improving quantitative methods and tools for vulnerability and risk assessment; and 5) defining thresholds and protocols for civil protection. With these objectives, the VeTOOLS project points to two of the Sendai Framework resolutions for implementing it: i) Provide guidance on methodologies and standards for risk assessments, disaster risk modelling and the use of data; ii) Promote and support the availability and application of science and technology to decision-making, and offers a good example on how a close collaboration between science and civil protection is an effective way to contribute to DRR. European Commission ECHO Grant SI2.695524
Process synthesis involving multi-period operations by the P-graph framework
The P-graph (process graph) framework is an effective tool for process-network synthesis (PNS). Here we extended it to multi-period operations. The efficacy of the P-graph methodology has been demonstrated by numerous applications. The unambiguous representation of processes and ...
Fault Tree Analysis: An Emerging Methodology for Instructional Science.
ERIC Educational Resources Information Center
Wood, R. Kent; And Others
1979-01-01
Describes Fault Tree Analysis, a tool for systems analysis which attempts to identify possible modes of failure in systems to increase the probability of success. The article defines the technique and presents the steps of FTA construction, focusing on its application to education. (RAO)
DOT National Transportation Integrated Search
2002-07-01
The purpose of the work is to validate the safety assessment methodology previously developed for passenger rail vehicle dynamics, which requires the application of simulation tools as well as testing of vehicles under different track scenarios. This...
Khan, F I; Abbasi, S A
2000-07-10
Fault tree analysis (FTA) is based on constructing a hypothetical tree of base events (initiating events) branching into numerous other sub-events, propagating the fault and eventually leading to the top event (accident). It has been a powerful technique used traditionally in identifying hazards in nuclear installations and power industries. As the systematic articulation of the fault tree is associated with assigning probabilities to each fault, the exercise is also sometimes called probabilistic risk assessment. But powerful as this technique is, it is also very cumbersome and costly, limiting its area of application. We have developed a new algorithm based on analytical simulation (named as AS-II), which makes the application of FTA simpler, quicker, and cheaper; thus opening up the possibility of its wider use in risk assessment in chemical process industries. Based on the methodology we have developed a computer-automated tool. The details are presented in this paper.
Wolfs, Vincent; Villazon, Mauricio Florencio; Willems, Patrick
2013-01-01
Applications such as real-time control, uncertainty analysis and optimization require an extensive number of model iterations. Full hydrodynamic sewer models are not sufficient for these applications due to the excessive computation time. Simplifications are consequently required. A lumped conceptual modelling approach results in a much faster calculation. The process of identifying and calibrating the conceptual model structure could, however, be time-consuming. Moreover, many conceptual models lack accuracy, or do not account for backwater effects. To overcome these problems, a modelling methodology was developed which is suited for semi-automatic calibration. The methodology is tested for the sewer system of the city of Geel in the Grote Nete river basin in Belgium, using both synthetic design storm events and long time series of rainfall input. A MATLAB/Simulink(®) tool was developed to guide the modeller through the step-wise model construction, reducing significantly the time required for the conceptual modelling process.
Sadiq, Rehan; Rodriguez, Manuel J
2005-04-01
Interpreting water quality data routinely generated for control and monitoring purposes in water distribution systems is a complicated task for utility managers. In fact, data for diverse water quality indicators (physico-chemical and microbiological) are generated at different times and at different locations in the distribution system. To simplify and improve the understanding and the interpretation of water quality, methodologies for aggregation and fusion of data must be developed. In this paper, the Dempster-Shafer theory also called theory of evidence is introduced as a potential methodology for interpreting water quality data. The conceptual basis of this methodology and the process for its implementation are presented by two applications. The first application deals with the interpretation of spatial water quality data fusion, while the second application deals with the development of water quality index based on key monitored indicators. Based on the obtained results, the authors discuss the potential contribution of theory of evidence as a decision-making tool for water quality management.
Molinos-Senante, María; Hernández-Sancho, Francesc; Sala-Garrido, Ramón
2012-01-01
The concept of sustainability involves the integration of economic, environmental, and social aspects and this also applies in the field of wastewater treatment. Economic feasibility studies are a key tool for selecting the most appropriate option from a set of technological proposals. Moreover, these studies are needed to assess the viability of transferring new technologies from pilot-scale to full-scale. In traditional economic feasibility studies, the benefits that have no market price, such as environmental benefits, are not considered and are therefore underestimated. To overcome this limitation, we propose a new methodology to assess the economic viability of wastewater treatment technologies that considers internal and external impacts. The estimation of the costs is based on the use of cost functions. To quantify the environmental benefits from wastewater treatment, the distance function methodology is proposed to estimate the shadow price of each pollutant removed in the wastewater treatment. The application of this methodological approach by decision makers enables the calculation of the true costs and benefits associated with each alternative technology. The proposed methodology is presented as a useful tool to support decision making.
Enhanced Learning Methodologies and the Implementation of an Identification Course
NASA Astrophysics Data System (ADS)
Guidorzi, Roberto
This paper proposes some considerations on the role played by information and communication technologies in the evolution of educational systems and describes the design philosophy and the realization of a basic course on dynamic system identification that relies on constructivist methodologies and on the use of e-learning environments. It reports also some of the opinions formulated by the students on the effectiveness of the available tools and on their role in acquiring proficiency in the application of identification techniques in modeling real processes.
NASA Astrophysics Data System (ADS)
Neuville, R.; Pouliot, J.; Poux, F.; Hallot, P.; De Rudder, L.; Billen, R.
2017-10-01
This paper deals with the establishment of a comprehensive methodological framework that defines 3D visualisation rules and its application in a decision support tool. Whilst the use of 3D models grows in many application fields, their visualisation remains challenging from the point of view of mapping and rendering aspects to be applied to suitability support the decision making process. Indeed, there exists a great number of 3D visualisation techniques but as far as we know, a decision support tool that facilitates the production of an efficient 3D visualisation is still missing. This is why a comprehensive methodological framework is proposed in order to build decision tables for specific data, tasks and contexts. Based on the second-order logic formalism, we define a set of functions and propositions among and between two collections of entities: on one hand static retinal variables (hue, size, shape…) and 3D environment parameters (directional lighting, shadow, haze…) and on the other hand their effect(s) regarding specific visual tasks. It enables to define 3D visualisation rules according to four categories: consequence, compatibility, potential incompatibility and incompatibility. In this paper, the application of the methodological framework is demonstrated for an urban visualisation at high density considering a specific set of entities. On the basis of our analysis and the results of many studies conducted in the 3D semiotics, which refers to the study of symbols and how they relay information, the truth values of propositions are determined. 3D visualisation rules are then extracted for the considered context and set of entities and are presented into a decision table with a colour coding. Finally, the decision table is implemented into a plugin developed with three.js, a cross-browser JavaScript library. The plugin consists of a sidebar and warning windows that help the designer in the use of a set of static retinal variables and 3D environment parameters.
Modeling Methodologies for Design and Control of Solid Oxide Fuel Cell APUs
NASA Astrophysics Data System (ADS)
Pianese, C.; Sorrentino, M.
2009-08-01
Among the existing fuel cell technologies, Solid Oxide Fuel Cells (SOFC) are particularly suitable for both stationary and mobile applications, due to their high energy conversion efficiencies, modularity, high fuel flexibility, low emissions and noise. Moreover, the high working temperatures enable their use for efficient cogeneration applications. SOFCs are entering in a pre-industrial era and a strong interest for designing tools has growth in the last years. Optimal system configuration, components sizing, control and diagnostic system design require computational tools that meet the conflicting needs of accuracy, affordable computational time, limited experimental efforts and flexibility. The paper gives an overview on control-oriented modeling of SOFC at both single cell and stack level. Such an approach provides useful simulation tools for designing and controlling SOFC-APUs destined to a wide application area, ranging from automotive to marine and airplane APUs.
Coussot, G; Ladner, Y; Bayart, C; Faye, C; Vigier, V; Perrin, C
2015-01-09
This work aims at studying the potentialities of an on-line capillary electrophoresis (CE)-based digestion methodology for evaluating polymer-drug conjugates degradability in the presence of free trypsin (in-solution digestion). A sandwich plugs injection scheme with transverse diffusion of laminar profile (TDLFP) mode was used to achieve on-line digestions. Electrophoretic separation conditions were established using poly-l-Lysine (PLL) as reference substrate. Comparison with off-line digestion was carried out to demonstrate the feasibility of the proposed methodology. The applicability of the on-line CE-based digestion methodology was evaluated for two PLL-drug conjugates and for the four first generations of dendrigraft of lysine (DGL). Different electrophoretic profiles presenting the formation of di, tri, and tetralysine were observed for PLL-drug and DGL. These findings are in good agreement with the nature of the linker used to link the drug to PLL structure and the predicted degradability of DGL. The present on-line methodology applicability was also successfully proven for protein conjugates hydrolysis. In summary, the described methodology provides a powerful tool for the rapid study of biodegradable polymers. Copyright © 2014 Elsevier B.V. All rights reserved.
Incorporation of lean methodology into pharmacy residency programs.
John, Natalie; Snider, Holly; Edgerton, Lisa; Whalin, Laurie
2017-03-15
The implementation of lean methodology into pharmacy residency programs at a community teaching hospital is described. New Hanover Regional Medical Center, a community teaching hospital in southeastern North Carolina, fully adopted a lean culture in 2010. Given the success of lean strategies organizationally, this methodology was used to assist with the evaluation and development of its pharmacy residency programs in 2014. Lean tools and activities have also been incorporated into residency requirements and rotation learning activities. The majority of lean events correspond to the required competency areas evaluating leadership and management, teaching, and education. These events have included participation in and facilitation of various lean problem-solving and communication tools. The application of the 4 rules of lean has resulted in enhanced management of the programs and provides a set of tools by which continual quality improvement can be ensured. Regular communication and direct involvement of all invested parties have been critical in developing and sustaining new improvements. In addition to program enhancements, lean methodology offers novel methods by which residents may be incorporated into leadership activities. The incorporation of lean methodology into pharmacy residency programs has translated into a variety of realized and potential benefits for the programs, the preceptors and residents, and the health system. Specific areas of growth have included quality-improvement processes, the expansion of leadership opportunities for residents, and improved communication among program directors, preceptors, and residents. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Benefit-cost methodology study with example application of the use of wind generators
NASA Technical Reports Server (NTRS)
Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.
1975-01-01
An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.
KNOW ESSENTIALS: a tool for informed decisions in the absence of formal HTA systems.
Mathew, Joseph L
2011-04-01
Most developing countries and resource-limited settings lack robust health technology assessment (HTA) systems. Because the development of locally relevant HTA is not immediately viable, and the extrapolation of external HTA is inappropriate, a new model for evaluating health technologies is required. The aim of this study was to describe the development and application of KNOW ESSENTIALS, a tool facilitating evidence-based decisions on health technologies by stakeholders in settings lacking formal HTA systems. Current HTA methodology was examined through literature search. Additional issues relevant to resource-limited settings, but not adequately addressed in current methodology, were identified through further literature search, appraisal of contextually relevant issues, discussion with healthcare professionals familiar with the local context, and personal experience. A set of thirteen elements important for evidence-based decisions was identified, selected and combined into a tool with the mnemonic KNOW ESSENTIALS. Detailed definitions for each element, coding for the elements, and a system to evaluate a given health technology using the tool were developed. Developing countries and resource-limited settings face several challenges to informed decision making. Models that are relevant and applicable in high-income countries are unlikely in such settings. KNOW ESSENTIALS is an alternative that facilitates evidence-based decision making by stakeholders without formal expertise in HTA. The tool could be particularly useful, as an interim measure, in healthcare systems that are developing HTA capacity. It could also be useful anywhere when rapid evidence-based decisions on health technologies are required.
Evaluating an Inquiry-Based Bioinformatics Course Using Q Methodology
ERIC Educational Resources Information Center
Ramlo, Susan E.; McConnell, David; Duan, Zhong-Hui; Moore, Francisco B.
2008-01-01
Faculty at a Midwestern metropolitan public university recently developed a course on bioinformatics that emphasized collaboration and inquiry. Bioinformatics, essentially the application of computational tools to biological data, is inherently interdisciplinary. Thus part of the challenge of creating this course was serving the needs and…
Encouraging the learning of hydraulic engineering subjects in agricultural engineering schools
NASA Astrophysics Data System (ADS)
Rodríguez Sinobas, Leonor; Sánchez Calvo, Raúl
2014-09-01
Several methodological approaches to improve the understanding and motivation of students in Hydraulic Engineering courses have been adopted in the Agricultural Engineering School at Technical University of Madrid. During three years student's progress and satisfaction have been assessed by continuous monitoring and the use of 'online' and web tools in two undergraduate courses. Results from their application to encourage learning and communication skills in Hydraulic Engineering subjects are analysed and compared to the initial situation. Student's academic performance has improved since their application, but surveys made among students showed that not all the methodological proposals were perceived as beneficial. Their participation in the 'online', classroom and reading activities was low although they were well assessed.
Disposable Screen Printed Electrochemical Sensors: Tools for Environmental Monitoring
Hayat, Akhtar; Marty, Jean Louis
2014-01-01
Screen printing technology is a widely used technique for the fabrication of electrochemical sensors. This methodology is likely to underpin the progressive drive towards miniaturized, sensitive and portable devices, and has already established its route from “lab-to-market” for a plethora of sensors. The application of these sensors for analysis of environmental samples has been the major focus of research in this field. As a consequence, this work will focus on recent important advances in the design and fabrication of disposable screen printed sensors for the electrochemical detection of environmental contaminants. Special emphasis is given on sensor fabrication methodology, operating details and performance characteristics for environmental applications. PMID:24932865
Characteristics of a semi-custom library development system
NASA Technical Reports Server (NTRS)
Yancey, M.; Cannon, R.
1990-01-01
Standard cell and gate array macro libraries are in common use with workstation computer aided design (CAD) tools for application specific integrated circuit (ASIC) semi-custom application and have resulted in significant improvements in the overall design efficiencies as contrasted with custom design methodologies. Similar design methodology enhancements in providing for the efficient development of the library cells is an important factor in responding to the need for continuous technology improvement. The characteristics of a library development system that provides design flexibility and productivity enhancements for the library development engineer as he provides libraries in the state-of-the-art process technologies are presented. An overview of Gould's library development system ('Accolade') is also presented.
Manson, Amy; Poyade, Matthieu; Rea, Paul
2015-10-19
The use of computer-aided learning in education can be advantageous, especially when interactive three-dimensional (3D) models are used to aid learning of complex 3D structures. The anatomy of the ventricular system of the brain is difficult to fully understand as it is seldom seen in 3D, as is the flow of cerebrospinal fluid (CSF). This article outlines a workflow for the creation of an interactive training tool for the cerebral ventricular system, an educationally challenging area of anatomy. This outline is based on the use of widely available computer software packages. Using MR images of the cerebral ventricular system and several widely available commercial and free software packages, the techniques of 3D modelling, texturing, sculpting, image editing and animations were combined to create a workflow in the creation of an interactive educational and training tool. This was focussed on cerebral ventricular system anatomy, and the flow of cerebrospinal fluid. We have successfully created a robust methodology by using key software packages in the creation of an interactive education and training tool. This has resulted in an application being developed which details the anatomy of the ventricular system, and flow of cerebrospinal fluid using an anatomically accurate 3D model. In addition to this, our established workflow pattern presented here also shows how tutorials, animations and self-assessment tools can also be embedded into the training application. Through our creation of an established workflow in the generation of educational and training material for demonstrating cerebral ventricular anatomy and flow of cerebrospinal fluid, it has enormous potential to be adopted into student training in this field. With the digital age advancing rapidly, this has the potential to be used as an innovative tool alongside other methodologies for the training of future healthcare practitioners and scientists. This workflow could be used in the creation of other tools, which could be developed for use not only on desktop and laptop computers but also smartphones, tablets and fully immersive stereoscopic environments. It also could form the basis on which to build surgical simulations enhanced with haptic interaction.
CFD applications: The Lockheed perspective
NASA Technical Reports Server (NTRS)
Miranda, Luis R.
1987-01-01
The Numerical Aerodynamic Simulator (NAS) epitomizes the coming of age of supercomputing and opens exciting horizons in the world of numerical simulation. An overview of supercomputing at Lockheed Corporation in the area of Computational Fluid Dynamics (CFD) is presented. This overview will focus on developments and applications of CFD as an aircraft design tool and will attempt to present an assessment, withing this context, of the state-of-the-art in CFD methodology.
Methodology and Reporting of Mobile Heath and Smartphone Application Studies for Schizophrenia
Torous, John; Firth, Joseph; Mueller, Nora; Onnela, J.P.; Baker, Justin T.
2016-01-01
The increasing prevalence of mobile devices among patients of all demographic groups has the potential to transform the ways we diagnose, monitor, treat, and study mental illness. As new tools and technologies emerge, clinicians and researchers are confronted with an increasing array of options for clinical assessment, through digital capture of the essential behavioral elements of a condition, and intervention, through formalized treatments, coaching, and other technology-assisted means of patient communication. And yet, as with any new set of tools for assessment or treatment of a medical condition, establishing and adhering to reporting guidelines – i.e., what works and under what conditions – is an essential component to the translational research process. Here, we review the methodological strengths and weaknesses in the existing literature on schizophrenia smartphone and wearables utilizing the recently published World Health Organization mHealth Evaluation, Reporting and Assessment (mERA) guidelines for evaluating mobile health applications. While growing evidence supports the feasibility of using several mobile tools in severe mental illness, most studies to date failed to adequately report accessibility, interoperability, costs, scalability, replicability, data security, usability testing, or compliance with national guidelines or regulatory statutes. Future research efforts addressing these specific gaps in the literature will help advance our understanding and realize the clinical potential of these new tools of psychiatry. PMID:28234658
Evaluation in industry of a draft code of practice for manual handling.
Ashby, Liz; Tappin, David; Bentley, Tim
2004-05-01
This paper reports findings from a study which evaluated the draft New Zealand Code of Practice for Manual Handling. The evaluation assessed the ease of use, applicability and validity of the Code and in particular the associated manual handling hazard assessment tools, within New Zealand industry. The Code was studied in a sample of eight companies from four sectors of industry. Subjective feedback and objective findings indicated that the Code was useful, applicable and informative. The manual handling hazard assessment tools incorporated in the Code could be adequately applied by most users, with risk assessment outcomes largely consistent with the findings of researchers using more specific ergonomics methodologies. However, some changes were recommended to the risk assessment tools to improve usability and validity. The evaluation concluded that both the Code and the tools within it would benefit from simplification, improved typography and layout, and industry-specific information on manual handling hazards.
The Role of Research in Making Interactive Products Effective.
ERIC Educational Resources Information Center
Rossi, Robert J.
1986-01-01
Argues that research and development (R&D) methods should be utilized to develop new technologies for training and retailing and describes useful research tools--critical incident methodology, task analysis, performance recording. Discussion covers R&D applications to interactive systems development in the areas of product need, customer…
Using GIS Tools and Environmental Scanning to Forecast Industry Workforce Needs
ERIC Educational Resources Information Center
Gaertner, Elaine; Fleming, Kevin; Marquez, Michelle
2009-01-01
The Centers of Excellence (COE) provide regional workforce data on high growth, high demand industries and occupations for use by community colleges in program planning and resource enhancement. This article discusses the environmental scanning research methodology and its application to data-driven decision making in community college program…
Engaging or Distracting: Children's Tablet Computer Use in Education
ERIC Educational Resources Information Center
McEwen, Rhonda N.; Dubé, Adam K.
2015-01-01
Communications studies and psychology offer analytical and methodological tools that when combined have the potential to bring novel perspectives on human interaction with technologies. In this study of children using simple and complex mathematics applications on tablet computers, cognitive load theory is used to answer the question: how…
Noyes, Jane; Booth, Andrew; Flemming, Kate; Garside, Ruth; Harden, Angela; Lewin, Simon; Pantoja, Tomas; Hannes, Karin; Cargo, Margaret; Thomas, James
2018-05-01
The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method implementation evidence. Choice of appropriate methodologies, methods, and tools is essential when developing a rigorous protocol and conducting the synthesis. Cochrane authors who conduct qualitative evidence syntheses have thus far used a small number of relatively simple methods to address similarly written questions. Cochrane has invested in methodological work to develop new tools and to encourage the production of exemplar reviews to show the value of more innovative methods that address a wider range of questions. In this paper, in the series, we report updated guidance on the selection of tools to assess methodological limitations in qualitative studies and methods to extract and synthesize qualitative evidence. We recommend application of Grades of Recommendation, Assessment, Development, and Evaluation-Confidence in the Evidence from Qualitative Reviews to assess confidence in qualitative synthesized findings. This guidance aims to support review authors to undertake a qualitative evidence synthesis that is intended to be integrated subsequently with the findings of one or more Cochrane reviews of the effects of similar interventions. The review of intervention effects may be undertaken concurrently with or separate to the qualitative evidence synthesis. We encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Allen, Cheryl L.
1991-01-01
Enhanced engineering tools can be obtained through the integration of expert system methodologies and existing design software. The application of these methodologies to the spacecraft design and cost model (SDCM) software provides an improved technique for the selection of hardware for unmanned spacecraft subsystem design. The knowledge engineering system (KES) expert system development tool was used to implement a smarter equipment section algorithm than that which is currently achievable through the use of a standard data base system. The guidance, navigation, and control subsystems of the SDCM software was chosen as the initial subsystem for implementation. The portions of the SDCM code which compute the selection criteria and constraints remain intact, and the expert system equipment selection algorithm is embedded within this existing code. The architecture of this new methodology is described and its implementation is reported. The project background and a brief overview of the expert system is described, and once the details of the design are characterized, an example of its implementation is demonstrated.
Investigating transport pathways in the ocean
NASA Astrophysics Data System (ADS)
Griffa, Annalisa; Haza, Angelique; Özgökmen, Tamay M.; Molcard, Anne; Taillandier, Vincent; Schroeder, Katrin; Chang, Yeon; Poulain, P.-M.
2013-01-01
The ocean is a very complex medium with scales of motion that range from thousands of kilometers to the dissipation scales. Transport by ocean currents plays an important role in many practical applications ranging from climatic problems to coastal management and accident mitigation at sea. Understanding transport is challenging because of the chaotic nature of particle motion. In the last decade, new methods have been put forth to improve our understanding of transport. Powerful tools are provided by dynamical system theory, that allow the identification of the barriers to transport and their time variability for a given flow. A shortcoming of this approach, though, is that it is based on the assumption that the velocity field is known with good accuracy, which is not always the case in practical applications. Improving model performance in terms of transport can be addressed using another important methodology that has been recently developed, namely the assimilation of Lagrangian data provided by floating buoys. The two methodologies are technically different but in many ways complementary. In this paper, we review examples of applications of both methodologies performed by the authors in the last few years, considering flows at different scales and in various ocean basins. The results are among the very first examples of applications of the methodologies to the real ocean including testing with Lagrangian in-situ data. The results are discussed in the general framework of the extended fields related to these methodologies, pointing out to open questions and potential for improvements, with an outlook toward future strategies.
Merly, Corinne; Chapman, Antony; Mouvet, Christophe
2012-01-01
Research results in environmental and socio-economic sciences are often under-used by stakeholders involved in the management of natural resources. To minimise this gap, the FP6 EU interdisciplinary project AquaTerra (AT) developed an end-users' integration methodology in order to ensure that the data, knowledge and tools related to the soil-water-sediment system that were generated by the project were delivered in a meaningful way for end-users, thus improving their uptake. The methodology and examples of its application are presented in this paper. From the 408 project deliverables, 96 key findings were identified, 53 related to data and knowledge, and 43 describing advanced tools. River Basin Management (RBM) stakeholders workshops identified 8 main RBM issues and 25 specific stakeholders' questions related to RBM which were classified into seven groups of cross-cutting issues, namely scale, climate change, non-climatic change, the need for systemic approaches, communication and participation, international and inter-basin coordination and collaboration, and the implementation of the Water Framework Directive. The integration methodology enabled an assessment of how AT key findings meet stakeholders' demands, and for each main RBM issue and for each specific question, described the added-value of the AT project in terms of knowledge and tools generated, key parameters to consider, and recommendations that can be made to stakeholders and the wider scientific community. Added value and limitations of the integration methodology and its outcomes are discussed and recommendations are provided to further improve integration methodology and bridge the gaps between scientific research data and their potential uptake by end-users.
Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications.
Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis
2016-12-24
The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol.
Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications
Fernández-Caramés, Tiago M.; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis
2016-01-01
The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol. PMID:28029119
Willemet, Marie; Vennin, Samuel; Alastruey, Jordi
2016-12-08
Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.
1984-01-01
The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.
Decision support and disease management: a logic engineering approach.
Fox, J; Thomson, R
1998-12-01
This paper describes the development and application of PROforma, a unified technology for clinical decision support and disease management. Work leading to the implementation of PROforma has been carried out in a series of projects funded by European agencies over the past 13 years. The work has been based on logic engineering, a distinct design and development methodology that combines concepts from knowledge engineering, logic programming, and software engineering. Several of the projects have used the approach to demonstrate a wide range of applications in primary and specialist care and clinical research. Concurrent academic research projects have provided a sound theoretical basis for the safety-critical elements of the methodology. The principal technical results of the work are the PROforma logic language for defining clinical processes and an associated suite of software tools for delivering applications, such as decision support and disease management procedures. The language supports four standard objects (decisions, plans, actions, and enquiries), each of which has an intuitive meaning with well-understood logical semantics. The development toolset includes a powerful visual programming environment for composing applications from these standard components, for verifying consistency and completeness of the resulting specification and for delivering stand-alone or embeddable applications. Tools and applications that have resulted from the work are described and illustrated, with examples from specialist cancer care and primary care. The results of a number of evaluation activities are included to illustrate the utility of the technology.
Recent advances in systems metabolic engineering tools and strategies.
Chae, Tong Un; Choi, So Young; Kim, Je Woong; Ko, Yoo-Sung; Lee, Sang Yup
2017-10-01
Metabolic engineering has been playing increasingly important roles in developing microbial cell factories for the production of various chemicals and materials to achieve sustainable chemical industry. Nowadays, many tools and strategies are available for performing systems metabolic engineering that allows systems-level metabolic engineering in more sophisticated and diverse ways by adopting rapidly advancing methodologies and tools of systems biology, synthetic biology and evolutionary engineering. As an outcome, development of more efficient microbial cell factories has become possible. Here, we review recent advances in systems metabolic engineering tools and strategies together with accompanying application examples. In addition, we describe how these tools and strategies work together in simultaneous and synergistic ways to develop novel microbial cell factories. Copyright © 2017 Elsevier Ltd. All rights reserved.
Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.
1978-01-01
Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.
Parallel adaptive discontinuous Galerkin approximation for thin layer avalanche modeling
NASA Astrophysics Data System (ADS)
Patra, A. K.; Nichita, C. C.; Bauer, A. C.; Pitman, E. B.; Bursik, M.; Sheridan, M. F.
2006-08-01
This paper describes the development of highly accurate adaptive discontinuous Galerkin schemes for the solution of the equations arising from a thin layer type model of debris flows. Such flows have wide applicability in the analysis of avalanches induced by many natural calamities, e.g. volcanoes, earthquakes, etc. These schemes are coupled with special parallel solution methodologies to produce a simulation tool capable of very high-order numerical accuracy. The methodology successfully replicates cold rock avalanches at Mount Rainier, Washington and hot volcanic particulate flows at Colima Volcano, Mexico.
Analytical aspects of plant metabolite profiling platforms: current standings and future aims.
Seger, Christoph; Sturm, Sonja
2007-02-01
Over the past years, metabolic profiling has been established as a comprehensive systems biology tool. Mass spectrometry or NMR spectroscopy-based technology platforms combined with unsupervised or supervised multivariate statistical methodologies allow a deep insight into the complex metabolite patterns of plant-derived samples. Within this review, we provide a thorough introduction to the analytical hard- and software requirements of metabolic profiling platforms. Methodological limitations are addressed, and the metabolic profiling workflow is exemplified by summarizing recent applications ranging from model systems to more applied topics.
NASA Technical Reports Server (NTRS)
1974-01-01
The purpose of the BRAVO User's Manual is to describe the BRAVO methodology in terms of step-by-step procedures. The BRAVO methodology then becomes a tool which a team of analysts can utilize to perform cost effectiveness analyses on potential future space applications with a relatively general set of input information and a relatively small expenditure of resources. An overview of the BRAVO procedure is given by describing the complete procedure in a general form.
RLV Turbine Performance Optimization
NASA Technical Reports Server (NTRS)
Griffin, Lisa W.; Dorney, Daniel J.
2001-01-01
A task was developed at NASA/Marshall Space Flight Center (MSFC) to improve turbine aerodynamic performance through the application of advanced design and analysis tools. There are four major objectives of this task: 1) to develop, enhance, and integrate advanced turbine aerodynamic design and analysis tools; 2) to develop the methodology for application of the analytical techniques; 3) to demonstrate the benefits of the advanced turbine design procedure through its application to a relevant turbine design point; and 4) to verify the optimized design and analysis with testing. Final results of the preliminary design and the results of the two-dimensional (2D) detailed design of the first-stage vane of a supersonic turbine suitable for a reusable launch vehicle (R-LV) are presented. Analytical techniques for obtaining the results are also discussed.
[Progress in methodological characteristics of clinical practice guideline for osteoarthritis].
Xing, D; Wang, B; Lin, J H
2017-06-01
At present, several clinical practice guidelines for the treatment of osteoarthritis have been developed by institutes or societies. The ultimate purpose of developing clinical practice guidelines is to formulate the process in the treatment of osteoarthritis effectively. However, the methodologies used in developing clinical practice guidelines may place an influence on the transformation and application of that in treating osteoarthritis. The present study summarized the methodological features of individual clinical practice guideline and presented the tools for quality evaluation of clinical practice guideline. The limitations of current osteoarthritis guidelines of China are also indicated. The review article might help relevant institutions improve the quality in developing guide and clinical transformation.
A human factors methodology for real-time support applications
NASA Technical Reports Server (NTRS)
Murphy, E. D.; Vanbalen, P. M.; Mitchell, C. M.
1983-01-01
A general approach to the human factors (HF) analysis of new or existing projects at NASA/Goddard is delineated. Because the methodology evolved from HF evaluations of the Mission Planning Terminal (MPT) and the Earth Radiation Budget Satellite Mission Operations Room (ERBS MOR), it is directed specifically to the HF analysis of real-time support applications. Major topics included for discussion are the process of establishing a working relationship between the Human Factors Group (HFG) and the project, orientation of HF analysts to the project, human factors analysis and review, and coordination with major cycles of system development. Sub-topics include specific areas for analysis and appropriate HF tools. Management support functions are outlined. References provide a guide to sources of further information.
ART-Ada design project, phase 2
NASA Technical Reports Server (NTRS)
Lee, S. Daniel; Allen, Bradley P.
1990-01-01
Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.
Application of damage tolerance methodology in certification of the Piaggio P-180 Avanti
NASA Technical Reports Server (NTRS)
Johnson, Jerry
1992-01-01
The Piaggio P-180 Avanti, a twin pusher-prop engine nine-passenger business aircraft was certified in 1990, to the requirements of FAR Part 23 and Associated Special Conditions for Composite Structure. Certification included the application of a damage tolerant methodology to the design of the composite forward wing and empennage (vertical fin, horizontal stabilizer, tailcone, and rudder) structure. This methodology included an extensive analytical evaluation coupled with sub-component and full-scale testing of the structure. The work from the Damage Tolerance Analysis Assessment was incorporated into the full-scale testing. Damage representing hazards such as dropped tools, ground equipment, handling, and runway debris, was applied to the test articles. Additional substantiation included allowing manufacturing discrepancies to exist unrepaired on the full-scale articles and simulated bondline failures in critical elements. The importance of full-scale testing in the critical environmental conditions and the application of critical damage are addressed. The implication of damage tolerance on static and fatigue testing is discussed. Good correlation between finite element solutions and experimental test data was observed.
Integration of infrared thermography into various maintenance methodologies
NASA Astrophysics Data System (ADS)
Morgan, William T.
1993-04-01
Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.
Utilizing the Project Method for Teaching Culture and Intercultural Competence
ERIC Educational Resources Information Center
Euler, Sasha S.
2017-01-01
This article presents a detailed methodological outline for teaching culture through project work. It is argued that because project work makes it possible to gain transferrable and applicable knowledge and insight, it is the ideal tool for teaching culture with the aim of achieving real intercultural communicative competence (ICC). Preceding the…
ERIC Educational Resources Information Center
Valasek, Mark A.; Repa, Joyce J.
2005-01-01
In recent years, real-time polymerase chain reaction (PCR) has emerged as a robust and widely used methodology for biological investigation because it can detect and quantify very small amounts of specific nucleic acid sequences. As a research tool, a major application of this technology is the rapid and accurate assessment of changes in gene…
Eye Tracking: A Brief Guide for Developmental Researchers
ERIC Educational Resources Information Center
Feng, Gary
2011-01-01
Eye tracking offers a powerful research tool for developmental scientists. In this brief article, the author introduces the methodology and issues associated with its applications in developmental research, beginning with an overview of eye movements and eye-tracking technologies, followed by examples of how it is used to study the developing mind…
USDA-ARS?s Scientific Manuscript database
The application of genotyping by sequencing (GBS) approaches, combined with data imputation methodologies, is narrowing the genetic knowledge gap between major and understudied, minor crops. GBS is an excellent tool to characterize the genomic structure of recently domesticated (~200 years) and unde...
Assessment of Development of the Learning Organization Concept in Jordanian Industrial Companies
ERIC Educational Resources Information Center
Khadra, Marah F. Abu; Rawabdeh, Ibrahim A.
2006-01-01
Purpose: The purpose of this research is to examine the impact on organizational performance of the application of management and human resource practices, and to attempt to outline key elements and assess development of the learning organization (LO) concept in Jordan. Design/methodology/approach: The tool described in this article assesses…
Seeking an Online Social Media Radar
ERIC Educational Resources Information Center
ter Veen, James
2014-01-01
Purpose: The purpose of this paper is to explore how the application of Systems Engineering tools and techniques can be applied to rapidly process and analyze the vast amounts of data present in social media in order to yield practical knowledge for Command and Control (C2) systems. Design/methodology/approach: Based upon comparative analysis of…
Aggarwal, Vasudha; Ha, Taekjip
2014-11-01
Macromolecular interactions play a central role in many biological processes. Protein-protein interactions have mostly been studied by co-immunoprecipitation, which cannot provide quantitative information on all possible molecular connections present in the complex. We will review a new approach that allows cellular proteins and biomolecular complexes to be studied in real-time at the single-molecule level. This technique is called single-molecule pull-down (SiMPull), because it integrates principles of conventional immunoprecipitation with the powerful single-molecule fluorescence microscopy. SiMPull is used to count how many of each protein is present in the physiological complexes found in cytosol and membranes. Concurrently, it serves as a single-molecule biochemical tool to perform functional studies on the pulled-down proteins. In this review, we will focus on the detailed methodology of SiMPull, its salient features and a wide range of biological applications in comparison with other biosensing tools. © 2014 WILEY Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
2009-11-24
production on Air Bases Field the Critical Asset Prioritization Methodology ( CAPM ) tool Manage costs Provide energy leadership throughout the Air...residing on military installations • Field the Critical Asset Prioritization Methodology ( CAPM ) tool. This CAPM tool will allow prioritization of Air...fielding of the Critical Asset Prioritization Methodology ( CAPM ) tool and the adoption of financial standards to enable transparency across Air
Molecular plant breeding: methodology and achievements.
Varshney, Rajeev K; Hoisington, Dave A; Nayak, Spurthi N; Graner, Andreas
2009-01-01
The progress made in DNA marker technology has been remarkable and exciting in recent years. DNA markers have proved valuable tools in various analyses in plant breeding, for example, early generation selection, enrichment of complex F(1)s, choice of donor parent in backcrossing, recovery of recurrent parent genotype in backcrossing, linkage block analysis and selection. Other main areas of applications of molecular markers in plant breeding include germplasm characterization/fingerprinting, determining seed purity, systematic sampling of germplasm, and phylogenetic analysis. Molecular markers, thus, have proved powerful tools in replacing the bioassays and there are now many examples available to show the efficacy of such markers. We have illustrated some basic concepts and methodology of applying molecular markers for enhancing the selection efficiency in plant breeding. Some successful examples of product developments of molecular breeding have also been presented.
Novel optical methodologies in studying mechanical signal transduction in mammalian cells
NASA Technical Reports Server (NTRS)
Stamatas, G. N.; McIntire, L. V.
1999-01-01
For the last 3 decades evidence has been accumulating that some types of mammalian cells respond to their mechanically active environment by altering their morphology, growth rate, and metabolism. The study of such responses is very important in understanding, physiological and pathological conditions ranging from bone formation to atherosclerosis. Obtaining this knowledge has been the goal for an active research area in bioengineering termed cell mechanotransduction. The advancement of optical methodologies used in cell biology research has given the tools to elucidate cellular mechanisms that would otherwise be impossible to visualize. Combined with molecular biology techniques, they give engineers invaluable tools in understanding the chemical pathways involved in mechanotransduction. Herein we briefly review the current knowledge on mechanical signal transduction in mammalian cells, focusing on the application of novel optical techniques in the ongoing research.
Optimization as a Tool for Consistency Maintenance in Multi-Resolution Simulation
NASA Technical Reports Server (NTRS)
Drewry, Darren T; Reynolds, Jr , Paul F; Emanuel, William R
2006-01-01
The need for new approaches to the consistent simulation of related phenomena at multiple levels of resolution is great. While many fields of application would benefit from a complete and approachable solution to this problem, such solutions have proven extremely difficult. We present a multi-resolution simulation methodology that uses numerical optimization as a tool for maintaining external consistency between models of the same phenomena operating at different levels of temporal and/or spatial resolution. Our approach follows from previous work in the disparate fields of inverse modeling and spacetime constraint-based animation. As a case study, our methodology is applied to two environmental models of forest canopy processes that make overlapping predictions under unique sets of operating assumptions, and which execute at different temporal resolutions. Experimental results are presented and future directions are addressed.
A Development Architecture for Serious Games Using BCI (Brain Computer Interface) Sensors
Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun
2012-01-01
Games that use brainwaves via brain–computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories. PMID:23202227
A development architecture for serious games using BCI (brain computer interface) sensors.
Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun
2012-11-12
Games that use brainwaves via brain-computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.
Gates, Allison; Gates, Michelle; Duarte, Gonçalo; Cary, Maria; Becker, Monika; Prediger, Barbara; Vandermeer, Ben; Fernandes, Ricardo M; Pieper, Dawid; Hartling, Lisa
2018-06-13
Systematic reviews (SRs) of randomised controlled trials (RCTs) can provide the best evidence to inform decision-making, but their methodological and reporting quality varies. Tools exist to guide the critical appraisal of quality and risk of bias in SRs, but evaluations of their measurement properties are limited. We will investigate the interrater reliability (IRR), usability, and applicability of A MeaSurement Tool to Assess systematic Reviews (AMSTAR), AMSTAR 2, and Risk Of Bias In Systematic reviews (ROBIS) for SRs in the fields of biomedicine and public health. An international team of researchers at three collaborating centres will undertake the study. We will use a random sample of 30 SRs of RCTs investigating therapeutic interventions indexed in MEDLINE in February 2014. Two reviewers at each centre will appraise the quality and risk of bias in each SR using AMSTAR, AMSTAR 2, and ROBIS. We will record the time to complete each assessment and for the two reviewers to reach consensus for each SR. We will extract the descriptive characteristics of each SR, the included studies, participants, interventions, and comparators. We will also extract the direction and strength of the results and conclusions for the primary outcome. We will summarise the descriptive characteristics of the SRs using means and standard deviations, or frequencies and proportions. To test for interrater reliability between reviewers and between the consensus agreements of reviewer pairs, we will use Gwet's AC 1 statistic. For comparability to previous evaluations, we will also calculate weighted Cohen's kappa and Fleiss' kappa statistics. To estimate usability, we will calculate the mean time to complete the appraisal and to reach consensus for each tool. To inform applications of the tools, we will test for statistical associations between quality scores and risk of bias judgments, and the results and conclusions of the SRs. Appraising the methodological and reporting quality of SRs is necessary to determine the trustworthiness of their conclusions. Which tool may be most reliably applied and how the appraisals should be used is uncertain; the usability of newly developed tools is unknown. This investigation of common (AMSTAR) and newly developed (AMSTAR 2, ROBIS) tools will provide empiric data to inform their application, interpretation, and refinement.
The 1988 Goddard Conference on Space Applications of Artificial Intelligence
NASA Technical Reports Server (NTRS)
Rash, James (Editor); Hughes, Peter (Editor)
1988-01-01
This publication comprises the papers presented at the 1988 Goddard Conference on Space Applications of Artificial Intelligence held at the NASA/Goddard Space Flight Center, Greenbelt, Maryland on May 24, 1988. The purpose of this annual conference is to provide a forum in which current research and development directed at space applications of artificial intelligence can be presented and discussed. The papers in these proceedings fall into the following areas: mission operations support, planning and scheduling; fault isolation/diagnosis; image processing and machine vision; data management; modeling and simulation; and development tools/methodologies.
Developing CORBA-Based Distributed Scientific Applications From Legacy Fortran Programs
NASA Technical Reports Server (NTRS)
Sang, Janche; Kim, Chan; Lopez, Isaac
2000-01-01
An efficient methodology is presented for integrating legacy applications written in Fortran into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into Common Object Request Broker Architecture (CORBA) objects are discussed. Fortran codes are modified as little as possible as they are decomposed into modules and wrapped as objects. A new conversion tool takes the Fortran application as input and generates the C/C++ header file and Interface Definition Language (IDL) file. In addition, the performance of the client server computing is evaluated.
Methodology of decreasing software complexity using ontology
NASA Astrophysics Data System (ADS)
DÄ browska-Kubik, Katarzyna
2015-09-01
In this paper a model of web application`s source code, based on the OSD ontology (Ontology for Software Development), is proposed. This model is applied to implementation and maintenance phase of software development process through the DevOntoCreator tool [5]. The aim of this solution is decreasing software complexity of that source code, using many different maintenance techniques, like creation of documentation, elimination dead code, cloned code or bugs, which were known before [1][2]. Due to this approach saving on software maintenance costs of web applications will be possible.
High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics
Carvalho, Carlos M.; Chang, Jeffrey; Lucas, Joseph E.; Nevins, Joseph R.; Wang, Quanli; West, Mike
2010-01-01
We describe studies in molecular profiling and biological pathway analysis that use sparse latent factor and regression models for microarray gene expression data. We discuss breast cancer applications and key aspects of the modeling and computational methodology. Our case studies aim to investigate and characterize heterogeneity of structure related to specific oncogenic pathways, as well as links between aggregate patterns in gene expression profiles and clinical biomarkers. Based on the metaphor of statistically derived “factors” as representing biological “subpathway” structure, we explore the decomposition of fitted sparse factor models into pathway subcomponents and investigate how these components overlay multiple aspects of known biological activity. Our methodology is based on sparsity modeling of multivariate regression, ANOVA, and latent factor models, as well as a class of models that combines all components. Hierarchical sparsity priors address questions of dimension reduction and multiple comparisons, as well as scalability of the methodology. The models include practically relevant non-Gaussian/nonparametric components for latent structure, underlying often quite complex non-Gaussianity in multivariate expression patterns. Model search and fitting are addressed through stochastic simulation and evolutionary stochastic search methods that are exemplified in the oncogenic pathway studies. Supplementary supporting material provides more details of the applications, as well as examples of the use of freely available software tools for implementing the methodology. PMID:21218139
A semi-quantitative approach to GMO risk-benefit analysis.
Morris, E Jane
2011-10-01
In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
Methodologie experimentale pour evaluer les caracteristiques des plateformes graphiques avioniques
NASA Astrophysics Data System (ADS)
Legault, Vincent
Within a context where the aviation industry intensifies the development of new visually appealing features and where time-to-market must be as short as possible, rapid graphics processing benchmarking in a certified avionics environment becomes an important issue. With this work we intend to demonstrate that it is possible to deploy a high-performance graphics application on an avionics platform that uses certified graphical COTS components. Moreover, we would like to bring to the avionics community a methodology which will allow developers to identify the needed elements for graphics system optimisation and provide them tools that can measure the complexity of this type of application and measure the amount of resources to properly scale a graphics system according to their needs. As far as we know, no graphics performance profiling tool dedicated to critical embedded architectures has been proposed. We thus had the idea of implementing a specialized benchmarking tool that would be an appropriate and effective solution to this problem. Our solution resides in the extraction of the key graphics specifications from an inherited application to use them afterwards in a 3D image generation application.
The Virtual Physiological Human ToolKit.
Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V
2010-08-28
The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.
Cancer diagnosis by infrared spectroscopy: methodological aspects
NASA Astrophysics Data System (ADS)
Jackson, Michael; Kim, Keith; Tetteh, John; Mansfield, James R.; Dolenko, Brion; Somorjai, Raymond L.; Orr, F. W.; Watson, Peter H.; Mantsch, Henry H.
1998-04-01
IR spectroscopy is proving to be a powerful tool for the study and diagnosis of cancer. The application of IR spectroscopy to the analysis of cultured tumor cells and grading of breast cancer sections is outlined. Potential sources of error in spectral interpretation due to variations in sample histology and artifacts associated with sample storage and preparation are discussed. The application of statistical techniques to assess differences between spectra and to non-subjectively classify spectra is demonstrated.
Use of the instream flow incremental methodology: a tool for negotiation
Cavendish, Mary G.; Duncan, Margaret I.
1986-01-01
The resolution of conflicts arising from differing values and water uses requires technical information and negotiating skills. This article outlines the Instream Flow Incremental Methodology (IFIM), developed by the US Fish and Wildlife Service, and demonstrates that its use to quantify flows necessary to protect desired instream values aids negotiation by illustrating areas of agreement and possible compromises between conflicting water interests. Pursuant to a Section 404 permit application to the US Army Corps of Engineers made by City Utilities of Springfield, Missouri, in 1978, IFIM provided the means by which City Utilities, concerned with a secure water supply for a growing population, and those advocating instream values were satisfied that their requirements were met. In tracing the 15-month process, the authors conclude that the application of IFIM, as well as the cooperative stance adopted by the parties involved, were the key ingredients of the successful permit application.
Application of atomic force microscopy as a nanotechnology tool in food science.
Yang, Hongshun; Wang, Yifen; Lai, Shaojuan; An, Hongjie; Li, Yunfei; Chen, Fusheng
2007-05-01
Atomic force microscopy (AFM) provides a method for detecting nanoscale structural information. First, this review explains the fundamentals of AFM, including principle, manipulation, and analysis. Applications of AFM are then reported in food science and technology research, including qualitative macromolecule and polymer imaging, complicated or quantitative structure analysis, molecular interaction, molecular manipulation, surface topography, and nanofood characterization. The results suggested that AFM could bring insightful knowledge on food properties, and the AFM analysis could be used to illustrate some mechanisms of property changes during processing and storage. However, the current difficulty in applying AFM to food research is lacking appropriate methodology for different food systems. Better understanding of AFM technology and developing corresponding methodology for complicated food systems would lead to a more in-depth understanding of food properties at macromolecular levels and enlarge their applications. The AFM results could greatly improve the food processing and storage technologies.
Transmission line relay mis-operation detection based on time-synchronized field data
Esmaeilian, Ahad; Popovic, Tomo; Kezunovic, Mladen
2015-05-04
In this paper, a real-time tool to detect transmission line relay mis-operation is implemented. The tool uses time-synchronized measurements obtained from both ends of the line during disturbances. The proposed fault analysis tool comes into the picture only after the protective device has operated and tripped the line. The proposed methodology is able not only to detect, classify, and locate transmission line faults, but also to accurately confirm whether the line was tripped due to a mis-operation of protective relays. The analysis report includes either detailed description of the fault type and location or detection of relay mis-operation. As such,more » it can be a source of very useful information to support the system restoration. The focus of the paper is on the implementation requirements that allow practical application of the methodology, which is illustrated using the field data obtained the real power system. Testing and validation is done using the field data recorded by digital fault recorders and protective relays. The test data included several hundreds of event records corresponding to both relay mis-operations and actual faults. The discussion of results addresses various challenges encountered during the implementation and validation of the presented methodology.« less
From LCAs to simplified models: a generic methodology applied to wind power electricity.
Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle
2013-02-05
This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.
Tools and methodologies to support more sustainable biofuel feedstock production.
Dragisic, Christine; Ashkenazi, Erica; Bede, Lucio; Honzák, Miroslav; Killeen, Tim; Paglia, Adriano; Semroc, Bambi; Savy, Conrad
2011-02-01
Increasingly, government regulations, voluntary standards, and company guidelines require that biofuel production complies with sustainability criteria. For some stakeholders, however, compliance with these criteria may seem complex, costly, or unfeasible. What existing tools, then, might facilitate compliance with a variety of biofuel-related sustainability criteria? This paper presents four existing tools and methodologies that can help stakeholders assess (and mitigate) potential risks associated with feedstock production, and can thus facilitate compliance with requirements under different requirement systems. These include the Integrated Biodiversity Assessment Tool (IBAT), the ARtificial Intelligence for Ecosystem Services (ARIES) tool, the Responsible Cultivation Areas (RCA) methodology, and the related Biofuels + Forest Carbon (Biofuel + FC) methodology.
Reliability modelling and analysis of thermal MEMS
NASA Astrophysics Data System (ADS)
Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.
2006-04-01
This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.
An Approach to V&V of Embedded Adaptive Systems
NASA Technical Reports Server (NTRS)
Liu, Yan; Yerramalla, Sampath; Fuller, Edgar; Cukic, Bojan; Gururajan, Srikaruth
2004-01-01
Rigorous Verification and Validation (V&V) techniques are essential for high assurance systems. Lately, the performance of some of these systems is enhanced by embedded adaptive components in order to cope with environmental changes. Although the ability of adapting is appealing, it actually poses a problem in terms of V&V. Since uncertainties induced by environmental changes have a significant impact on system behavior, the applicability of conventional V&V techniques is limited. In safety-critical applications such as flight control system, the mechanisms of change must be observed, diagnosed, accommodated and well understood prior to deployment. In this paper, we propose a non-conventional V&V approach suitable for online adaptive systems. We apply our approach to an intelligent flight control system that employs a particular type of Neural Networks (NN) as the adaptive learning paradigm. Presented methodology consists of a novelty detection technique and online stability monitoring tools. The novelty detection technique is based on Support Vector Data Description that detects novel (abnormal) data patterns. The Online Stability Monitoring tools based on Lyapunov's Stability Theory detect unstable learning behavior in neural networks. Cases studies based on a high fidelity simulator of NASA's Intelligent Flight Control System demonstrate a successful application of the presented V&V methodology. ,
Imaging screening of catastrophic neurological events using a software tool: preliminary results.
Fernandes, A P; Gomes, A; Veiga, J; Ermida, D; Vardasca, T
2015-05-01
In Portugal, as in most countries, the most frequent organ donors are brain-dead donors. To answer the increasing need for transplants, donation programs have been implemented. The goal is to recognize virtually all the possible and potential brain-dead donors admitted to hospitals. The aim of this work was to describe preliminary results of a software application designed to identify devastating neurological injury victims who may progress to brain death and can be possible organ donors. This was an observational, longitudinal study with retrospective data collection. The software application is an automatic algorithm based on natural language processing for selected keywords/expressions present in the cranio-encephalic computerized tomography (CE CT) scan reports to identify catastrophic neurological situations, with e-mail notification to the Transplant Coordinator (TC). The first 7 months of this application were analyzed and compared with the standard clinical evaluation methodology. The imaging identification tool showed a sensitivity of 77% and a specificity of 66%; predictive positive value (PPV) was 0.8 and predictive negative value (PNV) was 0.7 for the identification of catastrophic neurological events. The methodology proposed in this work seems promising in improving the screening efficiency of critical neurological events. Copyright © 2015 Elsevier Inc. All rights reserved.
Díaz-Ferguson, Edgardo E; Moyer, Gregory R
2014-12-01
Genetic material (short DNA fragments) left behind by species in nonliving components of the environment (e.g. soil, sediment, or water) is defined as environmental DNA (eDNA). This DNA has been previously described as particulate DNA and has been used to detect and describe microbial communities in marine sediments since the mid-1980's and phytoplankton communities in the water column since the early-1990's. More recently, eDNA has been used to monitor invasive or endangered vertebrate and invertebrate species. While there is a steady increase in the applicability of eDNA as a monitoring tool, a variety of eDNA applications are emerging in fields such as forensics, population and community ecology, and taxonomy. This review provides scientist with an understanding of the methods underlying eDNA detection as well as applications, key methodological considerations, and emerging areas of interest for its use in ecology and conservation of freshwater and marine environments.
Incorporating unnatural amino acids to engineer biocatalysts for industrial bioprocess applications.
Ravikumar, Yuvaraj; Nadarajan, Saravanan Prabhu; Hyeon Yoo, Tae; Lee, Chong-Soon; Yun, Hyungdon
2015-12-01
The bioprocess engineering with biocatalysts broadly spans its development and actual application of enzymes in an industrial context. Recently, both the use of bioprocess engineering and the development and employment of enzyme engineering techniques have been increasing rapidly. Importantly, engineering techniques that incorporate unnatural amino acids (UAAs) in vivo has begun to produce enzymes with greater stability and altered catalytic properties. Despite the growth of this technique, its potential value in bioprocess applications remains to be fully exploited. In this review, we explore the methodologies involved in UAA incorporation as well as ways to synthesize these UAAs. In addition, we summarize recent efforts to increase the yield of UAA engineered proteins in Escherichia coli and also the application of this tool in enzyme engineering. Furthermore, this protein engineering tool based on the incorporation of UAA can be used to develop immobilized enzymes that are ideal for bioprocess applications. Considering the potential of this tool and by exploiting these engineered enzymes, we expect the field of bioprocess engineering to open up new opportunities for biocatalysis in the near future. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Popovich, Ivor; Windsor, Bethany; Jordan, Vanessa; Showell, Marian; Shea, Bev; Farquhar, Cynthia M.
2012-01-01
Background Systematic reviews are used widely to guide health care decisions. Several tools have been created to assess systematic review quality. The measurement tool for assessing the methodological quality of systematic reviews known as the AMSTAR tool applies a yes/no score to eleven relevant domains of review methodology. This tool has been reworked so that each domain is scored based on a four point scale, producing R-AMSTAR. Methods and Findings We aimed to compare the AMSTAR and R-AMSTAR tools in assessing systematic reviews in the field of assisted reproduction for subfertility. All published systematic reviews on assisted reproductive technology, with the latest search for studies taking place from 2007–2011, were considered. Reviews that contained no included studies or considered diagnostic outcomes were excluded. Thirty each of Cochrane and non-Cochrane reviews were randomly selected from a search of relevant databases. Both tools were then applied to all sixty reviews. The results were converted to percentage scores and all reviews graded and ranked based on this. AMSTAR produced a much wider variation in percentage scores and achieved higher inter-rater reliability than R-AMSTAR according to kappa statistics. The average rating for Cochrane reviews was consistent between the two tools (88.3% for R-AMSTAR versus 83.6% for AMSTAR) but inconsistent for non-Cochrane reviews (63.9% R-AMSTAR vs. 38.5% AMSTAR). In comparing the rankings generated between the two tools Cochrane reviews changed an average of 4.2 places, compared to 2.9 for non-Cochrane. Conclusion R-AMSTAR provided greater guidance in the assessment of domains and produced quantitative results. However, there were many problems with the construction of its criteria and AMSTAR was much easier to apply consistently. We recommend that AMSTAR incorporates the findings of this study and produces additional guidance for its application in order to improve its reliability and usefulness. PMID:23300526
Teixeira, Carlos A; Russo, Mário; Matos, Cristina; Bentes, Isabel
2014-12-01
This article describes an accurate methodology for an operational, economic, and environmental assessment of municipal solid waste collection. The proposed methodological tool uses key performance indicators to evaluate independent operational and economic efficiency and performance of municipal solid waste collection practices. These key performance indicators are then used in life cycle inventories and life cycle impact assessment. Finally, the life cycle assessment environmental profiles provide the environmental assessment. We also report a successful application of this tool through a case study in the Portuguese city of Porto. Preliminary results demonstrate the applicability of the methodological tool to real cases. Some of the findings focus a significant difference between average mixed and selective collection effective distance (2.14 km t(-1); 16.12 km t(-1)), fuel consumption (3.96 L t(-1); 15.37 L t(-1)), crew productivity (0.98 t h(-1) worker(-1); 0.23 t h(-1) worker(-1)), cost (45.90 € t(-1); 241.20 € t(-1)), and global warming impact (19.95 kg CO2eq t(-1); 57.47 kg CO2eq t(-1)). Preliminary results consistently indicate: (a) higher global performance of mixed collection as compared with selective collection; (b) dependency of collection performance, even in urban areas, on the waste generation rate and density; (c) the decline of selective collection performances with decreasing source-separated material density and recycling collection rate; and (d) that the main threats to collection route efficiency are the extensive collection distances, high fuel consumption vehicles, and reduced crew productivity. © The Author(s) 2014.
Lerner, Richard M
2015-06-01
The bold claim that developmental science can contribute to both enhancing positive development among diverse individuals across the life span and promoting social justice in their communities, nations and regions is supported by decades of theoretical, methodological and research contributions. To explain the basis of this claim, I describe the relational developmental systems (RDS) metamodel that frames contemporary developmental science, and I present an example of a programme of research within the adolescent portion of the life span that is associated with this metamodel and is pertinent to promoting positive human development. I then discuss methodological issues associated with using RDS-based models as frames for research and application. Finally, I explain how the theoretical and methodological ideas associated with RDS thinking may provide the scholarly tools needed by developmental scientists seeking to contribute to human thriving and to advance social justice in the Global South. © 2015 International Union of Psychological Science.
Chambaron, Stéphanie; Ginhac, Dominique; Perruchet, Pierre
2008-05-01
Serial reaction time tasks and, more generally, the visual-motor sequential paradigms are increasingly popular tools in a variety of research domains, from studies on implicit learning in laboratory contexts to the assessment of residual learning capabilities of patients in clinical settings. A consequence of this success, however, is the increased variability in paradigms and the difficulty inherent in respecting the methodological principles that two decades of experimental investigations have made more and more stringent. The purpose of the present article is to address those problems. We present a user-friendly application that simplifies running classical experiments, but is flexible enough to permit a broad range of nonstandard manipulations for more specific objectives. Basic methodological guidelines are also provided, as are suggestions for using the software to explore unconventional directions of research. The most recent version of gSRT-Soft may be obtained for free by contacting the authors.
Integrated computational materials engineering: Tools, simulations and new applications
Madison, Jonathan D.
2016-03-30
Here, Integrated Computational Materials Engineering (ICME) is a relatively new methodology full of tremendous potential to revolutionize how science, engineering and manufacturing work together. ICME was motivated by the desire to derive greater understanding throughout each portion of the development life cycle of materials, while simultaneously reducing the time between discovery to implementation [1,2].
An application generator for rapid prototyping of Ada real-time control software
NASA Technical Reports Server (NTRS)
Johnson, Jim; Biglari, Haik; Lehman, Larry
1990-01-01
The need to increase engineering productivity and decrease software life cycle costs in real-time system development establishes a motivation for a method of rapid prototyping. The design by iterative rapid prototyping technique is described. A tool which facilitates such a design methodology for the generation of embedded control software is described.
Stacked Autoencoders for Outlier Detection in Over-the-Horizon Radar Signals
Protopapadakis, Eftychios; Doulamis, Anastasios; Doulamis, Nikolaos; Dres, Dimitrios; Bimpas, Matthaios
2017-01-01
Detection of outliers in radar signals is a considerable challenge in maritime surveillance applications. High-Frequency Surface-Wave (HFSW) radars have attracted significant interest as potential tools for long-range target identification and outlier detection at over-the-horizon (OTH) distances. However, a number of disadvantages, such as their low spatial resolution and presence of clutter, have a negative impact on their accuracy. In this paper, we explore the applicability of deep learning techniques for detecting deviations from the norm in behavioral patterns of vessels (outliers) as they are tracked from an OTH radar. The proposed methodology exploits the nonlinear mapping capabilities of deep stacked autoencoders in combination with density-based clustering. A comparative experimental evaluation of the approach shows promising results in terms of the proposed methodology's performance. PMID:29312449
Rosella, L; Bowman, C; Pach, B; Morgan, S; Fitzpatrick, T; Goel, V
2016-07-01
Most quality appraisal tools were developed for clinical medicine and tend to be study-specific with a strong emphasis on risk of bias. In order to be more relevant to public health, an appropriate quality appraisal tool needs to be less reliant on the evidence hierarchy and consider practice applicability. Given the broad range of study designs used in public health, the objective of this study was to develop and validate a meta-tool that combines public health-focused principles of appraisal coupled with a set of design-specific companion tools. Several design methods were used to develop and validate the tool including literature review, synthesis, and validation with a reference standard. A search of critical appraisal tools relevant to public health was conducted; core concepts were collated. The resulting framework was piloted during three feedback sessions with public health practitioners. Following subsequent revisions, the final meta-tool, the Meta Quality Appraisal Tool (MetaQAT), was then validated through a content analysis of appraisals conducted by two groups of experienced public health researchers (MetaQAT vs generic appraisal form). The MetaQAT framework consists of four domains: relevancy, reliability, validity, and applicability. In addition, a companion tool was assembled from existing critical appraisal tools to provide study design-specific guidance on validity appraisal. Content analysis showed similar methodological and generalizability concerns were raised by both groups; however, the MetaQAT appraisers commented more extensively on applicability to public health practice. Critical appraisal tools designed for clinical medicine have limitations for use in the context of public health. The meta-tool structure of the MetaQAT allows for rigorous appraisal, while allowing users to simultaneously appraise the multitude of study designs relevant to public health research and assess non-standard domains, such as applicability. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Thermo-hydro-mechanical-chemical processes in fractured-porous media: Benchmarks and examples
NASA Astrophysics Data System (ADS)
Kolditz, O.; Shao, H.; Görke, U.; Kalbacher, T.; Bauer, S.; McDermott, C. I.; Wang, W.
2012-12-01
The book comprises an assembly of benchmarks and examples for porous media mechanics collected over the last twenty years. Analysis of thermo-hydro-mechanical-chemical (THMC) processes is essential to many applications in environmental engineering, such as geological waste deposition, geothermal energy utilisation, carbon capture and storage, water resources management, hydrology, even climate change. In order to assess the feasibility as well as the safety of geotechnical applications, process-based modelling is the only tool to put numbers, i.e. to quantify future scenarios. This charges a huge responsibility concerning the reliability of computational tools. Benchmarking is an appropriate methodology to verify the quality of modelling tools based on best practices. Moreover, benchmarking and code comparison foster community efforts. The benchmark book is part of the OpenGeoSys initiative - an open source project to share knowledge and experience in environmental analysis and scientific computation.
Batchelor, Hannah K; Kendall, Richard; Desset-Brethes, Sabine; Alex, Rainer; Ernest, Terry B
2013-11-01
Biopharmaceutics is routinely used in the design and development of medicines to generate science based evidence to predict in vivo performance; the application of this knowledge specifically to paediatric medicines development is yet to be explored. The aim of this review is to present the current status of available biopharmaceutical tools and tests including solubility, permeability and dissolution that may be appropriate for use in the development of immediate release oral paediatric medicines. The existing tools used in adults are discussed together with any limitations for their use within paediatric populations. The results of this review highlight several knowledge gaps in current methodologies in paediatric biopharmaceutics. The authors provide recommendations based on existing knowledge to adapt tests to better represent paediatric patient populations and also provide suggestions for future research that may lead to better tools to evaluate paediatric medicines. Copyright © 2013 Elsevier B.V. All rights reserved.
PDTRT special section: Methodological issues in personality disorder research.
Widiger, Thomas A
2017-10-01
Personality Disorders: Theory, Research, and Treatment includes a rolling, ongoing Special Section concerned with methodological issues in personality disorder research. This third edition of this series includes two articles. The first is by Brian Hicks, Angus Clark, and Emily Durbin: "Person-Centered Approaches in the Study of Personality Disorders." The second article is by Steve Balsis: "Item Response Theory Applications in Personality Disorder Research." Both articles should be excellent resources for future research and certainly manuscripts submitted to this journal that use these analytic tools. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Program management aid for redundancy selection and operational guidelines
NASA Technical Reports Server (NTRS)
Hodge, P. W.; Davis, W. L.; Frumkin, B.
1972-01-01
Although this criterion was developed specifically for use on the shuttle program, it has application to many other multi-missions programs (i.e. aircraft or mechanisms). The methodology employed is directly applicable even if the tools (nomographs and equations) are for mission peculiar cases. The redundancy selection criterion was developed to insure that both the design and operational cost impacts (life cycle costs) were considered in the selection of the quantity of operational redundancy. These tools were developed as aids in expediting the decision process and not intended as the automatic decision maker. This approach to redundancy selection is unique in that it enables a pseudo systems analysis to be performed on an equipment basis without waiting for all designs to be hardened.
A GIS semiautomatic tool for classifying and mapping wetland soils
NASA Astrophysics Data System (ADS)
Moreno-Ramón, Héctor; Marqués-Mateu, Angel; Ibáñez-Asensio, Sara
2016-04-01
Wetlands are one of the most productive and biodiverse ecosystems in the world. Water is the main resource and controls the relationships between agents and factors that determine the quality of the wetland. However, vegetation, wildlife and soils are also essential factors to understand these environments. It is possible that soils have been the least studied resource due to their sampling problems. This feature has caused that sometimes wetland soils have been classified broadly. The traditional methodology states that homogeneous soil units should be based on the five soil forming-factors. The problem can appear when the variation of one soil-forming factor is too small to differentiate a change in soil units, or in case that there is another factor, which is not taken into account (e.g. fluctuating water table). This is the case of Albufera of Valencia, a coastal wetland located in the middle east of the Iberian Peninsula (Spain). The saline water table fluctuates throughout the year and it generates differences in soils. To solve this problem, the objectives of this study were to establish a reliable methodology to avoid that problems, and develop a GIS tool that would allow us to define homogeneous soil units in wetlands. This step is essential for the soil scientist, who has to decide the number of soil profiles in a study. The research was conducted with data from 133 soil pits of a previous study in the wetland. In that study, soil parameters of 401 samples (organic carbon, salinity, carbonates, n-value, etc.) were analysed. In a first stage, GIS layers were generated according to depth. The method employed was Bayesian Maxim Entropy. Subsequently, it was designed a program in GIS environment that was based on the decision tree algorithms. The goal of this tool was to create a single layer, for each soil variable, according to the different diagnostic criteria of Soil Taxonomy (properties, horizons and diagnostic epipedons). At the end, the program generated a set of layers with the geographical information, which corresponded with each diagnostic criteria. Finally, the superposition of layers generated the different homogeneous soil units where the soil scientist should locate the soil profiles. Historically, the Albufera of Valencia has been classified as a soil homogeneous unit, but it was demonstrated that there were six homogeneous units after the methodology and the GIS tool application. In that regard, the outcome reveals that it had been necessary to open only six profiles, against the 19 profiles opened when the real study was carried out. As a conclusion, the methodology and the SIG tool demonstrated that could be employed in areas where the soil forming-factors cannot be distinguished. The application of rapid measurement methods and this methodology could economise the definition process of homogeneous units.
Use of SCALE Continuous-Energy Monte Carlo Tools for Eigenvalue Sensitivity Coefficient Calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Perfetti, Christopher M; Rearden, Bradley T
2013-01-01
The TSUNAMI code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the development of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The CLUTCH and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in themore » CE KENO framework to generate the capability for TSUNAMI-3D to perform eigenvalue sensitivity calculations in continuous-energy applications. This work explores the improvements in accuracy that can be gained in eigenvalue and eigenvalue sensitivity calculations through the use of the SCALE CE KENO and CE TSUNAMI continuous-energy Monte Carlo tools as compared to multigroup tools. The CE KENO and CE TSUNAMI tools were used to analyze two difficult models of critical benchmarks, and produced eigenvalue and eigenvalue sensitivity coefficient results that showed a marked improvement in accuracy. The CLUTCH sensitivity method in particular excelled in terms of efficiency and computational memory requirements.« less
Production methodologies of polymeric and hydrogel particles for drug delivery applications.
Lima, Ana Catarina; Sher, Praveen; Mano, João F
2012-02-01
Polymeric particles are ideal vehicles for controlled delivery applications due to their ability to encapsulate a variety of substances, namely low- and high-molecular mass therapeutics, antigens or DNA. Micro and nano scale spherical materials have been developed as carriers for therapies, using appropriated methodologies, in order to achieve a prolonged and controlled drug administration. This paper reviews the methodologies used for the production of polymeric micro/nanoparticles. Emulsions, phase separation, spray drying, ionic gelation, polyelectrolyte complexation and supercritical fluids precipitation are all widely used processes for polymeric micro/nanoencapsulation. This paper also discusses the recent developments and patents reported in this field. Other less conventional methodologies are also described, such as the use of superhydrophobic substrates to produce hydrogel and polymeric particulate biomaterials. Polymeric drug delivery systems have gained increased importance due to the need for improving the efficiency and versatility of existing therapies. This allows the development of innovative concepts that could create more efficient systems, which in turn may address many healthcare needs worldwide. The existing methods to produce polymeric release systems have some critical drawbacks, which compromise the efficiency of these techniques. Improvements and development of new methodologies could be achieved by using multidisciplinary approaches and tools taken from other subjects, including nanotechnologies, biomimetics, tissue engineering, polymer science or microfluidics.
Integration of tablet technologies in the e-laboratory of cytology: a health technology assessment.
Giansanti, Daniele; Pochini, Marco; Giovagnoli, Maria Rosaria
2014-10-01
Although tablet systems are becoming a powerful technology, particularly useful in every application of medical imaging, to date no one has investigated the acceptance and performance of this technology in digital cytology. The specific aims of the work were (1) to design a health technology assessment (HTA) tool to assess, in terms of performance and acceptance, the introduction of tablet technologies (wearable, portable, and non portable) in the e-laboratories of cytology and (2) to test the tool in a first significant application of digital cytology. An HTA tool was proposed operating on a domain of five dimensions of investigation comprising the basic information of the product of digital cytology, the perceived subjective quality of images, the assessment of the virtual navigation on the e-slide, the assessment of the information and communication technologies features, and the diagnostic power. Six e-slides regarding studies of cervicovaginal cytology digitalized by means of an Aperio ( www.aperio.com ) scanner and uploaded onto the www.digitalslide.it Web site were used for testing the methodology on three different network connections. Three experts of cytology successfully tested the methodology on seven tablets found suitable for the study in their own standard configuration. Specific indexes furnished by the tool indicated both a high degree of performance and subjective acceptance of the investigated technology. The HTA tool thus could be useful to investigate new tablet technologies in digital cytology and furnish stakeholders with useful information that may help them make decisions involving the healthcare system. From a global point of view the study demonstrates the feasibility of using the tablet technology in digital cytology.
Effective methodology to derive strategic decisions from ESA exploration technology roadmaps
NASA Astrophysics Data System (ADS)
Cresto Aleina, Sara; Viola, Nicole; Fusaro, Roberta; Saccoccia, Giorgio
2016-09-01
Top priorities in future international space exploration missions regard the achievement of the necessary maturation of enabling technologies, thereby allowing Europe to play a role commensurate with its industrial, operational and scientific capabilities. As part of the actions derived from this commitment, ESA Technology Roadmaps for Exploration represent a powerful tool to prioritise R&D activities in technologies for space exploration and support the preparation of a consistent procurement plan for space exploration technologies in Europe. The roadmaps illustrate not only the technology procurement (to TRL-8) paths for specific missions envisaged in the present timeframe, but also the achievement for Europe of technological milestones enabling operational capabilities and building blocks, essential for current and future Exploration missions. Coordination of requirements and funding sources among all European stakeholders (ESA, EU, National, and Industry) is one of the objectives of these roadmaps, that show also possible application of the technologies beyond space exploration, both at ESA and outside. The present paper describes the activity that supports the work on-going at ESA on the elaboration and update of these roadmaps and related tools, in order to criticise the followed approach and to suggest methodologies of assessment of the Roadmaps, and to derive strategic decision for the advancement of Space Exploration in Europe. After a review of Technology Areas, Missions/Programmes and related building blocks (architectures) and operational capabilities, technology applicability analyses are presented. The aim is to identify if a specific technology is required, applicable or potentially a demonstrator in the building blocks of the proposed mission concepts. In this way, for each technology it is possible to outline one or more specific plans to increase TRL up to the required level. In practice, this translates into two possible solutions: on the one hand, approved mission concepts will be complemented with the required technologies if the latter can be considered as applicable or demo; on the other, if they are neither applicable nor demo, new missions, i.e. technology demonstrators based on multidisciplinary grouping of key technologies, shall be evaluated, so as to proceed through incremental steps. Finally, techniques to determine priorities in technology procurement are identified, and methodologies to rank the required technologies are proposed. In addition, a tool that estimates the percentage of technologies required for the final destination that are implementable in each intermediate destination of the incremental approach is presented.
Understanding the impact of TV commercials: electrical neuroimaging.
Vecchiato, Giovanni; Kong, Wanzeng; Maglione, Anton Giulio; Wei, Daming
2012-01-01
Today, there is a greater interest in the marketing world in using neuroimaging tools to evaluate the efficacy of TV commercials. This field of research is known as neuromarketing. In this article, we illustrate some applications of electrical neuroimaging, a discipline that uses electroencephalography (EEG) and intensive signal processing techniques for the evaluation of marketing stimuli. We also show how the proper usage of these methodologies can provide information related to memorization and attention while people are watching marketing-relevant stimuli. We note that temporal and frequency patterns of EEG signals are able to provide possible descriptors that convey information about the cognitive process in subjects observing commercial advertisements (ads). Such information could be unobtainable through common tools used in standard marketing research. Evidence of this research shows how EEG methodologies could be employed to better design new products that marketers are going to promote and to analyze the global impact of video commercials already broadcast on TV.
Assessing Hydrologic Impacts of Future Land Cover Change ...
Long‐term land‐use and land cover change and their associated impacts pose critical challenges to sustaining vital hydrological ecosystem services for future generations. In this study, a methodology was developed on the San Pedro River Basin to characterize hydrologic impacts from future urban growth through time. This methodology was then expanded and utilized to characterize the changing hydrology on the South Platte River Basin. Future urban growth is represented by housingdensity maps generated in decadal intervals from 2010 to 2100, produced by the U.S. Environmental Protection Agency (EPA) Integrated Climate and Land‐Use Scenarios (ICLUS) project. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize hydrologic impacts from future growth, the housing density maps were reclassified to National Land Cover Database (NLCD) 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The objectives of this project were to 1) develop and describe a methodology for adapting the ICLUS data for use in AGWA as anapproach to evaluate basin‐wide impacts of development on water‐quantity and ‐quality, 2) present initial results from the application of the methodology to
Nucleic acids-based tools for ballast water surveillance, monitoring, and research
NASA Astrophysics Data System (ADS)
Darling, John A.; Frederick, Raymond M.
2018-03-01
Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size classes), gaps in nucleic acids reference databases are still considerable, uncertainties in taxonomic assignment methods persist, and many applications have not yet matured sufficiently to offer standardized methods capable of meeting rigorous quality assurance standards. Nevertheless, the potential value of these tools, their growing utilization in biodiversity monitoring, and the rapid methodological advances over the past decade all suggest that they should be seriously considered for inclusion in the ballast water surveillance toolkit.
Analysis of Alternatives for Risk Assessment Methodologies and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.
The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less
Application of Six Sigma towards improving surgical outcomes.
Shukla, P J; Barreto, S G; Nadkarni, M S
2008-01-01
Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.
Niaksu, Olegas; Zaptorius, Jonas
2014-01-01
This paper presents the methodology suitable for creation of a performance related remuneration system in healthcare sector, which would meet requirements for efficiency and sustainable quality of healthcare services. Methodology for performance indicators selection, ranking and a posteriori evaluation has been proposed and discussed. Priority Distribution Method is applied for unbiased performance criteria weighting. Data mining methods are proposed to monitor and evaluate the results of motivation system.We developed a method for healthcare specific criteria selection consisting of 8 steps; proposed and demonstrated application of Priority Distribution Method for the selected criteria weighting. Moreover, a set of data mining methods for evaluation of the motivational system outcomes was proposed. The described methodology for calculating performance related payment needs practical approbation. We plan to develop semi-automated tools for institutional and personal performance indicators monitoring. The final step would be approbation of the methodology in a healthcare facility.
Automated Methodologies for the Design of Flow Diagrams for Development and Maintenance Activities
NASA Astrophysics Data System (ADS)
Shivanand M., Handigund; Shweta, Bhat
The Software Requirements Specification (SRS) of the organization is a text document prepared by strategic management incorporating the requirements of the organization. These requirements of ongoing business/ project development process involve the software tools, the hardware devices, the manual procedures, the application programs and the communication commands. These components are appropriately ordered for achieving the mission of the concerned process both in the project development and the ongoing business processes, in different flow diagrams viz. activity chart, workflow diagram, activity diagram, component diagram and deployment diagram. This paper proposes two generic, automatic methodologies for the design of various flow diagrams of (i) project development activities, (ii) ongoing business process. The methodologies also resolve the ensuing deadlocks in the flow diagrams and determine the critical paths for the activity chart. Though both methodologies are independent, each complements other in authenticating its correctness and completeness.
Sanchez-Vazquez, Manuel J; Nielen, Mirjam; Edwards, Sandra A; Gunn, George J; Lewis, Fraser I
2012-08-31
Abattoir detected pathologies are of crucial importance to both pig production and food safety. Usually, more than one pathology coexist in a pig herd although it often remains unknown how these different pathologies interrelate to each other. Identification of the associations between different pathologies may facilitate an improved understanding of their underlying biological linkage, and support the veterinarians in encouraging control strategies aimed at reducing the prevalence of not just one, but two or more conditions simultaneously. Multi-dimensional machine learning methodology was used to identify associations between ten typical pathologies in 6485 batches of slaughtered finishing pigs, assisting the comprehension of their biological association. Pathologies potentially associated with septicaemia (e.g. pericarditis, peritonitis) appear interrelated, suggesting on-going bacterial challenges by pathogens such as Haemophilus parasuis and Streptococcus suis. Furthermore, hepatic scarring appears interrelated with both milk spot livers (Ascaris suum) and bacteria-related pathologies, suggesting a potential multi-pathogen nature for this pathology. The application of novel multi-dimensional machine learning methodology provided new insights into how typical pig pathologies are potentially interrelated at batch level. The methodology presented is a powerful exploratory tool to generate hypotheses, applicable to a wide range of studies in veterinary research.
A primer on systematic reviews in toxicology.
Hoffmann, Sebastian; de Vries, Rob B M; Stephens, Martin L; Beck, Nancy B; Dirven, Hubert A A M; Fowle, John R; Goodman, Julie E; Hartung, Thomas; Kimber, Ian; Lalu, Manoj M; Thayer, Kristina; Whaley, Paul; Wikoff, Daniele; Tsaioun, Katya
2017-07-01
Systematic reviews, pioneered in the clinical field, provide a transparent, methodologically rigorous and reproducible means of summarizing the available evidence on a precisely framed research question. Having matured to a well-established approach in many research fields, systematic reviews are receiving increasing attention as a potential tool for answering toxicological questions. In the larger framework of evidence-based toxicology, the advantages and obstacles of, as well as the approaches for, adapting and adopting systematic reviews to toxicology are still being explored. To provide the toxicology community with a starting point for conducting or understanding systematic reviews, we herein summarized available guidance documents from various fields of application. We have elaborated on the systematic review process by breaking it down into ten steps, starting with planning the project, framing the question, and writing and publishing the protocol, and concluding with interpretation and reporting. In addition, we have identified the specific methodological challenges of toxicological questions and have summarized how these can be addressed. Ultimately, this primer is intended to stimulate scientific discussions of the identified issues to fuel the development of toxicology-specific methodology and to encourage the application of systematic review methodology to toxicological issues.
Samsi, Siddharth; Krishnamurthy, Ashok K.; Gurcan, Metin N.
2012-01-01
Follicular Lymphoma (FL) is one of the most common non-Hodgkin Lymphoma in the United States. Diagnosis and grading of FL is based on the review of histopathological tissue sections under a microscope and is influenced by human factors such as fatigue and reader bias. Computer-aided image analysis tools can help improve the accuracy of diagnosis and grading and act as another tool at the pathologist’s disposal. Our group has been developing algorithms for identifying follicles in immunohistochemical images. These algorithms have been tested and validated on small images extracted from whole slide images. However, the use of these algorithms for analyzing the entire whole slide image requires significant changes to the processing methodology since the images are relatively large (on the order of 100k × 100k pixels). In this paper we discuss the challenges involved in analyzing whole slide images and propose potential computational methodologies for addressing these challenges. We discuss the use of parallel computing tools on commodity clusters and compare performance of the serial and parallel implementations of our approach. PMID:22962572
NASA Technical Reports Server (NTRS)
Rash, James L. (Editor)
1988-01-01
This publication comprises the papers presented at the 1988 Goddard Conference on Space Applications of Artificial Intelligence held at the NASA/Goddard Space Flight Center, Greenbelt, Maryland on May 24, 1988. The purpose of this annual conference is to provide a forum in which current research and development directed at space applications of artificial intelligence can be presented and discussed. The papers in these proceedings fall into the following areas: mission operations support, planning and scheduling; fault isolation/diagnosis; image processing and machine vision; data management; modeling and simulation; and development tools methodologies.
Successfully Transitioning Science Research to Space Weather Applications
NASA Technical Reports Server (NTRS)
Spann, James
2012-01-01
The awareness of potentially significant impacts of space weather on spaceand ground ]based technological systems has generated a strong desire in many sectors of government and industry to effectively transform knowledge and understanding of the variable space environment into useful tools and applications for use by those entities responsible for systems that may be vulnerable to space weather impacts. Essentially, effectively transitioning science knowledge to useful applications relevant to space weather has become important. This talk will present proven methodologies that have been demonstrated to be effective, and how in the current environment those can be applied to space weather transition efforts.
Structural Embeddings: Mechanization with Method
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Rushby, John
1999-01-01
The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.
Recent Advances in Algal Genetic Tool Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
R. Dahlin, Lukas; T. Guarnieri, Michael
The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less
Recent Advances in Algal Genetic Tool Development
R. Dahlin, Lukas; T. Guarnieri, Michael
2016-06-24
The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well asmore » prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.« less
Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte
2018-01-25
The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.
Marshall Space Flight Center's Virtual Reality Applications Program 1993
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1993-01-01
A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.
ERIC Educational Resources Information Center
Sulaiman V., Rasheed; Hall, Andy; Kalaivani, N. J.; Dorai, Kumuda; Reddy, T. S. Vamsidhar
2012-01-01
Purpose: This article reviews the experience of ICT applications as a tool for putting research derived knowledge into use for innovation in South Asia. Design/methodology/approach: The article uses the contemporary understanding of communication and innovation in reviewing the experience of ICTs in putting new knowledge into use in South Asia.…
NASA Technical Reports Server (NTRS)
Dulchavsky, Scott A.; Sargsyan, A.E.
2009-01-01
This slide presentation reviews the use of ultrasound as a diagnostic tool in microgravity environments. The goals of research in ultrasound usage in space environments are: (1) Determine accuracy of ultrasound in novel clinical conditions. (2) Determine optimal training methodologies, (3) Determine microgravity associated changes and (4) Develop intuitive ultrasound catalog to enhance autonomous medical care. Also uses of Ultrasound technology in terrestrial applications are reviewed.
ERIC Educational Resources Information Center
Márquez, Manuel; Chaves, Beatriz
2016-01-01
The application of a methodology based on S.C. Dik's Functionalist Grammar linguistic principles, which is addressed to the teaching of Latin to secondary students, has resulted in a quantitative improvement in students' acquisition process of knowledge. To do so, we have used a self-learning tool, an ad hoc dictionary, of which the use in…
Tagliaferri, Roberto; Longo, Giuseppe; Milano, Leopoldo; Acernese, Fausto; Barone, Fabrizio; Ciaramella, Angelo; De Rosa, Rosario; Donalek, Ciro; Eleuteri, Antonio; Raiconi, Giancarlo; Sessa, Salvatore; Staiano, Antonino; Volpicelli, Alfredo
2003-01-01
In the last decade, the use of neural networks (NN) and of other soft computing methods has begun to spread also in the astronomical community which, due to the required accuracy of the measurements, is usually reluctant to use automatic tools to perform even the most common tasks of data reduction and data mining. The federation of heterogeneous large astronomical databases which is foreseen in the framework of the astrophysical virtual observatory and national virtual observatory projects, is, however, posing unprecedented data mining and visualization problems which will find a rather natural and user friendly answer in artificial intelligence tools based on NNs, fuzzy sets or genetic algorithms. This review is aimed to both astronomers (who often have little knowledge of the methodological background) and computer scientists (who often know little about potentially interesting applications), and therefore will be structured as follows: after giving a short introduction to the subject, we shall summarize the methodological background and focus our attention on some of the most interesting fields of application, namely: object extraction and classification, time series analysis, noise identification, and data mining. Most of the original work described in the paper has been performed in the framework of the AstroNeural collaboration (Napoli-Salerno).
van Noort, Betteke Maria; Pfeiffer, Ernst; Lehmkuhl, Ulrike; Kappel, Viola
2013-11-01
Adults with anorexia nervosa (AN) show weaknesses in several cognitive functions before and after weight restoration. There is a great demand for standardized examinations of executive functioning in the field of child and adolescent AN. Previous studies exhibited methodological inconsistencies regarding test selection and operationalization of cognitive functions, making the interpretation of their findings difficult. In order to overcome these inconsistencies, a neuropsychological assessment tool, the "Ravello Profile," was developed, though previously not available in German. This paper presents a German adaptation of the Ravello Profile and illustrates its applicability in children and adolescents via three case descriptions. The Ravello Profile was adapted for the German-speaking area. The applicability of the Ravello Profile was evaluated in three children and adolescents with AN. The cases presented confirm the feasible implementation of this adaptation of the Ravello Profile, both in children and adolescents. Hence, it enables a methodologically consistent examination of executive functioning in German-speaking children, adolescents, and adults with AN. Using the Ravello Profile, the role of cognitive functions in the development of AN can be systematically examined over a broad age range.
Closing the Certification Gaps in Adaptive Flight Control Software
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2008-01-01
Over the last five decades, extensive research has been performed to design and develop adaptive control systems for aerospace systems and other applications where the capability to change controller behavior at different operating conditions is highly desirable. Although adaptive flight control has been partially implemented through the use of gain-scheduled control, truly adaptive control systems using learning algorithms and on-line system identification methods have not seen commercial deployment. The reason is that the certification process for adaptive flight control software for use in national air space has not yet been decided. The purpose of this paper is to examine the gaps between the state-of-the-art methodologies used to certify conventional (i.e., non-adaptive) flight control system software and what will likely to be needed to satisfy FAA airworthiness requirements. These gaps include the lack of a certification plan or process guide, the need to develop verification and validation tools and methodologies to analyze adaptive controller stability and convergence, as well as the development of metrics to evaluate adaptive controller performance at off-nominal flight conditions. This paper presents the major certification gap areas, a description of the current state of the verification methodologies, and what further research efforts will likely be needed to close the gaps remaining in current certification practices. It is envisioned that closing the gap will require certain advances in simulation methods, comprehensive methods to determine learning algorithm stability and convergence rates, the development of performance metrics for adaptive controllers, the application of formal software assurance methods, the application of on-line software monitoring tools for adaptive controller health assessment, and the development of a certification case for adaptive system safety of flight.
NASA Technical Reports Server (NTRS)
Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola
2004-01-01
Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.
Integrating interface slicing into software engineering processes
NASA Technical Reports Server (NTRS)
Beck, Jon
1993-01-01
Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.
Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story
NASA Technical Reports Server (NTRS)
Ly, Vuong
2017-01-01
The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.
Error-rate prediction for programmable circuits: methodology, tools and studied cases
NASA Astrophysics Data System (ADS)
Velazco, Raoul
2013-05-01
This work presents an approach to predict the error rates due to Single Event Upsets (SEU) occurring in programmable circuits as a consequence of the impact or energetic particles present in the environment the circuits operate. For a chosen application, the error-rate is predicted by combining the results obtained from radiation ground testing and the results of fault injection campaigns performed off-beam during which huge numbers of SEUs are injected during the execution of the studied application. The goal of this strategy is to obtain accurate results about different applications' error rates, without using particle accelerator facilities, thus significantly reducing the cost of the sensitivity evaluation. As a case study, this methodology was applied a complex processor, the Power PC 7448 executing a program issued from a real space application and a crypto-processor application implemented in an SRAM-based FPGA and accepted to be embedded in the payload of a scientific satellite of NASA. The accuracy of predicted error rates was confirmed by comparing, for the same circuit and application, predictions with measures issued from radiation ground testing performed at the cyclotron Cyclone cyclotron of HIF (Heavy Ion Facility) of Louvain-la-Neuve (Belgium).
Shea, Beverley J; Grimshaw, Jeremy M; Wells, George A; Boers, Maarten; Andersson, Neil; Hamel, Candyce; Porter, Ashley C; Tugwell, Peter; Moher, David; Bouter, Lex M
2007-02-15
Our objective was to develop an instrument to assess the methodological quality of systematic reviews, building upon previous tools, empirical evidence and expert consensus. A 37-item assessment tool was formed by combining 1) the enhanced Overview Quality Assessment Questionnaire (OQAQ), 2) a checklist created by Sacks, and 3) three additional items recently judged to be of methodological importance. This tool was applied to 99 paper-based and 52 electronic systematic reviews. Exploratory factor analysis was used to identify underlying components. The results were considered by methodological experts using a nominal group technique aimed at item reduction and design of an assessment tool with face and content validity. The factor analysis identified 11 components. From each component, one item was selected by the nominal group. The resulting instrument was judged to have face and content validity. A measurement tool for the 'assessment of multiple systematic reviews' (AMSTAR) was developed. The tool consists of 11 items and has good face and content validity for measuring the methodological quality of systematic reviews. Additional studies are needed with a focus on the reproducibility and construct validity of AMSTAR, before strong recommendations can be made on its use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hugo, Jacques
The software application is called "HFE-Trace". This is an integrated method and tool for the management of Human Factors Engineering analyses and related data. Its primary purpose is to support the coherent and consistent application of the nuclear industry's best practices for human factors engineering work. The software is a custom Microsoft® Access® application. The application is used (in conjunction with other tools such as spreadsheets, checklists and normal documents where necessary) to collect data on the design of a new nuclear power plant from subject matter experts and other sources. This information is then used to identify potential systemmore » and functional breakdowns of the intended power plant design. This information is expanded by developing extensive descriptions of all functions, as well as system performance parameters, operating limits and constraints, and operational conditions. Once these have been verified, the human factors elements are added to each function, including intended operator role, function allocation considerations, prohibited actions, primary task categories, and primary work station. In addition, the application includes a computational method to assess a number of factors such as system and process complexity, workload, environmental conditions, procedures, regulations, etc.) that may shape operator performance. This is a unique methodology based upon principles described in NUREG/CR-3331 ("A methodology for allocating nuclear power plant control functions to human or automatic control") and it results in a semi-quantified allocation of functions to three or more levels of automation for a conceptual automation system. The aggregate of all this information is then linked to the Task Analysis section of the application where the existing information on all operator functions is transformed into task information and ultimately into design requirements for Human-System Interfaces and Control Rooms. This final step includes assessment of methods to prevent potential operator errors.« less
Tool Support for Software Lookup Table Optimization
Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.
2011-01-01
A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less
Support Tool in the Diagnosis of Sales Price of Dental Plans
NASA Astrophysics Data System (ADS)
de Oliveira, Raquel A. F.; Lóscio, Bernadette F.; Pinheiro, Plácido Rogério
It shows the formatting of a table of price to be used by a company is an activity that cannot be performed only empirically. The application of statistical methodologies and actuarial comes, increasingly, being used widely by companies primarily in the business of health plan. The increasing use of these techniques ensures that a manager of these companies more security and lower risk exposure while assisting them in making decisions. The aim of this paper is to present a tool for calculating the price of dental health developed in Java and PL/PgSQL.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
[Customer and patient satisfaction. An appropriate management tool in hospitals?].
Pawils, S; Trojan, A; Nickel, S; Bleich, C
2012-09-01
Recently, the concept of patient satisfaction has been established as an essential part of the quality management of hospitals. Despite the concept's lack of theoretical and methodological foundations, patient surveys on subjective hospital experiences contribute immensely to the improvement of hospitals. What needs to be considered critically in this context is the concept of customer satisfaction for patients, the theoretical integration of empirical results, the reduction of false satisfaction indications and the application of risk-adjusted versus naïve benchmarking of data. This paper aims to contribute to the theoretical discussion of the topic and to build a basis for planning methodologically sound patient surveys.
A computer simulator for development of engineering system design methodologies
NASA Technical Reports Server (NTRS)
Padula, S. L.; Sobieszczanski-Sobieski, J.
1987-01-01
A computer program designed to simulate and improve engineering system design methodology is described. The simulator mimics the qualitative behavior and data couplings occurring among the subsystems of a complex engineering system. It eliminates the engineering analyses in the subsystems by replacing them with judiciously chosen analytical functions. With the cost of analysis eliminated, the simulator is used for experimentation with a large variety of candidate algorithms for multilevel design optimization to choose the best ones for the actual application. Thus, the simulator serves as a development tool for multilevel design optimization strategy. The simulator concept, implementation, and status are described and illustrated with examples.
Ensemble: an Architecture for Mission-Operations Software
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Powell, Mark; Fox, Jason; Rabe, Kenneth; Shu, IHsiang; McCurdy, Michael; Vera, Alonso
2008-01-01
Ensemble is the name of an open architecture for, and a methodology for the development of, spacecraft mission operations software. Ensemble is also potentially applicable to the development of non-spacecraft mission-operations- type software. Ensemble capitalizes on the strengths of the open-source Eclipse software and its architecture to address several issues that have arisen repeatedly in the development of mission-operations software: Heretofore, mission-operations application programs have been developed in disparate programming environments and integrated during the final stages of development of missions. The programs have been poorly integrated, and it has been costly to develop, test, and deploy them. Users of each program have been forced to interact with several different graphical user interfaces (GUIs). Also, the strategy typically used in integrating the programs has yielded serial chains of operational software tools of such a nature that during use of a given tool, it has not been possible to gain access to the capabilities afforded by other tools. In contrast, the Ensemble approach offers a low-risk path towards tighter integration of mission-operations software tools.
Development of a nursing handoff tool: a web-based application to enhance patient safety.
Goldsmith, Denise; Boomhower, Marc; Lancaster, Diane R; Antonelli, Mary; Kenyon, Mary Anne Murphy; Benoit, Angela; Chang, Frank; Dykes, Patricia C
2010-11-13
Dynamic and complex clinical environments present many challenges for effective communication among health care providers. The omission of accurate, timely, easily accessible vital information by health care providers significantly increases risk of patient harm and can have devastating consequences for patient care. An effective nursing handoff supports the standardized transfer of accurate, timely, critical patient information, as well as continuity of care and treatment, resulting in enhanced patient safety. The Brigham and Women's/Faulkner Hospital Healthcare Information Technology Innovation Program (HIP) is supporting the development of a web based nursing handoff tool (NHT). The goal of this project is to develop a "proof of concept" handoff application to be evaluated by nurses on the inpatient intermediate care units. The handoff tool would enable nurses to use existing knowledge of evidence-based handoff methodology in their everyday practice to improve patient care and safety. In this paper, we discuss the results of nursing focus groups designed to identify the current state of handoff practice as well as the functional and data element requirements of a web based Nursing Handoff Tool (NHT).
Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe
2018-01-17
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.
Archetype modeling methodology.
Moner, David; Maldonado, José Alberto; Robles, Montserrat
2018-03-01
Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.
Selected Tether Applications Cost Model
NASA Technical Reports Server (NTRS)
Keeley, Michael G.
1988-01-01
Diverse cost-estimating techniques and data combined into single program. Selected Tether Applications Cost Model (STACOM 1.0) is interactive accounting software tool providing means for combining several independent cost-estimating programs into fully-integrated mathematical model capable of assessing costs, analyzing benefits, providing file-handling utilities, and putting out information in text and graphical forms to screen, printer, or plotter. Program based on Lotus 1-2-3, version 2.0. Developed to provide clear, concise traceability and visibility into methodology and rationale for estimating costs and benefits of operations of Space Station tether deployer system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Phillips, Jason, E-mail: jp1@tiscali.co.uk
Evaluating sustainability from EIA-based assessments has been problematic at best. This is due to the use of reductionist and qualitative approaches which is dependent upon the perspective of the assessor(s). Therefore, a more rigorous and holistic approach is required to evaluate sustainability in a more consistent way. In this paper, a matrix-based methodology in order to assess the indicated level and nature of sustainability for any project, policy, indicators, legislation, regulation, or other framework is described. The Geocybernetic Assessment Matrix (GAM) is designed to evaluate the level and nature of sustainability or unsustainability occurring in respect the fundamental and complexmore » geocybernetic paradigms. The GAM method is described in detail in respect to the theory behind it and the methodology. The GAM is then demonstrated using an appropriate case study — Part 1 of the UK Climate Change Act (2008) concerning carbon budgets and targets. The results indicate that the Part 1 of Act may not achieve the desired goals in contributing towards sustainable development through the stated mechanisms for carbon budgets and targets. The paper then discusses the broader context of the GAM with respect to the core themes evident in the development and application of the GAM of: sustainability science; sustainability assessment; application value of the GAM; and future research and development. - Highlights: • A new assessment tool called the Geocybernetic Assessment Matrix (GAM) described. • GAM evaluates the level and nature of sustainability or unsustainability. • GAM demonstrated by application to Part 1 of the UK Climate Change Act (CCA). • Part 1 of CCA has significant flaws in achieving a sustainable pathway. • GAM offers a potentially useful tool for quantitatively evaluating sustainability.« less
Application of Six Sigma/CAP methodology: controlling blood-product utilization and costs.
Neri, Robert A; Mason, Cindy E; Demko, Lisa A
2008-01-01
Blood-product components are a limited commodity whose cost is rising. Many patients benefit from their use, but patients who receive transfusions face an unnecessary increased risk for developing infections; fatal, febrile, or allergic reactions; and circulatory overload. To improve patient care, safety, and resource stewardship, transfusion practices must be evaluated for appropriateness (Wilson et al. 2002). A multihospital health system undertook a rigorous study of blood-product utilization patterns and management processes to address cost-control problems in the organization. The system leveraged two process improvement tools widely implemented outside of the healthcare industry: (1) Six Sigma methodology to identify blood-utilization drivers and to standardize transfusion practice, and (2) change acceleration process model to drive effective change. The initiative resulted in a decreased rate of inappropriate transfusions of packed red blood cell from 16 percent to less than 5 percent, improved clinician use of a blood-component order form, establishment of internal benchmarks, enhanced laboratory-to-clinician communication, and better blood-product expense control. The project further demonstrated how out-of-industry tools and methodologies can be adopted, adapted, and systematically applied to generate positive change (Black and Revere 2006).
Proceedings: USACERL/ASCE First Joint Conference on Expert Systems, 29-30 June 1988
1989-01-01
Wong KOWLEDGE -BASED GRAPHIC DIALOGUES . o ...................... .... 80 D. L Mw 4 CONTENTS (Cont’d) ABSTRACTS ACCEPTED FOR PUBLICATION MAD, AN EXPERT...methodology of inductive shallow modeling was developed. Inductive systems may become powerful shallow modeling tools applicable to a large class of...analysis was conducted using a statistical package, Trajectories. Four different types of relationships were analyzed: linear, logarithmic, power , and
ERIC Educational Resources Information Center
Fatokun, J. O.; Fatokun, K. V. F.
2013-01-01
In this paper, we present the concept of problem-based learning as a tool for learning Mathematics and Chemistry, and in fact, all sciences, using life situations or simulated scenario. The methodology involves some level of brain storming. Here, active learning takes place and knowledge gained by students either way through a collaborative…
Daniel J. Leduc; Thomas G. Matney; Keith L. Belli; V. Clark Baldwin
2001-01-01
Artificial neural networks (NN) are becoming a popular estimation tool. Because they require no assumptions about the form of a fitting function, they can free the modeler from reliance on parametric approximating functions that may or may not satisfactorily fit the observed data. To date there have been few applications in forestry science, but as better NN software...
NASA Astrophysics Data System (ADS)
Atlas, R. M.
2016-12-01
Observing System Simulation Experiments (OSSEs) provide an effective method for evaluating the potential impact of proposed new observing systems, as well as for evaluating trade-offs in observing system design, and in developing and assessing improved methodology for assimilating new observations. As such, OSSEs can be an important tool for determining science and user requirements, and for incorporating these requirements into the planning for future missions. Detailed OSSEs have been conducted at NASA/ GSFC and NOAA/AOML in collaboration with Simpson Weather Associates and operational data assimilation centers over the last three decades. These OSSEs determined correctly the quantitative potential for several proposed satellite observing systems to improve weather analysis and prediction prior to their launch, evaluated trade-offs in orbits, coverage and accuracy for space-based wind lidars, and were used in the development of the methodology that led to the first beneficial impacts of satellite surface winds on numerical weather prediction. In this talk, the speaker will summarize the development of OSSE methodology, early and current applications of OSSEs and how OSSEs will evolve in order to enhance mission planning.
Delvosalle, Christian; Fievez, Cécile; Pipart, Aurore; Debray, Bruno
2006-03-31
In the frame of the Accidental Risk Assessment Methodology for Industries (ARAMIS) project, this paper aims at presenting the work carried out in the part of the project devoted to the definition of accident scenarios. This topic is a key-point in risk assessment and serves as basis for the whole risk quantification. The first result of the work is the building of a methodology for the identification of major accident hazards (MIMAH), which is carried out with the development of generic fault and event trees based on a typology of equipment and substances. The term "major accidents" must be understood as the worst accidents likely to occur on the equipment, assuming that no safety systems are installed. A second methodology, called methodology for the identification of reference accident scenarios (MIRAS) takes into account the influence of safety systems on both the frequencies and possible consequences of accidents. This methodology leads to identify more realistic accident scenarios. The reference accident scenarios are chosen with the help of a tool called "risk matrix", crossing the frequency and the consequences of accidents. This paper presents both methodologies and an application on an ethylene oxide storage.
A framework for assessing the adequacy and effectiveness of software development methodologies
NASA Technical Reports Server (NTRS)
Arthur, James D.; Nance, Richard E.
1990-01-01
Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-01
... an improved understanding of methodological challenges associated with integrating existing tools and... methodological challenges associated with integrating existing tools (e.g., climate models, downscaling... sensitivity to methodological choices such as different approaches for downscaling global climate change...
Validation of Predictors of Fall Events in Hospitalized Patients With Cancer.
Weed-Pfaff, Samantha H; Nutter, Benjamin; Bena, James F; Forney, Jennifer; Field, Rosemary; Szoka, Lynn; Karius, Diana; Akins, Patti; Colvin, Christina M; Albert, Nancy M
2016-10-01
A seven-item cancer-specific fall risk tool (Cleveland Clinic Capone-Albert [CC-CA] Fall Risk Score) was shown to have a strong concordance index for predicting falls; however, validation of the model is needed. The aims of this study were to validate that the CC-CA Fall Risk Score, made up of six factors, predicts falls in patients with cancer and to determine if the CC-CA Fall Risk Score performs better than the Morse Fall Tool. Using a prospective, comparative methodology, data were collected from electronic health records of patients hospitalized for cancer care in four hospitals. Risk factors from each tool were recorded, when applicable. Multivariable models were created to predict the probability of a fall. A concordance index for each fall tool was calculated. The CC-CA Fall Risk Score provided higher discrimination than the Morse Fall Tool in predicting fall events in patients hospitalized for cancer management.
NASA Technical Reports Server (NTRS)
Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)
2002-01-01
This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.
NASA Technical Reports Server (NTRS)
Ling, Lisa
2014-01-01
For the purpose of performing safety analysis and risk assessment for a potential off-nominal atmospheric reentry resulting in vehicle breakup, a synthesis of trajectory propagation coupled with thermal analysis and the evaluation of node failure is required to predict the sequence of events, the timeline, and the progressive demise of spacecraft components. To provide this capability, the Simulation for Prediction of Entry Article Demise (SPEAD) analysis tool was developed. The software and methodology have been validated against actual flights, telemetry data, and validated software, and safety/risk analyses were performed for various programs using SPEAD. This report discusses the capabilities, modeling, validation, and application of the SPEAD analysis tool.
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS
2017-10-01
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ
Translational benchmark risk analysis
Piegorsch, Walter W.
2010-01-01
Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283
Brown, Ashleigh F.; Upjohn, Melissa
2018-01-01
The majority of horses, donkeys and mules (equids) are in low- and middle-income countries, where they remain a key source of labour in the construction, agriculture and tourism industries, as well as supporting households daily through transporting people and staple goods. Globally, approximately 600 million people depend on working equids for their livelihood. Safeguarding the welfare of these animals is essential for them to work, as well as for the intrinsic value of the animal’s quality of life. In order to manage animal welfare, it must be measured. Over the past decade, welfare assessment methodologies have emerged for different species, more recently for equids. We present the Standardised Equine-Based Welfare Assessment Tool (SEBWAT) for working equids. The tool is unique, in that it has been applied in practice by a non-governmental organisation (NGO) for six years across Low-Middle-Income Countries (LMICs). We describe the revision of the tool from an original to a second version, the tool methodology and user training process and how data collection and analysis have been conducted. We describe its application at scale, where it has been used more than 71,000 times in 11 countries. Case study examples are given from the tool being used for a needs assessment in Guatemala and monitoring welfare change in Jordan. We conclude by describing the main benefits and limitations for how the tool could be applied by others on working equids in LMICs and how it may develop in the future. PMID:29466391
Cost benefit analysis for smart grid projects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karali, Nihan; He, Gang; Mauzey, J
The U.S. is unusual in that a definition of the term “smart grid” was written into legislation, appearing in the Energy Independence and Security Act (2007). When the recession called for stimulus spending and the American Recovery and Reinvestment Act (ARRA, 2009) was passed, a framework already existed for identification of smart grid projects. About $4.5B of the U.S. Department of Energy’s (U.S. DOE’s) $37B allocation from ARRA was directed to smart grid projects of two types, investment grants and demonstrations. Matching funds from other sources more than doubled the total value of ARRA-funded smart grid projects. The Smart Gridmore » Investment Grant Program (SGIG) consumed all but $620M of the ARRA funds, which was available for the 32 projects in the Smart Grid Demonstration Program (SGDP, or demonstrations). Given the economic potential of these projects and the substantial investments required, there was keen interest in estimating the benefits of the projects (i.e., quantifying and monetizing the performance of smart grid technologies). Common method development and application, data collection, and analysis to calculate and publicize the benefits were central objectives of the program. For this purpose standard methods and a software tool, the Smart Grid Computational Tool (SGCT), were developed by U.S. DOE and a spreadsheet model was made freely available to grantees and other analysts. The methodology was intended to define smart grid technologies or assets, the mechanisms by which they generate functions, their impacts and, ultimately, their benefits. The SGCT and its application to the Demonstration Projects are described, and actual projects in Southern California and in China are selected to test and illustrate the tool. The usefulness of the methodology and tool for international analyses is then assessed.« less
Predicting Great Lakes fish yields: tools and constraints
Lewis, C.A.; Schupp, D.H.; Taylor, W.W.; Collins, J.J.; Hatch, Richard W.
1987-01-01
Prediction of yield is a critical component of fisheries management. The development of sound yield prediction methodology and the application of the results of yield prediction are central to the evolution of strategies to achieve stated goals for Great Lakes fisheries and to the measurement of progress toward those goals. Despite general availability of species yield models, yield prediction for many Great Lakes fisheries has been poor due to the instability of the fish communities and the inadequacy of available data. A host of biological, institutional, and societal factors constrain both the development of sound predictions and their application to management. Improved predictive capability requires increased stability of Great Lakes fisheries through rehabilitation of well-integrated communities, improvement of data collection, data standardization and information-sharing mechanisms, and further development of the methodology for yield prediction. Most important is the creation of a better-informed public that will in turn establish the political will to do what is required.
Gómez, Aina G; Ondiviela, Bárbara; Puente, Araceli; Juanes, José A
2015-05-15
This work presents a standard and unified procedure for assessment of environmental risks at the contaminant source level in port aquatic systems. Using this method, port managers and local authorities will be able to hierarchically classify environmental hazards and proceed with the most suitable management actions. This procedure combines rigorously selected parameters and indicators to estimate the environmental risk of each contaminant source based on its probability, consequences and vulnerability. The spatio-temporal variability of multiple stressors (agents) and receptors (endpoints) is taken into account to provide accurate estimations for application of precisely defined measures. The developed methodology is tested on a wide range of different scenarios via application in six European ports. The validation process confirms its usefulness, versatility and adaptability as a management tool for port water quality in Europe and worldwide. Copyright © 2015 Elsevier Ltd. All rights reserved.
DNA Sequencing in Cultural Heritage.
Vai, Stefania; Lari, Martina; Caramelli, David
2016-02-01
During the last three decades, DNA analysis on degraded samples revealed itself as an important research tool in anthropology, archaeozoology, molecular evolution, and population genetics. Application on topics such as determination of species origin of prehistoric and historic objects, individual identification of famous personalities, characterization of particular samples important for historical, archeological, or evolutionary reconstructions, confers to the paleogenetics an important role also for the enhancement of cultural heritage. A really fast improvement in methodologies in recent years led to a revolution that permitted recovering even complete genomes from highly degraded samples with the possibility to go back in time 400,000 years for samples from temperate regions and 700,000 years for permafrozen remains and to analyze even more recent material that has been subjected to hard biochemical treatments. Here we propose a review on the different methodological approaches used so far for the molecular analysis of degraded samples and their application on some case studies.
A methodological approach for designing a usable ontology-based GUI in healthcare.
Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J
2013-01-01
This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.
ERIC Educational Resources Information Center
Yarime, Masaru; Tanaka, Yuko
2012-01-01
Assessment tools influence incentives to higher education institutions by encouraging them to move towards sustainability. A review of 16 sustainability assessment tools was conducted to examine the recent trends in the issues and methodologies addressed in assessment tools quantitatively and qualitatively. The characteristics of the current…
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2011 CFR
2011-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2014 CFR
2014-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2012 CFR
2012-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2010 CFR
2010-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
34 CFR 75.210 - General selection criteria.
Code of Federal Regulations, 2013 CFR
2013-07-01
... for research activities, and the use of appropriate theoretical and methodological tools, including... project implementation, and the use of appropriate methodological tools to ensure successful achievement...
De Belvis, Antonio Giulio; Specchia, Maria Lucia; Ferriero, Anna Maria; Capizzi, Silvio
2017-01-01
Risk management is a key tool in Clinical Governance. Our project aimed to define, share, apply and measure the impact of tools and methodologies for the continuous improvement of quality of care, especially in relation to the multi-disciplinary and integrated management of the hyperglycemic patient in hospital settings. A training project, coordinated by a scientific board of experts in diabetes and health management and an Expert Meeting with representatives of all the participating centers was launched in 2014. The project involved eight hospitals through the organization of meetings with five managers and 25 speakers, including diabetologists, internists, pharmacists and nurses. The analysis showed a wide variability in the adoption of tools and processes towards a comprehensive and coordinated management of hyperglycemic patients.
NASA Astrophysics Data System (ADS)
Huang, Xiao
2006-04-01
Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.
System learning approach to assess sustainability and ...
This paper presents a methodology that combines the power of an Artificial Neural Network and Information Theory to forecast variables describing the condition of a regional system. The novelty and strength of this approach is in the application of Fisher information, a key method in Information Theory, to preserve trends in the historical data and prevent over fitting projections. The methodology was applied to demographic, environmental, food and energy consumption, and agricultural production in the San Luis Basin regional system in Colorado, U.S.A. These variables are important for tracking conditions in human and natural systems. However, available data are often so far out of date that they limit the ability to manage these systems. Results indicate that the approaches developed provide viable tools for forecasting outcomes with the aim of assisting management toward sustainable trends. This methodology is also applicable for modeling different scenarios in other dynamic systems. Indicators are indispensable for tracking conditions in human and natural systems, however, available data is sometimes far out of date and limit the ability to gauge system status. Techniques like regression and simulation are not sufficient because system characteristics have to be modeled ensuring over simplification of complex dynamics. This work presents a methodology combining the power of an Artificial Neural Network and Information Theory to capture patterns in a real dyna
NASA Astrophysics Data System (ADS)
Servigne, S.; Gripay, Y.; Pinarer, O.; Samuel, J.; Ozgovde, A.; Jay, J.
2016-09-01
Concerning energy consumption and monitoring architectures, our goal is to develop a sustainable declarative monitoring architecture for lower energy consumption taking into account the monitoring system itself. Our second is to develop theoretical and practical tools to model, explore and exploit heterogeneous data from various sources in order to understand a phenomenon like energy consumption of smart building vs inhabitants' social behaviours. We focus on a generic model for data acquisition campaigns based on the concept of generic sensor. The concept of generic sensor is centered on acquired data and on their inherent multi-dimensional structure, to support complex domain-specific or field-oriented analysis processes. We consider that a methodological breakthrough may pave the way to deep understanding of voluminous and heterogeneous scientific data sets. Our use case concerns energy efficiency of buildings to understand relationship between physical phenomena and user behaviors. The aim of this paper is to give a presentation of our methodology and results concerning architecture and user-centric tools.
Satellite Vulnerability to Space Debris- An Improved 3D Risk Assessment Methodology
NASA Astrophysics Data System (ADS)
Grassi, Lilith; Destefanis, Roberto; Tiboldo, Francesca; Donath, Therese; Winterboer, Arne; Evand, Leanne; Janovsky, Rolf; Kempf, Scott; Rudolph, Martin; Schafer, Frank; Gelhaus, Johannes
2013-08-01
The work described in the present paper, performed as a part of the PÇ-ROTECT project, presents an enhanced method to evaluate satellite vulnerability to micrometeoroids and orbital debris (MMOD), using the ESABASE2/Debris tool (developed under ESA contract). Starting from the estimation of induced failures on spacecraft (S/C) components and from the computation of lethal impacts (with an energy leading to the loss of the satellite), and considering the equipment redundancies and interactions between components, the debris-induced S/C functional impairment is assessed. The developed methodology, illustrated through its application to a case study satellite, includes the capability to estimate the number of failures on internal components, overcoming the limitations of current tools which do not allow propagating the debris cloud inside the S/C. The ballistic limit of internal equipment behind a sandwich panel structure is evaluated through the implementation of the Schäfer Ryan Lambert (SRL) Ballistic Limit Equation (BLE).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kapuscinski, A.R.; Hallerman, E.M.
Among the many methodologies encompassing biotechnology in aquaculture, this report addresses: the production of genetically modified aquatic organisms (aquatic GMOs) by gene transfer, chromosome set manipulation, or hybridization or protoplast fusion between species; new health management tools, including DNA-Based diagnostics and recombinant DNA vaccines; Marker-assisted selection; cryopreservation; and stock marking. These methodologies pose a wide range of potential economic benefits for aquaculture by providing improved or new means to affect the mix of necessary material inputs, enhance production efficiency, or improve product quality. Advances in aquaculture through biotechnology could simulate growth of the aquaculture industry to provide a larger proportionmore » of consummer demand, and thereby reduce pressure and natural stocks from over-harvest. Judicious application of gamete cryopreservation and chromosome set manipulations to achieve sterilization could reduce environmental risks of some aquaculture operations. Given the significant losses to disease in many aquaculture enterprises, potential benefits of DNA-based health management tools are very high and appear to pose no major environmental risks or social concerns.« less
Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Yungster, S.
1996-01-01
A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.
Esposito, Pasquale; Dal Canton, Antonio
2014-11-06
Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings.
A novel methodology for in-process monitoring of flow forming
NASA Astrophysics Data System (ADS)
Appleby, Andrew; Conway, Alastair; Ion, William
2017-10-01
Flow forming (FF) is an incremental cold working process with near-net-shape forming capability. Failures by fracture due to high deformation can be unexpected and sometimes catastrophic, causing tool damage. If process failures can be identified in real time, an automatic cut-out could prevent costly tool damage. Sound and vibration monitoring is well established and commercially viable in the machining sector to detect current and incipient process failures, but not for FF. A broad-frequency microphone was used to record the sound signature of the manufacturing cycle for a series of FF parts. Parts were flow formed using single and multiple passes, and flaws were introduced into some of the parts to simulate the presence of spontaneously initiated cracks. The results show that this methodology is capable of identifying both introduced defects and spontaneous failures during flow forming. Further investigation is needed to categorise and identify different modes of failure and identify further potential applications in rotary forming.
Yang, Fang; Liao, Xiangzhi; Tian, Yuan; Li, Guiying
2017-04-01
Exosomes, nanovesicles secreted by most types of cells, exist in virtually all bodily fluids. Their rich nucleic acid and protein content make them potentially valuable biomarkers for noninvasive molecular diagnostics. They also show promise, after further development, to serve as a drug delivery system. Unfortunately, existing exosome separation technologies, such as ultracentrifugation and methods incorporating magnetic beads, are time-consuming, laborious and separate only exosomes of low purity. Thus, a more effective separation method is highly desirable. Microfluidic platforms are ideal tools for exosome separation, since they enable fast, cost-efficient, portable and precise processing of nanoparticles and small volumes of liquid samples. Recently, several microfluidic-based exosome separation technologies have been studied. In this article, the advantages of the most recent technologies, as well as their limitations, challenges and potential uses in novel microfluidic exosome separation and collection applications is reviewed. This review outlines the uses of new powerful microfluidic exosome detection tools for biologists and clinicians, as well as exosome separation tools for microfluidic engineers. Current challenges of exosome separation methodologies are also described, in order to highlight areas for future research and development. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
[Handbook for the preparation of evidence-based documents. Tools derived from scientific knowledge].
Carrión-Camacho, M R; Martínez-Brocca, M A; Paneque-Sánchez-Toscano, I; Valencia-Martín, R; Palomino-García, A; Muñoz-Durán, C; Tamayo-López, M J; González-Eiris-Delgado, C; Otero-Candelera, R; Ortega-Ruiz, F; Sobrino-Márquez, J M; Jiménez-García-Bóveda, R; Fernández-Quero, M; Campos-Pareja, A M
2013-01-01
This handbook is intended to be an accessible, easy-to-consult guide to help professionals produce or adapt Evidence-Based Documents. Such documents will help standardize both clinical practice and decision-making, the quality always being monitored in such a way that established references are complied with. Evidence-Based Health Care Committee, a member of "Virgen del Rocío" University Hospital quality structure, proposed the preparation of a handbook to produce Evidence-Based Documents including: a description of products, characteristics, qualities, uses, methodology of production, and application scope of every one of them. The handbook consists of seven Evidence-Based tools, one chapter on critical analysis methodology of scientific literature, one chapter with internet resources, and some appendices with different assessment tools. This Handbook provides general practitioners with a great opportunity to improve quality and as a guideline to standardize clinical healthcare, and managers with a strategy to promote and encourage the development of documents in an effort to reduce clinical practice variability, as well as giving patients the opportunity of taking part in planning their own care. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.
Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente
2012-01-01
Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.
Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach
NASA Astrophysics Data System (ADS)
Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh
2017-03-01
Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.
Gómez-García, Francisco; Ruano, Juan; Gay-Mimbrera, Jesus; Aguilar-Luque, Macarena; Sanz-Cabanillas, Juan Luis; Alcalde-Mellado, Patricia; Maestre-López, Beatriz; Carmona-Fernández, Pedro Jesús; González-Padilla, Marcelino; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz
2017-12-01
No gold standard exists to assess methodological quality of systematic reviews (SRs). Although Assessing the Methodological Quality of Systematic Reviews (AMSTAR) is widely accepted for analyzing quality, the ROBIS instrument has recently been developed. This study aimed to compare the capacity of both instruments to capture the quality of SRs concerning psoriasis interventions. Systematic literature searches were undertaken on relevant databases. For each review, methodological quality and bias risk were evaluated using the AMSTAR and ROBIS tools. Descriptive and principal component analyses were conducted to describe similarities and discrepancies between both assessment tools. We classified 139 intervention SRs as displaying high/moderate/low methodological quality and as high/low risk of bias. A high risk of bias was detected for most SRs classified as displaying high or moderate methodological quality by AMSTAR. When comparing ROBIS result profiles, responses to domain 4 signaling questions showed the greatest differences between bias risk assessments, whereas domain 2 items showed the least. When considering SRs published about psoriasis, methodological quality remains suboptimal, and the risk of bias is elevated, even for SRs exhibiting high methodological quality. Furthermore, the AMSTAR and ROBIS tools may be considered as complementary when conducting quality assessment of SRs. Copyright © 2017 Elsevier Inc. All rights reserved.
Partitioning an object-oriented terminology schema.
Gu, H; Perl, Y; Halper, M; Geller, J; Kuo, F; Cimino, J J
2001-07-01
Controlled medical terminologies are increasingly becoming strategic components of various healthcare enterprises. However, the typical medical terminology can be difficult to exploit due to its extensive size and high density. The schema of a medical terminology offered by an object-oriented representation is a valuable tool in providing an abstract view of the terminology, enhancing comprehensibility and making it more usable. However, schemas themselves can be large and unwieldy. We present a methodology for partitioning a medical terminology schema into manageably sized fragments that promote increased comprehension. Our methodology has a refinement process for the subclass hierarchy of the terminology schema. The methodology is carried out by a medical domain expert in conjunction with a computer. The expert is guided by a set of three modeling rules, which guarantee that the resulting partitioned schema consists of a forest of trees. This makes it easier to understand and consequently use the medical terminology. The application of our methodology to the schema of the Medical Entities Dictionary (MED) is presented.
Horizon Mission Methodology - A tool for the study of technology innovation and new paradigms
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission (HM) methodology was developed to provide a means of identifying and evaluating highly innovative, breakthrough technology concepts (BTCs) and for assessing their potential impact on advanced space missions. The methodology is based on identifying new capabilities needed by hypothetical 'horizon' space missions having performance requirements that cannot be met even by extrapolating known space technologies. Normal human evaluation of new ideas such as BTCs appears to be governed (and limited) by 'inner models of reality' defined as paradigms. Thus, new ideas are evaluated by old models. This paper describes the use of the HM Methodology to define possible future paradigms that would provide alternatives to evaluation by current paradigms. The approach is to represent a future paradigm by a set of new BTC-based capabilities - called a paradigm abstract. The paper describes methods of constructing and using the abstracts for evaluating BTCs for space applications and for exploring the concept of paradigms and paradigm shifts as a representation of technology innovation.
Cristy Watkins; Lynne M. Westphal
2015-01-01
In this paper, we describe our application of Ostrom et al.'s ADICO syntax, a grammatical tool based in the Institutional Analysis and Development framework, to a study of ecological restoration decision making in the Chicago Wilderness region. As this method has only been used to look at written policy and/or extractive natural resource management systems, our...
Predicting Networked Strategic Behavior via Machine Learning and Game Theory
2015-01-13
The funding for this project was used to develop basic models, methodology and algorithms for the application of machine learning and related tools to settings in which strategic behavior is central. Among the topics studied was the development of simple behavioral models explaining and predicting human subject behavior in networked strategic experiments from prior work. These included experiments in biased voting and networked trading, among others.
Development of a Neural Network-Based Renewable Energy Forecasting Framework for Process Industries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Soobin; Ryu, Jun-Hyung; Hodge, Bri-Mathias
2016-06-25
This paper presents a neural network-based forecasting framework for photovoltaic power (PV) generation as a decision-supporting tool to employ renewable energies in the process industry. The applicability of the proposed framework is illustrated by comparing its performance against other methodologies such as linear and nonlinear time series modelling approaches. A case study of an actual PV power plant in South Korea is presented.
Source Code Vulnerability Assessment Methodology
2008-09-01
Information Sciences Directorate’s (CISD) Center for Intrusion Detection Monitoring and Protection ( CIMP ) to reverse engineer tools captured by...application terminates. It is possible, however, to write past the buffer boundary in a controlled way such that the value for EIP can be overwritten with...vulnerability is widely known and has been exploited in the past . This work provides a proof-of-concept for the ARL/SLAD CAM and exploit development process
System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers
NASA Technical Reports Server (NTRS)
Young, Larry A.
2006-01-01
System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
Structured representation for requirements and specifications
NASA Technical Reports Server (NTRS)
Cohen, Gerald C.; Fisher, Gene; Frincke, Deborah; Wolber, Dave
1991-01-01
This document was generated in support of NASA contract NAS1-18586, Design and Validation of Digital Flight Control Systems suitable for Fly-By-Wire Applications, Task Assignment 2. Task 2 is associated with a formal representation of requirements and specifications. In particular, this document contains results associated with the development of a Wide-Spectrum Requirements Specification Language (WSRSL) that can be used to express system requirements and specifications in both stylized and formal forms. Included with this development are prototype tools to support the specification language. In addition a preliminary requirements specification methodology based on the WSRSL has been developed. Lastly, the methodology has been applied to an Advanced Subsonic Civil Transport Flight Control System.
Using Counter-Stories to Challenge Stock Stories about Traveller Families
ERIC Educational Resources Information Center
D'Arcy, Kate
2017-01-01
Critical Race Theory (CRT) is formed from a series of different methodological tools to expose and address racism and discrimination. Counter-stories are one of these tools. This article considers the potential of counter-stories as a methodological, theoretical and practical tool to analyse existing educational inequalities for Traveller…
NASA Astrophysics Data System (ADS)
Bedrina, T.; Parodi, A.; Quarati, A.; Clematis, A.; Rebora, N.; Laiosa, D.
2012-04-01
One of the critical issues in Hydro-Meteorological Research (HMR) is a better exploitation of data archives according to a multidisciplinary perspective. Different Earth science databases offer a huge amount of observational data, which often need to be assembled, processed, combined accordingly HM scientists needs. The cooperation between scientists active in HMR and Information and Communication Technologies (ICT) is essential in the development of innovative tools and applications for manipulating, aggregating and re-arranging heterogeneous information in flexible way. In this paper it is described an application devoted to the collection and integration of HM datasets, originated by public or private sources, freely exposed via Web services API. This application uses the mashup, recently become very popular in many fields, (Chow S.-W., 2007) technology concepts. Such methodology means combination of data and/or programs published by external online sources into an integrated experience. Mashup seems to be a promising methodology to respond to the multiple data-related activities into which HM researchers are daily involved (e.g. finding and retrieving high volume data; learning formats and developing readers; extracting parameters; performing filtering and mask; developing analysis and visualization tools). The specific case study of the recent extreme rainfall event, occurred over Genoa in Italy on the 4th November 2011 is shown through the integration of semi-professional weather observational networks as free available data source in addition to official weather networks.
NASA Astrophysics Data System (ADS)
Grubert, Emily; Siders, Anne
2016-09-01
Digitally-aided reviews of large bodies of text-based information, such as academic literature, are growing in capability but are not yet common in environmental fields. Environmental sciences and studies can benefit from application of digital tools to create comprehensive, replicable, interdisciplinary reviews that provide rapid, up-to-date, and policy-relevant reports of existing work. This work reviews the potential for applications of computational text mining and analysis tools originating in the humanities to environmental science and policy questions. Two process-oriented case studies of digitally-aided environmental literature reviews and meta-analyses illustrate potential benefits and limitations. A medium-sized, medium-resolution review (∼8000 journal abstracts and titles) focuses on topic modeling as a rapid way to identify thematic changes over time. A small, high-resolution review (∼300 full text journal articles) combines collocation and network analysis with manual coding to synthesize and question empirical field work. We note that even small digitally-aided analyses are close to the upper limit of what can be done manually. Established computational methods developed in humanities disciplines and refined by humanities and social science scholars to interrogate large bodies of textual data are applicable and useful in environmental sciences but have not yet been widely applied. Two case studies provide evidence that digital tools can enhance insight. Two major conclusions emerge. First, digital tools enable scholars to engage large literatures rapidly and, in some cases, more comprehensively than is possible manually. Digital tools can confirm manually identified patterns or identify additional patterns visible only at a large scale. Second, digital tools allow for more replicable and transparent conclusions to be drawn from literature reviews and meta-analyses. The methodological subfields of digital humanities and computational social sciences will likely continue to create innovative tools for analyzing large bodies of text, providing opportunities for interdisciplinary collaboration with the environmental fields.
Cornejo, E; Fungairiño, S G; Barandica, J M; Serrano, J M; Zorrilla, J M; Gómez, T; Zapata, F J; Acosta, F J
2016-01-15
Improving the efficiency of management in protected areas is imperative in a generalized context of limited conservation budgets. However, this is overlooked due to flaws in problem definition, general disregard for cost information, and a lack of suitable tools for measuring costs and management quality. This study describes an innovative methodological framework, implemented in the web application SIGEIN, focused on maximizing the quality of management against its costs, establishing an explicit justification for any decision. The tool integrates, with this aim, a procedure for prioritizing management objects according to a conservation value, modified by a functional criterion; a project management module; and a module for management of continuous assessment. This appraisal associates the relevance of the conservation targets, the efficacy of the methods employed, both resource and personnel investments, and the resulting costs. Preliminary results of a prototypical SIGEIN application on the Site of Community Importance Chafarinas Islands are included. Copyright © 2015 Elsevier Ltd. All rights reserved.
Vanegas, Fernando; Weiss, John; Gonzalez, Felipe
2018-01-01
Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101
França da Silva, Anne Kastelianne; Penachini da Costa de Rezende Barbosa, Marianne; Marques Vanderlei, Franciele; Destro Christofaro, Diego Giuliano; Marques Vanderlei, Luiz Carlos
2016-05-01
The use of heart rate variability as a tool capable of discriminating individuals with diabetes mellitus is still little explored, as its use has been limited to comparing those with and without the disease. Thus, the purpose of this study was to verify the use of heart rate variability as a tool for diagnostic and prognostic evaluation in person with diabetes and to identify whether there are cutoff points generated from the use of this tool in these individuals. A search was conducted in the electronic databases MEDLINE, Cochrane Library, Web of Science, EMBASE, and LILACS starting from the oldest records until January 2015, by means of descriptors related to the target condition, evaluated tool, and evaluation method. All the studies were evaluated for methodological quality using the QUADAS-2 instrument. Eight studies were selected. In general, the studies showed that the heart rate variability is useful to discriminate cardiac autonomic neuropathy in person with diabetes, and the sample entropy, SD1/SD2 indices, SDANN, HF, and slope of TFC have better discriminatory power to detect autonomic dysfunction, with sensitivity and specificity values ranging from 72% to 100% and 71% to 97%, respectively. Although there are methodological differences in indices used, in general, this tool demonstrated good sensitivity and specificity and can be used as an additional and/or complementary tool to the conventional autonomic tests, in order to obtain safer and more effective diagnostic, collaborating for better risk stratification conditions of these patients. © 2016 Wiley Periodicals, Inc.
Using quality assessment tools to critically appraise ageing research: a guide for clinicians.
Harrison, Jennifer Kirsty; Reid, James; Quinn, Terry J; Shenkin, Susan Deborah
2017-05-01
Evidence based medicine tells us that we should not accept published research at face value. Even research from established teams published in the highest impact journals can have methodological flaws, biases and limited generalisability. The critical appraisal of research studies can seem daunting, but tools are available to make the process easier for the non-specialist. Understanding the language and process of quality assessment is essential when considering or conducting research, and is also valuable for all clinicians who use published research to inform their clinical practice.We present a review written specifically for the practising geriatrician. This considers how quality is defined in relation to the methodological conduct and reporting of research. Having established why quality assessment is important, we present and critique tools which are available to standardise quality assessment. We consider five study designs: RCTs, non-randomised studies, observational studies, systematic reviews and diagnostic test accuracy studies. Quality assessment for each of these study designs is illustrated with an example of published cognitive research. The practical applications of the tools are highlighted, with guidance on their strengths and limitations. We signpost educational resources and offer specific advice for use of these tools.We hope that all geriatricians become comfortable with critical appraisal of published research and that use of the tools described in this review - along with awareness of their strengths and limitations - become a part of teaching, journal clubs and practice. © The Author 2016. Published by Oxford University Press on behalf of the British Geriatrics Society.
Computational Design of Functionalized Metal–Organic Framework Nodes for Catalysis
2017-01-01
Recent progress in the synthesis and characterization of metal–organic frameworks (MOFs) has opened the door to an increasing number of possible catalytic applications. The great versatility of MOFs creates a large chemical space, whose thorough experimental examination becomes practically impossible. Therefore, computational modeling is a key tool to support, rationalize, and guide experimental efforts. In this outlook we survey the main methodologies employed to model MOFs for catalysis, and we review selected recent studies on the functionalization of their nodes. We pay special attention to catalytic applications involving natural gas conversion. PMID:29392172
Computer-Aided Analysis of Patents for Product Technology Maturity Forecasting
NASA Astrophysics Data System (ADS)
Liang, Yanhong; Gan, Dequan; Guo, Yingchun; Zhang, Peng
Product technology maturity foresting is vital for any enterprises to hold the chance for innovation and keep competitive for a long term. The Theory of Invention Problem Solving (TRIZ) is acknowledged both as a systematic methodology for innovation and a powerful tool for technology forecasting. Based on TRIZ, the state -of-the-art on the technology maturity of product and the limits of application are discussed. With the application of text mining and patent analysis technologies, this paper proposes a computer-aided approach for product technology maturity forecasting. It can overcome the shortcomings of the current methods.
Application of lean thinking to health care: issues and observations
Joosten, Tom; Bongers, Inge; Janssen, Richard
2009-01-01
Background Incidents and quality problems are a prime cause why health care leaders are calling to redesign health care delivery. One of the concepts used is lean thinking. Yet, lean often leads to resistance. Also, there is a lack of high quality evidence supporting lean premises. In this paper, we present an overview of lean thinking and its application to health care. Development, theory and application of lean thinking to health care Lean thinking evolved from a tool designed to improve operational shop-floor performance at an automotive manufacturer to a management approach with both operational and sociotechnical aspects. Sociotechnical dynamics have until recently not received much attention. At the same time a balanced approach might lead to a situation where operational and sociotechnial improvements are mutually reinforcing. Application to health care has been limited and focussed mainly on operational aspects using original lean tools. A more integrative approach would be to pay more attention to sociotechnical dynamics of lean implementation efforts. Also, the need to use the original lean tools may be limited, because health care may have different instruments and tools already in use that are in line with lean thinking principles. Discussion We believe lean thinking has the potential to improve health care delivery. At the same time, there are methodological and practical considerations that need to be taken into account. Otherwise, lean implementation will be superficial and fail, adding to existing resistance and making it more difficult to improve health care in the long term. PMID:19696048
Multi-criteria decision analysis in environmental sciences: ten years of applications and trends.
Huang, Ivy B; Keisler, Jeffrey; Linkov, Igor
2011-09-01
Decision-making in environmental projects requires consideration of trade-offs between socio-political, environmental, and economic impacts and is often complicated by various stakeholder views. Multi-criteria decision analysis (MCDA) emerged as a formal methodology to face available technical information and stakeholder values to support decisions in many fields and can be especially valuable in environmental decision making. This study reviews environmental applications of MCDA. Over 300 papers published between 2000 and 2009 reporting MCDA applications in the environmental field were identified through a series of queries in the Web of Science database. The papers were classified by their environmental application area, decision or intervention type. In addition, the papers were also classified by the MCDA methods used in the analysis (analytic hierarchy process, multi-attribute utility theory, and outranking). The results suggest that there is a significant growth in environmental applications of MCDA over the last decade across all environmental application areas. Multiple MCDA tools have been successfully used for environmental applications. Even though the use of the specific methods and tools varies in different application areas and geographic regions, our review of a few papers where several methods were used in parallel with the same problem indicates that recommended course of action does not vary significantly with the method applied. Published by Elsevier B.V.
INTEGRATION OF POLLUTION PREVENTION TOOLS
A prototype computer-based decision support system was designed to provide small businesses with an integrated pollution prevention methodology. Preliminary research involved compilation of an inventory of existing pollution prevention tools (i.e., methodologies, software, etc.),...
Water flow algorithm decision support tool for travelling salesman problem
NASA Astrophysics Data System (ADS)
Kamarudin, Anis Aklima; Othman, Zulaiha Ali; Sarim, Hafiz Mohd
2016-08-01
This paper discuss about the role of Decision Support Tool in Travelling Salesman Problem (TSP) for helping the researchers who doing research in same area will get the better result from the proposed algorithm. A study has been conducted and Rapid Application Development (RAD) model has been use as a methodology which includes requirement planning, user design, construction and cutover. Water Flow Algorithm (WFA) with initialization technique improvement is used as the proposed algorithm in this study for evaluating effectiveness against TSP cases. For DST evaluation will go through usability testing conducted on system use, quality of information, quality of interface and overall satisfaction. Evaluation is needed for determine whether this tool can assists user in making a decision to solve TSP problems with the proposed algorithm or not. Some statistical result shown the ability of this tool in term of helping researchers to conduct the experiments on the WFA with improvements TSP initialization.
Promoting climate literacy through social engagement: the Green Ninja Project
NASA Astrophysics Data System (ADS)
Cordero, E. C.; Todd, A.
2012-12-01
One of the challenges of communicating climate change to younger audiences is the disconnect between global issues and local impacts. The Green Ninja is a climate-action superhero that aims to energize young people about climate science through media and social engagement tools. In this presentation, we'll highlight two of the tools designed to help K-12 students implement appropriate local mitigation strategies. A mobile phone application builds and supports a social community around taking action at local businesses regarding themes such as food, packaging and energy efficiency. An energy efficiency contest in local schools utilizes smart meter technology to provide feedback on household energy use and conservation. These tools are supported by films and lesson plans that link formal and informal education channels. The effectiveness of these methodologies as tools to engage young people in climate science and action will be discussed.
GetReal in network meta-analysis: a review of the methodology.
Efthimiou, Orestis; Debray, Thomas P A; van Valkenhoef, Gert; Trelle, Sven; Panayidou, Klea; Moons, Karel G M; Reitsma, Johannes B; Shang, Aijing; Salanti, Georgia
2016-09-01
Pairwise meta-analysis is an established statistical tool for synthesizing evidence from multiple trials, but it is informative only about the relative efficacy of two specific interventions. The usefulness of pairwise meta-analysis is thus limited in real-life medical practice, where many competing interventions may be available for a certain condition and studies informing some of the pairwise comparisons may be lacking. This commonly encountered scenario has led to the development of network meta-analysis (NMA). In the last decade, several applications, methodological developments, and empirical studies in NMA have been published, and the area is thriving as its relevance to public health is increasingly recognized. This article presents a review of the relevant literature on NMA methodology aiming to pinpoint the developments that have appeared in the field. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Das, Anup Kumar; Mandal, Vivekananda; Mandal, Subhash C
2014-01-01
Extraction forms the very basic step in research on natural products for drug discovery. A poorly optimised and planned extraction methodology can jeopardise the entire mission. To provide a vivid picture of different chemometric tools and planning for process optimisation and method development in extraction of botanical material, with emphasis on microwave-assisted extraction (MAE) of botanical material. A review of studies involving the application of chemometric tools in combination with MAE of botanical materials was undertaken in order to discover what the significant extraction factors were. Optimising a response by fine-tuning those factors, experimental design or statistical design of experiment (DoE), which is a core area of study in chemometrics, was then used for statistical analysis and interpretations. In this review a brief explanation of the different aspects and methodologies related to MAE of botanical materials that were subjected to experimental design, along with some general chemometric tools and the steps involved in the practice of MAE, are presented. A detailed study on various factors and responses involved in the optimisation is also presented. This article will assist in obtaining a better insight into the chemometric strategies of process optimisation and method development, which will in turn improve the decision-making process in selecting influential extraction parameters. Copyright © 2013 John Wiley & Sons, Ltd.
A technical guide to tDCS, and related non-invasive brain stimulation tools
Woods, AJ; Antal, A; Bikson, M; Boggio, PS; Brunoni, AR; Celnik, P; Cohen, LG; Fregni, F; Herrmann, CS; Kappenman, ES; Knotkova, H; Liebetanz, D; Miniussi, C; Miranda, PC; Paulus, W; Priori, A; Reato, D; Stagg, C; Wenderoth, N; Nitsche, MA
2015-01-01
Transcranial electrical stimulation (tES), including transcranial direct and alternating current stimulation (tDCS, tACS) are non-invasive brain stimulation techniques increasingly used for modulation of central nervous system excitability in humans. Here we address methodological issues required for tES application. This review covers technical aspects of tES, as well as applications like exploration of brain physiology, modelling approaches, tES in cognitive neurosciences, and interventional approaches. It aims to help the reader to appropriately design and conduct studies involving these brain stimulation techniques, understand limitations and avoid shortcomings, which might hamper the scientific rigor and potential applications in the clinical domain. PMID:26652115
Podometrics as a Potential Clinical Tool for Glomerular Disease Management.
Kikuchi, Masao; Wickman, Larysa; Hodgin, Jeffrey B; Wiggins, Roger C
2015-05-01
Chronic kidney disease culminating in end-stage kidney disease is a major public health problem costing in excess of $40 billion per year with high morbidity and mortality. Current tools for glomerular disease monitoring lack precision and contribute to poor outcome. The podocyte depletion hypothesis describes the major mechanisms underlying the progression of glomerular diseases, which are responsible for more than 80% of cases of end-stage kidney disease. The question arises of whether this new knowledge can be used to improve outcomes and reduce costs. Podocytes have unique characteristics that make them an attractive monitoring tool. Methodologies for estimating podocyte number, size, density, glomerular volume and other parameters in routine kidney biopsies, and the rate of podocyte detachment from glomeruli into urine (podometrics) now have been developed and validated. They potentially fill important gaps in the glomerular disease monitoring toolbox. The application of these tools to glomerular disease groups shows good correlation with outcome, although data validating their use for individual decision making is not yet available. Given the urgency of the clinical problem, we argue that the time has come to focus on testing these tools for application to individualized clinical decision making toward more effective progression prevention. Copyright © 2015 Elsevier Inc. All rights reserved.
A systematic review of the research evidence on cross-country features of illegal abortions
Aghaei, Farideh; Shaghaghi, Abdolreza; Sarbakhsh, Parvin
2017-01-01
Background: There are contrasting debates about abortions and prohibitory regulations posed serious public health challenges especially in underdeveloped and developing countries. Due to paucity of the empirical evidences this study was conducted to explore the existent cumulative knowledge with special focus on the applied methodology. Methods: A comprehensive review of published articles from January 1995 to December 2015 was performed. Several databases including: Embase, PubMed, Cochrane and also databasesof the Iranian medical journals were searched using combinations of relevant Medical Subject Headings (MeSH terms) and their equivalents, i.e., induced abortion, embryotomy, criminal abortion and illegal abortion. The STrengthening the Reporting of OBservational studies in Epidemiology (STROBE) statement for appraisal of the cross-sectional studies and Consolidated Criteria for Reporting Qualitative Research (COREQ) checklist for the qualitative reports were utilized. After removal of duplicates and irrelevant publications 36 articles remained for data analysis. Results: A wide heterogeneity was observed in the utilized methodology with no standard data collection tool. Face to face interview and self-administered questionnaire were the most common reported data collection/tool respectively. Married and unemployed women of 26-30 years old age group with low socioeconomic backgrounds were the most typical illegal abortees in the included studies. Conclusion: Despite limitation in accessing all relevant publications and including only those reports written in English or Persian languages, the accumulated knowledge might be applicable to develop a potentially inclusive data collection tool and hence, improve the quality of data collection and/or application of a more robust study design in future investigations. PMID:28695098
MiRduplexSVM: A High-Performing MiRNA-Duplex Prediction and Evaluation Methodology
Karathanasis, Nestoras; Tsamardinos, Ioannis; Poirazi, Panayiota
2015-01-01
We address the problem of predicting the position of a miRNA duplex on a microRNA hairpin via the development and application of a novel SVM-based methodology. Our method combines a unique problem representation and an unbiased optimization protocol to learn from mirBase19.0 an accurate predictive model, termed MiRduplexSVM. This is the first model that provides precise information about all four ends of the miRNA duplex. We show that (a) our method outperforms four state-of-the-art tools, namely MaturePred, MiRPara, MatureBayes, MiRdup as well as a Simple Geometric Locator when applied on the same training datasets employed for each tool and evaluated on a common blind test set. (b) In all comparisons, MiRduplexSVM shows superior performance, achieving up to a 60% increase in prediction accuracy for mammalian hairpins and can generalize very well on plant hairpins, without any special optimization. (c) The tool has a number of important applications such as the ability to accurately predict the miRNA or the miRNA*, given the opposite strand of a duplex. Its performance on this task is superior to the 2nts overhang rule commonly used in computational studies and similar to that of a comparative genomic approach, without the need for prior knowledge or the complexity of performing multiple alignments. Finally, it is able to evaluate novel, potential miRNAs found either computationally or experimentally. In relation with recent confidence evaluation methods used in miRBase, MiRduplexSVM was successful in identifying high confidence potential miRNAs. PMID:25961860
MPHASYS: a mouse phenotype analysis system
Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan
2007-01-01
Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167
Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan
2015-04-01
Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation. Copyright © 2014 Elsevier B.V. All rights reserved.
Jaraíz, Martín; Enríquez, Lourdes; Pinacho, Ruth; Rubio, José E; Lesarri, Alberto; López-Pérez, José L
2017-04-07
A novel DFT-based Reaction Kinetics (DFT-RK) simulation approach, employed in combination with real-time data from reaction monitoring instrumentation (like UV-vis, FTIR, Raman, and 2D NMR benchtop spectrometers), is shown to provide a detailed methodology for the analysis and design of complex synthetic chemistry schemes. As an example, it is applied to the opening of epoxides by titanocene in THF, a catalytic system with abundant experimental data available. Through a DFT-RK analysis of real-time IR data, we have developed a comprehensive mechanistic model that opens new perspectives to understand previous experiments. Although derived specifically from the opening of epoxides, the prediction capabilities of the model, built on elementary reactions, together with its practical side (reaction kinetics simulations of real experimental conditions) make it a useful simulation tool for the design of new experiments, as well as for the conception and development of improved versions of the reagents. From the perspective of the methodology employed, because both the computational (DFT-RK) and the experimental (spectroscopic data) components can follow the time evolution of several species simultaneously, it is expected to provide a helpful tool for the study of complex systems in synthetic chemistry.
NASA Astrophysics Data System (ADS)
Ćwikła, G.; Gwiazda, A.; Banaś, W.; Monica, Z.; Foit, K.
2017-08-01
The article presents the study of possible application of selected methods of complex description, that can be used as a support of the Manufacturing Information Acquisition System (MIAS) methodology, describing how to design a data acquisition system, allowing for collecting and processing real-time data on the functioning of a production system, necessary for management of a company. MIAS can allow conversion into Cyber-Physical Production System. MIAS is gathering and pre-processing data on the state of production system, including e.g. realisation of production orders, state of machines, materials and human resources. Systematised approach and model-based development is proposed for improving the quality of the design of MIAS methodology-based complex systems supporting data acquisition in various types of companies. Graphical specification can be the baseline for any model-based development in specified areas. The possibility of application of SysML and BPMN, both being UML-based languages, representing different approaches to modelling of requirements, architecture and implementation of the data acquisition system, as a tools supporting description of required features of MIAS, were considered.
Clarke, Brydie; Swinburn, Boyd; Sacks, Gary
2016-10-13
Theories of the policy process are recommended as tools to help explain both policy stasis and change. A systematic review of the application of such theoretical frameworks within the field of obesity prevention policy was conducted. A meta-synthesis was also undertaken to identify the key influences on policy decision-making. The review identified 17 studies of obesity prevention policy underpinned by political science theories. The majority of included studies were conducted in the United States (US), with significant heterogeneity in terms of policy level (e.g., national, state) studied, areas of focus, and methodologies used. Many of the included studies were methodologically limited, in regard to rigour and trustworthiness. Prominent themes identified included the role of groups and networks, political institutions, and political system characteristics, issue framing, the use of evidence, personal values and beliefs, prevailing political ideology, and timing. The limited application of political science theories indicates a need for future theoretically based research into the complexity of policy-making and multiple influences on obesity prevention policy processes.
Preloaded joint analysis methodology for space flight systems
NASA Technical Reports Server (NTRS)
Chambers, Jeffrey A.
1995-01-01
This report contains a compilation of some of the most basic equations governing simple preloaded joint systems and discusses the more common modes of failure associated with such hardware. It is intended to provide the mechanical designer with the tools necessary for designing a basic bolted joint. Although the information presented is intended to aid in the engineering of space flight structures, the fundamentals are equally applicable to other forms of mechanical design.
On the Use of Accelerated Aging Methods for Screening High Temperature Polymeric Composite Materials
NASA Technical Reports Server (NTRS)
Gates, Thomas S.; Grayson, Michael A.
1999-01-01
A rational approach to the problem of accelerated testing of high temperature polymeric composites is discussed. The methods provided are considered tools useful in the screening of new materials systems for long-term application to extreme environments that include elevated temperature, moisture, oxygen, and mechanical load. The need for reproducible mechanisms, indicator properties, and real-time data are outlined as well as the methodologies for specific aging mechanisms.
A New Interface Specification Methodology and its Application to Transducer Synthesis
1988-05-01
structural, and physical. Within each domain descriptive methods are distinguished by the level of abstraction they emphasize. The Gajski -Kuhn Y...4.2. The Gajski -Kuhn Y-chart’s three axes correspond to three different domains for describing designs: behavioral, structural, and physical. The...Gajski83] D. Gajski , R. Kuhn, Guest Editors’ Introduction: New VLSI Tools, IEEE Computer, Vol. 16, No. 12, December 1983. [Girczyc85] E. Girczyc, R
NASA Astrophysics Data System (ADS)
Di Guilmi, Corrado; Gallegati, Mauro; Landini, Simone
2017-04-01
Preface; List of tables; List of figures, 1. Introduction; Part I. Methodological Notes and Tools: 2. The state space notion; 3. The master equation; Part II. Applications to HIA Based Models: 4. Financial fragility and macroeconomic dynamics I: heterogeneity and interaction; 5. Financial fragility and macroeconomic Dynamics II: learning; Part III. Conclusions: 6. Conclusive remarks; Part IV. Appendices and Complements: Appendix A: Complements to Chapter 3; Appendix B: Solving the ME to solve the ABM; Appendix C: Specifying transition rates; Index.
Quantification of groundwater recharge in urban environments.
Tubau, Isabel; Vázquez-Suñé, Enric; Carrera, Jesús; Valhondo, Cristina; Criollo, Rotman
2017-08-15
Groundwater management in urban areas requires a detailed knowledge of the hydrogeological system as well as the adequate tools for predicting the amount of groundwater and water quality evolution. In that context, a key difference between urban and natural areas lies in recharge evaluation. A large number of studies have been published since the 1990s that evaluate recharge in urban areas, with no specific methodology. Most of these methods show that there are generally higher rates of recharge in urban settings than in natural settings. Methods such as mixing ratios or groundwater modeling can be used to better estimate the relative importance of different sources of recharge and may prove to be a good tool for total recharge evaluation. However, accurate evaluation of this input is difficult. The objective is to present a methodology to help overcome those difficulties, and which will allow us to quantify the variability in space and time of the recharge into aquifers in urban areas. Recharge calculations have been initially performed by defining and applying some analytical equations, and validation has been assessed based on groundwater flow and solute transport modeling. This methodology is applicable to complex systems by considering temporal variability of all water sources. This allows managers of urban groundwater to evaluate the relative contribution of different recharge sources at a city scale by considering quantity and quality factors. The methodology is applied to the assessment of recharge sources in the Barcelona city aquifers. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Generazio, Edward R.
2015-01-01
Directed Design of Experiments for Validating Probability of Detection Capability of NDE Systems (DOEPOD) Manual v.1.2 The capability of an inspection system is established by applications of various methodologies to determine the probability of detection (POD). One accepted metric of an adequate inspection system is that there is 95% confidence that the POD is greater than 90% (90/95 POD). Design of experiments for validating probability of detection capability of nondestructive evaluation (NDE) systems (DOEPOD) is a methodology that is implemented via software to serve as a diagnostic tool providing detailed analysis of POD test data, guidance on establishing data distribution requirements, and resolving test issues. DOEPOD demands utilization of observance of occurrences. The DOEPOD capability has been developed to provide an efficient and accurate methodology that yields observed POD and confidence bounds for both Hit-Miss or signal amplitude testing. DOEPOD does not assume prescribed POD logarithmic or similar functions with assumed adequacy over a wide range of flaw sizes and inspection system technologies, so that multi-parameter curve fitting or model optimization approaches to generate a POD curve are not required. DOEPOD applications for supporting inspector qualifications is included.
Application of tolerance limits to the characterization of image registration performance.
Fedorov, Andriy; Wells, William M; Kikinis, Ron; Tempany, Clare M; Vangel, Mark G
2014-07-01
Deformable image registration is used increasingly in image-guided interventions and other applications. However, validation and characterization of registration performance remain areas that require further study. We propose an analysis methodology for deriving tolerance limits on the initial conditions for deformable registration that reliably lead to a successful registration. This approach results in a concise summary of the probability of registration failure, while accounting for the variability in the test data. The (β, γ) tolerance limit can be interpreted as a value of the input parameter that leads to successful registration outcome in at least 100β% of cases with the 100γ% confidence. The utility of the methodology is illustrated by summarizing the performance of a deformable registration algorithm evaluated in three different experimental setups of increasing complexity. Our examples are based on clinical data collected during MRI-guided prostate biopsy registered using publicly available deformable registration tool. The results indicate that the proposed methodology can be used to generate concise graphical summaries of the experiments, as well as a probabilistic estimate of the registration outcome for a future sample. Its use may facilitate improved objective assessment, comparison and retrospective stress-testing of deformable.
Frederiksen, Kirsten; Lomborg, Kirsten; Beedholm, Kirsten
2015-09-01
This study takes its point of departure in an oft-voiced critique that the French philosopher Michel Foucault gives discourse priority over practice, thereby being deterministic and leaving little space for the individual to act as an agent. Based on an interpretation of the latter part of Foucault's oeuvre, we argue against this critique and provide a methodological discussion of the perception that Foucault's method constitutes, primarily, discourse analysis. We argue that it is possible to overcome this critique of Foucault's work by the application of methodological tools adapted from Foucault's later writings and his diagnosis of his own work as studies of forms of problematization. To shed light on the possibilities that this approach offers to the researcher, we present a reading of aspects of Foucault's work, with a focus on his notion of forms of problematization. Furthermore, we elaborate on concepts from his so-called genealogical period, namely 'the dispositive', strategy and tactics. Our interpretation is supported by examples from a study of the emergence of Danish nursing education, which is based on an analytical framework that we developed in the light of an interpretation of aspects of Foucault's work. © 2015 John Wiley & Sons Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munasinghe, M.; Meier, P.
1988-01-01
Given the importance of energy in modern economies, the first part of the volume is devoted to examining some of the key conceptual and analytical tools available for energy-policy analysis and planning. Policy tools and institutional frameworks that will facilitate better energy management are also discussed. Energy-policy analysis is explained, while effective energy management techniques are discussed to achieve desirable national objectives, using a selected set of policies and policy instruments. In the second part of the volume, the actual application of the principles set out earlier is explained through a case study of Sri Lanka. The monograph integrates themore » many aspects of the short-term programs already begun with the options for the medium to long term, and ends with the outline of a long-term strategy for Sri Lanka.« less
NASA Astrophysics Data System (ADS)
Chevrié, Mathieu; Farges, Christophe; Sabatier, Jocelyn; Guillemard, Franck; Pradere, Laetitia
2017-04-01
In automotive application field, reducing electric conductors dimensions is significant to decrease the embedded mass and the manufacturing costs. It is thus essential to develop tools to optimize the wire diameter according to thermal constraints and protection algorithms to maintain a high level of safety. In order to develop such tools and algorithms, accurate electro-thermal models of electric wires are required. However, thermal equation solutions lead to implicit fractional transfer functions involving an exponential that cannot be embedded in a car calculator. This paper thus proposes an integer order transfer function approximation methodology based on a spatial discretization for this class of fractional transfer functions. Moreover, the H2-norm is used to minimize approximation error. Accuracy of the proposed approach is confirmed with measured data on a 1.5 mm2 wire implemented in a dedicated test bench.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
NASA Astrophysics Data System (ADS)
Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta
2016-06-01
With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.
Tenca, A; Schievano, A; Lonati, S; Malagutti, L; Oberti, R; Adani, F
2011-09-01
This study aimed at finding applicable tools for favouring dark fermentation application in full-scale biogas plants in the next future. Firstly, the focus was obtaining mixed microbial cultures from natural sources (soil-inocula and anaerobically digested materials), able to efficiently produce bio-hydrogen by dark fermentation. Batch reactors with proper substrate (1 gL(glucose)(-1)) and metabolites concentrations, allowed high H(2) yields (2.8 ± 0.66 mol H(2)mol(glucose)(-1)), comparable to pure microbial cultures achievements. The application of this methodology to four organic substrates, of possible interest for full-scale plants, showed promising and repeatable bio-H(2) potential (BHP=202 ± 3 NL(H2)kg(VS)(-1)) from organic fraction of municipal source-separated waste (OFMSW). Nevertheless, the fermentation in a lab-scale CSTR (nowadays the most diffused typology of biogas-plant) of a concentrated organic mixture of OFMSW (126 g(TS)L(-1)) resulted in only 30% of its BHP, showing that further improvements are still needed for future full-scale applications of dark fermentation. Copyright © 2011 Elsevier Ltd. All rights reserved.
Virtual surgery in a (tele-)radiology framework.
Glombitza, G; Evers, H; Hassfeld, S; Engelmann, U; Meinzer, H P
1999-09-01
This paper presents telemedicine as an extension of a teleradiology framework through tools for virtual surgery. To classify the described methods and applications, the research field of virtual reality (VR) is broadly reviewed. Differences with respect to technical equipment, methodological requirements and areas of application are pointed out. Desktop VR, augmented reality, and virtual reality are differentiated and discussed in some typical contexts of diagnostic support, surgical planning, therapeutic procedures, simulation and training. Visualization techniques are compared as a prerequisite for virtual reality and assigned to distinct levels of immersion. The advantage of a hybrid visualization kernel is emphasized with respect to the desktop VR applications that are subsequently shown. Moreover, software design aspects are considered by outlining functional openness in the architecture of the host system. Here, a teleradiology workstation was extended by dedicated tools for surgical planning through a plug-in mechanism. Examples of recent areas of application are introduced such as liver tumor resection planning, diagnostic support in heart surgery, and craniofacial surgery planning. In the future, surgical planning systems will become more important. They will benefit from improvements in image acquisition and communication, new image processing approaches, and techniques for data presentation. This will facilitate preoperative planning and intraoperative applications.
Hounsome, J; Whittington, R; Brown, A; Greenhill, B; McGuire, J
2018-01-01
While structured professional judgement approaches to assessing and managing the risk of violence have been extensively examined in mental health/forensic settings, the application of the findings to people with an intellectual disability is less extensively researched and reviewed. This review aimed to assess whether risk assessment tools have adequate predictive validity for violence in adults with an intellectual disability. Standard systematic review methodology was used to identify and synthesize appropriate studies. A total of 14 studies were identified as meeting the inclusion criteria. These studies assessed the predictive validity of 18 different risk assessment tools, mainly in forensic settings. All studies concluded that the tools assessed were successful in predicting violence. Studies were generally of a high quality. There is good quality evidence that risk assessment tools are valid for people with intellectual disability who offend but further research is required to validate tools for use with people with intellectual disability who offend. © 2016 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Maseda, F. J.; Martija, I.; Martija, I.
2012-01-01
This paper describes a novel Electrical Machine and Power Electronic Training Tool (EM&PE[subscript TT]), a methodology for using it, and associated experimental educational activities. The training tool is implemented by recreating a whole power electronics system, divided into modular blocks. This process is similar to that applied when…
Applied Virtual Reality Research and Applications at NASA/Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Hale, Joseph P.
1995-01-01
A Virtual Reality (VR) applications program has been under development at NASA/Marshall Space Flight Center (MSFC) since 1989. The objectives of the MSFC VR Applications Program are to develop, assess, validate, and utilize VR in hardware development, operations development and support, mission operations training and science training. Before this technology can be utilized with confidence in these applications, it must be validated for each particular class of application. That is, the precision and reliability with which it maps onto real settings and scenarios, representative of a class, must be calculated and assessed. The approach of the MSFC VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems. Specific validation studies for selected classes of applications have been completed or are currently underway. These include macro-ergonomic "control-room class" design analysis, Spacelab stowage reconfiguration training, a full-body micro-gravity functional reach simulator, and a gross anatomy teaching simulator. This paper describes the MSFC VR Applications Program and the validation studies.
Abrams, Marc
2013-01-01
Small interfering RNA (siRNA) therapeutics have advanced from bench to clinical trials in recent years, along with new tools developed to enable detection of siRNA delivered at the organ, cell, and subcellular levels. Preclinical models of siRNA delivery have benefitted from methodologies such as stem-loop quantitative polymerase chain reaction, histological in situ immunofluorescent staining, endosomal escape assay, and RNA-induced silencing complex loading assay. These technologies have accelerated the detection and optimization of siRNA platforms to overcome the challenges associated with delivering therapeutic oligonucleotides to the cytosol of specific target cells. This review focuses on the methodologies and their application in the biodistribution of siRNA delivered by lipid nanoparticles. PMID:23504369
Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update
Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.
2012-01-01
Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996
Consolidating drug data on a global scale using Linked Data.
Jovanovik, Milos; Trajanov, Dimitar
2017-01-21
Drug product data is available on the Web in a distributed fashion. The reasons lie within the regulatory domains, which exist on a national level. As a consequence, the drug data available on the Web are independently curated by national institutions from each country, leaving the data in varying languages, with a varying structure, granularity level and format, on different locations on the Web. Therefore, one of the main challenges in the realm of drug data is the consolidation and integration of large amounts of heterogeneous data into a comprehensive dataspace, for the purpose of developing data-driven applications. In recent years, the adoption of the Linked Data principles has enabled data publishers to provide structured data on the Web and contextually interlink them with other public datasets, effectively de-siloing them. Defining methodological guidelines and specialized tools for generating Linked Data in the drug domain, applicable on a global scale, is a crucial step to achieving the necessary levels of data consolidation and alignment needed for the development of a global dataset of drug product data. This dataset would then enable a myriad of new usage scenarios, which can, for instance, provide insight into the global availability of different drug categories in different parts of the world. We developed a methodology and a set of tools which support the process of generating Linked Data in the drug domain. Using them, we generated the LinkedDrugs dataset by seamlessly transforming, consolidating and publishing high-quality, 5-star Linked Drug Data from twenty-three countries, containing over 248,000 drug products, over 99,000,000 RDF triples and over 278,000 links to generic drugs from the LOD Cloud. Using the linked nature of the dataset, we demonstrate its ability to support advanced usage scenarios in the drug domain. The process of generating the LinkedDrugs dataset demonstrates the applicability of the methodological guidelines and the supporting tools in transforming drug product data from various, independent and distributed sources, into a comprehensive Linked Drug Data dataset. The presented user-centric and analytical usage scenarios over the dataset show the advantages of having a de-siloed, consolidated and comprehensive dataspace of drug data available via the existing infrastructure of the Web.
Life cycle costing of food waste: A review of methodological approaches.
De Menna, Fabio; Dietershagen, Jana; Loubiere, Marion; Vittuari, Matteo
2018-03-01
Food waste (FW) is a global problem that is receiving increasing attention due to its environmental and economic impacts. Appropriate FW prevention, valorization, and management routes could mitigate or avoid these effects. Life cycle thinking and approaches, such as life cycle costing (LCC), may represent suitable tools to assess the sustainability of these routes. This study analyzes different LCC methodological aspects and approaches to evaluate FW management and valorization routes. A systematic literature review was carried out with a focus on different LCC approaches, their application to food, FW, and waste systems, as well as on specific methodological aspects. The review consisted of three phases: a collection phase, an iterative phase with experts' consultation, and a final literature classification. Journal papers and reports were retrieved from selected databases and search engines. The standardization of LCC methodologies is still in its infancy due to a lack of consensus over definitions and approaches. Research on the life cycle cost of FW is limited and generally focused on FW management, rather than prevention or valorization of specific flows. FW prevention, valorization, and management require a consistent integration of LCC and Life Cycle Assessment (LCA) to avoid tradeoffs between environmental and economic impacts. This entails a proper investigation of methodological differences between attributional and consequential modelling in LCC, especially with regard to functional unit, system boundaries, multi-functionality, included cost, and assessed impacts. Further efforts could also aim at finding the most effective and transparent categorization of costs, in particular when dealing with multiple stakeholders sustaining costs of FW. Interpretation of results from LCC of FW should take into account the effect on larger economic systems. Additional key performance indicators and analytical tools could be included in consequential approaches. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Vazquez Rascon, Maria de Lourdes
This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.
Simulation and Analyses of Multi-Body Separation in Launch Vehicle Staging Environment
NASA Technical Reports Server (NTRS)
Pamadi, Bandu N.; Hotchko, Nathaniel J.; Samareh, Jamshid; Covell, Peter F.; Tartabini, Paul V.
2006-01-01
The development of methodologies, techniques, and tools for analysis and simulation of multi-body separation is critically needed for successful design and operation of next generation launch vehicles. As a part of this activity, ConSep simulation tool is being developed. ConSep is a generic MATLAB-based front-and-back-end to the commercially available ADAMS. solver, an industry standard package for solving multi-body dynamic problems. This paper discusses the 3-body separation capability in ConSep and its application to the separation of the Shuttle Solid Rocket Boosters (SRBs) from the External Tank (ET) and the Orbiter. The results are compared with STS-1 flight data.
Shi, Wuxian; Chance, Mark R.
2010-01-01
About one-third of all proteins are associated with a metal. Metalloproteomics is defined as the structural and functional characterization of metalloproteins on a genome-wide scale. The methodologies utilized in metalloproteomics, including both forward (bottom-up) and reverse (top-down) technologies, to provide information on the identity, quantity and function of metalloproteins are discussed. Important techniques frequently employed in metalloproteomics include classical proteomics tools such as mass spectrometry and 2-D gels, immobilized-metal affinity chromatography, bioinformatics sequence analysis and homology modeling, X-ray absorption spectroscopy and other synchrotron radiation based tools. Combinative applications of these techniques provide a powerful approach to understand the function of metalloproteins. PMID:21130021
Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A
2017-12-01
Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.
ERIC Educational Resources Information Center
Socha, Teresa; Potter, Tom; Potter, Stephanie; Jickling, Bob
2016-01-01
This paper shares our experiences using pinhole photography with adolescents as both a pedagogical tool to support and deepen adolescent experiences in wild nature, and as a visual methodological tool to elucidate their experiences. Reflecting on a journey that explored the nature-based experiences of two adolescents on a family canoe trip in…
A Tool for the Administration and Management of University Profile Information
ERIC Educational Resources Information Center
Bulchand, Jacques; Rodriguez, Jorge; Chattah, Ana C.
2005-01-01
Purpose: The purpose of this paper is to present a management tool that helps to achieve the objectives of the plan for info-tech systems and communications of the University of Las Palmas de Gran Canaria for the 2003-2006 period. Design/methodology/approach: The methodology used in this case is nothing if not practical. The chosen tool involved…
Assessment of Near-Field Sonic Boom Simulation Tools
NASA Technical Reports Server (NTRS)
Casper, J. H.; Cliff, S. E.; Thomas, S. D.; Park, M. A.; McMullen, M. S.; Melton, J. E.; Durston, D. A.
2008-01-01
A recent study for the Supersonics Project, within the National Aeronautics and Space Administration, has been conducted to assess current in-house capabilities for the prediction of near-field sonic boom. Such capabilities are required to simulate the highly nonlinear flow near an aircraft, wherein a sonic-boom signature is generated. There are many available computational fluid dynamics codes that could be used to provide the near-field flow for a sonic boom calculation. However, such codes have typically been developed for applications involving aerodynamic configuration, for which an efficiently generated computational mesh is usually not optimum for a sonic boom prediction. Preliminary guidelines are suggested to characterize a state-of-the-art sonic boom prediction methodology. The available simulation tools that are best suited to incorporate into that methodology are identified; preliminary test cases are presented in support of the selection. During this phase of process definition and tool selection, parallel research was conducted in an attempt to establish criteria that link the properties of a computational mesh to the accuracy of a sonic boom prediction. Such properties include sufficient grid density near shocks and within the zone of influence, which are achieved by adaptation and mesh refinement strategies. Prediction accuracy is validated by comparison with wind tunnel data.
Sjögren, Jonathan; Andersson, Linda; Mejàre, Malin; Olsson, Fredrik
2017-01-01
Fab fragments are valuable research tools in various areas of science including applications in imaging, binding studies, removal of Fc-mediated effector functions, mass spectrometry, infection biology, and many others. The enzymatic tools for the generation of Fab fragments have been discovered through basic research within the field of molecular bacterial pathogenesis. Today, these enzymes are widely applied as research tools and in this chapter, we describe methodologies based on bacterial enzymes to generate Fab fragments from both human and mouse IgG. For all human IgG subclasses, the IdeS enzyme from Streptococcus pyogenes has been applied to generate F(ab')2 fragments that subsequently can be reduced under mild conditions to generate a homogenous pool of Fab' fragments. The enzyme Kgp from Porphyromonas gingivalis has been applied to generate intact Fab fragments from human IgG1 and the Fab fragments can be purified using a CH1-specific affinity resin. The SpeB protease, also from S. pyogenes, is able to digest mouse IgGs and has been applied to digest antibodies and Fab fragments can be purified on light chain affinity resins. In this chapter, we describe methodologies that can be used to obtain Fab fragments from human and mouse IgG using bacterial proteases.
Nakamura, Shinichiro; Kondo, Yasushi; Matsubae, Kazuyo; Nakajima, Kenichi; Nagasaka, Tetsuya
2011-02-01
Identification of the flow of materials and substances associated with a product system provides useful information for Life Cycle Analysis (LCA), and contributes to extending the scope of complementarity between LCA and Materials Flow Analysis/Substances Flow Analysis (MFA/SFA), the two major tools of industrial ecology. This paper proposes a new methodology based on input-output analysis for identifying the physical input-output flow of individual materials that is associated with the production of a unit of given product, the unit physical input-output by materials (UPIOM). While the Sankey diagram has been a standard tool for the visualization of MFA/SFA, with an increase in the complexity of the flows under consideration, which will be the case when economy-wide intersectoral flows of materials are involved, the Sankey diagram may become too complex for effective visualization. An alternative way to visually represent material flows is proposed which makes use of triangulation of the flow matrix based on degrees of fabrication. The proposed methodology is applied to the flow of pig iron and iron and steel scrap that are associated with the production of a passenger car in Japan. Its usefulness to identify a specific MFA pattern from the original IO table is demonstrated.
Qu, Y J; Yang, Z R; Sun, F; Zhan, S Y
2018-04-10
This paper introduced the Revised Tool for the Quality Assessment of Diagnostic Accuracy Studies (QUADAS-2), including the development and comparison with the original QUADAS, and illustrated the application of QUADAS-2 in a published paper related to the study on diagnostic accuracy which was included in systematic review and Meta-analysis. QUADAS-2 presented considerable improvement over the original tool. Confused items that included in QUADAS had disappeared and the quality assessment of the original study replaced by the rating of risk on bias and applicability. This was implemented through the description on the four main domains with minimal overlapping and answering the signal questions in each domain. The risk of bias and applicability with 'high','low' or 'unclear' was in line with the risk of bias assessment of intervention studies in Cochrane, so to replace the total score of quality assessment in QUADAS. Meanwhile, QUADAS-2 was also applicable to assess the diagnostic accuracy studies in which follow-up without prognosis was involved in golden standard. It was useful to assess the overall methodological quality of the study despite more time consuming than the original QUADAS. However, QUADAS-2 needs to be modified to apply in comparative studies on diagnostic accuracy and we hope the users would follow the updates and give their feedbacks on line.
Use of digital technologies for nasal prosthesis manufacturing.
Palousek, David; Rosicky, Jiri; Koutny, Daniel
2014-04-01
Digital technology is becoming more accessible for common use in medical applications; however, their expansion in prosthetic and orthotic laboratories is not large because of the persistent image of difficult applicability to real patients. This article aims to offer real example in the area of human facial prostheses. This article describes the utilization of optical digitization, computational modelling, rapid prototyping, mould fabrication and manufacturing of a nasal silicone prosthesis. This technical note defines the key points of the methodology and aspires to contribute to the introduction of a certified manufacturing procedure. The results show that the used technologies reduce the manufacturing time, reflect patient's requirements and allow the manufacture of high-quality prostheses for missing facial asymmetric parts. The methodology provides a good position for further development issues and is usable for clinical practice. Clinical relevance Utilization of digital technologies in facial prosthesis manufacturing process can be a good contribution for higher patient comfort and higher production efficiency but with higher initial investment and demands for experience with software tools.
Octopus: A Design Methodology for Motion Capture Wearables
2017-01-01
Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them. PMID:28809786
A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification
NASA Astrophysics Data System (ADS)
Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.
MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.
Bioinspired Methodology for Artificial Olfaction
Raman, Baranidharan; Hertz, Joshua L.; Benkstein, Kurt D.; Semancik, Steve
2008-01-01
Artificial olfaction is a potential tool for noninvasive chemical monitoring. Application of “electronic noses” typically involves recognition of “pretrained” chemicals, while long-term operation and generalization of training to allow chemical classification of “unknown” analytes remain challenges. The latter analytical capability is critically important, as it is unfeasible to pre-expose the sensor to every analyte it might encounter. Here, we demonstrate a biologically inspired approach where the recognition and generalization problems are decoupled and resolved in a hierarchical fashion. Analyte composition is refined in a progression from general (e.g., target is a hydrocarbon) to precise (e.g., target is ethane), using highly optimized response features for each step. We validate this approach using a MEMS-based chemiresistive microsensor array. We show that this approach, a unique departure from existing methodologies in artificial olfaction, allows the recognition module to better mitigate sensor-aging effects and to better classify unknowns, enhancing the utility of chemical sensors for real-world applications. PMID:18855409
Octopus: A Design Methodology for Motion Capture Wearables.
Marin, Javier; Blanco, Teresa; Marin, Jose J
2017-08-15
Human motion capture (MoCap) is widely recognised for its usefulness and application in different fields, such as health, sports, and leisure; therefore, its inclusion in current wearables (MoCap-wearables) is increasing, and it may be very useful in a context of intelligent objects interconnected with each other and to the cloud in the Internet of Things (IoT). However, capturing human movement adequately requires addressing difficult-to-satisfy requirements, which means that the applications that are possible with this technology are held back by a series of accessibility barriers, some technological and some regarding usability. To overcome these barriers and generate products with greater wearability that are more efficient and accessible, factors are compiled through a review of publications and market research. The result of this analysis is a design methodology called Octopus, which ranks these factors and schematises them. Octopus provides a tool that can help define design requirements for multidisciplinary teams, generating a common framework and offering a new method of communication between them.
Strategies and Methodologies for Developing Microbial Detoxification Systems to Mitigate Mycotoxins
Zhu, Yan; Hassan, Yousef I.; Lepp, Dion; Shao, Suqin; Zhou, Ting
2017-01-01
Mycotoxins, the secondary metabolites of mycotoxigenic fungi, have been found in almost all agricultural commodities worldwide, causing enormous economic losses in livestock production and severe human health problems. Compared to traditional physical adsorption and chemical reactions, interest in biological detoxification methods that are environmentally sound, safe and highly efficient has seen a significant increase in recent years. However, researchers in this field have been facing tremendous unexpected challenges and are eager to find solutions. This review summarizes and assesses the research strategies and methodologies in each phase of the development of microbiological solutions for mycotoxin mitigation. These include screening of functional microbial consortia from natural samples, isolation and identification of single colonies with biotransformation activity, investigation of the physiological characteristics of isolated strains, identification and assessment of the toxicities of biotransformation products, purification of functional enzymes and the application of mycotoxin decontamination to feed/food production. A full understanding and appropriate application of this tool box should be helpful towards the development of novel microbiological solutions on mycotoxin detoxification. PMID:28387743
Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco
2016-01-01
Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery.
Soloperto, Alessandro; Palazzolo, Gemma; Tsushima, Hanako; Chieregatti, Evelina; Vassalli, Massimo; Difato, Francesco
2016-01-01
Current optical approaches are progressing far beyond the scope of monitoring the structure and function of living matter, and they are becoming widely recognized as extremely precise, minimally-invasive, contact-free handling tools. Laser manipulation of living tissues, single cells, or even single-molecules is becoming a well-established methodology, thus founding the onset of new experimental paradigms and research fields. Indeed, a tightly focused pulsed laser source permits complex tasks such as developing engineered bioscaffolds, applying calibrated forces, transfecting, stimulating, or even ablating single cells with subcellular precision, and operating intracellular surgical protocols at the level of single organelles. In the present review, we report the state of the art of laser manipulation in neuroscience, to inspire future applications of light-assisted tools in nano-neurosurgery. PMID:27013962
2003-06-01
delivery Data Access (1980s) "What were unit sales in New England last March?" Relational databases (RDBMS), Structured Query Language ( SQL ...macros written in Visual Basic for Applications ( VBA ). 32 Iteration Two: Class Diagram Tech OASIS Export ScriptImport Filter Data ProcessingMethod 1...MS Excel * 1 VBA Macro*1 contains sends data to co nt ai ns executes * * 1 1 contains contains Figure 20. Iteration two class diagram The
Mixed methods for telehealth research.
Caffery, Liam J; Martin-Khan, Melinda; Wade, Victoria
2017-10-01
Mixed methods research is important to health services research because the integrated qualitative and quantitative investigation can give a more comprehensive understanding of complex interventions such as telehealth than can a single-method study. Further, mixed methods research is applicable to translational research and program evaluation. Study designs relevant to telehealth research are described and supported by examples. Quality assessment tools, frameworks to assist in the reporting and review of mixed methods research, and related methodologies are also discussed.
Advanced piloted aircraft flight control system design methodology. Volume 1: Knowledge base
NASA Technical Reports Server (NTRS)
Mcruer, Duane T.; Myers, Thomas T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design stages starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. While theory and associated computational means are an important aspect of the design methodology, the lore, knowledge and experience elements, which guide and govern applications are critical features. This material is presented as summary tables, outlines, recipes, empirical data, lists, etc., which encapsulate a great deal of expert knowledge. Much of this is presented in topical knowledge summaries which are attached as Supplements. The composite of the supplements and the main body elements constitutes a first cut at a a Mark 1 Knowledge Base for manned-aircraft flight control.
A Relevancy Algorithm for Curating Earth Science Data Around Phenomenon
NASA Technical Reports Server (NTRS)
Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.
2017-01-01
Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earthscience metadata records. Second, the methodology has been implemented as a standalone web service that is utilized to augment search and usability of data in a variety of tools.
A relevancy algorithm for curating earth science data around phenomenon
NASA Astrophysics Data System (ADS)
Maskey, Manil; Ramachandran, Rahul; Li, Xiang; Weigel, Amanda; Bugbee, Kaylin; Gatlin, Patrick; Miller, J. J.
2017-09-01
Earth science data are being collected for various science needs and applications, processed using different algorithms at multiple resolutions and coverages, and then archived at different archiving centers for distribution and stewardship causing difficulty in data discovery. Curation, which typically occurs in museums, art galleries, and libraries, is traditionally defined as the process of collecting and organizing information around a common subject matter or a topic of interest. Curating data sets around topics or areas of interest addresses some of the data discovery needs in the field of Earth science, especially for unanticipated users of data. This paper describes a methodology to automate search and selection of data around specific phenomena. Different components of the methodology including the assumptions, the process, and the relevancy ranking algorithm are described. The paper makes two unique contributions to improving data search and discovery capabilities. First, the paper describes a novel methodology developed for automatically curating data around a topic using Earth science metadata records. Second, the methodology has been implemented as a stand-alone web service that is utilized to augment search and usability of data in a variety of tools.
Methodology for the Assessment of the Ecotoxicological Potential of Construction Materials
Rodrigues, Patrícia; Silvestre, José D.; Flores-Colen, Inês; Viegas, Cristina A.; de Brito, Jorge; Kurad, Rawaz; Demertzi, Martha
2017-01-01
Innovation in construction materials (CM) implies changing their composition by incorporating raw materials, usually non-traditional ones, which confer the desired characteristics. However, this practice may have unknown risks. This paper discusses the ecotoxicological potential associated with raw and construction materials, and proposes and applies a methodology for the assessment of their ecotoxicological potential. This methodology is based on existing laws, such as Regulation (European Commission) No. 1907/2006 (REACH—Registration, Evaluation, Authorization and Restriction of Chemicals) and Regulation (European Commission) No. 1272/2008 (CLP—Classification, Labelling and Packaging). Its application and validation showed that raw material without clear evidence of ecotoxicological potential, but with some ability to release chemicals, can lead to the formulation of a CM with a slightly lower hazardousness in terms of chemical characterization despite a slightly higher ecotoxicological potential than the raw materials. The proposed methodology can be a useful tool for the development and manufacturing of products and the design choice of the most appropriate CM, aiming at the reduction of their environmental impact and contributing to construction sustainability. PMID:28773011
A Test Methodology for Determining Space-Readiness of Xilinx SRAM-Based FPGA Designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quinn, Heather M; Graham, Paul S; Morgan, Keith S
2008-01-01
Using reconfigurable, static random-access memory (SRAM) based field-programmable gate arrays (FPGAs) for space-based computation has been an exciting area of research for the past decade. Since both the circuit and the circuit's state is stored in radiation-tolerant memory, both could be alterd by the harsh space radiation environment. Both the circuit and the circuit's state can be prote cted by triple-moduler redundancy (TMR), but applying TMR to FPGA user designs is often an error-prone process. Faulty application of TMR could cause the FPGA user circuit to output incorrect data. This paper will describe a three-tiered methodology for testing FPGA usermore » designs for space-readiness. We will describe the standard approach to testing FPGA user designs using a particle accelerator, as well as two methods using fault injection and a modeling tool. While accelerator testing is the current 'gold standard' for pre-launch testing, we believe the use of fault injection and modeling tools allows for easy, cheap and uniform access for discovering errors early in the design process.« less
Esposito, Pasquale; Dal Canton, Antonio
2014-01-01
Evaluation and improvement of quality of care provided to the patients are of crucial importance in the daily clinical practice and in the health policy planning and financing. Different tools have been developed, including incident analysis, health technology assessment and clinical audit. The clinical audit consist of measuring a clinical outcome or a process, against well-defined standards set on the principles of evidence-based medicine in order to identify the changes needed to improve the quality of care. In particular, patients suffering from chronic renal diseases, present many problems that have been set as topics for clinical audit projects, such as hypertension, anaemia and mineral metabolism management. Although the results of these studies have been encouraging, demonstrating the effectiveness of audit, overall the present evidence is not clearly in favour of clinical audit. These findings call attention to the need to further studies to validate this methodology in different operating scenarios. This review examines the principle of clinical audit, focusing on experiences performed in nephrology settings. PMID:25374819
NASA Technical Reports Server (NTRS)
Lo, Yunnhon; Johnson, Stephen B.; Breckenridge, Jonathan T.
2014-01-01
SHM/FM theory has been successfully applied to the selection of the baseline set Abort Triggers for the NASA SLS center dot Quantitative assessment played a useful role in the decision process ? M&FM, which is new within NASA MSFC, required the most "new" work, as this quantitative analysis had never been done before center dot Required development of the methodology and tool to mechanize the process center dot Established new relationships to the other groups ? The process is now an accepted part of the SLS design process, and will likely be applied to similar programs in the future at NASA MSFC ? Future improvements center dot Improve technical accuracy ?Differentiate crew survivability due to an abort, vs. survivability even no immediate abort occurs (small explosion with little debris) ?Account for contingent dependence of secondary triggers on primary triggers ?Allocate "? LOC Benefit" of each trigger when added to the previously selected triggers. center dot Reduce future costs through the development of a specialized tool ? Methodology can be applied to any manned/unmanned vehicle, in space or terrestrial
Machine intelligence and autonomy for aerospace systems
NASA Technical Reports Server (NTRS)
Heer, Ewald (Editor); Lum, Henry (Editor)
1988-01-01
The present volume discusses progress toward intelligent robot systems in aerospace applications, NASA Space Program automation and robotics efforts, the supervisory control of telerobotics in space, machine intelligence and crew/vehicle interfaces, expert-system terms and building tools, and knowledge-acquisition for autonomous systems. Also discussed are methods for validation of knowledge-based systems, a design methodology for knowledge-based management systems, knowledge-based simulation for aerospace systems, knowledge-based diagnosis, planning and scheduling methods in AI, the treatment of uncertainty in AI, vision-sensing techniques in aerospace applications, image-understanding techniques, tactile sensing for robots, distributed sensor integration, and the control of articulated and deformable space structures.
Palomino, Robert M.; Hamlyn, Rebecca; Liu, Zongyuan; ...
2017-04-27
In this paper we provide a summary of the recent development of ambient pressure X-ray photoelectron spectroscopy (AP-XPS) and its application to catalytic surface chemistry. The methodology as well as significant advantages and challenges associated with this novel technique are described. Details about specific examples of using AP-XPS to probe surface chemistry under working reaction conditions for a number of reactions are explained: CO oxidation, water-gas shift (WGS), CO 2 hydrogenation, dry reforming of methane (DRM) and ethanol steam reforming (ESR). In conclusion, we discuss insights into the future development of the AP-XPS technique and its applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palomino, Robert M.; Hamlyn, Rebecca; Liu, Zongyuan
In this paper we provide a summary of the recent development of ambient pressure X-ray photoelectron spectroscopy (AP-XPS) and its application to catalytic surface chemistry. The methodology as well as significant advantages and challenges associated with this novel technique are described. Details about specific examples of using AP-XPS to probe surface chemistry under working reaction conditions for a number of reactions are explained: CO oxidation, water-gas shift (WGS), CO 2 hydrogenation, dry reforming of methane (DRM) and ethanol steam reforming (ESR). In conclusion, we discuss insights into the future development of the AP-XPS technique and its applications.
Space Weather Monitoring for ISS Space Environments Engineering and Crew Auroral Observations
NASA Technical Reports Server (NTRS)
Minow, Joseph I.; Pettit, Donald R.; Hartman, William A.
2012-01-01
The awareness of potentially significant impacts of space weather on spaceand ground ]based technological systems has generated a strong desire in many sectors of government and industry to effectively transform knowledge and understanding of the variable space environment into useful tools and applications for use by those entities responsible for systems that may be vulnerable to space weather impacts. Essentially, effectively transitioning science knowledge to useful applications relevant to space weather has become important. This talk will present proven methodologies that have been demonstrated to be effective, and how in the current environment those can be applied to space weather transition efforts.
ERIC Educational Resources Information Center
Sikaliuk, Anzhela
2014-01-01
The role and importance of situational methodology as one of the pedagogical tools of influence on the formation of socio-ethical values of future managers in higher schools of Ukraine and Germany have been theoretically substantiated. The possibilities of situational methodology influence on the formation of socio-ethical values of…
Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang
2017-01-01
The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System's underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored.
Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang
2017-01-01
The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242
PET/MRI for neurologic applications.
Catana, Ciprian; Drzezga, Alexander; Heiss, Wolf-Dieter; Rosen, Bruce R
2012-12-01
PET and MRI provide complementary information in the study of the human brain. Simultaneous PET/MRI data acquisition allows the spatial and temporal correlation of the measured signals, creating opportunities impossible to realize using stand-alone instruments. This paper reviews the methodologic improvements and potential neurologic and psychiatric applications of this novel technology. We first present methods for improving the performance and information content of each modality by using the information provided by the other technique. On the PET side, we discuss methods that use the simultaneously acquired MRI data to improve the PET data quantification. On the MRI side, we present how improved PET quantification can be used to validate several MRI techniques. Finally, we describe promising research, translational, and clinical applications that can benefit from these advanced tools.
A technical guide to tDCS, and related non-invasive brain stimulation tools.
Woods, A J; Antal, A; Bikson, M; Boggio, P S; Brunoni, A R; Celnik, P; Cohen, L G; Fregni, F; Herrmann, C S; Kappenman, E S; Knotkova, H; Liebetanz, D; Miniussi, C; Miranda, P C; Paulus, W; Priori, A; Reato, D; Stagg, C; Wenderoth, N; Nitsche, M A
2016-02-01
Transcranial electrical stimulation (tES), including transcranial direct and alternating current stimulation (tDCS, tACS) are non-invasive brain stimulation techniques increasingly used for modulation of central nervous system excitability in humans. Here we address methodological issues required for tES application. This review covers technical aspects of tES, as well as applications like exploration of brain physiology, modelling approaches, tES in cognitive neurosciences, and interventional approaches. It aims to help the reader to appropriately design and conduct studies involving these brain stimulation techniques, understand limitations and avoid shortcomings, which might hamper the scientific rigor and potential applications in the clinical domain. Copyright © 2015 International Federation of Clinical Neurophysiology. All rights reserved.
Spanish methodological approach for biosphere assessment of radioactive waste disposal.
Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C
2007-10-01
The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.
Propulsion integration of hypersonic air-breathing vehicles utilizing a top-down design methodology
NASA Astrophysics Data System (ADS)
Kirkpatrick, Brad Kenneth
In recent years, a focus of aerospace engineering design has been the development of advanced design methodologies and frameworks to account for increasingly complex and integrated vehicles. Techniques such as parametric modeling, global vehicle analyses, and interdisciplinary data sharing have been employed in an attempt to improve the design process. The purpose of this study is to introduce a new approach to integrated vehicle design known as the top-down design methodology. In the top-down design methodology, the main idea is to relate design changes on the vehicle system and sub-system level to a set of over-arching performance and customer requirements. Rather than focusing on the performance of an individual system, the system is analyzed in terms of the net effect it has on the overall vehicle and other vehicle systems. This detailed level of analysis can only be accomplished through the use of high fidelity computational tools such as Computational Fluid Dynamics (CFD) or Finite Element Analysis (FEA). The utility of the top-down design methodology is investigated through its application to the conceptual and preliminary design of a long-range hypersonic air-breathing vehicle for a hypothetical next generation hypersonic vehicle (NHRV) program. System-level design is demonstrated through the development of the nozzle section of the propulsion system. From this demonstration of the methodology, conclusions are made about the benefits, drawbacks, and cost of using the methodology.
Gurdak, Jason J.; Qi, Sharon L.; Geisler, Michael L.
2009-01-01
The U.S. Geological Survey Raster Error Propagation Tool (REPTool) is a custom tool for use with the Environmental System Research Institute (ESRI) ArcGIS Desktop application to estimate error propagation and prediction uncertainty in raster processing operations and geospatial modeling. REPTool is designed to introduce concepts of error and uncertainty in geospatial data and modeling and provide users of ArcGIS Desktop a geoprocessing tool and methodology to consider how error affects geospatial model output. Similar to other geoprocessing tools available in ArcGIS Desktop, REPTool can be run from a dialog window, from the ArcMap command line, or from a Python script. REPTool consists of public-domain, Python-based packages that implement Latin Hypercube Sampling within a probabilistic framework to track error propagation in geospatial models and quantitatively estimate the uncertainty of the model output. Users may specify error for each input raster or model coefficient represented in the geospatial model. The error for the input rasters may be specified as either spatially invariant or spatially variable across the spatial domain. Users may specify model output as a distribution of uncertainty for each raster cell. REPTool uses the Relative Variance Contribution method to quantify the relative error contribution from the two primary components in the geospatial model - errors in the model input data and coefficients of the model variables. REPTool is appropriate for many types of geospatial processing operations, modeling applications, and related research questions, including applications that consider spatially invariant or spatially variable error in geospatial data.
ERIC Educational Resources Information Center
Mankowska, Anna
2016-01-01
Little, if any, examination of using play-based tools to examine children's opinions in research exists in the current literature. Therefore, this paper is meant to address that gap within the literature and showcase the study about the use of a specific play-based methodological tool in qualitative research. This specific tool called social board…
A Software Engineering Environment for the Navy.
1982-03-31
Engineering Pr.cess . - 55 ?art II: Description of A Software Engineering Env.Lonnmeut 1. Data Base ........................................ 7 -3 L.I...Methodology to Tool 1-54 2.2.2.2-6 Flow of Management: Activity to Methodology to Tool 21- 55 2.2.2.2-7 Pipelining for Activity-Specific Tools 11-56 A.1.1-1 A...testing techniques. 2.2. 2 Methodciogies and Tools: Correctness Analysis Pai e T- 4Metboioioo ies aews - Pev2.ews Jeicrmine the in ernai ’ Qolc .. ness and
NASA Astrophysics Data System (ADS)
Kempf, Scott; Schäfer, Frank K.; Cardone, Tiziana; Ferreira, Ivo; Gerené, Sam; Destefanis, Roberto; Grassi, Lilith
2016-12-01
During recent years, the state-of-the-art risk assessment of the threat posed to spacecraft by micrometeoroids and space debris has been expanded to the analysis of failure modes of internal spacecraft components. This method can now be used to perform risk analyses for satellites to assess various failure levels - from failure of specific sub-systems to catastrophic break-up. This new assessment methodology is based on triple-wall ballistic limit equations (BLEs), specifically the Schäfer-Ryan-Lambert (SRL) BLE, which is applicable for describing failure threshold levels for satellite components following a hypervelocity impact. The methodology is implemented in the form of the software tool Particle Impact Risk and vulnerability Analysis Tool (PIRAT). During a recent European Space Agency (ESA) funded study, the PIRAT functionality was expanded in order to provide an interface to ESA's Concurrent Design Facility (CDF). The additions include a geometry importer and an OCDT (Open Concurrent Design Tool) interface. The new interface provides both the expanded geometrical flexibility, which is provided by external computer aided design (CAD) modelling, and an ease of import of existing data without the need for extensive preparation of the model. The reduced effort required to perform vulnerability analyses makes it feasible for application during early design phase, at which point modifications to satellite design can be undertaken with relatively little extra effort. The integration of PIRAT in the CDF represents the first time that vulnerability analyses can be performed in-session in ESA's CDF and the first time that comprehensive vulnerability studies can be applied cost-effectively in early design phase in general.
NASA Astrophysics Data System (ADS)
Fahrul Hassan, Mohd; Jusoh, Suhada; Zaini Yunos, Muhamad; Arifin, A. M. T.; Ismail, A. E.; Rasidi Ibrahim, M.; Zulafif Rahim, M.
2017-09-01
Portable water filter has grown significantly in recent years. The use of water bottles as a water drink stuff using hand pump water filtration unit has been suggested to replace water bottled during outdoor recreational activities and for emergency supplies. However, quality of water still the issue related to contaminated water due to the residual waste plants, bacteria, and so on. Based on these issues, the study was carried out to design a portable water filter that uses membrane filtration system by applying Design for Six Sigma. Design for Six Sigma methodology consists of five stages which is Define, Measure, Analyze, Design and Verify. There were several tools have been used in each stage in order to come out with a specific objective. In the Define stage, questionnaire approach was used to identify the needs of portable water filter in the future from potential users. Next, Quality Function Deployment (QFD) tool was used in the Measure stage to measure the users’ needs into engineering characteristics. Based on the information in the Measure stage, morphological chart and weighted decision matrix tools were used in the Analyze stage. This stage performed several activities including concept generation and selection. Once the selection of the final concept completed, detail drawing was made in the Design stage. Then, prototype was developed in the Verify stage to conduct proof-of-concept testing. The results that obtained from each stage have been reported in this paper. From this study, it can be concluded that the application of Design for Six Sigma in designing a future portable water filter that uses membrane filtration system is a good start in looking for a new alternative concept with a completed supporting document.
Economic Consequence Analysis of Disasters: The ECAT Software Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Adam; Prager, Fynn; Chen, Zhenhua
This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less
NASA Technical Reports Server (NTRS)
Culbert, Chris
1990-01-01
Although they have reached a point of commercial viability, expert systems were originally developed in artificial intelligence (AI) research environments. Many of the available tools still work best in such environments. These environments typically utilize special hardware such as LISP machines and relatively unfamiliar languages such as LISP or Prolog. Space Station applications will require deep integration of expert system technology with applications developed in conventional languages, specifically Ada. The ability to apply automation to Space Station functions could be greatly enhanced by widespread availability of state-of-the-art expert system tools based on Ada. Although there have been some efforts to examine the use of Ada for AI applications, there are few, if any, existing products which provide state-of-the-art AI capabilities in an Ada tool. The goal of the ART/Ada Design Project is to conduct research into the implementation in Ada of state-of-the-art hybrid expert systems building tools (ESBT's). This project takes the following approach: using the existing design of the ART-IM ESBT as a starting point, analyze the impact of the Ada language and Ada development methodologies on that design; redesign the system in Ada; and analyze its performance. The research project will attempt to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future Space Station Freedom projects. During Phase 1 of the project, initial requirements analysis, design, and implementation of the kernel subset of ART-IM functionality was completed. During Phase 2, the effort has been focused on the implementation and performance analysis of several versions with increasing functionality. Since production quality ART/Ada tools will not be available for a considerable time, and additional subtask of this project will be the completion of an Ada version of the CLIPS expert system shell developed by NASA. This tool will provide full syntactic compatibility with any eventual products of the ART/Ada design while allowing SSFP developers early access to this technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less
The risk of bias in systematic reviews tool showed fair reliability and good construct validity.
Bühn, Stefanie; Mathes, Tim; Prengel, Peggy; Wegewitz, Uta; Ostermann, Thomas; Robens, Sibylle; Pieper, Dawid
2017-11-01
There is a movement from generic quality checklists toward a more domain-based approach in critical appraisal tools. This study aimed to report on a first experience with the newly developed risk of bias in systematic reviews (ROBIS) tool and compare it with A Measurement Tool to Assess Systematic Reviews (AMSTAR), that is, the most common used tool to assess methodological quality of systematic reviews while assessing validity, reliability, and applicability. Validation study with four reviewers based on 16 systematic reviews in the field of occupational health. Interrater reliability (IRR) of all four raters was highest for domain 2 (Fleiss' kappa κ = 0.56) and lowest for domain 4 (κ = 0.04). For ROBIS, median IRR was κ = 0.52 (range 0.13-0.88) for the experienced pair of raters compared to κ = 0.32 (range 0.12-0.76) for the less experienced pair of raters. The percentage of "yes" scores of each review of ROBIS ratings was strongly correlated with the AMSTAR ratings (r s = 0.76; P = 0.01). ROBIS has fair reliability and good construct validity to assess the risk of bias in systematic reviews. More validation studies are needed to investigate reliability and applicability, in particular. Copyright © 2017 Elsevier Inc. All rights reserved.
Single-scan 2D NMR: An Emerging Tool in Analytical Spectroscopy
Giraudeau, Patrick; Frydman, Lucio
2016-01-01
Two-dimensional Nuclear Magnetic Resonance (2D NMR) spectroscopy is widely used in chemical and biochemical analyses. Multidimensional NMR is also witnessing an increased use in quantitative and metabolic screening applications. Conventional 2D NMR experiments, however, are affected by inherently long acquisition durations, arising from their need to sample the frequencies involved along their indirect domains in an incremented, scan-by-scan nature. A decade ago a so-called “ultrafast” (UF) approach was proposed, capable to deliver arbitrary 2D NMR spectra involving any kind of homo- or hetero-nuclear correlations, in a single scan. During the intervening years the performance of this sub-second 2D NMR methodology has been greatly improved, and UF 2D NMR is rapidly becoming a powerful analytical tool witnessing an expanded scope of applications. The present reviews summarizes the principles and the main developments which have contributed to the success of this approach, and focuses on applications which have been recently demonstrated in various areas of analytical chemistry –from the real time monitoring of chemical and biochemical processes, to extensions in hyphenated techniques and in quantitative applications. PMID:25014342
Knapp, Jennifer; Macdonald, Michael; Malone, David; Hamon, Nicholas; Richardson, Jason H
2015-09-26
Malaria vector control technology has remained largely static for decades and there is a pressing need for innovative control tools and methodology to radically improve the quality and efficiency of current vector control practices. This report summarizes a workshop jointly organized by the Innovative Vector Control Consortium (IVCC) and the Armed Forces Pest Management Board (AFPMB) focused on public health pesticide application technology. Three main topics were discussed: the limitations with current tools and techniques used for indoor residual spraying (IRS), technology innovation to improve efficacy of IRS programmes, and truly disruptive application technology beyond IRS. The group identified several opportunities to improve application technology to include: insuring all IRS programmes are using constant flow valves and erosion resistant tips; introducing compression sprayer improvements that help minimize pesticide waste and human error; and moving beyond IRS by embracing the potential for new larval source management techniques and next generation technology such as unmanned "smart" spray systems. The meeting served to lay the foundation for broader collaboration between the IVCC and AFPMB and partners in industry, the World Health Organization, the Bill and Melinda Gates Foundation and others.
Methodological challenges of validating a clinical decision-making tool in the practice environment.
Brennan, Caitlin W; Daly, Barbara J
2015-04-01
Validating a measurement tool intended for use in the practice environment poses challenges that may not be present when validating a tool intended solely for research purposes. The aim of this article is to describe the methodological challenges of validating a clinical decision-making tool, the Oncology Acuity Tool, which nurses use to make nurse assignment and staffing decisions prospectively each shift. Data were derived from a larger validation study, during which several methodological challenges arose. Revisions to the tool, including conducting iterative feedback cycles with end users, were necessary before the validation study was initiated. The "true" value of patient acuity is unknown, and thus, two approaches to inter-rater reliability assessment were used. Discordant perspectives existed between experts and end users. Balancing psychometric rigor with clinical relevance may be achieved through establishing research-practice partnerships, seeking active and continuous feedback with end users, and weighing traditional statistical rules of thumb with practical considerations. © The Author(s) 2014.
Materials Selection Criteria for Nuclear Power Applications: A Decision Algorithm
NASA Astrophysics Data System (ADS)
Rodríguez-Prieto, Álvaro; Camacho, Ana María; Sebastián, Miguel Ángel
2016-02-01
An innovative methodology based on stringency levels is proposed in this paper and improves the current selection method for structural materials used in demanding industrial applications. This paper describes a new approach for quantifying the stringency of materials requirements based on a novel deterministic algorithm to prevent potential failures. We have applied the new methodology to different standardized specifications used in pressure vessels design, such as SA-533 Grade B Cl.1, SA-508 Cl.3 (issued by the American Society of Mechanical Engineers), DIN 20MnMoNi55 (issued by the German Institute of Standardization) and 16MND5 (issued by the French Nuclear Commission) specifications and determine the influence of design code selection. This study is based on key scientific publications on the influence of chemical composition on the mechanical behavior of materials, which were not considered when the technological requirements were established in the aforementioned specifications. For this purpose, a new method to quantify the efficacy of each standard has been developed using a deterministic algorithm. The process of assigning relative weights was performed by consulting a panel of experts in materials selection for reactor pressure vessels to provide a more objective methodology; thus, the resulting mathematical calculations for quantitative analysis are greatly simplified. The final results show that steel DIN 20MnMoNi55 is the best material option. Additionally, more recently developed materials such as DIN 20MnMoNi55, 16MND5 and SA-508 Cl.3 exhibit mechanical requirements more stringent than SA-533 Grade B Cl.1. The methodology presented in this paper can be used as a decision tool in selection of materials for a wide range of applications.
Computer-aided design for metabolic engineering.
Fernández-Castané, Alfred; Fehér, Tamás; Carbonell, Pablo; Pauthenier, Cyrille; Faulon, Jean-Loup
2014-12-20
The development and application of biotechnology-based strategies has had a great socio-economical impact and is likely to play a crucial role in the foundation of more sustainable and efficient industrial processes. Within biotechnology, metabolic engineering aims at the directed improvement of cellular properties, often with the goal of synthesizing a target chemical compound. The use of computer-aided design (CAD) tools, along with the continuously emerging advanced genetic engineering techniques have allowed metabolic engineering to broaden and streamline the process of heterologous compound-production. In this work, we review the CAD tools available for metabolic engineering with an emphasis, on retrosynthesis methodologies. Recent advances in genetic engineering strategies for pathway implementation and optimization are also reviewed as well as a range of bionalytical tools to validate in silico predictions. A case study applying retrosynthesis is presented as an experimental verification of the output from Retropath, the first complete automated computational pipeline applicable to metabolic engineering. Applying this CAD pipeline, together with genetic reassembly and optimization of culture conditions led to improved production of the plant flavonoid pinocembrin. Coupling CAD tools with advanced genetic engineering strategies and bioprocess optimization is crucial for enhanced product yields and will be of great value for the development of non-natural products through sustainable biotechnological processes. Copyright © 2014 Elsevier B.V. All rights reserved.
Scott, Anna Mae; Hofmann, Björn; Gutiérrez-Ibarluzea, Iñaki; Bakke Lysdahl, Kristin; Sandman, Lars; Bombard, Yvonne
2017-01-01
Introduction: Assessment of ethics issues is an important part of health technology assessments (HTA). However, in terms of existence of quality assessment tools, ethics for HTA is methodologically underdeveloped in comparison to other areas of HTA, such as clinical or cost effectiveness. Objective: To methodologically advance ethics for HTA by: (1) proposing and elaborating Q-SEA, the first instrument for quality assessment of ethics analyses, and (2) applying Q-SEA to a sample systematic review of ethics for HTA, in order to illustrate and facilitate its use. Methods: To develop a list of items for the Q-SEA instrument, we systematically reviewed the literature on methodology in ethics for HTA, reviewed HTA organizations’ websites, and solicited views from 32 experts in the field of ethics for HTA at two 2-day workshops. We subsequently refined Q-SEA through its application to an ethics analysis conducted for HTA. Results: Q-SEA instrument consists of two domains – the process domain and the output domain. The process domain consists of 5 elements: research question, literature search, inclusion/exclusion criteria, perspective, and ethics framework. The output domain consists of 5 elements: completeness, bias, implications, conceptual clarification, and conflicting values. Conclusion: Q-SEA is the first instrument for quality assessment of ethics analyses in HTA. Further refinements to the instrument to enhance its usability continue. PMID:28326147
Electronic Design Automation: Integrating the Design and Manufacturing Functions
NASA Technical Reports Server (NTRS)
Bachnak, Rafic; Salkowski, Charles
1997-01-01
As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.
A Perspective on Computational Human Performance Models as Design Tools
NASA Technical Reports Server (NTRS)
Jones, Patricia M.
2010-01-01
The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.
NASA Technical Reports Server (NTRS)
1974-01-01
The BRAVO User's Manual is presented which describes the BRAVO methodology in terms of step-by-step procedures, so that it may be used as a tool for a team of analysts performing cost effectiveness analyses on potential future space applications. BRAVO requires a relatively general set of input information and a relatively small expenditure of resources. For Vol. 1, see N74-12493; for Vol. 2, see N74-14530.
Bragança, Luís
2014-01-01
This paper analyses the current trends in sustainability assessment. After about 15 years from the launch of sustainability assessment tools, focused on buildings evaluation, the paradigm of sustainability assessment tools is changing from the building scale to the built environment scale. Currently European cities and cities around the world are concerned with sustainable development, as well as its evolution. Cities seek a way to adapt to contemporary changes, in order to meet the required needs and ensure population's well-being. Considering this, the new generations of sustainability assessment tools are being developed to be used to guide and help cities and urban areas to become more sustainable. Following the trend of the most important sustainability assessment tools, the sustainability assessment tool SBToolPT is also developing its version for assessing the sustainability of the built environment, namely, the urban planning projects and the urban regeneration projects, to be developed in Portugal, the SBToolPT-UP. The application of the methodology to three case studies will demonstrate its feasibility; at the same time this will identify the best practices which will serve as reference for new projects, thereby assisting the development of the tool. PMID:24592171
Castanheira, Guilherme; Bragança, Luís
2014-01-01
This paper analyses the current trends in sustainability assessment. After about 15 years from the launch of sustainability assessment tools, focused on buildings evaluation, the paradigm of sustainability assessment tools is changing from the building scale to the built environment scale. Currently European cities and cities around the world are concerned with sustainable development, as well as its evolution. Cities seek a way to adapt to contemporary changes, in order to meet the required needs and ensure population's well-being. Considering this, the new generations of sustainability assessment tools are being developed to be used to guide and help cities and urban areas to become more sustainable. Following the trend of the most important sustainability assessment tools, the sustainability assessment tool SBTool(PT) is also developing its version for assessing the sustainability of the built environment, namely, the urban planning projects and the urban regeneration projects, to be developed in Portugal, the SBTool(PT)-UP. The application of the methodology to three case studies will demonstrate its feasibility; at the same time this will identify the best practices which will serve as reference for new projects, thereby assisting the development of the tool.
NASA Astrophysics Data System (ADS)
Yuen, K.; Chang, G.; Basilio, R. R.; Hatfield, J.; Cox, E. L.
2017-12-01
The prevalence and availability of NASA remote sensing data over the last 40+ years have produced many opportunities for the development of science derived data applications. However, extending and systematically integrating the applications into decision support models and tools have been sporadic and incomplete. Despite efforts among the research communities and external partners, implementation challenges exist and still remain to be addressed. In order to effectively address the systemic gap between the research and applications communities, steps must be taken to effectively bridge that gap: specific goals, a clear plan, and a concerted and diligent effort are needed to produce the desired results. The Orbiting Carbon Observatory-2 (OCO-2) mission sponsored a pilot effort on science data applications with the specific intent of building strategic partnerships, so that organizations and individuals could effectively use OCO-2 data products for application development. The successful partnership with the USDA/ARS National Laboratory for Agriculture and the Environment (NLAE) has laid the foundation for: 1) requirements and lessons for establishing a strategic partnership for application development, 2) building opportunities and growing partnerships for new missions such as OCO-3, and 3) the development of a methodology and approach for integrating application development into a mission life cycle. This presentation will provide an overview of the OCO-2 pilot effort, deliverables, the methodology, implementation, and best practices.
Digital modeling of end-mill cutting tools for FEM applications from the active cutting contour
NASA Astrophysics Data System (ADS)
Salguero, Jorge; Marcos, M.; Batista, M.; Gómez, A.; Mayuet, P.; Bienvenido, R.
2012-04-01
A very current technique in the research field of machining by material removal is the use of simulations using the Finite Element Method (FEM). Nevertheless, and although is widely used in processes that allows approximations to orthogonal cutting, such as shaping, is scarcely used in more complexes processes, such as milling. This fact is due principally to the complex geometry of the cutting tools in these processes, and the need to realize the studi es in an oblique cutting configuration. This paper shows a methodology for the geometrical characterization of commercial endmill cutting tools, by the extraction of the cutting tool contour, making use of optical metrology, and using this geometry to model the active cutting zone with a 3D CAD software. This model is easily exportable to different CAD formats, such as IGES or STEP, and importable from FEM software, where is possible to study the behavior in service of the same ones.
Using m-learning on nursing courses to improve learning.
de Marcos Ortega, Luis; Barchino Plata, Roberto; Jiménez Rodríguez, María Lourdes; Hilera González, José Ramón; Martínez Herráiz, José Javier; Gutiérrez de Mesa, José Antonio; Gutiérrez Martínez, José María; Otón Tortosa, Salvador
2011-05-01
Modern handheld devices and wireless communications foster new kinds of communication and interaction that can define new approaches to teaching and learning. Mobile learning (m-learning) seeks to use them extensively, exactly in the same way in which e-learning uses personal computers and wired communication technologies. In this new mobile environment, new applications and educational models need to be created and tested to confirm (or reject) their validity and usefulness. In this article, we present a mobile tool aimed at self-assessment, which allows students to test their knowledge at any place and at any time. The degree to which the students' achievement improved is also evaluated, and a survey on the students' opinion of the new tool was also conducted. An experimental group of 20- to 21-year-old nursing students was chosen to test the tool. Results show that this kind of tool improves students' achievement and does not make necessary to introduce substantial changes in current teaching activities and methodology.
A climate responsive urban design tool: a platform to improve energy efficiency in a dry hot climate
NASA Astrophysics Data System (ADS)
El Dallal, Norhan; Visser, Florentine
2017-09-01
In the Middle East and North Africa (MENA) region, new urban developments should address the climatic conditions to improve outdoor comfort and to reduce the energy consumption of buildings. This article describes a design tool that supports climate responsive design for a dry hot climate. The approach takes the climate as an initiator for the conceptual urban form with a more energy-efficient urban morphology. The methodology relates the different passive strategies suitable for major climate conditions in MENA region (dry-hot) to design parameters that create the urban form. This parametric design approach is the basis for a tool that generates conceptual climate responsive urban forms so as to assist the urban designer early in the design process. Various conceptual scenarios, generated by a computational model, are the results of the proposed platform. A practical application of the approach is conducted on a New Urban Community in Aswan (Egypt), showing the economic feasibility of the resulting urban form and morphology, and the proposed tool.
Mobile applications in children with cerebral palsy.
Rodríguez Mariblanca, M; Cano de la Cuerda, R
2017-12-21
Cerebral palsy (CP) is one of the most common developmental disorders. Technological development has enabled a transformation of the healthcare sector, which can offer more individualised, participatory, and preventive services. Within this context of new technology applied to the healthcare sector, mobile applications, or apps, constitute a very promising tool for the management of children with CP. The purpose of this article is to perform a systematic review of the information published about various mobile applications either directly related to CP or with potential to be useful in the context of the disease, and to describe, analyse, and classify these applications. A literature search was carried out to gather articles published in English or Spanish between 2011 and 2017 which presented, analysed, or validated applications either specifically designed or potentially useful for CP. Furthermore, a search for mobile applications was conducted in the main mobile application markets. A total of 63 applications were found in biomedical databases and mobile application markets, of which 40 were potentially useful for CP and 23 were specifically designed for the condition (11 for information, 3 for evaluation, and 9 for treatment). There are numerous mobile applications either specifically designed for or with potential to be useful in the field of CP. However, despite the existing scientific evidence, the low methodological quality of scientific articles makes it impossible to generalise the use of these tools. Copyright © 2017 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.
Alternative power supply systems for remote industrial customers
NASA Astrophysics Data System (ADS)
Kharlamova, N. V.; Khalyasmaa, A. I.; Eroshenko, S. A.
2017-06-01
The paper addresses the problem of alternative power supply of remote industrial clusters with renewable electric energy generation. As a result of different technologies comparison, consideration is given to wind energy application. The authors present a methodology of mean expected wind generation output calculation, based on Weibull distribution, which provides an effective express-tool for preliminary assessment of required installed generation capacity. The case study is based on real data including database of meteorological information, relief characteristics, power system topology etc. Wind generation feasibility estimation for a specific territory is followed by power flow calculations using Monte Carlo methodology. Finally, the paper provides a set of recommendations to ensure safe and reliable power supply for the final customers and, subsequently, to provide sustainable development of the regions, located far from megalopolises and industrial centres.
Fracture Mechanics for Composites: State of the Art and Challenges
NASA Technical Reports Server (NTRS)
Krueger, Ronald; Krueger, Ronald
2006-01-01
Interlaminar fracture mechanics has proven useful for characterizing the onset of delaminations in composites and has been used with limited success primarily to investigate onset in fracture toughness specimens and laboratory size coupon type specimens. Future acceptance of the methodology by industry and certification authorities however, requires the successful demonstration of the methodology on the structural level. In this paper, the state-of-the-art in fracture toughness characterization, and interlaminar fracture mechanics analysis tools are described. To demonstrate the application on the structural level, a panel was selected which is reinforced with stringers. Full implementation of interlaminar fracture mechanics in design however remains a challenge and requires a continuing development effort of codes to calculate energy release rates and advancements in delamination onset and growth criteria under mixed mode conditions.
Auray-Blais, Christiane; Maranda, Bruno; Lavoie, Pamela
2014-09-25
Creatine synthesis and transport disorders, Triple H syndrome and ornithine transcarbamylase deficiency are treatable inborn errors of metabolism. Early screening of patients was found to be beneficial. Mass spectrometry analysis of specific urinary biomarkers might lead to early detection and treatment in the neonatal period. We developed a high-throughput mass spectrometry methodology applicable to newborn screening using dried urine on filter paper for these aforementioned diseases. A high-throughput methodology was devised for the simultaneous analysis of creatine, guanidineacetic acid, orotic acid, uracil, creatinine and respective internal standards, using both positive and negative electrospray ionization modes, depending on the compound. The precision and accuracy varied by <15%. Stability during storage at different temperatures was confirmed for three weeks. The limits of detection and quantification for each biomarker varied from 0.3 to 6.3 μmol/l and from 1.0 to 20.9 μmol/l, respectively. Analyses of urine specimens from affected patients revealed abnormal results. Targeted biomarkers in urine were detected in the first weeks of life. This rapid, simple and robust liquid chromatography/tandem mass spectrometry methodology is an efficient tool applicable to urine screening for inherited disorders by biochemical laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holden, Jacob; Van Til, Harrison J; Wood, Eric W
A data-informed model to predict energy use for a proposed vehicle trip has been developed in this paper. The methodology leverages nearly 1 million miles of real-world driving data to generate the estimation model. Driving is categorized at the sub-trip level by average speed, road gradient, and road network geometry, then aggregated by category. An average energy consumption rate is determined for each category, creating an energy rates look-up table. Proposed vehicle trips are then categorized in the same manner, and estimated energy rates are appended from the look-up table. The methodology is robust and applicable to almost any typemore » of driving data. The model has been trained on vehicle global positioning system data from the Transportation Secure Data Center at the National Renewable Energy Laboratory and validated against on-road fuel consumption data from testing in Phoenix, Arizona. The estimation model has demonstrated an error range of 8.6% to 13.8%. The model results can be used to inform control strategies in routing tools, such as change in departure time, alternate routing, and alternate destinations to reduce energy consumption. This work provides a highly extensible framework that allows the model to be tuned to a specific driver or vehicle type.« less
Silva, Carmen; Cabral, João Alexandre; Hughes, Samantha Jane; Santos, Mário
2017-03-01
Worldwide ecological impact assessments of wind farms have gathered relevant information on bat activity patterns. Since conventional bat study methods require intensive field work, the prediction of bat activity might prove useful by anticipating activity patterns and estimating attractiveness concomitant with the wind farm location. A novel framework was developed, based on the stochastic dynamic methodology (StDM) principles, to predict bat activity on mountain ridges with wind farms. We illustrate the framework application using regional data from North Portugal by merging information from several environmental monitoring programmes associated with diverse wind energy facilities that enable integrating the multifactorial influences of meteorological conditions, land cover and geographical variables on bat activity patterns. Output from this innovative methodology can anticipate episodes of exceptional bat activity, which, if correlated with collision probability, can be used to guide wind farm management strategy such as halting wind turbines during hazardous periods. If properly calibrated with regional gradients of environmental variables from mountain ridges with windfarms, the proposed methodology can be used as a complementary tool in environmental impact assessments and ecological monitoring, using predicted bat activity to assist decision making concerning the future location of wind farms and the implementation of effective mitigation measures. Copyright © 2016 Elsevier B.V. All rights reserved.
Colombini, Daniela; Occhipinti, Enrico; Peluso, Raffaele; Montomoli, Loretta
2012-01-01
In August 2009, an international group was founded with the task of developing a "toolkit for MSD prevention" under the IEA and in collaboration with the World Health Organization.According to the ISO standard 11228 series and the new Draft ISO TR 12259 "Application document guides for the potential user", our group developed a preliminary "mapping" methodology of occupational hazards in the craft industry, supported by software (Excel®, free download on: www.epmresearch.org).The possible users of toolkits are: members of health and safety committees; health and safety representatives; line supervisors; foremen; workers; government representatives; health workers providing basic occupational health services; occupational health and safety specialists.The proposed methodology, using specific key enters and quick assessment criteria, allows a simple ergonomics hazards identification and risk estimation to be made. It is thus possible to decide for which occupational hazards a more exhaustive risk assessment will be necessary and which occupational consultant should be involved (occupational physician, safety engineer, industrial hygienist, etc.).The methodology has been applied in different situations of small and medium craftsmanship Italian enterprises: leather goods, food, technical dental work, production of artistic ceramics and stained glass, beekeeping activities. The results are synthetically reported and discussed in this paper.
Community-based early warning systems for flood risk mitigation in Nepal
NASA Astrophysics Data System (ADS)
Smith, Paul J.; Brown, Sarah; Dugar, Sumit
2017-03-01
This paper focuses on the use of community-based early warning systems for flood resilience in Nepal. The first part of the work outlines the evolution and current status of these community-based systems, highlighting the limited lead times currently available for early warning. The second part of the paper focuses on the development of a robust operational flood forecasting methodology for use by the Nepal Department of Hydrology and Meteorology (DHM) to enhance early warning lead times. The methodology uses data-based physically interpretable time series models and data assimilation to generate probabilistic forecasts, which are presented in a simple visual tool. The approach is designed to work in situations of limited data availability with an emphasis on sustainability and appropriate technology. The successful application of the forecast methodology to the flood-prone Karnali River basin in western Nepal is outlined, increasing lead times from 2-3 to 7-8 h. The challenges faced in communicating probabilistic forecasts to the last mile of the existing community-based early warning systems across Nepal is discussed. The paper concludes with an assessment of the applicability of this approach in basins and countries beyond Karnali and Nepal and an overview of key lessons learnt from this initiative.
Assessing mobile health applications with twitter analytics.
Pai, Rajesh R; Alathur, Sreejith
2018-05-01
Advancement in the field of information technology and rise in the use of Internet has changed the lives of people by enabling various services online. In recent times, healthcare sector which faces its service delivery challenges started promoting and using mobile health applications with the intention of cutting down the cost making it accessible and affordable to the people. The objective of the study is to perform sentiment analysis using the Twitter data which measures the perception and use of various mobile health applications among the citizens. The methodology followed in this research is qualitative with the data extracted from a social networking site "Twitter" through a tool RStudio. This tool with the help of Twitter Application Programming Interface requested one thousand tweets each for four different phrases of mobile health applications (apps) such as "fitness app", "diabetes app", "meditation app", and "cancer app". Depending on the tweets, sentiment analysis was carried out, and its polarity and emotions were measured. Except for cancer app there exists a positive polarity towards the fitness, diabetes, and meditation apps among the users. Following a system thinking approach for our results, this paper also explains the causal relationships between the accessibility and acceptability of mobile health applications which helps the healthcare facility and the application developers in understanding and analyzing the dynamics involved the adopting a new system or modifying an existing one. Copyright © 2018 Elsevier B.V. All rights reserved.
MPEG-4 solutions for virtualizing RDP-based applications
NASA Astrophysics Data System (ADS)
Joveski, Bojan; Mitrea, Mihai; Ganji, Rama-Rao
2014-02-01
The present paper provides the proof-of-concepts for the use of the MPEG-4 multimedia scene representations (BiFS and LASeR) as a virtualization tool for RDP-based applications (e.g. MS Windows applications). Two main applicative benefits are thus granted. First, any legacy application can be virtualized without additional programming effort. Second, heterogeneous mobile devices (different manufacturers, OS) can collaboratively enjoy full multimedia experiences. From the methodological point of view, the main novelty consists in (1) designing an architecture allowing the conversion of the RDP content into a semantic multimedia scene-graph and its subsequent rendering on the client and (2) providing the underlying scene graph management and interactivity tools. Experiments consider 5 users and two RDP applications (MS Word and Internet Explorer), and benchmark our solution against two state-of-the-art technologies (VNC and FreeRDP). The visual quality is evaluated by six objective measures (e.g. PSNR<37dB, SSIM<0.99). The network traffic evaluation shows that: (1) for text editing, the MPEG-based solutions outperforms the VNC by a factor 1.8 while being 2 times heavier then the FreeRDP; (2) for Internet browsing, the MPEG solutions outperform both VNC and FreeRDP by factors of 1.9 and 1.5, respectively. The average round-trip times (less than 40ms) cope with real-time application constraints.
NASA Astrophysics Data System (ADS)
Doss, Derek J.; Heiselman, Jon S.; Collins, Jarrod A.; Weis, Jared A.; Clements, Logan W.; Geevarghese, Sunil K.; Miga, Michael I.
2017-03-01
Sparse surface digitization with an optically tracked stylus for use in an organ surface-based image-to-physical registration is an established approach for image-guided open liver surgery procedures. However, variability in sparse data collections during open hepatic procedures can produce disparity in registration alignments. In part, this variability arises from inconsistencies with the patterns and fidelity of collected intraoperative data. The liver lacks distinct landmarks and experiences considerable soft tissue deformation. Furthermore, data coverage of the organ is often incomplete or unevenly distributed. While more robust feature-based registration methodologies have been developed for image-guided liver surgery, it is still unclear how variation in sparse intraoperative data affects registration. In this work, we have developed an application to allow surgeons to study the performance of surface digitization patterns on registration. Given the intrinsic nature of soft-tissue, we incorporate realistic organ deformation when assessing fidelity of a rigid registration methodology. We report the construction of our application and preliminary registration results using four participants. Our preliminary results indicate that registration quality improves as users acquire more experience selecting patterns of sparse intraoperative surface data.
Boronat, F; Budia, A; Broseta, E; Ruiz-Cerdá, J L; Vivas-Consuelo, D
To describe the application of the Lean methodology as a method for continuously improving the efficiency of a urology department in a tertiary hospital. The implementation of the Lean Healthcare methodology in a urology department was conducted in 3 phases: 1) team training and improvement of feedback among the practitioners, 2) management by process and superspecialisation and 3) improvement of indicators (continuous improvement). The indicators were obtained from the Hospital's information systems. The main source of information was the Balanced Scorecard for health systems management (CUIDISS). The comparison with other autonomous and national urology departments was performed through the same platform with the help of the Hospital's records department (IASIST). A baseline was established with the indicators obtained in 2011 for the comparative analysis of the results after implementing the Lean Healthcare methodology. The implementation of this methodology translated into high practitioner satisfaction, improved quality indicators reaching a risk-adjusted complication index (RACI) of 0.59 and a risk-adjusted mortality rate (RAMR) of 0.24 in 4 years. A value of 0.61 was reached with the efficiency indicator (risk-adjusted length of stay [RALOS] index), with a savings of 2869 stays compared with national Benchmarking (IASIST). The risk-adjusted readmissions index (RARI) was the only indicator above the standard, with a value of 1.36 but with progressive annual improvement of the same. The Lean methodology can be effectively applied to a urology department of a tertiary hospital to improve efficiency, obtaining significant and continuous improvements in all its indicators, as well as practitioner satisfaction. Team training, management by process, continuous improvement and delegation of responsibilities has been shown to be the fundamental pillars of this methodology. Copyright © 2017 AEU. Publicado por Elsevier España, S.L.U. All rights reserved.
Investigating surety methodologies for cognitive systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Caudell, Thomas P.; Peercy, David Eugene; Mills, Kristy
2006-11-01
Advances in cognitive science provide a foundation for new tools that promise to advance human capabilities with significant positive impacts. As with any new technology breakthrough, associated technical and non-technical risks are involved. Sandia has mitigated both technical and non-technical risks by applying advanced surety methodologies in such areas as nuclear weapons, nuclear reactor safety, nuclear materials transport, and energy systems. In order to apply surety to the development of cognitive systems, we must understand the concepts and principles that characterize the certainty of a system's operation as well as the risk areas of cognitive sciences. This SAND report documentsmore » a preliminary spectrum of risks involved with cognitive sciences, and identifies some surety methodologies that can be applied to potentially mitigate such risks. Some potential areas for further study are recommended. In particular, a recommendation is made to develop a cognitive systems epistemology framework for more detailed study of these risk areas and applications of surety methods and techniques.« less
NASA Astrophysics Data System (ADS)
Ferreira, L. E. T.; Vareda, L. V.; Hanai, J. B.; Sousa, J. L. A. O.; Silva, A. I.
2017-05-01
A modal dynamic analysis is used as the tool to evaluate the fracture toughness of concrete from the results of notched-through beam tests. The dimensionless functions describing the relation between the frequencies and specimen geometry used for identifying the variation in the natural frequency as a function of crack depth is first determined for a 150 × 150 × 500-mm notched-through specimen. The frequency decrease resulting from the propagating crack is modeled through a modal/fracture mechanics approach, leading to determination of an effective crack length. This length, obtained numerically, is used to evaluate the fracture toughness of concrete, the critical crack mouth opening displacements, and the brittleness index proposed. The methodology is applied to tests performed on high-strength concrete specimens. The frequency response for each specimen is evaluated before and after each crack propagation step. The methodology is then validated by comparison with results from the application of other methodologies described in the literature and suggested by RILEM.
Data-Driven Simulation-Enhanced Optimization of People-Based Print Production Service
NASA Astrophysics Data System (ADS)
Rai, Sudhendu
This paper describes a systematic six-step data-driven simulation-based methodology for optimizing people-based service systems on a large distributed scale that exhibit high variety and variability. The methodology is exemplified through its application within the printing services industry where it has been successfully deployed by Xerox Corporation across small, mid-sized and large print shops generating over 250 million in profits across the customer value chain. Each step of the methodology consisting of innovative concepts co-development and testing in partnership with customers, development of software and hardware tools to implement the innovative concepts, establishment of work-process and practices for customer-engagement and service implementation, creation of training and infrastructure for large scale deployment, integration of the innovative offering within the framework of existing corporate offerings and lastly the monitoring and deployment of the financial and operational metrics for estimating the return-on-investment and the continual renewal of the offering are described in detail.
Muñoz-Colmenero, Marta; Martínez, Jose Luis; Roca, Agustín; Garcia-Vazquez, Eva
2017-01-01
The Next Generation Sequencing methodologies are considered the next step within DNA-based methods and their applicability in different fields is being evaluated. Here, we tested the usefulness of the Ion Torrent Personal Genome Machine (PGM) in food traceability analyzing candies as a model of high processed foods, and compared the results with those obtained by PCR-cloning-sequencing (PCR-CS). The majority of samples exhibited consistency between methodologies, yielding more information and species per product from the PGM platform than PCR-CS. Significantly higher AT-content in sequences of the same species was also obtained from PGM. This together with some taxonomical discrepancies between methodologies suggest that the PGM platform is still pre-mature for its use in food traceability of complex highly processed products. It could be a good option for analysis of less complex food, saving time and cost per sample. Copyright © 2016 Elsevier Ltd. All rights reserved.
The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.
Schubert, Christian R; Stultz, Collin M
2009-08-01
Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.
Ju, Feng; Zhang, Tong
2015-11-03
Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.
A Healthy Lifestyle Intervention Application.
Tufte, Trond; Babic, Ankica
2017-01-01
In this project, a mHealth tool for smart-phones has been developed using Design Science methodology, where the goal has been to promote an active lifestyle. This was undertaken by implementing social and physical activity stimulating features within the application MoveFit. Users can opt to utilize just a feature or two or engage in social activities of different intensity. Regular and expert users have evaluated the application in order to meet usability requirements. In addition a field expert and a focus group have contributed towards the application's potential to increase physical activity. There was enough data collected by the app to document its good effect; it was possible to demonstrate that the app was capable of promoting physical activity. User testing has also shown the appreciation of the various features such as social networking, activity monitoring, and route/activity creation.
The nanomaterial toolkit for neuroengineering
NASA Astrophysics Data System (ADS)
Shah, Shreyas
2016-10-01
There is a growing interest in developing effective tools to better probe the central nervous system (CNS), to understand how it works and to treat neural diseases, injuries and cancer. The intrinsic complexity of the CNS has made this a challenging task for decades. Yet, with the extraordinary recent advances in nanotechnology and nanoscience, there is a general consensus on the immense value and potential of nanoscale tools for engineering neural systems. In this review, an overview of specialized nanomaterials which have proven to be the most effective tools in neuroscience is provided. After a brief background on the prominent challenges in the field, a variety of organic and inorganic-based nanomaterials are described, with particular emphasis on the distinctive properties that make them versatile and highly suitable in the context of the CNS. Building on this robust nano-inspired foundation, the rational design and application of nanomaterials can enable the generation of new methodologies to greatly advance the neuroscience frontier.
Computer assisted screening, correction, and analysis of historical weather measurements
NASA Astrophysics Data System (ADS)
Burnette, Dorian J.; Stahle, David W.
2013-04-01
A computer program, Historical Observation Tools (HOB Tools), has been developed to facilitate many of the calculations used by historical climatologists to develop instrumental and documentary temperature and precipitation datasets and makes them readily accessible to other researchers. The primitive methodology used by the early weather observers makes the application of standard techniques difficult. HOB Tools provides a step-by-step framework to visually and statistically assess, adjust, and reconstruct historical temperature and precipitation datasets. These routines include the ability to check for undocumented discontinuities, adjust temperature data for poor thermometer exposures and diurnal averaging, and assess and adjust daily precipitation data for undercount. This paper provides an overview of the Visual Basic.NET program and a demonstration of how it can assist in the development of extended temperature and precipitation datasets using modern and early instrumental measurements from the United States.
Developments in the Tools and Methodologies of Synthetic Biology
Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul
2014-01-01
Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788
Tsang, Michael P; Kikuchi-Uehara, Emi; Sonnemann, Guido W; Aymonier, Cyril; Hirao, Masahiko
2017-08-04
It has been some 15 years since the topics of sustainability and nanotechnologies first appeared together in the scientific literature and became a focus of organizations' research and policy developments. On the one hand, this focus is directed towards approaches and tools for risk assessment and management and on the other hand towards life-cycle thinking and assessment. Comparable to their application for regular chemicals, each tool is seen to serve separate objectives as it relates to evaluating nanotechnologies' safety or resource efficiency, respectively. While nanomaterials may provide resource efficient production and consumption, this must balance any potential hazards they pose across their life-cycles. This Perspective advocates for integrating these two tools at the methodological level for achieving this objective, and it explains what advantages and challenges this offers decision-makers while highlighting what research is needed to further enhance integration.
NASAL-Geom, a free upper respiratory tract 3D model reconstruction software
NASA Astrophysics Data System (ADS)
Cercos-Pita, J. L.; Cal, I. R.; Duque, D.; de Moreta, G. Sanjuán
2018-02-01
The tool NASAL-Geom, a free upper respiratory tract 3D model reconstruction software, is here described. As a free software, researchers and professionals are welcome to obtain, analyze, improve and redistribute it, potentially increasing the rate of development, and reducing at the same time ethical conflicts regarding medical applications which cannot be analyzed. Additionally, the tool has been optimized for the specific task of reading upper respiratory tract Computerized Tomography scans, and producing 3D geometries. The reconstruction process is divided into three stages: preprocessing (including Metal Artifact Reduction, noise removal, and feature enhancement), segmentation (where the nasal cavity is identified), and 3D geometry reconstruction. The tool has been automatized (i.e. no human intervention is required) a critical feature to avoid bias in the reconstructed geometries. The applied methodology is discussed, as well as the program robustness and precision.
NASA Astrophysics Data System (ADS)
Aubert, A. H.; Schnepel, O.; Kraft, P.; Houska, T.; Plesca, I.; Orlowski, N.; Breuer, L.
2015-11-01
This paper addresses education and communication in hydrology and geosciences. Many approaches can be used, such as the well-known seminars, modelling exercises and practical field work but out-door learning in our discipline is a must, and this paper focuses on the recent development of a new out-door learning tool at the landscape scale. To facilitate improved teaching and hands-on experience, we designed the Studienlandschaft Schwingbachtal. Equipped with field instrumentation, education trails, and geocache, we now implemented an augmented reality App, adding virtual teaching objects on the real landscape. The App development is detailed, to serve as methodology for people wishing to implement such a tool. The resulting application, namely the Schwingbachtal App, is described as an example. We conclude that such an App is useful for communication and education purposes, making learning pleasant, and offering personalized options.
Program and Project Management Framework
NASA Technical Reports Server (NTRS)
Butler, Cassandra D.
2002-01-01
The primary objective of this project was to develop a framework and system architecture for integrating program and project management tools that may be applied consistently throughout Kennedy Space Center (KSC) to optimize planning, cost estimating, risk management, and project control. Project management methodology used in building interactive systems to accommodate the needs of the project managers is applied as a key component in assessing the usefulness and applicability of the framework and tools developed. Research for the project included investigation and analysis of industrial practices, KSC standards, policies, and techniques, Systems Management Office (SMO) personnel, and other documented experiences of project management experts. In addition, this project documents best practices derived from the literature as well as new or developing project management models, practices, and techniques.
NASA Technical Reports Server (NTRS)
Kerr, Andrew W.
1990-01-01
The utilization of advanced simulation technology in the development of the non-real-time MANPRINT design tools in the Army/NASA Aircrew-Aircraft Integration (A3I) program is described. A description is then given of the Crew Station Research and Development Facilities, the primary tool for the application of MANPRINT principles. The purpose of the A3I program is to develop a rational, predictive methodology for helicopter cockpit system design that integrates human factors engineering with other principles at an early stage in the development process, avoiding the high cost of previous system design methods. Enabling technologies such as the MIDAS work station are examined, and the potential of low-cost parallel-processing systems is indicated.
Biosecurity Risk Assessment Methodology (BioRAM) v. 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
CASKEY, SUSAN; GAUDIOSO, JENNIFER; SALERNO, REYNOLDS
Sandia National Laboratories International Biological Threat Reduction Dept (SNL/IBTR) has an ongoing mission to enhance biosecurity assessment methodologies, tools, and guise. These will aid labs seeking to implement biosecurity as advocated in the recently released WHO's Biorisk Management: Lab Biosecurity Guidance. BioRAM 2.0 is the software tool developed initially using the SNL LDRD process and designed to complement the "Laboratory Biosecurity Risk Handbook" written by Ren Salerno and Jennifer Gaudioso defining biosecurity risk assessment methodologies.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
Ceramic tools insert assesment based on vickers indentation methodology
NASA Astrophysics Data System (ADS)
Husni; Rizal, Muhammad; Aziz M, M.; Wahyu, M.
2018-05-01
In the interrupted cutting process, the risk of tool chipping or fracture is higher than continues cutting. Therefore, the selection of suitable ceramic tools for interrupted cutting application become an important issue to assure that the cutting process is running effectively. At present, the performance of ceramics tools is assessed by conducting some cutting tests, which is required time and cost consuming. In this study, the performance of ceramic tools evaluated using hardness tester machine. The technique, in general, has a certain advantage compare with the more conventional methods; the experimental is straightforward involving minimal specimen preparation and the amount of material needed is small. Three types of ceramic tools AS10, CC650 and K090 have been used, each tool was polished then Vickers indentation test were performed with the load were 0.2, 0.5, 1, 2.5, 5 and 10 kgf. The results revealed that among the load used in the tests, the indentation loads of 5 kgf always produce well cracks as compared with others. Among the cutting tool used in the tests, AS10 has produced the shortest crack length and follow by CC 670, and K090. It is indicated that the shortest crack length of AS10 reflected that the tool has a highest dynamic load resistance among others insert.
The Application of Ultrasonic Inspection to Crimped Electrical Connections
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Perey, Daniel F.; Yost, William T.
2010-01-01
The development of a new ultrasonic measurement technique to quantitatively assess wire crimp terminations is discussed. The development of a prototype instrument, based on a modified, commercially available, crimp tool, is demonstrated for applying this technique when wire crimps are installed. The crimp tool has three separate crimping locations that accommodate the three different ferrule diameters. The crimp tool in this study is capable of crimping wire diameters ranging from 12 to 26 American Wire Gauge (AWG). A transducer design is presented that allows for interrogation of each of the three crimp locations on the crimp tool without reconfiguring the device. An analysis methodology, based on transmitted ultrasonic energy and timing of the first received pulse is shown to correlate to both crimp location in the tool and the AWG of the crimp/ferrule combination. The detectability of a number of the crimp failure pathologies, such as missing strands, partially inserted wires and incomplete crimp compression, is discussed. A wave propagation model, solved by finite element analysis, describes the compressional ultrasonic wave propagation through the junction during the crimping process.
NASA Technical Reports Server (NTRS)
Fatig, Michael
1993-01-01
Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bri Rolston
2005-06-01
Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills,more » and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between those threats and the defensive capabilities of control systems can be analyzed. The results of the gap analysis drive changes in the cyber security of critical infrastructure networks to close the gap between current exploits and existing defenses. The analysis also provides defenders with an idea of how threat technology is evolving and how defenses will need to be modified to address these emerging trends.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Yan, Da; D'Oca, Simona
Occupant behavior has significant impacts on building energy performance and occupant comfort. However, occupant behavior is not well understood and is often oversimplified in the building life cycle, due to its stochastic, diverse, complex, and interdisciplinary nature. The use of simplified methods or tools to quantify the impacts of occupant behavior in building performance simulations significantly contributes to performance gaps between simulated models and actual building energy consumption. Therefore, it is crucial to understand occupant behavior in a comprehensive way, integrating qualitative approaches and data- and model-driven quantitative approaches, and employing appropriate tools to guide the design and operation ofmore » low-energy residential and commercial buildings that integrate technological and human dimensions. This paper presents ten questions, highlighting some of the most important issues regarding concepts, applications, and methodologies in occupant behavior research. The proposed questions and answers aim to provide insights into occupant behavior for current and future researchers, designers, and policy makers, and most importantly, to inspire innovative research and applications to increase energy efficiency and reduce energy use in buildings.« less
Factors that impact nurses' use of electronic mail (e-mail).
Hughes, J A; Pakieser, R A
1999-01-01
As electronic applications are used increasingly in healthcare, nurses are being challenged to adopt them. Electronic mail (e-mail) is an electronic tool with general as well as healthcare uses. E-mail use may be an opportunity to learn a tool that requires skills similar to those used in other applications. This study aimed to identify barriers and facilitators that impact nurses' use of e-mail in the workplace. Data for this study were gathered using focus group methodology. Content analysis identified and labeled factors into seven major categories. Specific factors identified were generally consistent with those previously described in the literature as affecting use of computers in general. However, there were several additional factors identified that were not reported in other previous studies: lack of face-to-face communication, individual writing skills, recency of any educational experience, volume of mail received, password integrity, and technical support. Findings from this study provide information for any individual involved in introducing or updating an e-mail system in a healthcare environment.
ON IDENTIFIABILITY OF NONLINEAR ODE MODELS AND APPLICATIONS IN VIRAL DYNAMICS
MIAO, HONGYU; XIA, XIAOHUA; PERELSON, ALAN S.; WU, HULIN
2011-01-01
Ordinary differential equations (ODE) are a powerful tool for modeling dynamic processes with wide applications in a variety of scientific fields. Over the last 2 decades, ODEs have also emerged as a prevailing tool in various biomedical research fields, especially in infectious disease modeling. In practice, it is important and necessary to determine unknown parameters in ODE models based on experimental data. Identifiability analysis is the first step in determing unknown parameters in ODE models and such analysis techniques for nonlinear ODE models are still under development. In this article, we review identifiability analysis methodologies for nonlinear ODE models developed in the past one to two decades, including structural identifiability analysis, practical identifiability analysis and sensitivity-based identifiability analysis. Some advanced topics and ongoing research are also briefly reviewed. Finally, some examples from modeling viral dynamics of HIV, influenza and hepatitis viruses are given to illustrate how to apply these identifiability analysis methods in practice. PMID:21785515
Recognition-mediated activation of therapeutic gold nanoparticles inside living cells
NASA Astrophysics Data System (ADS)
Kim, Chaekyu; Agasti, Sarit S.; Zhu, Zhengjiang; Isaacs, Lyle; Rotello, Vincent M.
2010-11-01
Supramolecular chemistry provides a versatile tool for the organization of molecular systems into functional structures and the actuation of these assemblies for applications through the reversible association between complementary components. Use of this methodology in living systems, however, represents a significant challenge owing to the chemical complexity of cellular environments and lack of selectivity of conventional supramolecular interactions. Herein, we present a host-guest system featuring diaminohexane-terminated gold nanoparticles (AuNP-NH2) and complementary cucurbit[7]uril (CB[7]). In this system, threading of CB[7] on the particle surface reduces the cytotoxicity of AuNP-NH2 through sequestration of the particle in endosomes. Intracellular triggering of the therapeutic effect of AuNP-NH2 was then achieved through the administration of 1-adamantylamine (ADA), removing CB[7] from the nanoparticle surface, causing the endosomal release and concomitant in situ cytotoxicity of AuNP-NH2. This supramolecular strategy for intracellular activation provides a new tool for potential therapeutic applications.
Shaikh, Faiq; Franc, Benjamin; Allen, Erastus; Sala, Evis; Awan, Omer; Hendrata, Kenneth; Halabi, Safwan; Mohiuddin, Sohaib; Malik, Sana; Hadley, Dexter; Shrestha, Rasu
2018-03-01
Enterprise imaging has channeled various technological innovations to the field of clinical radiology, ranging from advanced imaging equipment and postacquisition iterative reconstruction tools to image analysis and computer-aided detection tools. More recently, the advancement in the field of quantitative image analysis coupled with machine learning-based data analytics, classification, and integration has ushered in the era of radiomics, a paradigm shift that holds tremendous potential in clinical decision support as well as drug discovery. However, there are important issues to consider to incorporate radiomics into a clinically applicable system and a commercially viable solution. In this two-part series, we offer insights into the development of the translational pipeline for radiomics from methodology to clinical implementation (Part 1) and from that point to enterprise development (Part 2). In Part 2 of this two-part series, we study the components of the strategy pipeline, from clinical implementation to building enterprise solutions. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Belanger, Scott; Barron, Mace; Craig, Peter; Dyer, Scott; Galay-Burgos, Malyka; Hamer, Mick; Marshall, Stuart; Posthuma, Leo; Raimondo, Sandy; Whitehouse, Paul
2017-07-01
A species sensitivity distribution (SSD) is a probability model of the variation of species sensitivities to a stressor, in particular chemical exposure. The SSD approach has been used as a decision support tool in environmental protection and management since the 1980s, and the ecotoxicological, statistical, and regulatory basis and applications continue to evolve. This article summarizes the findings of a 2014 workshop held by the European Centre for Toxicology and Ecotoxicology of Chemicals and the UK Environment Agency in Amsterdam, The Netherlands, on the ecological relevance, statistical basis, and regulatory applications of SSDs. An array of research recommendations categorized under the topical areas of use of SSDs, ecological considerations, guideline considerations, method development and validation, toxicity data, mechanistic understanding, and uncertainty were identified and prioritized. A rationale for the most critical research needs identified in the workshop is provided. The workshop reviewed the technical basis and historical development and application of SSDs, described approaches to estimating generic and scenario-specific SSD-based thresholds, evaluated utility and application of SSDs as diagnostic tools, and presented new statistical approaches to formulate SSDs. Collectively, these address many of the research needs to expand and improve their application. The highest priority work, from a pragmatic regulatory point of view, is to develop a guidance of best practices that could act as a basis for global harmonization and discussions regarding the SSD methodology and tools. Integr Environ Assess Manag 2017;13:664-674. © 2016 SETAC. © 2016 SETAC.
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users.
Banerjee, Shyamashree; Gupta, Parth Sarthi Sen; Nayek, Arnab; Das, Sunit; Sur, Vishma Pratap; Seth, Pratyay; Islam, Rifat Nawaz Ul; Bandyopadhyay, Amal K
2015-01-01
Automated genome sequencing procedure is enriching the sequence database very fast. To achieve a balance between the entry of sequences in the database and their analyses, efficient software is required. In this end PHYSICO2, compare to earlier PHYSICO and other public domain tools, is most efficient in that it i] extracts physicochemical, window-dependent and homologousposition-based-substitution (PWS) properties including positional and BLOCK-specific diversity and conservation, ii] provides users with optional-flexibility in setting relevant input-parameters, iii] helps users to prepare BLOCK-FASTA-file by the use of Automated Block Preparation Tool of the program, iv] performs fast, accurate and user-friendly analyses and v] redirects itemized outputs in excel format along with detailed methodology. The program package contains documentation describing application of methods. Overall the program acts as efficient PWS-analyzer and finds application in sequence-bioinformatics. Availability PHYSICO2: is freely available at http://sourceforge.net/projects/physico2/ along with its documentation at https://sourceforge.net/projects/physico2/files/Documentation.pdf/download for all users. PMID:26339154
Focused and Steady-State Characteristics of Shaped Sonic Boom Signatures: Prediction and Analysis
NASA Technical Reports Server (NTRS)
Maglieri, Domenic J.; Bobbitt, Percy J.; Massey, Steven J.; Plotkin, Kenneth J.; Kandil, Osama A.; Zheng, Xudong
2011-01-01
The objective of this study is to examine the effect of flight, at off-design conditions, on the propagated sonic boom pressure signatures of a small "low-boom" supersonic aircraft. The amplification, or focusing, of the low magnitude "shaped" signatures produced by maneuvers such as the accelerations from transonic to supersonic speeds, climbs, turns, pull-up and pushovers is the concern. To analyze these effects, new and/or improved theoretical tools have been developed, in addition to the use of existing methodology. Several shaped signatures are considered in the application of these tools to the study of selected maneuvers and off-design conditions. The results of these applications are reported in this paper as well as the details of the new analytical tools. Finally, the magnitude of the focused boom problem for "low boom" supersonic aircraft designs has been more accurately quantified and potential "mitigations" suggested. In general, "shaped boom" signatures, designed for cruise flight, such as asymmetric and symmetric flat-top and initial-shock ramp waveforms retain their basic shape during transition flight. Complex and asymmetric and symmetric initial shock ramp waveforms provide lower magnitude focus boom levels than N-waves or asymmetric and symmetric flat-top signatures.
Role of Knowledge Management in Development and Lifecycle Management of Biopharmaceuticals.
Rathore, Anurag S; Garcia-Aponte, Oscar Fabián; Golabgir, Aydin; Vallejo-Diaz, Bibiana Margarita; Herwig, Christoph
2017-02-01
Knowledge Management (KM) is a key enabler for achieving quality in a lifecycle approach for production of biopharmaceuticals. Due to the important role that it plays towards successful implementation of Quality by Design (QbD), an analysis of KM solutions is needed. This work provides a comprehensive review of the interface between KM and QbD-driven biopharmaceutical production systems as perceived by academic as well as industrial viewpoints. A comprehensive set of 356 publications addressing the applications of KM tools to QbD-related tasks were screened and a query to gather industrial inputs from 17 major biopharmaceutical organizations was performed. Three KM tool classes were identified as having high relevance for biopharmaceutical production systems and have been further explored: knowledge indicators, ontologies, and process modeling. A proposed categorization of 16 distinct KM tool classes allowed for the identification of holistic technologies supporting QbD. In addition, the classification allowed for addressing the disparity between industrial and academic expectations regarding the application of KM methodologies. This is a first of a kind attempt and thus we think that this paper would be of considerable interest to those in academia and industry that are engaged in accelerating development and commercialization of biopharmaceuticals.
PET/MRI for Neurological Applications
Catana, Ciprian; Drzezga, Alexander; Heiss, Wolf-Dieter; Rosen, Bruce R.
2013-01-01
PET and MRI provide complementary information in the study of the human brain. Simultaneous PET/MR data acquisition allows the spatial and temporal correlation of the measured signals, opening up opportunities impossible to realize using stand-alone instruments. This paper reviews the methodological improvements and potential neurological and psychiatric applications of this novel technology. We first present methods for improving the performance and information content of each modality by using the information provided by the other technique. On the PET side, we discuss methods that use the simultaneously acquired MR data to improve the PET data quantification. On the MR side, we present how improved PET quantification could be used to validate a number of MR techniques. Finally, we describe promising research, translational and clinical applications that could benefit from these advanced tools. PMID:23143086
Lightweight Composite Materials for Heavy Duty Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pruez, Jacky; Shoukry, Samir; Williams, Gergis
The main objective of this project is to develop, analyze and validate data, methodologies and tools that support widespread applications of automotive lightweighting technologies. Two underlying principles are guiding the research efforts towards this objective: • Seamless integration between the lightweight materials selected for certain vehicle systems, cost-effective methods for their design and manufacturing, and practical means to enhance their durability while reducing their Life-Cycle-Costs (LCC). • Smooth migration of the experience and findings accumulated so far at WVU in the areas of designing with lightweight materials, innovative joining concepts and durability predictions, from applications to the area of weightmore » savings for heavy vehicle systems and hydrogen storage tanks, to lightweighting applications of selected systems or assemblies in light–duty vehicles.« less
A numerical identifiability test for state-space models--application to optimal experimental design.
Hidalgo, M E; Ayesa, E
2001-01-01
This paper describes a mathematical tool for identifiability analysis, easily applicable to high order non-linear systems modelled in state-space and implementable in simulators with a time-discrete approach. This procedure also permits a rigorous analysis of the expected estimation errors (average and maximum) in calibration experiments. The methodology is based on the recursive numerical evaluation of the information matrix during the simulation of a calibration experiment and in the setting-up of a group of information parameters based on geometric interpretations of this matrix. As an example of the utility of the proposed test, the paper presents its application to an optimal experimental design of ASM Model No. 1 calibration, in order to estimate the maximum specific growth rate microH and the concentration of heterotrophic biomass XBH.
NASA-Ames workload research program
NASA Technical Reports Server (NTRS)
Hart, Sandra
1988-01-01
Research has been underway for several years to develop valid and reliable measures and predictors of workload as a function of operator state, task requirements, and system resources. Although the initial focus of this research was on aeronautics, the underlying principles and methodologies are equally applicable to space, and provide a set of tools that NASA and its contractors can use to evaluate design alternatives from the perspective of the astronauts. Objectives and approach of the research program are described, as well as the resources used in conducting research and the conceptual framework around which the program evolved. Next, standardized tasks are described, in addition to predictive models and assessment techniques and their application to the space program. Finally, some of the operational applications of these tasks and measures are reviewed.
Review of computational fluid dynamics (CFD) researches on nano fluid flow through micro channel
NASA Astrophysics Data System (ADS)
Dewangan, Satish Kumar
2018-05-01
Nanofluid is becoming a promising heat transfer fluids due to its improved thermo-physical properties and heat transfer performance. Micro channel heat transfer has potential application in the cooling high power density microchips in CPU system, micro power systems and many such miniature thermal systems which need advanced cooling capacity. Use of nanofluids enhances the effectiveness of t=scu systems. Computational Fluid Dynamics (CFD) is a very powerful tool in computational analysis of the various physical processes. It application to the situations of flow and heat transfer analysis of the nano fluids is catching up very fast. Present research paper gives a brief account of the methodology of the CFD and also summarizes its application on nano fluid and heat transfer for microchannel cases.
Simulation validation and management
NASA Astrophysics Data System (ADS)
Illgen, John D.
1995-06-01
Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.
Prioritization methodology for chemical replacement
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott
1995-01-01
Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology to be quantitatively compared in several categories, and a QFD matrix which allows process/chemical pairs to be rated against one another for importance (using consistent categories). Depending on the need for application, one can choose the part(s) needed or have the methodology completed in its entirety. For example, if a program needs to show the risk of changing a process/chemical one may choose to use part of Matrix A and Matrix C. If a chemical is being used, and the process must be changed; one might use the Process Concerns part of Matrix D for the existing process and all possible replacement processes. If an overall analysis of a program is needed, one may request the QFD to be completed.
Protein structural similarity search by Ramachandran codes
Lo, Wei-Cheng; Huang, Po-Jung; Chang, Chih-Hung; Lyu, Ping-Chiang
2007-01-01
Background Protein structural data has increased exponentially, such that fast and accurate tools are necessary to access structure similarity search. To improve the search speed, several methods have been designed to reduce three-dimensional protein structures to one-dimensional text strings that are then analyzed by traditional sequence alignment methods; however, the accuracy is usually sacrificed and the speed is still unable to match sequence similarity search tools. Here, we aimed to improve the linear encoding methodology and develop efficient search tools that can rapidly retrieve structural homologs from large protein databases. Results We propose a new linear encoding method, SARST (Structural similarity search Aided by Ramachandran Sequential Transformation). SARST transforms protein structures into text strings through a Ramachandran map organized by nearest-neighbor clustering and uses a regenerative approach to produce substitution matrices. Then, classical sequence similarity search methods can be applied to the structural similarity search. Its accuracy is similar to Combinatorial Extension (CE) and works over 243,000 times faster, searching 34,000 proteins in 0.34 sec with a 3.2-GHz CPU. SARST provides statistically meaningful expectation values to assess the retrieved information. It has been implemented into a web service and a stand-alone Java program that is able to run on many different platforms. Conclusion As a database search method, SARST can rapidly distinguish high from low similarities and efficiently retrieve homologous structures. It demonstrates that the easily accessible linear encoding methodology has the potential to serve as a foundation for efficient protein structural similarity search tools. These search tools are supposed applicable to automated and high-throughput functional annotations or predictions for the ever increasing number of published protein structures in this post-genomic era. PMID:17716377
NASA Astrophysics Data System (ADS)
Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.
2012-09-01
Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.
De Brouwere, Katleen; Cornelis, Christa; Arvanitis, Athanasios; Brown, Terry; Crump, Derrick; Harrison, Paul; Jantunen, Matti; Price, Paul; Torfs, Rudi
2014-05-01
The maximum cumulative ratio (MCR) method allows the categorisation of mixtures according to whether the mixture is of concern for toxicity and if so whether this is driven by one substance or multiple substances. The aim of the present study was to explore, by application of the MCR approach, whether health risks due to indoor air pollution are dominated by one substance or are due to concurrent exposure to various substances. Analysis was undertaken on monitoring data of four European indoor studies (giving five datasets), involving 1800 records of indoor air or personal exposure. Application of the MCR methodology requires knowledge of the concentrations of chemicals in a mixture together with health-based reference values for those chemicals. For this evaluation, single substance health-based reference values (RVs) were selected through a structured review process. The MCR analysis found high variability in the proportion of samples of concern for mixture toxicity. The fraction of samples in these groups of concern varied from 2% (Flemish schools) to 77% (EXPOLIS, Basel, indoor), the variation being due not only to the variation in indoor air contaminant levels across the studies but also to other factors such as differences in number and type of substances monitored, analytical performance, and choice of RVs. However, in 4 out of the 5 datasets, a considerable proportion of cases were found where a chemical-by-chemical approach failed to identify the need for the investigation of combined risk assessment. Although the MCR methodology applied in the current study provides no consideration of commonality of endpoints, it provides a tool for discrimination between those mixtures requiring further combined risk assessment and those for which a single-substance assessment is sufficient. Copyright © 2014 Elsevier B.V. All rights reserved.
Rodríguez-Pérez, Aitana; Alfaro-Lara, Eva Rocío; Albiñana-Perez, Sandra; Nieto-Martín, María Dolores; Díez-Manglano, Jesús; Pérez-Guerrero, Concepción; Santos-Ramos, Bernardo
2017-11-01
To create a tool to identify drugs and clinical situations that offers an opportunity of deprescribing in patients with multimorbidity. A literature review completed with electronic brainstorming, and subsequently, a panel of experts using the Delphi methodology were applied. The experts assessed the criteria identified in the literature and brainstorming as possible situations for deprescribing. They were also asked to assess the influence of life prognosis in each criterion. A tool was composed of the most appropriate criteria according to the strength of their evidence, usefulness in patients with multimorbidity and applicability in clinical practice. Out of a total of 100, 27 criteria were selected to be included in the final list. It was named the LESS-CHRON criteria (List of Evidence-baSed depreScribing for CHRONic patients), and was organized by the anatomical group of the Anatomical, Therapeutic, Chemical (ATC) classification system of the drug to be deprescribed. Each criterion contains: drug indication for which it is prescribed, clinical situation that offers an opportunity to deprescribe, clinical variable to be monitored and the minimum time to follow up the patient after deprescribing. The "LESS-CHRON criteria" are the result of a comprehensive and standardized methodology to identify clinical situations for deprescribing drugs in chronic patients with multimorbidity. Geriatr Gerontol Int 2017; 17: 2200-2207. © 2017 Japan Geriatrics Society.
A systematic collaborative process for assessing launch vehicle propulsion technologies
NASA Astrophysics Data System (ADS)
Odom, Pat R.
1999-01-01
A systematic, collaborative process for prioritizing candidate investments in space transportation systems technologies has been developed for the NASA Space Transportation Programs Office. The purpose of the process is to provide a repeatable and auditable basis for selecting technology investments to enable achievement of NASA's strategic space transportation objectives. The paper describes the current multilevel process and supporting software tool that has been developed. Technologies are prioritized across system applications to produce integrated portfolios for recommended funding. An example application of the process to the assessment of launch vehicle propulsion technologies is described and illustrated. The methodologies discussed in the paper are expected to help NASA and industry ensure maximum returns from technology investments under constrained budgets.
Diagnosis Related Groups as a Casemix/Management Tool for Hospice Patients
Johnson-Hürzeler, R.; Leary, Robert J.; Hill, Claire L.
1983-01-01
to control the costs of care, and to remain prepared for changes in reimbursement methodologies, health care organizations are beginning to analyze their casemix and their costs per case of providing care. Increasing importance is thus assigned to the search for valid casemix measures and to the construction of information systems which will support casemix investigations. After two years of information systems development, The Connecticut Hospice has begun its search for casemix measures that are applicable to the care of the dying. In this paper, we present our findings on the application of one casemix measure - the DRG - in the specialized area of nonsurgical care of the terminally ill.
NASA Astrophysics Data System (ADS)
Kassem, M.; Soize, C.; Gagliardini, L.
2011-02-01
In a recent work [ Journal of Sound and Vibration 323 (2009) 849-863] the authors presented an energy-density field approach for the vibroacoustic analysis of complex structures in the low and medium frequency ranges. In this approach, a local vibroacoustic energy model as well as a simplification of this model were constructed. In this paper, firstly an extension of the previous theory is performed in order to include the case of general input forces and secondly, a structural partitioning methodology is presented along with a set of tools used for the construction of a partitioning. Finally, an application is presented for an automotive vehicle.
[Application of mass spectrometry in mycology].
Quiles Melero, Inmaculada; Peláez, Teresa; Rezusta López, Antonio; Garcia-Rodríguez, Julio
2016-06-01
MALDI-TOF (matrix-assisted laser desorption ionization time-of-flight) mass spectrometry (MS) is becoming an essential tool in most microbiology laboratories. At present, by using a characteristic fungal profile obtained from whole cells or through simple extraction protocols, MALDI-TOF MS allows the identification of pathogenic fungi with a high performance potential. This methodology decreases the laboratory turnaround time, optimizing the detection of mycoses. This article describes the state-of-the-art of the use of MALDI-TOF MS for the detection of human clinical fungal pathogens in the laboratory and discusses the future applications of this technology, which will further improve routine mycological diagnosis. Copyright © 2016 Elsevier España, S.L.U. All rights reserved.
Gulf Coast Clean Energy Application Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dillingham, Gavin
The Gulf Coast Clean Energy Application Center was initiated to significantly improve market and regulatory conditions for the implementation of combined heat and power technologies. The GC CEAC was responsible for the development of CHP in Texas, Louisiana and Oklahoma. Through this program we employed a variety of outreach and education techniques, developed and deployed assessment tools and conducted market assessments. These efforts resulted in the growth of the combined heat and power market in the Gulf Coast region with a realization of more efficient energy generation, reduced emissions and a more resilient infrastructure. Specific t research, we did notmore » formally investigate any techniques with any formal research design or methodology.« less
Kobayashi, Leo; Gosbee, John W; Merck, Derek L
2017-07-01
(1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.
Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan
2016-10-01
A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Artificial Intelligence Methodologies and Their Application to Diabetes
Rigla, Mercedes; García-Sáez, Gema; Pons, Belén; Hernando, Maria Elena
2017-01-01
In the past decade diabetes management has been transformed by the addition of continuous glucose monitoring and insulin pump data. More recently, a wide variety of functions and physiologic variables, such as heart rate, hours of sleep, number of steps walked and movement, have been available through wristbands or watches. New data, hydration, geolocation, and barometric pressure, among others, will be incorporated in the future. All these parameters, when analyzed, can be helpful for patients and doctors’ decision support. Similar new scenarios have appeared in most medical fields, in such a way that in recent years, there has been an increased interest in the development and application of the methods of artificial intelligence (AI) to decision support and knowledge acquisition. Multidisciplinary research teams integrated by computer engineers and doctors are more and more frequent, mirroring the need of cooperation in this new topic. AI, as a science, can be defined as the ability to make computers do things that would require intelligence if done by humans. Increasingly, diabetes-related journals have been incorporating publications focused on AI tools applied to diabetes. In summary, diabetes management scenarios have suffered a deep transformation that forces diabetologists to incorporate skills from new areas. This recently needed knowledge includes AI tools, which have become part of the diabetes health care. The aim of this article is to explain in an easy and plane way the most used AI methodologies to promote the implication of health care providers—doctors and nurses—in this field. PMID:28539087
Artificial Intelligence Methodologies and Their Application to Diabetes.
Rigla, Mercedes; García-Sáez, Gema; Pons, Belén; Hernando, Maria Elena
2018-03-01
In the past decade diabetes management has been transformed by the addition of continuous glucose monitoring and insulin pump data. More recently, a wide variety of functions and physiologic variables, such as heart rate, hours of sleep, number of steps walked and movement, have been available through wristbands or watches. New data, hydration, geolocation, and barometric pressure, among others, will be incorporated in the future. All these parameters, when analyzed, can be helpful for patients and doctors' decision support. Similar new scenarios have appeared in most medical fields, in such a way that in recent years, there has been an increased interest in the development and application of the methods of artificial intelligence (AI) to decision support and knowledge acquisition. Multidisciplinary research teams integrated by computer engineers and doctors are more and more frequent, mirroring the need of cooperation in this new topic. AI, as a science, can be defined as the ability to make computers do things that would require intelligence if done by humans. Increasingly, diabetes-related journals have been incorporating publications focused on AI tools applied to diabetes. In summary, diabetes management scenarios have suffered a deep transformation that forces diabetologists to incorporate skills from new areas. This recently needed knowledge includes AI tools, which have become part of the diabetes health care. The aim of this article is to explain in an easy and plane way the most used AI methodologies to promote the implication of health care providers-doctors and nurses-in this field.
[Evaluation of medication risk in pregnant women: methodology of evaluation and risk management].
Eléfant, E; Sainte-Croix, A
1997-01-01
This round table discussion was devoted to the description of the tools currently available for the evaluation of drug risks and management during pregnancy. Five topics were submitted for discussion: pre-clinical data, methodological tools, benefit/risk ratio before prescription, teratogenic or fetal risk evaluation, legal comments.
NASA/CARES dual-use ceramic technology spinoff applications
NASA Technical Reports Server (NTRS)
Powers, Lynn M.; Janosik, Lesley A.; Gyekenyesi, John P.; Nemeth, Noel N.
1994-01-01
NASA has developed software that enables American industry to establish the reliability and life of ceramic structures in a wide variety of 21st Century applications. Designing ceramic components to survive at higher temperatures than the capability of most metals and in severe loading environments involves the disciplines of statistics and fracture mechanics. Successful application of advanced ceramics material properties and the use of a probabilistic brittle material design methodology. The NASA program, known as CARES (Ceramics Analysis and Reliability Evaluation of Structures), is a comprehensive general purpose design tool that predicts the probability of failure of a ceramic component as a function of its time in service. The latest version of this software, CARESALIFE, is coupled to several commercially available finite element analysis programs (ANSYS, MSC/NASTRAN, ABAQUS, COSMOS/N4, MARC), resulting in an advanced integrated design tool which is adapted to the computing environment of the user. The NASA-developed CARES software has been successfully used by industrial, government, and academic organizations to design and optimize ceramic components for many demanding applications. Industrial sectors impacted by this program include aerospace, automotive, electronic, medical, and energy applications. Dual-use applications include engine components, graphite and ceramic high temperature valves, TV picture tubes, ceramic bearings, electronic chips, glass building panels, infrared windows, radiant heater tubes, heat exchangers, and artificial hips, knee caps, and teeth.
Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.
Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen
2015-11-01
Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.
Moody, Jonathan B; Lee, Benjamin C; Corbett, James R; Ficaro, Edward P; Murthy, Venkatesh L
2015-10-01
A number of exciting advances in PET/CT technology and improvements in methodology have recently converged to enhance the feasibility of routine clinical quantification of myocardial blood flow and flow reserve. Recent promising clinical results are pointing toward an important role for myocardial blood flow in the care of patients. Absolute blood flow quantification can be a powerful clinical tool, but its utility will depend on maintaining precision and accuracy in the face of numerous potential sources of methodological errors. Here we review recent data and highlight the impact of PET instrumentation, image reconstruction, and quantification methods, and we emphasize (82)Rb cardiac PET which currently has the widest clinical application. It will be apparent that more data are needed, particularly in relation to newer PET technologies, as well as clinical standardization of PET protocols and methods. We provide recommendations for the methodological factors considered here. At present, myocardial flow reserve appears to be remarkably robust to various methodological errors; however, with greater attention to and more detailed understanding of these sources of error, the clinical benefits of stress-only blood flow measurement may eventually be more fully realized.
IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bazjanac, Vladimir
2008-07-01
Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
E-learning, dual-task, and cognitive load: The anatomy of a failed experiment.
Van Nuland, Sonya E; Rogers, Kem A
2016-01-01
The rising popularity of commercial anatomy e-learning tools has been sustained, in part, due to increased annual enrollment and a reduction in laboratory hours across educational institutions. While e-learning tools continue to gain popularity, the research methodologies used to investigate their impact on learning remain imprecise. As new user interfaces are introduced, it is critical to understand how functionality can influence the load placed on a student's memory resources, also known as cognitive load. To study cognitive load, a dual-task paradigm wherein a learner performs two tasks simultaneously is often used, however, its application within educational research remains uncommon. Using previous paradigms as a guide, a dual-task methodology was developed to assess the cognitive load imposed by two commercial anatomical e-learning tools. Results indicate that the standard dual-task paradigm, as described in the literature, is insensitive to the cognitive load disparities across e-learning tool interfaces. Confounding variables included automation of responses, task performance tradeoff, and poor understanding of primary task cognitive load requirements, leading to unreliable quantitative results. By modifying the secondary task from a basic visual response to a more cognitively demanding task, such as a modified Stroop test, the automation of secondary task responses can be reduced. Furthermore, by recording baseline measures for the primary task as well as the secondary task, it is possible for task performance tradeoff to be detected. Lastly, it is imperative that the cognitive load of the primary task be designed such that it does not overwhelm the individual's ability to learn new material. © 2015 American Association of Anatomists.
Advanced Endoscopic Navigation: Surgical Big Data, Methodology, and Applications.
Luo, Xiongbiao; Mori, Kensaku; Peters, Terry M
2018-06-04
Interventional endoscopy (e.g., bronchoscopy, colonoscopy, laparoscopy, cystoscopy) is a widely performed procedure that involves either diagnosis of suspicious lesions or guidance for minimally invasive surgery in a variety of organs within the body cavity. Endoscopy may also be used to guide the introduction of certain items (e.g., stents) into the body. Endoscopic navigation systems seek to integrate big data with multimodal information (e.g., computed tomography, magnetic resonance images, endoscopic video sequences, ultrasound images, external trackers) relative to the patient's anatomy, control the movement of medical endoscopes and surgical tools, and guide the surgeon's actions during endoscopic interventions. Nevertheless, it remains challenging to realize the next generation of context-aware navigated endoscopy. This review presents a broad survey of various aspects of endoscopic navigation, particularly with respect to the development of endoscopic navigation techniques. First, we investigate big data with multimodal information involved in endoscopic navigation. Next, we focus on numerous methodologies used for endoscopic navigation. We then review different endoscopic procedures in clinical applications. Finally, we discuss novel techniques and promising directions for the development of endoscopic navigation.
Ishii, Satoshi; Sadowsky, Michael J
2009-04-01
A large number of repetitive DNA sequences are found in multiple sites in the genomes of numerous bacteria, archaea and eukarya. While the functions of many of these repetitive sequence elements are unknown, they have proven to be useful as the basis of several powerful tools for use in molecular diagnostics, medical microbiology, epidemiological analyses and environmental microbiology. The repetitive sequence-based PCR or rep-PCR DNA fingerprint technique uses primers targeting several of these repetitive elements and PCR to generate unique DNA profiles or 'fingerprints' of individual microbial strains. Although this technique has been extensively used to examine diversity among variety of prokaryotic microorganisms, rep-PCR DNA fingerprinting can also be applied to microbial ecology and microbial evolution studies since it has the power to distinguish microbes at the strain or isolate level. Recent advancement in rep-PCR methodology has resulted in increased accuracy, reproducibility and throughput. In this minireview, we summarize recent improvements in rep-PCR DNA fingerprinting methodology, and discuss its applications to address fundamentally important questions in microbial ecology and evolution.
An automation simulation testbed
NASA Technical Reports Server (NTRS)
Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel
1988-01-01
The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.
Risk-Informed External Hazards Analysis for Seismic and Flooding Phenomena for a Generic PWR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parisi, Carlo; Prescott, Steve; Ma, Zhegang
This report describes the activities performed during the FY2017 for the US-DOE Light Water Reactor Sustainability Risk-Informed Safety Margin Characterization (LWRS-RISMC), Industry Application #2. The scope of Industry Application #2 is to deliver a risk-informed external hazards safety analysis for a representative nuclear power plant. Following the advancements occurred during the previous FYs (toolkits identification, models development), FY2017 focused on: increasing the level of realism of the analysis; improving the tools and the coupling methodologies. In particular the following objectives were achieved: calculation of buildings pounding and their effects on components seismic fragility; development of a SAPHIRE code PRA modelsmore » for 3-loops Westinghouse PWR; set-up of a methodology for performing static-dynamic PRA coupling between SAPHIRE and EMRALD codes; coupling RELAP5-3D/RAVEN for performing Best-Estimate Plus Uncertainty analysis and automatic limit surface search; and execute sample calculations for demonstrating the capabilities of the toolkit in performing a risk-informed external hazards safety analyses.« less
Multidisciplinary design and optimization (MDO) methodology for the aircraft conceptual design
NASA Astrophysics Data System (ADS)
Iqbal, Liaquat Ullah
An integrated design and optimization methodology has been developed for the conceptual design of an aircraft. The methodology brings higher fidelity Computer Aided Design, Engineering and Manufacturing (CAD, CAE and CAM) Tools such as CATIA, FLUENT, ANSYS and SURFCAM into the conceptual design by utilizing Excel as the integrator and controller. The approach is demonstrated to integrate with many of the existing low to medium fidelity codes such as the aerodynamic panel code called CMARC and sizing and constraint analysis codes, thus providing the multi-fidelity capabilities to the aircraft designer. The higher fidelity design information from the CAD and CAE tools for the geometry, aerodynamics, structural and environmental performance is provided for the application of the structured design methods such as the Quality Function Deployment (QFD) and the Pugh's Method. The higher fidelity tools bring the quantitative aspects of a design such as precise measurements of weight, volume, surface areas, center of gravity (CG) location, lift over drag ratio, and structural weight, as well as the qualitative aspects such as external geometry definition, internal layout, and coloring scheme early in the design process. The performance and safety risks involved with the new technologies can be reduced by modeling and assessing their impact more accurately on the performance of the aircraft. The methodology also enables the design and evaluation of the novel concepts such as the blended (BWB) and the hybrid wing body (HWB) concepts. Higher fidelity computational fluid dynamics (CFD) and finite element analysis (FEA) allow verification of the claims for the performance gains in aerodynamics and ascertain risks of structural failure due to different pressure distribution in the fuselage as compared with the tube and wing design. The higher fidelity aerodynamics and structural models can lead to better cost estimates that help reduce the financial risks as well. This helps in achieving better designs with reduced risk in lesser time and cost. The approach is shown to eliminate the traditional boundary between the conceptual and the preliminary design stages, combining the two into one consolidated preliminary design phase. Several examples for the validation and utilization of the Multidisciplinary Design and Optimization (MDO) Tool are presented using missions for the Medium and High Altitude Long Range/Endurance Unmanned Aerial Vehicles (UAVs).
VASSAR: Value assessment of system architectures using rules
NASA Astrophysics Data System (ADS)
Selva, D.; Crawley, E. F.
A key step of the mission development process is the selection of a system architecture, i.e., the layout of the major high-level system design decisions. This step typically involves the identification of a set of candidate architectures and a cost-benefit analysis to compare them. Computational tools have been used in the past to bring rigor and consistency into this process. These tools can automatically generate architectures by enumerating different combinations of decisions and options. They can also evaluate these architectures by applying cost models and simplified performance models. Current performance models are purely quantitative tools that are best fit for the evaluation of the technical performance of mission design. However, assessing the relative merit of a system architecture is a much more holistic task than evaluating performance of a mission design. Indeed, the merit of a system architecture comes from satisfying a variety of stakeholder needs, some of which are easy to quantify, and some of which are harder to quantify (e.g., elegance, scientific value, political robustness, flexibility). Moreover, assessing the merit of a system architecture at these very early stages of design often requires dealing with a mix of: a) quantitative and semi-qualitative data; objective and subjective information. Current computational tools are poorly suited for these purposes. In this paper, we propose a general methodology that can used to assess the relative merit of several candidate system architectures under the presence of objective, subjective, quantitative, and qualitative stakeholder needs. The methodology called VASSAR (Value ASsessment for System Architectures using Rules). The major underlying assumption of the VASSAR methodology is that the merit of a system architecture can assessed by comparing the capabilities of the architecture with the stakeholder requirements. Hence for example, a candidate architecture that fully satisfies all critical sta- eholder requirements is a good architecture. The assessment process is thus fundamentally seen as a pattern matching process where capabilities match requirements, which motivates the use of rule-based expert systems (RBES). This paper describes the VASSAR methodology and shows how it can be applied to a large complex space system, namely an Earth observation satellite system. Companion papers show its applicability to the NASA space communications and navigation program and the joint NOAA-DoD NPOESS program.
Improving ED specimen TAT using Lean Six Sigma.
Sanders, Janet H; Karr, Tedd
2015-01-01
Lean and Six Sigma are continuous improvement methodologies that have garnered international fame for improving manufacturing and service processes. Increasingly these methodologies are demonstrating their power to also improve healthcare processes. The purpose of this paper is to discuss a case study for the application of Lean and Six Sigma tools in the reduction of turnaround time (TAT) for Emergency Department (ED) specimens. This application of the scientific methodologies uncovered opportunities to improve the entire ED to lab system for the specimens. This case study provides details on the completion of a Lean Six Sigma project in a 1,000 bed tertiary care teaching hospital. Six Sigma's Define, Measure, Analyze, Improve, and Control methodology is very similar to good medical practice: first, relevant information is obtained and assembled; second, a careful and thorough diagnosis is completed; third, a treatment is proposed and implemented; and fourth, checks are made to determine if the treatment was effective. Lean's primary goal is to do more with less work and waste. The Lean methodology was used to identify and eliminate waste through rapid implementation of change. The initial focus of this project was the reduction of turn-around-times for ED specimens. However, the results led to better processes for both the internal and external customers of this and other processes. The project results included: a 50 percent decrease in vials used for testing, a 50 percent decrease in unused or extra specimens, a 90 percent decrease in ED specimens without orders, a 30 percent decrease in complete blood count analysis (CBCA) Median TAT, a 50 percent decrease in CBCA TAT Variation, a 10 percent decrease in Troponin TAT Variation, a 18.2 percent decrease in URPN TAT Variation, and a 2-5 minute decrease in ED registered nurses rainbow draw time. This case study demonstrated how the quantitative power of Six Sigma and the speed of Lean worked in harmony to improve the blood draw process for a 1,000 bed tertiary care teaching hospital. The blood draw process is a standard process used in hospitals to collect blood chemistry and hematology information for clinicians. The methods used in this case study demonstrated valuable and practical applications of process improvement methodologies that can be used for any hospital process and/or service environment. While this is not the first case study that has demonstrated the use of continuous process improvement methodologies to improve a hospital process, it is unique in the way in which it utilizes the strength of the project focussed approach that adheres more to the structure and rigor of Six Sigma and relied less on the speed of lean. Additionally, the application of these methodologies in healthcare is emerging research.
Global Change adaptation in water resources management: the Water Change project.
Pouget, Laurent; Escaler, Isabel; Guiu, Roger; Mc Ennis, Suzy; Versini, Pierre-Antoine
2012-12-01
In recent years, water resources management has been facing new challenges due to increasing changes and their associated uncertainties, such as changes in climate, water demand or land use, which can be grouped under the term Global Change. The Water Change project (LIFE+ funding) developed a methodology and a tool to assess the Global Change impacts on water resources, thus helping river basin agencies and water companies in their long term planning and in the definition of adaptation measures. The main result of the project was the creation of a step by step methodology to assess Global Change impacts and define strategies of adaptation. This methodology was tested in the Llobregat river basin (Spain) with the objective of being applicable to any water system. It includes several steps such as setting-up the problem with a DPSIR framework, developing Global Change scenarios, running river basin models and performing a cost-benefit analysis to define optimal strategies of adaptation. This methodology was supported by the creation of a flexible modelling system, which can link a wide range of models, such as hydrological, water quality, and water management models. The tool allows users to integrate their own models to the system, which can then exchange information among them automatically. This enables to simulate the interactions among multiple components of the water cycle, and run quickly a large number of Global Change scenarios. The outcomes of this project make possible to define and test different sets of adaptation measures for the basin that can be further evaluated through cost-benefit analysis. The integration of the results contributes to an efficient decision-making on how to adapt to Global Change impacts. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Funk, Christie J.; Perry, Boyd, III; Silva, Walter A.; Newman, Brett
2014-01-01
A software program and associated methodology to study gust loading on aircraft exists for a classification of geometrically simplified flexible configurations. This program consists of a simple aircraft response model with two rigid and three flexible symmetric degrees-of - freedom and allows for the calculation of various airplane responses due to a discrete one-minus- cosine gust as well as continuous turbulence. Simplifications, assumptions, and opportunities for potential improvements pertaining to the existing software program are first identified, then a revised version of the original software tool is developed with improved methodology to include more complex geometries, additional excitation cases, and additional output data so as to provide a more useful and precise tool for gust load analysis. In order to improve the original software program to enhance usefulness, a wing control surface and horizontal tail control surface is added, an extended application of the discrete one-minus-cosine gust input is employed, a supplemental continuous turbulence spectrum is implemented, and a capability to animate the total vehicle deformation response to gust inputs is included. These revisions and enhancements are implemented and an analysis of the results is used to validate the modifications.
Remote Sensing for Crop Water Management: From ET Modelling to Services for the End Users
Calera, Alfonso; Campos, Isidro; Osann, Anna; D’Urso, Guido; Menenti, Massimo
2017-01-01
The experiences gathered during the past 30 years support the operational use of irrigation scheduling based on frequent multi-spectral image data. Currently, the operational use of dense time series of multispectral imagery at high spatial resolution makes monitoring of crop biophysical parameters feasible, capturing crop water use across the growing season, with suitable temporal and spatial resolutions. These achievements, and the availability of accurate forecasting of meteorological data, allow for precise predictions of crop water requirements with unprecedented spatial resolution. This information is greatly appreciated by the end users, i.e., professional farmers or decision-makers, and can be provided in an easy-to-use manner and in near-real-time by using the improvements achieved in web-GIS methodologies (Geographic Information Systems based on web technologies). This paper reviews the most operational and explored methods based on optical remote sensing for the assessment of crop water requirements, identifying strengths and weaknesses and proposing alternatives to advance towards full operational application of this methodology. In addition, we provide a general overview of the tools, which facilitates co-creation and collaboration with stakeholders, paying special attention to these approaches based on web-GIS tools. PMID:28492515
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinney, Adriana L.; Varga, Tamas
Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less
Remote Sensing for Crop Water Management: From ET Modelling to Services for the End Users.
Calera, Alfonso; Campos, Isidro; Osann, Anna; D'Urso, Guido; Menenti, Massimo
2017-05-11
The experiences gathered during the past 30 years support the operational use of irrigation scheduling based on frequent multi-spectral image data. Currently, the operational use of dense time series of multispectral imagery at high spatial resolution makes monitoring of crop biophysical parameters feasible, capturing crop water use across the growing season, with suitable temporal and spatial resolutions. These achievements, and the availability of accurate forecasting of meteorological data, allow for precise predictions of crop water requirements with unprecedented spatial resolution. This information is greatly appreciated by the end users, i.e., professional farmers or decision-makers, and can be provided in an easy-to-use manner and in near-real-time by using the improvements achieved in web-GIS methodologies (Geographic Information Systems based on web technologies). This paper reviews the most operational and explored methods based on optical remote sensing for the assessment of crop water requirements, identifying strengths and weaknesses and proposing alternatives to advance towards full operational application of this methodology. In addition, we provide a general overview of the tools, which facilitates co-creation and collaboration with stakeholders, paying special attention to these approaches based on web-GIS tools.
Assessing hydrologic impacts of future Land Change scenarios in the San Pedro River (U.S./Mexico)
NASA Astrophysics Data System (ADS)
Kepner, W. G.; Burns, S.; Sidman, G.; Levick, L.; Goodrich, D. C.; Guertin, P.; Yee, W.; Scianni, M.
2012-12-01
An approach was developed to characterize the hydrologic impacts of urban expansion through time for the San Pedro River, a watershed of immense international importance that straddles the U.S./Mexico border. Future urban growth is a key driving force altering local and regional hydrology and is represented by decadal changes in housing density maps from 2010 to 2100 derived from the Integrated Climate and Land-Use Scenarios (ICLUS) database. ICLUS developed future housing density maps by adapting the Intergovernmental Panel on Climate Change (IPCC) Special Report on Emissions Scenarios (SRES) social, economic, and demographic storylines to the conterminous United States. To characterize the hydrologic impacts of future growth, the housing density maps were reclassified to National Land Cover Database 2006 land cover classes and used to parameterize the Soil and Water Assessment Tool (SWAT) using the Automated Geospatial Watershed Assessment (AGWA) tool. The presentation will report 1) the methodology for adapting the ICLUS data for use in AGWA as an approach to evaluate basin-wide impacts of development on water-quantity and -quality, 2) initial results of the application of the methodology, and 3) discuss implications of the analysis.
Yager, Douglas B.; Hofstra, Albert H.; Granitto, Matthew
2012-01-01
This report emphasizes geographic information system analysis and the display of data stored in the legacy U.S. Geological Survey National Geochemical Database for use in mineral resource investigations. Geochemical analyses of soils, stream sediments, and rocks that are archived in the National Geochemical Database provide an extensive data source for investigating geochemical anomalies. A study area in the Egan Range of east-central Nevada was used to develop a geographic information system analysis methodology for two different geochemical datasets involving detailed (Bureau of Land Management Wilderness) and reconnaissance-scale (National Uranium Resource Evaluation) investigations. ArcGIS was used to analyze and thematically map geochemical information at point locations. Watershed-boundary datasets served as a geographic reference to relate potentially anomalous sample sites with hydrologic unit codes at varying scales. The National Hydrography Dataset was analyzed with Hydrography Event Management and ArcGIS Utility Network Analyst tools to delineate potential sediment-sample provenance along a stream network. These tools can be used to track potential upstream-sediment-contributing areas to a sample site. This methodology identifies geochemically anomalous sample sites, watersheds, and streams that could help focus mineral resource investigations in the field.
Rubin, Katrine Hass; Friis-Holmberg, Teresa; Hermann, Anne Pernille; Abrahamsen, Bo; Brixen, Kim
2013-08-01
A huge number of risk assessment tools have been developed. Far from all have been validated in external studies, more of them have absence of methodological and transparent evidence, and few are integrated in national guidelines. Therefore, we performed a systematic review to provide an overview of existing valid and reliable risk assessment tools for prediction of osteoporotic fractures. Additionally, we aimed to determine if the performance of each tool was sufficient for practical use, and last, to examine whether the complexity of the tools influenced their discriminative power. We searched PubMed, Embase, and Cochrane databases for papers and evaluated these with respect to methodological quality using the Quality Assessment Tool for Diagnostic Accuracy Studies (QUADAS) checklist. A total of 48 tools were identified; 20 had been externally validated, however, only six tools had been tested more than once in a population-based setting with acceptable methodological quality. None of the tools performed consistently better than the others and simple tools (i.e., the Osteoporosis Self-assessment Tool [OST], Osteoporosis Risk Assessment Instrument [ORAI], and Garvan Fracture Risk Calculator [Garvan]) often did as well or better than more complex tools (i.e., Simple Calculated Risk Estimation Score [SCORE], WHO Fracture Risk Assessment Tool [FRAX], and Qfracture). No studies determined the effectiveness of tools in selecting patients for therapy and thus improving fracture outcomes. High-quality studies in randomized design with population-based cohorts with different case mixes are needed. Copyright © 2013 American Society for Bone and Mineral Research.
Araújo, Jane A M; Esmerino, Erick A; Alvarenga, Verônica O; Cappato, Leandro P; Hora, Iracema C; Silva, Marcia Cristina; Freitas, Monica Q; Pimentel, Tatiana C; Walter, Eduardo H M; Sant'Ana, Anderson S; Cruz, Adriano G
2018-03-01
This study aimed to develop a checklist for good hygiene practices (GHP) for raw material of vegetable origin using the focus groups (FGs) approach (n = 4). The final checklist for commercialization of horticultural products totaled 28 questions divided into six blocks, namely: water supply; hygiene, health, and training; waste control; control of pests; packaging and traceability; and hygiene of facilities and equipment. The FG methodology was efficient to elaborate a participatory and objective checklist, based on minimum hygiene requirements, serving as a tool for diagnosis, planning, and training in GHP of fresh vegetables, besides contributing to raise awareness of the consumers' food safety. The FG methodology provided useful information to establish the final checklist for GHP, with easy application, according to the previous participants' perception and experience.
FTDD973: A multimedia knowledge-based system and methodology for operator training and diagnostics
NASA Technical Reports Server (NTRS)
Hekmatpour, Amir; Brown, Gary; Brault, Randy; Bowen, Greg
1993-01-01
FTDD973 (973 Fabricator Training, Documentation, and Diagnostics) is an interactive multimedia knowledge based system and methodology for computer-aided training and certification of operators, as well as tool and process diagnostics in IBM's CMOS SGP fabrication line (building 973). FTDD973 is an example of what can be achieved with modern multimedia workstations. Knowledge-based systems, hypertext, hypergraphics, high resolution images, audio, motion video, and animation are technologies that in synergy can be far more useful than each by itself. FTDD973's modular and object-oriented architecture is also an example of how improvements in software engineering are finally making it possible to combine many software modules into one application. FTDD973 is developed in ExperMedia/2; and OS/2 multimedia expert system shell for domain experts.
Learning physical descriptors for materials science by compressed sensing
NASA Astrophysics Data System (ADS)
Ghiringhelli, Luca M.; Vybiral, Jan; Ahmetcik, Emre; Ouyang, Runhai; Levchenko, Sergey V.; Draxl, Claudia; Scheffler, Matthias
2017-02-01
The availability of big data in materials science offers new routes for analyzing materials properties and functions and achieving scientific understanding. Finding structure in these data that is not directly visible by standard tools and exploitation of the scientific information requires new and dedicated methodology based on approaches from statistical learning, compressed sensing, and other recent methods from applied mathematics, computer science, statistics, signal processing, and information science. In this paper, we explain and demonstrate a compressed-sensing based methodology for feature selection, specifically for discovering physical descriptors, i.e., physical parameters that describe the material and its properties of interest, and associated equations that explicitly and quantitatively describe those relevant properties. As showcase application and proof of concept, we describe how to build a physical model for the quantitative prediction of the crystal structure of binary compound semiconductors.
The future is now: single-cell genomics of bacteria and archaea
Blainey, Paul C.
2013-01-01
Interest in the expanding catalog of uncultivated microorganisms, increasing recognition of heterogeneity among seemingly similar cells, and technological advances in whole-genome amplification and single-cell manipulation are driving considerable progress in single-cell genomics. Here, the spectrum of applications for single-cell genomics, key advances in the development of the field, and emerging methodology for single-cell genome sequencing are reviewed by example with attention to the diversity of approaches and their unique characteristics. Experimental strategies transcending specific methodologies are identified and organized as a road map for future studies in single-cell genomics of environmental microorganisms. Over the next decade, increasingly powerful tools for single-cell genome sequencing and analysis will play key roles in accessing the genomes of uncultivated organisms, determining the basis of microbial community functions, and fundamental aspects of microbial population biology. PMID:23298390
Magalhaes, Sandra; Banwell, Brenda; Bar-Or, Amit; Fortier, Isabel; Hanwell, Heather E; Lim, Ming; Matt, Georg E; Neuteboom, Rinze F; O'Riordan, David L; Schneider, Paul K; Pugliatti, Maura; Shatenstein, Bryna; Tansey, Catherine M; Wassmer, Evangeline; Wolfson, Christina
2018-06-01
While studying the etiology of multiple sclerosis (MS) in children has several methodological advantages over studying etiology in adults, studies are limited by small sample sizes. Using a rigorous methodological process, we developed the Pediatric MS Tool-Kit, a measurement framework that includes a minimal set of core variables to assess etiological risk factors. We solicited input from the International Pediatric MS Study Group to select three risk factors: environmental tobacco smoke (ETS) exposure, sun exposure, and vitamin D intake. To develop the Tool-Kit, we used a Delphi study involving a working group of epidemiologists, neurologists, and content experts from North America and Europe. The Tool-Kit includes six core variables to measure ETS, six to measure sun exposure, and six to measure vitamin D intake. The Tool-Kit can be accessed online ( www.maelstrom-research.org/mica/network/tool-kit ). The goals of the Tool-Kit are to enhance exposure measurement in newly designed pediatric MS studies and comparability of results across studies, and in the longer term to facilitate harmonization of studies, a methodological approach that can be used to circumvent issues of small sample sizes. We believe the Tool-Kit will prove to be a valuable resource to guide pediatric MS researchers in developing study-specific questionnaire.
[Ergonomic risk assessment: aspects applicable in the light of current standards].
Baracco, A; Perrelli, F; Romano, C
2010-01-01
The Italian decree law 81/2008 mentions the application of ergonomic principles as a basic tool for the prevention. In this regulation we can not find the definition either of Ergonomics nor of the competences required for its correct application. The Authors consider that occupational physicians have a suitable competence and knowledge on the matter, thanks to their highly specialized training. Actually, the ergonomic doctrine shows up in the daily practice of occupational physicians, who regularly base their activity on the evaluation of the binomial "worker-task": in the management of fitness to work judgements they try to combine operating conditions with worker's psychophysical state, not confining themselves to a simple expression ofa medico-legal certificate. However, the legislative references to specific regulations raise difficulties to occupational physicians in dealing with aspects such as gender, age, reference values and methodological choices. The Authors debate these difficulties in the application of rules.
ERIC Educational Resources Information Center
Mills, Carmen; Molla, Tebeje; Gale, Trevor; Cross, Russell; Parker, Stephen; Smith, Catherine
2017-01-01
This article investigates the social justice dispositions of teachers and principals in secondary schools as inferred from their metaphoric expressions. Drawing on a Bourdieuian account of disposition, our focus is the use of metaphor as a methodological tool to identify and reveal these otherwise latent forces within our data. Our analysis shows…
Methodology and Practical Tools for Enhancing an Accounting/Business Ethics Class
ERIC Educational Resources Information Center
Kreissl, Laura Jean; Upshaw, Alice
2012-01-01
While many articles have argued the value and impact of ethics courses, few have discussed methodology and particularly the tools used in the implementation of accounting ethics classes. We address both of those items in this paper in hopes of helping other instructors in building or strengthening their courses. This paper describes the…
Education in Management of Data Created by New Technologies for Rapid Product Development in SMEs
ERIC Educational Resources Information Center
Shaw, A.; Aitchison, D.
2003-01-01
This paper presents outcomes from a research programme aimed at developing new tools and methodologies to assist small and medium-sized enterprises (SMEs) in rapid product development (RPD). The authors suggest that current education strategies for the teaching of RPD tools and methodologies may be of limited value unless those strategies also…
Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?
ERIC Educational Resources Information Center
Brondani, Mario; He, Sarah
2013-01-01
Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…
Challenges in the estimation of Net SURvival: The CENSUR working survival group.
Giorgi, R
2016-10-01
Net survival, the survival probability that would be observed, in a hypothetical world, where the cancer of interest would be the only possible cause of death, is a key indicator in population-based cancer studies. Accounting for mortality due to other causes, it allows cross-country comparisons or trends analysis and provides a useful indicator for public health decision-making. The objective of this study was to show how the creation and formalization of a network comprising established research teams, which already had substantial and complementary experience in both cancer survival analysis and methodological development, make it possible to meet challenges and thus provide more adequate tools, to improve the quality and the comparability of cancer survival data, and to promote methodological transfers in areas of emerging interest. The Challenges in the Estimation of Net SURvival (CENSUR) working survival group is composed of international researchers highly skilled in biostatistics, methodology, and epidemiology, from different research organizations in France, the United Kingdom, Italy, Slovenia, and Canada, and involved in French (FRANCIM) and European (EUROCARE) cancer registry networks. The expected advantages are an interdisciplinary, international, synergistic network capable of addressing problems in public health, for decision-makers at different levels; tools for those in charge of net survival analyses; a common methodology that makes unbiased cross-national comparisons of cancer survival feasible; transfer of methods for net survival estimations to other specific applications (clinical research, occupational epidemiology); and dissemination of results during an international training course. The formalization of the international CENSUR working survival group was motivated by a need felt by scientists conducting population-based cancer research to discuss, develop, and monitor implementation of a common methodology to analyze net survival in order to provide useful information for cancer control and cancer policy. A "team science" approach is necessary to address new challenges concerning the estimation of net survival. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Connectivity Measures in EEG Microstructural Sleep Elements.
Sakellariou, Dimitris; Koupparis, Andreas M; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K
2016-01-01
During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an "EEG-element connectivity" methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease.
Connectivity Measures in EEG Microstructural Sleep Elements
Sakellariou, Dimitris; Koupparis, Andreas M.; Kokkinos, Vasileios; Koutroumanidis, Michalis; Kostopoulos, George K.
2016-01-01
During Non-Rapid Eye Movement sleep (NREM) the brain is relatively disconnected from the environment, while connectedness between brain areas is also decreased. Evidence indicates, that these dynamic connectivity changes are delivered by microstructural elements of sleep: short periods of environmental stimuli evaluation followed by sleep promoting procedures. The connectivity patterns of the latter, among other aspects of sleep microstructure, are still to be fully elucidated. We suggest here a methodology for the assessment and investigation of the connectivity patterns of EEG microstructural elements, such as sleep spindles. The methodology combines techniques in the preprocessing, estimation, error assessing and visualization of results levels in order to allow the detailed examination of the connectivity aspects (levels and directionality of information flow) over frequency and time with notable resolution, while dealing with the volume conduction and EEG reference assessment. The high temporal and frequency resolution of the methodology will allow the association between the microelements and the dynamically forming networks that characterize them, and consequently possibly reveal aspects of the EEG microstructure. The proposed methodology is initially tested on artificially generated signals for proof of concept and subsequently applied to real EEG recordings via a custom built MATLAB-based tool developed for such studies. Preliminary results from 843 fast sleep spindles recorded in whole night sleep of 5 healthy volunteers indicate a prevailing pattern of interactions between centroparietal and frontal regions. We demonstrate hereby, an opening to our knowledge attempt to estimate the scalp EEG connectivity that characterizes fast sleep spindles via an “EEG-element connectivity” methodology we propose. The application of the latter, via a computational tool we developed suggests it is able to investigate the connectivity patterns related to the occurrence of EEG microstructural elements. Network characterization of specified physiological or pathological EEG microstructural elements can potentially be of great importance in the understanding, identification, and prediction of health and disease. PMID:26924980