Four applications of a software data collection and analysis methodology
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Selby, Richard W., Jr.
1985-01-01
The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.
NASA Technical Reports Server (NTRS)
Onwubiko, Chinyere; Onyebueke, Landon
1996-01-01
This program report is the final report covering all the work done on this project. The goal of this project is technology transfer of methodologies to improve design process. The specific objectives are: 1. To learn and understand the Probabilistic design analysis using NESSUS. 2. To assign Design Projects to either undergraduate or graduate students on the application of NESSUS. 3. To integrate the application of NESSUS into some selected senior level courses in Civil and Mechanical Engineering curricula. 4. To develop courseware in Probabilistic Design methodology to be included in a graduate level Design Methodology course. 5. To study the relationship between the Probabilistic design methodology and Axiomatic design methodology.
Railroad classification yard design methodology study Elkhart Yard Rehabilitation : a case study
DOT National Transportation Integrated Search
1980-02-01
This interim report documents the application of a railroad classification : yard design methodology to CONRAIL's Elkhart Yard Rehabilitation. This : case study effort represents Phase 2 of a larger effort to develop a yard : design methodology, and ...
Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo
2010-01-01
Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818
DOT National Transportation Integrated Search
1977-06-01
Author's abstract: In this report, a methodology for analyzing general categorical data with misclassification errors is developed and applied to the study of seat belt effectiveness. The methodology assumes the availability of an original large samp...
ERIC Educational Resources Information Center
Cheung, Alan C. K.; Slavin, Robert E.
2013-01-01
The present review examines research on the effects of educational technology applications on mathematics achievement in K-12 classrooms. Unlike previous reviews, this review applies consistent inclusion standards to focus on studies that met high methodological standards. In addition, methodological and substantive features of the studies are…
Gu, Huidong; Wang, Jian; Aubry, Anne-Françoise; Jiang, Hao; Zeng, Jianing; Easter, John; Wang, Jun-sheng; Dockens, Randy; Bifano, Marc; Burrell, Richard; Arnold, Mark E
2012-06-05
A methodology for the accurate calculation and mitigation of isotopic interferences in liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS) assays and its application in supporting microdose absolute bioavailability studies are reported for the first time. For simplicity, this calculation methodology and the strategy to minimize the isotopic interference are demonstrated using a simple molecule entity, then applied to actual development drugs. The exact isotopic interferences calculated with this methodology were often much less than the traditionally used, overestimated isotopic interferences simply based on the molecular isotope abundance. One application of the methodology is the selection of a stable isotopically labeled internal standard (SIL-IS) for an LC-MS/MS bioanalytical assay. The second application is the selection of an SIL analogue for use in intravenous (i.v.) microdosing for the determination of absolute bioavailability. In the case of microdosing, the traditional approach of calculating isotopic interferences can result in selecting a labeling scheme that overlabels the i.v.-dosed drug or leads to incorrect conclusions on the feasibility of using an SIL drug and analysis by LC-MS/MS. The methodology presented here can guide the synthesis by accurately calculating the isotopic interferences when labeling at different positions, using different selective reaction monitoring (SRM) transitions or adding more labeling positions. This methodology has been successfully applied to the selection of the labeled i.v.-dosed drugs for use in two microdose absolute bioavailability studies, before initiating the chemical synthesis. With this methodology, significant time and cost saving can be achieved in supporting microdose absolute bioavailability studies with stable labeled drugs.
SOME POSSIBLE APPLICATIONS OF PROJECT OUTCOMES RESEARCH METHODOLOGY
Section I, refers to the possibility of using the theory and methodology of Project Outcomes to problems of strategic information. It is felt that...purposes of assessing present and future organizational effectiveness . Section IV, refers to the applications that our study may have for problems of
Multilevel Modeling: A Review of Methodological Issues and Applications
ERIC Educational Resources Information Center
Dedrick, Robert F.; Ferron, John M.; Hess, Melinda R.; Hogarty, Kristine Y.; Kromrey, Jeffrey D.; Lang, Thomas R.; Niles, John D.; Lee, Reginald S.
2009-01-01
This study analyzed the reporting of multilevel modeling applications of a sample of 99 articles from 13 peer-reviewed journals in education and the social sciences. A checklist, derived from the methodological literature on multilevel modeling and focusing on the issues of model development and specification, data considerations, estimation, and…
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1986-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
Dynamic Decision Making under Uncertainty and Partial Information
2017-01-30
order to address these problems, we investigated efficient computational methodologies for dynamic decision making under uncertainty and partial...information. In the course of this research, we developed and studied efficient simulation-based methodologies for dynamic decision making under...uncertainty and partial information; (ii) studied the application of these decision making models and methodologies to practical problems, such as those
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Mattern, Duane
1994-01-01
An advanced methodology for integrated flight propulsion control (IFPC) design for future aircraft, which will use propulsion system generated forces and moments for enhanced maneuver capabilities, is briefly described. This methodology has the potential to address in a systematic manner the coupling between the airframe and the propulsion subsystems typical of such enhanced maneuverability aircraft. Application of the methodology to a short take-off vertical landing (STOVL) aircraft in the landing approach to hover transition flight phase is presented with brief description of the various steps in the IFPC design methodology. The details of the individual steps have been described in previous publications and the objective of this paper is to focus on how the components of the control system designed at each step integrate into the overall IFPC system. The full nonlinear IFPC system was evaluated extensively in nonreal-time simulations as well as piloted simulations. Results from the nonreal-time evaluations are presented in this paper. Lessons learned from this application study are summarized in terms of areas of potential improvements in the STOVL IFPC design as well as identification of technology development areas to enhance the applicability of the proposed design methodology.
Turbofan engine control system design using the LQG/LTR methodology
NASA Technical Reports Server (NTRS)
Garg, Sanjay
1989-01-01
Application of the linear-quadratic-Gaussian with loop-transfer-recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired target feedback loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.
Turbofan engine control system design using the LQG/LTR methodology
NASA Technical Reports Server (NTRS)
Garg, Sanjay
1989-01-01
Application of the Linear-Quadratic-Gaussian with Loop-Transfer-Recovery methodology to design of a control system for a simplified turbofan engine model is considered. The importance of properly scaling the plant to achieve the desired Target-Feedback-Loop is emphasized. The steps involved in the application of the methodology are discussed via an example, and evaluation results are presented for a reduced-order compensator. The effect of scaling the plant on the stability robustness evaluation of the closed-loop system is studied in detail.
Railroad classification yard design methodology study : East Deerfield Yard, a case study
DOT National Transportation Integrated Search
1980-02-01
This interim report documents the application of a railroad classification yard design methodology to Boston and Maine's East Deerfield Yard Rehabiliation. This case study effort represents Phase 2 of a larger effort to develop a yard design methodol...
Railroad classification yard technology : computer system methodology : case study : Potomac Yard
DOT National Transportation Integrated Search
1981-08-01
This report documents the application of the railroad classification yard computer system methodology to Potomac Yard of the Richmond, Fredericksburg, and Potomac Railroad Company (RF&P). This case study entailed evaluation of the yard traffic capaci...
Giansanti, Daniele; Morelli, Sandra; Maccioni, Giovanni; Guerriero, Lorenzo; Bedini, Remo; Pepe, Gennaro; Colombo, Cesare; Borghi, Gabriella; Macellari, Velio
2009-01-01
Due to major advances in the information technology, telemedicine applications are ready for a widespread use. Nonetheless, to allow their diffusion in National Health Care Systems (NHCSs) specific methodologies of health technology assessment (HTA) should be used to assess the standardization, the overall quality, the interoperability, the addressing to legal, economic and cost benefit aspects. One of the limits to the diffusion of the digital tele-echocardiography (T-E) applications in the NHCS lacking of a specific methodology for the HTA. In the present study, a solution offering a structured HTA of T-E products was designed. The methodology assured also the definition of standardized quality levels for the application. The first level represents the minimum level of acceptance; the other levels are accessory levels useful for a more accurate assessment of the product. The methodology showed to be useful to rationalize the process of standardization and has received a high degree of acceptance by the subjects involved in the study.
A methodological review of qualitative case study methodology in midwifery research.
Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn
2016-10-01
To explore the use and application of case study research in midwifery. Case study research provides rich data for the analysis of complex issues and interventions in the healthcare disciplines; however, a gap in the midwifery research literature was identified. A methodological review of midwifery case study research using recognized templates, frameworks and reporting guidelines facilitated comprehensive analysis. An electronic database search using the date range January 2005-December 2014: Maternal and Infant Care, CINAHL Plus, Academic Search Complete, Web of Knowledge, SCOPUS, Medline, Health Collection (Informit), Cochrane Library Health Source: Nursing/Academic Edition, Wiley online and ProQuest Central. Narrative evaluation was undertaken. Clearly worded questions reflected the problem and purpose. The application, strengths and limitations of case study methods were identified through a quality appraisal process. The review identified both case study research's applicability to midwifery and its low uptake, especially in clinical studies. Many papers included the necessary criteria to achieve rigour. The included measures of authenticity and methodology were varied. A high standard of authenticity was observed, suggesting authors considered these elements to be routine inclusions. Technical aspects were lacking in many papers, namely a lack of reflexivity and incomplete transparency of processes. This review raises the profile of case study research in midwifery. Midwives will be encouraged to explore if case study research is suitable for their investigation. The raised profile will demonstrate further applicability; encourage support and wider adoption in the midwifery setting. © 2016 John Wiley & Sons Ltd.
Recommendations for benefit-risk assessment methodologies and visual representations.
Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain
2016-03-01
The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.
Evaluation of stormwater harvesting sites using multi criteria decision methodology
NASA Astrophysics Data System (ADS)
Inamdar, P. M.; Sharma, A. K.; Cook, Stephen; Perera, B. J. C.
2018-07-01
Selection of suitable urban stormwater harvesting sites and associated project planning are often complex due to spatial, temporal, economic, environmental and social factors, and related various other variables. This paper is aimed at developing a comprehensive methodology framework for evaluating of stormwater harvesting sites in urban areas using Multi Criteria Decision Analysis (MCDA). At the first phase, framework selects potential stormwater harvesting (SWH) sites using spatial characteristics in a GIS environment. In second phase, MCDA methodology is used for evaluating and ranking of SWH sites in multi-objective and multi-stakeholder environment. The paper briefly describes first phase of framework and focuses chiefly on the second phase of framework. The application of the methodology is also demonstrated over a case study comprising of the local government area, City of Melbourne (CoM), Australia for the benefit of wider water professionals engaged in this area. Nine performance measures (PMs) were identified to characterise the objectives and system performance related to the eight alternative SWH sites for the demonstration of the application of developed methodology. To reflect the stakeholder interests in the current study, four stakeholder participant groups were identified, namely, water authorities (WA), academics (AC), consultants (CS), and councils (CL). The decision analysis methodology broadly consisted of deriving PROMETHEE II rankings of eight alternative SWH sites in the CoM case study, under two distinct group decision making scenarios. The major innovation of this work is the development and application of comprehensive methodology framework that assists in the selection of potential sites for SWH, and facilitates the ranking in multi-objective and multi-stakeholder environment. It is expected that the proposed methodology will assist the water professionals and managers with better knowledge that will reduce the subjectivity in the selection and evaluation of SWH sites.
NASA Technical Reports Server (NTRS)
Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.
1989-01-01
A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.
Tautin, J.; Lebreton, J.-D.; North, P.M.
1993-01-01
Capture-recapture methodology has advanced greatly in the last twenty years and is now a major factor driving the continuing evolution of the North American bird banding program. Bird banding studies are becoming more scientific with improved study designs and analytical procedures. Researchers and managers are gaining more reliable knowledge which in turn betters the conservation of migratory birds. The advances in capture-recapture methodology have benefited gamebird studies primarily, but nongame bird studies will benefit similarly as they expand greatly in the next decade. Further theoretical development of capture-recapture methodology should be encouraged, and, to maximize benefits of the methodology, work on practical applications should be increased.
The Methodology for Developing Mobile Agent Application for Ubiquitous Environment
NASA Astrophysics Data System (ADS)
Matsuzaki, Kazutaka; Yoshioka, Nobukazu; Honiden, Shinichi
A methodology which enables a flexible and reusable development of mobile agent application to a mobility aware indoor environment is provided in this study. The methodology is named Workflow-awareness model based on a concept of a pair of mobile agents cooperating to perform a given task. A monolithic mobile agent application with numerous concerns in a mobility aware setting is divided into a master agent (MA) and a shadow agent (SA) according to a type of tasks. The MA executes a main application logic which includes monitoring a user's physical movement and coordinating various services. The SA performs additional tasks depending on environments to aid the MA in achieving efficient execution without losing application logic. "Workflow-awareness (WFA)" means that the SA knows the MA's execution state transition so that the SA can provide a proper task at a proper timing. A prototype implementation of the methodology is done with a practical use of AspectJ. AspectJ is used to automate WFA by weaving communication modules to both MA and SA. Usefulness of this methodology concerning its efficiency and software engineering aspects are analyzed. As for the effectiveness, the overhead of WFA is relatively small to the whole expenditure time. And from the view of the software engineering, WFA is possible to provide a mechanism to deploy one application in various situations.
Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean
2017-03-01
In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.
A manual for conducting environmental impact studies.
DOT National Transportation Integrated Search
1971-01-01
This report suggests methodologies which should enable an interdisciplinary team to assess community values. The methodologies are applicable to the conceptual, location, and design phases of highway planning, respectively. The approach employs a wei...
Coussot, G; Ladner, Y; Bayart, C; Faye, C; Vigier, V; Perrin, C
2015-01-09
This work aims at studying the potentialities of an on-line capillary electrophoresis (CE)-based digestion methodology for evaluating polymer-drug conjugates degradability in the presence of free trypsin (in-solution digestion). A sandwich plugs injection scheme with transverse diffusion of laminar profile (TDLFP) mode was used to achieve on-line digestions. Electrophoretic separation conditions were established using poly-l-Lysine (PLL) as reference substrate. Comparison with off-line digestion was carried out to demonstrate the feasibility of the proposed methodology. The applicability of the on-line CE-based digestion methodology was evaluated for two PLL-drug conjugates and for the four first generations of dendrigraft of lysine (DGL). Different electrophoretic profiles presenting the formation of di, tri, and tetralysine were observed for PLL-drug and DGL. These findings are in good agreement with the nature of the linker used to link the drug to PLL structure and the predicted degradability of DGL. The present on-line methodology applicability was also successfully proven for protein conjugates hydrolysis. In summary, the described methodology provides a powerful tool for the rapid study of biodegradable polymers. Copyright © 2014 Elsevier B.V. All rights reserved.
Applications of mixed-methods methodology in clinical pharmacy research.
Hadi, Muhammad Abdul; Closs, S José
2016-06-01
Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.
2016-12-22
assumptions of behavior. This research proposes an information theoretic methodology to discover such complex network structures and dynamics while overcoming...the difficulties historically associated with their study. Indeed, this was the first application of an information theoretic methodology as a tool...1 Research Objectives and Questions..............................................................................2 Methodology
ERIC Educational Resources Information Center
Cosier, Meghan
2012-01-01
Historically, researchers focused on individuals with severe disabilities have utilized single-subject research methodologies to study the application of the behavioral theory to learning. In contrast, disability studies scholars have primarily used qualitative research methodologies to study quality of life or policy issues related to individuals…
Proteomic Profiling of Rat Thyroarytenoid Muscle
ERIC Educational Resources Information Center
Welham, Nathan V.; Marriott, Gerard; Bless, Diane M.
2006-01-01
Purpose: Proteomic methodologies offer promise in elucidating the systemwide cellular and molecular processes that characterize normal and diseased thyroarytenoid (TA) muscle. This study examined methodological issues central to the application of 2-dimensional sodium dodecyl sulfate polyacrylamide gel electrophoresis (2D SDS-PAGE) to the study of…
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
NASA Technical Reports Server (NTRS)
Unal, Resit; Keating, Charles; Conway, Bruce; Chytka, Trina
2004-01-01
A comprehensive expert-judgment elicitation methodology to quantify input parameter uncertainty and analysis tool uncertainty in a conceptual launch vehicle design analysis has been developed. The ten-phase methodology seeks to obtain expert judgment opinion for quantifying uncertainties as a probability distribution so that multidisciplinary risk analysis studies can be performed. The calibration and aggregation techniques presented as part of the methodology are aimed at improving individual expert estimates, and provide an approach to aggregate multiple expert judgments into a single probability distribution. The purpose of this report is to document the methodology development and its validation through application to a reference aerospace vehicle. A detailed summary of the application exercise, including calibration and aggregation results is presented. A discussion of possible future steps in this research area is given.
NASA Astrophysics Data System (ADS)
Sharma, Amita; Sarangdevot, S. S.
2010-11-01
Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.
2017-09-01
THE EXPANDED APPLICATION OF FORENSIC SCIENCE AND LAW ENFORCEMENT METHODOLOGIES IN ARMY COUNTERINTELLIGENCE A RESEARCH PROJECT...Jul 2017 The Expanded Application of Forensic Science and Law Enforcement Methodologies in Army Counterintelligence CW2 Stockham, Braden E. National...forensic science resources, law enforcement methodologies and procedures, and basic investigative training. In order to determine if these changes would
Deljavan, Reza; Sadeghi-Bazargani, Homayoun; Fouladi, Nasrin; Arshi, Shahnam; Mohammadi, Reza
2012-01-01
Little has been done to investigate the application of injury specific qualitative research methods in the field of burn injuries. The aim of this study was to use an analytical tool (Haddon's matrix) through qualitative research methods to better understand people's perceptions about burn injuries. This study applied Haddon's matrix as a framework and an analytical tool for a qualitative research methodology in burn research. Both child and adult burn injury victims were enrolled into a qualitative study conducted using focus group discussion. Haddon's matrix was used to develop an interview guide and also through the analysis phase. The main analysis clusters were pre-event level/human (including risky behaviors, belief and cultural factors, and knowledge and education), pre-event level/object, pre-event phase/environment and event and post-event phase (including fire control, emergency scald and burn wound management, traditional remedies, medical consultation, and severity indicators). This research gave rise to results that are possibly useful both for future injury research and for designing burn injury prevention plans. Haddon's matrix is applicable in a qualitative research methodology both at data collection and data analysis phases. The study using Haddon's matrix through a qualitative research methodology yielded substantially rich information regarding burn injuries that may possibly be useful for prevention or future quantitative research.
NASA Astrophysics Data System (ADS)
Jia, Xiaodong; Jin, Chao; Buzza, Matt; Di, Yuan; Siegel, David; Lee, Jay
2018-01-01
Successful applications of Diffusion Map (DM) in machine failure detection and diagnosis have been reported in several recent studies. DM provides an efficient way to visualize the high-dimensional, complex and nonlinear machine data, and thus suggests more knowledge about the machine under monitoring. In this paper, a DM based methodology named as DM-EVD is proposed for machine degradation assessment, abnormality detection and diagnosis in an online fashion. Several limitations and challenges of using DM for machine health monitoring have been analyzed and addressed. Based on the proposed DM-EVD, a deviation based methodology is then proposed to include more dimension reduction methods. In this work, the incorporation of Laplacian Eigen-map and Principal Component Analysis (PCA) are explored, and the latter algorithm is named as PCA-Dev and is validated in the case study. To show the successful application of the proposed methodology, case studies from diverse fields are presented and investigated in this work. Improved results are reported by benchmarking with other machine learning algorithms.
NASA Astrophysics Data System (ADS)
Lee, Jay; Wu, Fangji; Zhao, Wenyu; Ghaffari, Masoud; Liao, Linxia; Siegel, David
2014-01-01
Much research has been conducted in prognostics and health management (PHM), an emerging field in mechanical engineering that is gaining interest from both academia and industry. Most of these efforts have been in the area of machinery PHM, resulting in the development of many algorithms for this particular application. The majority of these algorithms concentrate on applications involving common rotary machinery components, such as bearings and gears. Knowledge of this prior work is a necessity for any future research efforts to be conducted; however, there has not been a comprehensive overview that details previous and on-going efforts in PHM. In addition, a systematic method for developing and deploying a PHM system has yet to be established. Such a method would enable rapid customization and integration of PHM systems for diverse applications. To address these gaps, this paper provides a comprehensive review of the PHM field, followed by an introduction of a systematic PHM design methodology, 5S methodology, for converting data to prognostics information. This methodology includes procedures for identifying critical components, as well as tools for selecting the most appropriate algorithms for specific applications. Visualization tools are presented for displaying prognostics information in an appropriate fashion for quick and accurate decision making. Industrial case studies are included in this paper to show how this methodology can help in the design of an effective PHM system.
Vertically aligned carbon nanotubes for microelectrode arrays applications.
Castro Smirnov, J R; Jover, Eric; Amade, Roger; Gabriel, Gemma; Villa, Rosa; Bertran, Enric
2012-09-01
In this work a methodology to fabricate carbon nanotube based electrodes using plasma enhanced chemical vapour deposition has been explored and defined. The final integrated microelectrode based devices should present specific properties that make them suitable for microelectrode arrays applications. The methodology studied has been focused on the preparation of highly regular and dense vertically aligned carbon nanotube (VACNT) mat compatible with the standard lithography used for microelectrode arrays technology.
Benefit-cost methodology study with example application of the use of wind generators
NASA Technical Reports Server (NTRS)
Zimmer, R. P.; Justus, C. G.; Mason, R. M.; Robinette, S. L.; Sassone, P. G.; Schaffer, W. A.
1975-01-01
An example application for cost-benefit methodology is presented for the use of wind generators. The approach adopted for the example application consisted of the following activities: (1) surveying of the available wind data and wind power system information, (2) developing models which quantitatively described wind distributions, wind power systems, and cost-benefit differences between conventional systems and wind power systems, and (3) applying the cost-benefit methodology to compare a conventional electrical energy generation system with systems which included wind power generators. Wind speed distribution data were obtained from sites throughout the contiguous United States and were used to compute plant factor contours shown on an annual and seasonal basis. Plant factor values (ratio of average output power to rated power) are found to be as high as 0.6 (on an annual average basis) in portions of the central U. S. and in sections of the New England coastal area. Two types of wind power systems were selected for the application of the cost-benefit methodology. A cost-benefit model was designed and implemented on a computer to establish a practical tool for studying the relative costs and benefits of wind power systems under a variety of conditions and to efficiently and effectively perform associated sensitivity analyses.
Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth
2017-11-28
The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further research is required, however, so that a rational choice can be made about which supplementary search strategies should be used, and when.
Steuten, Lotte; van de Wetering, Gijs; Groothuis-Oudshoorn, Karin; Retèl, Valesca
2013-01-01
This article provides a systematic and critical review of the evolving methods and applications of value of information (VOI) in academia and practice and discusses where future research needs to be directed. Published VOI studies were identified by conducting a computerized search on Scopus and ISI Web of Science from 1980 until December 2011 using pre-specified search terms. Only full-text papers that outlined and discussed VOI methods for medical decision making, and studies that applied VOI and explicitly discussed the results with a view to informing healthcare decision makers, were included. The included papers were divided into methodological and applied papers, based on the aim of the study. A total of 118 papers were included of which 50 % (n = 59) are methodological. A rapidly accumulating literature base on VOI from 1999 onwards for methodological papers and from 2005 onwards for applied papers is observed. Expected value of sample information (EVSI) is the preferred method of VOI to inform decision making regarding specific future studies, but real-life applications of EVSI remain scarce. Methodological challenges to VOI are numerous and include the high computational demands, dealing with non-linear models and interdependency between parameters, estimations of effective time horizons and patient populations, and structural uncertainties. VOI analysis receives increasing attention in both the methodological and the applied literature bases, but challenges to applying VOI in real-life decision making remain. For many technical and methodological challenges to VOI analytic solutions have been proposed in the literature, including leaner methods for VOI. Further research should also focus on the needs of decision makers regarding VOI.
The RAAF Logistics Study. Volume 4,
1986-10-01
Use of Issue-Based Root Definitions Application of Soft Systems Methodology to 27 Information Systems Analysis Conclusion 30 LIST OF ABBREVIATIONS 58 k...Management Control Systems’, Journal of Applied Systems Analysis, Volume 6, 1979, pages 51 to 67. 5. The soft systems methodology was developed to tackle...the soft systems methodology has many advantages whi-h recmmenrl it to this type of study area, it does not mcklel the timo ev, lut i, n :-f a system
Stochastic response surface methodology: A study in the human health area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa
2015-03-10
In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.
Calibration Modeling Methodology to Optimize Performance for Low Range Applications
NASA Technical Reports Server (NTRS)
McCollum, Raymond A.; Commo, Sean A.; Parker, Peter A.
2010-01-01
Calibration is a vital process in characterizing the performance of an instrument in an application environment and seeks to obtain acceptable accuracy over the entire design range. Often, project requirements specify a maximum total measurement uncertainty, expressed as a percent of full-scale. However in some applications, we seek to obtain enhanced performance at the low range, therefore expressing the accuracy as a percent of reading should be considered as a modeling strategy. For example, it is common to desire to use a force balance in multiple facilities or regimes, often well below its designed full-scale capacity. This paper presents a general statistical methodology for optimizing calibration mathematical models based on a percent of reading accuracy requirement, which has broad application in all types of transducer applications where low range performance is required. A case study illustrates the proposed methodology for the Mars Entry Atmospheric Data System that employs seven strain-gage based pressure transducers mounted on the heatshield of the Mars Science Laboratory mission.
The Professionalization of Dental Students: The Application of Socio-Anthropological Methodology.
ERIC Educational Resources Information Center
Platt, Larry A.; Bailey, Wilfrid C.
This paper discusses the advantages of using both qualitative and quantitative methodological procedures in investigating attitudinal and perception changes in the population studied. This project is part of a 4-year longitudinal study involving 24 dental students and 29 faculty members of a new southern dental school. The paper reviews some of…
ERIC Educational Resources Information Center
Williams, Lawrence H., Jr.
2013-01-01
This qualitative study analyzed experiences of twenty software developers. The research showed that all software development methodologies are distinct from each other. While some, such as waterfall, focus on traditional, plan-driven approaches that allow software requirements and design to evolve; others facilitate ambiguity and uncertainty by…
An omnibus test for family-based association studies with multiple SNPs and multiple phenotypes.
Lasky-Su, Jessica; Murphy, Amy; McQueen, Matthew B; Weiss, Scott; Lange, Christoph
2010-06-01
We propose an omnibus family-based association test (MFBAT) that can be applied to multiple markers and multiple phenotypes and that has only one degree of freedom. The proposed test statistic extends current FBAT methodology to incorporate multiple markers as well as multiple phenotypes. Using simulation studies, power estimates for the proposed methodology are compared with the standard methodologies. On the basis of these simulations, we find that MFBAT substantially outperforms other methods, including haplotypic approaches and doing multiple tests with single single-nucleotide polymorphisms (SNPs) and single phenotypes. The practical relevance of the approach is illustrated by an application to asthma in which SNP/phenotype combinations are identified and reach overall significance that would not have been identified using other approaches. This methodology is directly applicable to cases in which there are multiple SNPs, such as candidate gene studies, cases in which there are multiple phenotypes, such as expression data, and cases in which there are multiple phenotypes and genotypes, such as genome-wide association studies that incorporate expression profiles as phenotypes. This program is available in the PBAT analysis package.
High-Dimensional Sparse Factor Modeling: Applications in Gene Expression Genomics
Carvalho, Carlos M.; Chang, Jeffrey; Lucas, Joseph E.; Nevins, Joseph R.; Wang, Quanli; West, Mike
2010-01-01
We describe studies in molecular profiling and biological pathway analysis that use sparse latent factor and regression models for microarray gene expression data. We discuss breast cancer applications and key aspects of the modeling and computational methodology. Our case studies aim to investigate and characterize heterogeneity of structure related to specific oncogenic pathways, as well as links between aggregate patterns in gene expression profiles and clinical biomarkers. Based on the metaphor of statistically derived “factors” as representing biological “subpathway” structure, we explore the decomposition of fitted sparse factor models into pathway subcomponents and investigate how these components overlay multiple aspects of known biological activity. Our methodology is based on sparsity modeling of multivariate regression, ANOVA, and latent factor models, as well as a class of models that combines all components. Hierarchical sparsity priors address questions of dimension reduction and multiple comparisons, as well as scalability of the methodology. The models include practically relevant non-Gaussian/nonparametric components for latent structure, underlying often quite complex non-Gaussianity in multivariate expression patterns. Model search and fitting are addressed through stochastic simulation and evolutionary stochastic search methods that are exemplified in the oncogenic pathway studies. Supplementary supporting material provides more details of the applications, as well as examples of the use of freely available software tools for implementing the methodology. PMID:21218139
A Systematic Determination of Skill and Simulator Requirements for Airplane Pilot Certification
DOT National Transportation Integrated Search
1985-03-01
This research report describes: (1) the FAA's ATP airman certification system; (2) needs of the system regarding simulator use; (3) a systematic methodology for meeting these needs; (4) application of the methodology; (5) results of the study; and (6...
ERIC Educational Resources Information Center
Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen
2017-01-01
Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…
Methodology for estimating helicopter performance and weights using limited data
NASA Technical Reports Server (NTRS)
Baserga, Claudio; Ingalls, Charles; Lee, Henry; Peyran, Richard
1990-01-01
Methodology is developed and described for estimating the flight performance and weights of a helicopter for which limited data are available. The methodology is based on assumptions which couple knowledge of the technology of the helicopter under study with detailed data from well documented helicopters thought to be of similar technology. The approach, analysis assumptions, technology modeling, and the use of reference helicopter data are discussed. Application of the methodology is illustrated with an investigation of the Agusta A129 Mangusta.
ERIC Educational Resources Information Center
Gaven, Patricia; Williams, R. David
An experiment is proposed which will study the advantages of satellite technology as a means for the standardization of teaching methodology in an attempt to socially integrate the rural Alaskan native. With "Man: A Course of Study" as the curricular base of the experiment, there will be a Library Experiment Program for Adults using…
Nanosatellite and Plug-and-Play Architecture 2 (NAPA 2)
2017-02-28
potentially other militarily relevant roles. The "i- Missions" focus area studies the kinetics of rapid mission development. The methodology involves...the US and Sweden in the Nanosatellite and Plug-and-play Architecture or "NAPA" program) is to pioneer a methodology for creating mission capable 6U...spacecraft. The methodology involves interchangeable blackbox (self-describing) components, software (middleware and applications), advanced
Abad, Sergi; Pérez, Xavier; Planas, Antoni; Turon, Xavier
2014-04-01
Recently, the need for crude glycerol valorisation from the biodiesel industry has generated many studies for practical and economic applications. Amongst them, fermentations based on glycerol media for the production of high value metabolites are prominent applications. This has generated a need to develop analytical techniques which allow fast and simple glycerol monitoring during fermentation. The methodology should be fast and inexpensive to be adopted in research, as well as in industrial applications. In this study three different methods were analysed and compared: two common methodologies based on liquid chromatography and enzymatic kits, and the new method based on a DotBlot assay coupled with image analysis. The new methodology is faster and cheaper than the other conventional methods, with comparable performance. Good linearity, precision and accuracy were achieved in the lower range (10 or 15 g/L to depletion), the most common range of glycerol concentrations to monitor fermentations in terms of growth kinetics. Copyright © 2014 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Nickerson, Carol A.; McClelland, Gary H.
1988-01-01
A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)
Development of Methodologies Evaluating Emissions from Metal-Containing Explosives and Propellants
Experiments were performed to develop methodologies that will allow determination of pollutant emission factors for gases and particles produced by...micrometer, 16 by weight). Although not included here, the analysis methods described will be directly applicable to the study of pyrotechnics.
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling
Lareo, Angel; Forlim, Caroline G.; Pinto, Reynaldo D.; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox. PMID:27766078
Temporal Code-Driven Stimulation: Definition and Application to Electric Fish Signaling.
Lareo, Angel; Forlim, Caroline G; Pinto, Reynaldo D; Varona, Pablo; Rodriguez, Francisco de Borja
2016-01-01
Closed-loop activity-dependent stimulation is a powerful methodology to assess information processing in biological systems. In this context, the development of novel protocols, their implementation in bioinformatics toolboxes and their application to different description levels open up a wide range of possibilities in the study of biological systems. We developed a methodology for studying biological signals representing them as temporal sequences of binary events. A specific sequence of these events (code) is chosen to deliver a predefined stimulation in a closed-loop manner. The response to this code-driven stimulation can be used to characterize the system. This methodology was implemented in a real time toolbox and tested in the context of electric fish signaling. We show that while there are codes that evoke a response that cannot be distinguished from a control recording without stimulation, other codes evoke a characteristic distinct response. We also compare the code-driven response to open-loop stimulation. The discussed experiments validate the proposed methodology and the software toolbox.
Samuel, Gbeminiyi O; Hoffmann, Sebastian; Wright, Robert A; Lalu, Manoj Mathew; Patlewicz, Grace; Becker, Richard A; DeGeorge, George L; Fergusson, Dean; Hartung, Thomas; Lewis, R Jeffrey; Stephens, Martin L
2016-01-01
Assessments of methodological and reporting quality are critical to adequately judging the credibility of a study's conclusions and to gauging its potential reproducibility. To aid those seeking to assess the methodological or reporting quality of studies relevant to toxicology, we conducted a scoping review of the available guidance with respect to four types of studies: in vivo and in vitro, (quantitative) structure-activity relationships ([Q]SARs), physico-chemical, and human observational studies. Our aims were to identify the available guidance in this diverse literature, briefly summarize each document, and distill the common elements of these documents for each study type. In general, we found considerable guidance for in vivo and human studies, but only one paper addressed in vitro studies exclusively. The guidance for (Q)SAR studies and physico-chemical studies was scant but authoritative. There was substantial overlap across guidance documents in the proposed criteria for both methodological and reporting quality. Some guidance documents address toxicology research directly, whereas others address preclinical research generally or clinical research and therefore may not be fully applicable to the toxicology context without some translation. Another challenge is the degree to which assessments of methodological quality in toxicology should focus on risk of bias - as in clinical medicine and healthcare - or be broadened to include other quality measures, such as confirming the identity of test substances prior to exposure. Our review is intended primarily for those in toxicology and risk assessment seeking an entry point into the extensive and diverse literature on methodological and reporting quality applicable to their work. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Cho, Kyung Hwa; Lee, Seungwon; Ham, Young Sik; Hwang, Jin Hwan; Cha, Sung Min; Park, Yongeun; Kim, Joon Ha
2009-01-01
The present study proposes a methodology for determining the effective dispersion coefficient based on the field measurements performed in Gwangju (GJ) Creek in South Korea which is environmentally degraded by the artificial interferences such as weirs and culverts. Many previous works determining the dispersion coefficient were limited in application due to the complexity and artificial interferences in natural stream. Therefore, the sequential combination of N-Tank-In-Series (NTIS) model and Advection-Dispersion-Reaction (ADR) model was proposed for evaluating dispersion process in complex stream channel in this study. The series of water quality data were intensively monitored in the field to determine the effective dispersion coefficient of E. coli in rainy day. As a result, the suggested methodology reasonably estimates the dispersion coefficient for GJ Creek with 1.25 m(2)/s. Also, the sequential combined method provided Number of tank-Velocity-Dispersion coefficient (NVD) curves for convenient evaluation of dispersion coefficient of other rivers or streams. Comparing the previous studies, the present methodology is quite general and simple for determining the effective dispersion coefficients which are applicable for other rivers and streams.
Formulating accident occurrence as a survival process.
Chang, H L; Jovanis, P P
1990-10-01
A conceptual framework for accident occurrence is developed based on the principle of the driver as an information processor. The framework underlies the development of a modeling approach that is consistent with the definition of exposure to risk as a repeated trial. Survival theory is proposed as a statistical technique that is consistent with the conceptual structure and allows the exploration of a wide range of factors that contribute to highway operating risk. This survival model of accident occurrence is developed at a disaggregate level, allowing safety researchers to broaden the scope of studies which may be limited by the use of traditional aggregate approaches. An application of the approach to motor carrier safety is discussed as are potential applications to a variety of transportation industries. Lastly, a typology of highway safety research methodologies is developed to compare the properties of four safety methodologies: laboratory experiments, on-the-road studies, multidisciplinary accident investigations, and correlational studies. The survival theory formulation has a mathematical structure that is compatible with each safety methodology, so it may facilitate the integration of findings across methodologies.
Fadyl, Joanna K; Nicholls, David A; McPherson, Kathryn M
2013-09-01
Discourse analysis following the work of Michel Foucault has become a valuable methodology in the critical analysis of a broad range of topics relating to health. However, it can be a daunting task, in that there seems to be both a huge number of possible approaches to carrying out this type of project, and an abundance of different, often conflicting, opinions about what counts as 'Foucauldian'. This article takes the position that methodological design should be informed by ongoing discussion and applied as appropriate to a particular area of inquiry. The discussion given offers an interpretation and application of Foucault's methodological principles, integrating a reading of Foucault with applications of his work by other authors, showing how this is then applied to interrogate the practice of vocational rehabilitation. It is intended as a contribution to methodological discussion in this area, offering an interpretation of various methodological elements described by Foucault, alongside specific application of these aspects.
Product environmental footprint in policy and market decisions: Applicability and impact assessment.
Lehmann, Annekatrin; Bach, Vanessa; Finkbeiner, Matthias
2015-07-01
In April 2013, the European Commission published the Product and Organisation Environmental Footprint (PEF/OEF) methodology--a life cycle-based multicriteria measure of the environmental performance of products, services, and organizations. With its approach of "comparability over flexibility," the PEF/OEF methodology aims at harmonizing existing methods, while decreasing the flexibility provided by the International Organization for Standardization (ISO) standards regarding methodological choices. Currently, a 3-y pilot phase is running, aiming at testing the methodology and developing product category and organization sector rules (PEFCR/OEFSR). Although a harmonized method is in theory a good idea, the PEF/OEF methodology presents challenges, including a risk of confusion and limitations in applicability to practice. The paper discusses the main differences between the PEF and ISO methodologies and highlights challenges regarding PEF applicability, with a focus on impact assessment. Some methodological aspects of the PEF and PEFCR Guides are found to contradict the ISO 14044 (2006) and ISO 14025 (2006). Others, such as prohibition of inventory cutoffs, are impractical. The evaluation of the impact assessment methods proposed in the PEF/OEF Guide showed that the predefined methods for water consumption, land use, and abiotic resources are not adequate because of modeling artefacts, missing inventory data, or incomplete characterization factors. However, the methods for global warming and ozone depletion perform very well. The results of this study are relevant for the PEF (and OEF) pilot phase, which aims at testing the PEF (OEF) methodology (and potentially adapting it) as well as addressing challenges and coping with them. © 2015 SETAC.
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Greco, Patricia
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Lean for Education. Design/methodology/approach: The paper presents the origins, theoretical foundations, core concepts and a case study demonstrating an application in US education,…
Nonlinear and adaptive control
NASA Technical Reports Server (NTRS)
Athans, Michael
1989-01-01
The primary thrust of the research was to conduct fundamental research in the theories and methodologies for designing complex high-performance multivariable feedback control systems; and to conduct feasibiltiy studies in application areas of interest to NASA sponsors that point out advantages and shortcomings of available control system design methodologies.
Determining The Various Perspectives And Consensus Within A Classroom Using Q Methodology
NASA Astrophysics Data System (ADS)
Ramlo, Susan E.
2008-10-01
Q methodology was developed by PhD physicist and psychologist William Stevenson 73 years ago as a new way of investigating people's views of any topic. Yet its application has primarily been in the fields of marketing, psychology, and political science. Still, Q offers an opportunity for the physics education research community to determine the perspectives and consensus within a group, such as a classroom, related to topics of interest such as the nature of science and epistemology. This paper presents the basics of using Q methodology with a classroom application as an example and subsequent comparisons of this example's results to similar studies using qualitative and survey methods.
A Visual Programming Methodology for Tactical Aircrew Scheduling and Other Applications
1991-12-01
prgramming methodology and environment of a user-specific application remains with and is delivered as part of the application, then there is another factor...animation is useful, not only for scheduling applications, but as a general prgramming methodology. Of course, there are a number of improvements...possible using Excel because there is nothing to prevent access to cells. However, it is easy to imagine a spreadsheet which can support the
Waring, Mike; Bielfeldt, Stephan; Mätzold, Katja; Wilhelm, Klaus-Peter
2013-02-01
Chronic wounds require frequent dressing changes. Adhesive dressings used for this indication can be damaging to the stratum corneum, particularly in the elderly where the skin tends to be thinner. Understanding the level of damage caused by dressing removal can aid dressing selection. This study used a novel methodology that applied a stain to the skin and measured the intensity of that stain after repeated application and removal of a series of different adhesive types. Additionally, a traditional method of measuring skin barrier damage (transepidermal water loss) was also undertaken and compared with the staining methodology. The staining methodology and measurement of transepidermal water loss differentiated the adhesive dressings, showing that silicone adhesives caused least trauma to the skin. The staining methodology was shown to be as effective as transepidermal water loss in detecting damage to the stratum corneum and was shown to detect disruption of the barrier earlier than the traditional technique. © 2012 John Wiley & Sons A/S.
Debrus, Benjamin; Lebrun, Pierre; Ceccato, Attilio; Caliaro, Gabriel; Rozet, Eric; Nistor, Iolanda; Oprean, Radu; Rupérez, Francisco J; Barbas, Coral; Boulanger, Bruno; Hubert, Philippe
2011-04-08
HPLC separations of an unknown sample mixture and a pharmaceutical formulation have been optimized using a recently developed chemometric methodology proposed by W. Dewé et al. in 2004 and improved by P. Lebrun et al. in 2008. This methodology is based on experimental designs which are used to model retention times of compounds of interest. Then, the prediction accuracy and the optimal separation robustness, including the uncertainty study, were evaluated. Finally, the design space (ICH Q8(R1) guideline) was computed as the probability for a criterion to lie in a selected range of acceptance. Furthermore, the chromatograms were automatically read. Peak detection and peak matching were carried out with a previously developed methodology using independent component analysis published by B. Debrus et al. in 2009. The present successful applications strengthen the high potential of these methodologies for the automated development of chromatographic methods. Copyright © 2011 Elsevier B.V. All rights reserved.
Remote sensing applied to agriculture: Basic principles, methodology, and applications
NASA Technical Reports Server (NTRS)
Dejesusparada, N. (Principal Investigator); Mendonca, F. J.
1981-01-01
The general principles of remote sensing techniques as applied to agriculture and the methods of data analysis are described. the theoretical spectral responses of crops; reflectance, transmittance, and absorbtance of plants; interactions of plants and soils with reflectance energy; leaf morphology; and factors which affect the reflectance of vegetation cover are dicussed. The methodologies of visual and computer-aided analyses of LANDSAT data are presented. Finally, a case study wherein infrared film was used to detect crop anomalies and other data applications are described.
Sodium MRI: Methods and applications
Madelin, Guillaume; Lee, Jae-Seung; Regatte, Ravinder R.; Jerschow, Alexej
2014-01-01
Sodium NMR spectroscopy and MRI have become popular in recent years through the increased availability of high-field MRI scanners, advanced scanner hardware and improved methodology. Sodium MRI is being evaluated for stroke and tumor detection, for breast cancer studies, and for the assessment of osteoarthritis and muscle and kidney functions, to name just a few. In this article, we aim to present an up-to-date review of the theoretical background, the methodology, the challenges and limitations, and current and potential new applications of sodium MRI. PMID:24815363
Nguyen, Ha T.; Pearce, Joshua M.; Harrap, Rob; Barber, Gerald
2012-01-01
A methodology is provided for the application of Light Detection and Ranging (LiDAR) to automated solar photovoltaic (PV) deployment analysis on the regional scale. Challenges in urban information extraction and management for solar PV deployment assessment are determined and quantitative solutions are offered. This paper provides the following contributions: (i) a methodology that is consistent with recommendations from existing literature advocating the integration of cross-disciplinary competences in remote sensing (RS), GIS, computer vision and urban environmental studies; (ii) a robust methodology that can work with low-resolution, incomprehensive data and reconstruct vegetation and building separately, but concurrently; (iii) recommendations for future generation of software. A case study is presented as an example of the methodology. Experience from the case study such as the trade-off between time consumption and data quality are discussed to highlight a need for connectivity between demographic information, electrical engineering schemes and GIS and a typical factor of solar useful roofs extracted per method. Finally, conclusions are developed to provide a final methodology to extract the most useful information from the lowest resolution and least comprehensive data to provide solar electric assessments over large areas, which can be adapted anywhere in the world. PMID:22666044
Applying a contemporary grounded theory methodology.
Licqurish, Sharon; Seibold, Carmel
2011-01-01
The aim of this paper is to discuss the application of a contemporary grounded theory methodology to a research project exploring the experiences of students studying for a degree in midwifery. Grounded theory is a qualitative research approach developed by Glaser and Strauss in the 1950s but the methodology for this study was modelled on Clarke's (2005) approach and was underpinned by a symbolic interactionist theoretical perspective, post-structuralist theories of Michel Foucault and a constructionist epistemology. The study participants were 19 midwifery students completing their final placement. Data were collected through individual in-depth interviews and participant observation, and analysed using the grounded theory analysis techniques of coding, constant comparative analysis and theoretical sampling, as well as situational maps. The analysis focused on social action and interaction and the operation of power in the students' environment. The social process in which the students were involved, as well as the actors and discourses that affected the students' competency development, were highlighted. The methodology allowed a thorough exploration of the students' experiences of achieving competency. However, some difficulties were encountered. One of the major issues related to the understanding and application of complex sociological theories that challenged positivist notions of truth and power. Furthermore, the mapping processes were complex. Despite these minor challenges, the authors recommend applying this methodology to other similar research projects.
A systematic review of grounded theory studies in physiotherapy.
Ali, Nancy; May, Stephen; Grafton, Kate
2018-05-23
This systematic review aimed at appraising the methodological rigor of grounded theory research published in the field of physiotherapy to assess how the methodology is understood and applied. A secondary aim was to provide research implications drawn from the findings to guide future grounded theory methodology (GTM) research. A systematic search was conducted in MEDLINE, CINHAL, SPORT Discus, Science Direct, PubMed, Scopus, and Web of Science to identify studies in the field of physiotherapy that reported using GTM and/or methods in the study title and/or abstract. The descriptive characteristics and methodological quality of eligible studies were examined using grounded theory methodology assessment guidelines. The review included 68 studies conducted between 1998 and 2017. The findings showed that GTM is becoming increasingly used by physiotherapy researchers. Thirty-six studies (53%) demonstrated a good understanding and appropriate application of GTM. Thirty-two studies (47%) presented descriptive findings and were considered to be of poor methodological quality. There are several key tenets of GTM that are integral to the iterative process of qualitative theorizing and need to be applied throughout all research practices including sampling, data collection, and analysis.
An Application of Six Sigma to Reduce Supplier Quality Cost
NASA Astrophysics Data System (ADS)
Gaikwad, Lokpriya Mohanrao; Teli, Shivagond Nagappa; Majali, Vijay Shashikant; Bhushi, Umesh Mahadevappa
2016-01-01
This article presents an application of Six Sigma to reduce supplier quality cost in manufacturing industry. Although there is a wider acceptance of Six Sigma in many organizations today, there is still a lack of in-depth case study of Six Sigma. For the present research the case study methodology was used. The company decided to reduce quality cost and improve selected processes using Six Sigma methodologies. Regarding the fact that there is a lack of case studies dealing with Six Sigma especially in individual manufacturing organization this article could be of great importance also for the practitioners. This paper discusses the quality and productivity improvement in a supplier enterprise through a case study. The paper deals with an application of Six Sigma define-measure-analyze-improve-control methodology in an industry which provides a framework to identify, quantify and eliminate sources of variation in an operational process in question, to optimize the operation variables, improve and sustain performance viz. process yield with well-executed control plans. Six Sigma improves the process performance (process yield) of the critical operational process, leading to better utilization of resources, decreases variations and maintains consistent quality of the process output.
Yusuf, Afiqah; Elsabbagh, Mayada
2015-12-15
Identifying biomarkers for autism can improve outcomes for those affected by autism. Engaging the diverse stakeholders in the research process using community-based participatory research (CBPR) can accelerate biomarker discovery into clinical applications. However, there are limited examples of stakeholder involvement in autism research, possibly due to conceptual and practical concerns. We evaluate the applicability of CBPR principles to biomarker discovery in autism and critically review empirical studies adopting these principles. Using a scoping review methodology, we identified and evaluated seven studies using CBPR principles in biomarker discovery. The limited number of studies in biomarker discovery adopting CBPR principles coupled with their methodological limitations suggests that such applications are feasible but challenging. These studies illustrate three CBPR themes: community assessment, setting global priorities, and collaboration in research design. We propose that further research using participatory principles would be useful in accelerating the pace of discovery and the development of clinically meaningful biomarkers. For this goal to be successful we advocate for increased attention to previously identified conceptual and methodological challenges to participatory approaches in health research, including improving scientific rigor and developing long-term partnerships among stakeholders.
Meta-Study as Diagnostic: Toward Content Over Form in Qualitative Synthesis.
Frost, Julia; Garside, Ruth; Cooper, Chris; Britten, Nicky
2016-02-01
Having previously conducted qualitative syntheses of the diabetes literature, we wanted to explore the changes in theoretical approaches, methodological practices, and the construction of substantive knowledge which have recently been presented in the qualitative diabetes literature. The aim of this research was to explore the feasibility of synthesizing existing qualitative syntheses of patient perspectives of diabetes using meta-study methodology. A systematic review of qualitative literature, published between 2000 and 2013, was conducted. Six articles were identified as qualitative syntheses. The meta-study methodology was used to compare the theoretical, methodological, analytic, and synthetic processes across the six studies, exploring the potential for an overarching synthesis. We identified that while research questions have increasingly concentrated on specific aspects of diabetes, the focus on systematic review processes has led to the neglect of qualitative theory and methods. This can inhibit the production of compelling results with meaningful clinical applications. Although unable to produce a synthesis of syntheses, we recommend that researchers who conduct qualitative syntheses pay equal attention to qualitative traditions and systematic review processes, to produce research products that are both credible and applicable. © The Author(s) 2015.
Qualitative case study methodology in nursing research: an integrative review.
Anthony, Susan; Jack, Susan
2009-06-01
This paper is a report of an integrative review conducted to critically analyse the contemporary use of qualitative case study methodology in nursing research. Increasing complexity in health care and increasing use of case study in nursing research support the need for current examination of this methodology. In 2007, a search for case study research (published 2005-2007) indexed in the CINAHL, MEDLINE, EMBASE, PsychINFO, Sociological Abstracts and SCOPUS databases was conducted. A sample of 42 case study research papers met the inclusion criteria. Whittemore and Knafl's integrative review method guided the analysis. Confusion exists about the name, nature and use of case study. This methodology, including terminology and concepts, is often invisible in qualitative study titles and abstracts. Case study is an exclusive methodology and an adjunct to exploring particular aspects of phenomena under investigation in larger or mixed-methods studies. A high quality of case study exists in nursing research. Judicious selection and diligent application of literature review methods promote the development of nursing science. Case study is becoming entrenched in the nursing research lexicon as a well-accepted methodology for studying phenomena in health and social care, and its growing use warrants continued appraisal to promote nursing knowledge development. Attention to all case study elements, process and publication is important in promoting authenticity, methodological quality and visibility.
ERIC Educational Resources Information Center
LeMahieu, Paul G.; Nordstrum, Lee E.; Cudney, Elizabeth A.
2017-01-01
Purpose: This paper is one of seven in this volume that aims to elaborate different approaches to quality improvement in education. It delineates a methodology called Six Sigma. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study demonstrating an application of Six Sigma in a…
Guidelines for the Design and Conduct of Clinical Studies in Knee Articular Cartilage Repair
Mithoefer, Kai; Saris, Daniel B.F.; Farr, Jack; Kon, Elizaveta; Zaslav, Kenneth; Cole, Brian J.; Ranstam, Jonas; Yao, Jian; Shive, Matthew; Levine, David; Dalemans, Wilfried; Brittberg, Mats
2011-01-01
Objective: To summarize current clinical research practice and develop methodological standards for objective scientific evaluation of knee cartilage repair procedures and products. Design: A comprehensive literature review was performed of high-level original studies providing information relevant for the design of clinical studies on articular cartilage repair in the knee. Analysis of cartilage repair publications and synopses of ongoing trials were used to identify important criteria for the design, reporting, and interpretation of studies in this field. Results: Current literature reflects the methodological limitations of the scientific evidence available for articular cartilage repair. However, clinical trial databases of ongoing trials document a trend suggesting improved study designs and clinical evaluation methodology. Based on the current scientific information and standards of clinical care, detailed methodological recommendations were developed for the statistical study design, patient recruitment, control group considerations, study endpoint definition, documentation of results, use of validated patient-reported outcome instruments, and inclusion and exclusion criteria for the design and conduct of scientifically sound cartilage repair study protocols. A consensus statement among the International Cartilage Repair Society (ICRS) and contributing authors experienced in clinical trial design and implementation was achieved. Conclusions: High-quality clinical research methodology is critical for the optimal evaluation of current and new cartilage repair technologies. In addition to generally applicable principles for orthopedic study design, specific criteria and considerations apply to cartilage repair studies. Systematic application of these criteria and considerations can facilitate study designs that are scientifically rigorous, ethical, practical, and appropriate for the question(s) being addressed in any given cartilage repair research project. PMID:26069574
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
Improving ED specimen TAT using Lean Six Sigma.
Sanders, Janet H; Karr, Tedd
2015-01-01
Lean and Six Sigma are continuous improvement methodologies that have garnered international fame for improving manufacturing and service processes. Increasingly these methodologies are demonstrating their power to also improve healthcare processes. The purpose of this paper is to discuss a case study for the application of Lean and Six Sigma tools in the reduction of turnaround time (TAT) for Emergency Department (ED) specimens. This application of the scientific methodologies uncovered opportunities to improve the entire ED to lab system for the specimens. This case study provides details on the completion of a Lean Six Sigma project in a 1,000 bed tertiary care teaching hospital. Six Sigma's Define, Measure, Analyze, Improve, and Control methodology is very similar to good medical practice: first, relevant information is obtained and assembled; second, a careful and thorough diagnosis is completed; third, a treatment is proposed and implemented; and fourth, checks are made to determine if the treatment was effective. Lean's primary goal is to do more with less work and waste. The Lean methodology was used to identify and eliminate waste through rapid implementation of change. The initial focus of this project was the reduction of turn-around-times for ED specimens. However, the results led to better processes for both the internal and external customers of this and other processes. The project results included: a 50 percent decrease in vials used for testing, a 50 percent decrease in unused or extra specimens, a 90 percent decrease in ED specimens without orders, a 30 percent decrease in complete blood count analysis (CBCA) Median TAT, a 50 percent decrease in CBCA TAT Variation, a 10 percent decrease in Troponin TAT Variation, a 18.2 percent decrease in URPN TAT Variation, and a 2-5 minute decrease in ED registered nurses rainbow draw time. This case study demonstrated how the quantitative power of Six Sigma and the speed of Lean worked in harmony to improve the blood draw process for a 1,000 bed tertiary care teaching hospital. The blood draw process is a standard process used in hospitals to collect blood chemistry and hematology information for clinicians. The methods used in this case study demonstrated valuable and practical applications of process improvement methodologies that can be used for any hospital process and/or service environment. While this is not the first case study that has demonstrated the use of continuous process improvement methodologies to improve a hospital process, it is unique in the way in which it utilizes the strength of the project focussed approach that adheres more to the structure and rigor of Six Sigma and relied less on the speed of lean. Additionally, the application of these methodologies in healthcare is emerging research.
NASA Astrophysics Data System (ADS)
Hendikawati, P.; Arifudin, R.; Zahid, M. Z.
2018-03-01
This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.
Data Centric Development Methodology
ERIC Educational Resources Information Center
Khoury, Fadi E.
2012-01-01
Data centric applications, an important effort of software development in large organizations, have been mostly adopting a software methodology, such as a waterfall or Rational Unified Process, as the framework for its development. These methodologies could work on structural, procedural, or object oriented based applications, but fails to capture…
C3I system modification and EMC (electromagnetic compatibility) methodology, volume 1
NASA Astrophysics Data System (ADS)
Wilson, J. L.; Jolly, M. B.
1984-01-01
A methodology (i.e., consistent set of procedures) for assessing the electromagnetic compatibility (EMC) of RF subsystem modifications on C3I aircraft was generated during this study (Volume 1). An IEMCAP (Intrasystem Electromagnetic Compatibility Analysis Program) database for the E-3A (AWACS) C3I aircraft RF subsystem was extracted to support the design of the EMC assessment methodology (Volume 2). Mock modifications were performed on the E-3A database to assess, using a preliminary form of the methodology, the resulting EMC impact. Application of the preliminary assessment methodology to modifications in the E-3A database served to fine tune the form of a final assessment methodology. The resulting final assessment methodology is documented in this report in conjunction with the overall study goals, procedures, and database. It is recommended that a similar EMC assessment methodology be developed for the power subsystem within C3I aircraft. It is further recommended that future EMC assessment methodologies be developed around expert systems (i.e., computer intelligent agents) to control both the excruciating detail and user requirement for transparency.
Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
ERIC Educational Resources Information Center
Gilpatrick, Eleanor
This report contains the results of a pilot test which represents the first complete field test of methodological work begun in October 1967 under a Federal grant for the purpose of job analysis in the health services. This 4-year Health Services Mobility Study permitted basic research, field testing, practical application, and policy involvement…
On the importance of methods in hydrological modelling. Perspectives from a case study
NASA Astrophysics Data System (ADS)
Fenicia, Fabrizio; Kavetski, Dmitri
2017-04-01
The hydrological community generally appreciates that developing any non-trivial hydrological model requires a multitude of modelling choices. These choices may range from a (seemingly) straightforward application of mass conservation, to the (often) guesswork-like selection of constitutive functions, parameter values, etc. The application of a model itself requires a myriad of methodological choices - the selection of numerical solvers, objective functions for model calibration, validation approaches, performance metrics, etc. Not unreasonably, hydrologists embarking on ever ambitious projects prioritize hydrological insight over the morass of methodological choices. Perhaps to emphasize "ideas" over "methods", some journals have even reduced the fontsize of the methodology sections of its articles. However, the very nature of modelling is that seemingly routine methodological choices can significantly affect the conclusions of case studies and investigations - making it dangerous to skimp over methodological details in an enthusiastic rush towards the next great hydrological idea. This talk shares modelling insights from a hydrological study of a 300 km2 catchment in Luxembourg, where the diversity of hydrograph dynamics observed at 10 locations begs the question of whether external forcings or internal catchment properties act as dominant controls on streamflow generation. The hydrological insights are fascinating (at least to us), but in this talk we emphasize the impact of modelling methodology on case study conclusions and recommendations. How did we construct our prior set of hydrological model hypotheses? What numerical solver was implemented and why was an objective function based on Bayesian theory deployed? And what would have happened had we omitted model cross-validation, or not used a systematic hypothesis testing approach?
2016-09-15
7 Methodology Overview ................................................................................................7...32 III. Methodology ...33 Overview of Research Methodology ..........................................................................34 Implementation of Methodology
Bigger is Better, but at What Cost? Estimating the Economic Value of Incremental Data Assets.
Dalessandro, Brian; Perlich, Claudia; Raeder, Troy
2014-06-01
Many firms depend on third-party vendors to supply data for commercial predictive modeling applications. An issue that has received very little attention in the prior research literature is the estimation of a fair price for purchased data. In this work we present a methodology for estimating the economic value of adding incremental data to predictive modeling applications and present two cases studies. The methodology starts with estimating the effect that incremental data has on model performance in terms of common classification evaluation metrics. This effect is then translated into economic units, which gives an expected economic value that the firm might realize with the acquisition of a particular data asset. With this estimate a firm can then set a data acquisition price that targets a particular return on investment. This article presents the methodology in full detail and illustrates it in the context of two marketing case studies.
Forecasting the Economic Impact of Future Space Station Operations
NASA Technical Reports Server (NTRS)
Summer, R. A.; Smolensky, S. M.; Muir, A. H.
1967-01-01
Recent manned and unmanned Earth-orbital operations have suggested great promise of improved knowledge and of substantial economic and associated benefits to be derived from services offered by a space station. Proposed application areas include agriculture, forestry, hydrology, public health, oceanography, natural disaster warning, and search/rescue operations. The need for reliable estimates of economic and related Earth-oriented benefits to be realized from Earth-orbital operations is discussed and recent work in this area is reviewed. Emphasis is given to those services based on remote sensing. Requirements for a uniform, comprehensive and flexible methodology are discussed. A brief review of the suggested methodology is presented. This methodology will be exercised through five case studies which were chosen from a gross inventory of almost 400 user candidates. The relationship of case study results to benefits in broader application areas is discussed, Some management implications of possible future program implementation are included.
A methodology for selecting optimum organizations for space communities
NASA Technical Reports Server (NTRS)
Ragusa, J. M.
1978-01-01
This paper suggests that a methodology exists for selecting optimum organizations for future space communities of various sizes and purposes. Results of an exploratory study to identify an optimum hypothetical organizational structure for a large earth-orbiting multidisciplinary research and applications (R&A) Space Base manned by a mixed crew of technologists are presented. Since such a facility does not presently exist, in situ empirical testing was not possible. Study activity was, therefore, concerned with the identification of a desired organizational structural model rather than the empirical testing of it. The principal finding of this research was that a four-level project type 'total matrix' model will optimize the effectiveness of Space Base technologists. An overall conclusion which can be reached from the research is that application of this methodology, or portions of it, may provide planning insights for the formal organizations which will be needed during the Space Industrialization Age.
NASA Astrophysics Data System (ADS)
Polatidis, Heracles; Morales, Jan Borràs
2016-11-01
In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.
Q and you: The application of Q methodology in recreation research
Whitney Ward
2010-01-01
Researchers have used various qualitative and quantitative methods to deal with subjectivity in studying people's recreation experiences. Q methodology has been the most effective approach for analyzing both qualitative and quantitative aspects of experience, including attitudes or perceptions. The method is composed of two main components--Q sorting and Q factor...
Towards a Trans-Disciplinary Methodology for a Game-Based Intervention Development Process
ERIC Educational Resources Information Center
Arnab, Sylvester; Clarke, Samantha
2017-01-01
The application of game-based learning adds play into educational and instructional contexts. Even though there is a lack of standard methodologies or formulaic frameworks to better inform game-based intervention development, there exist scientific and empirical studies that can serve as benchmarks for establishing scientific validity in terms of…
NASA Astrophysics Data System (ADS)
Madariaga, J. M.; Torre-Fdez, I.; Ruiz-Galende, P.; Aramendia, J.; Gomez-Nubla, L.; Fdez-Ortiz de Vallejuelo, S.; Maguregui, M.; Castro, K.; Arana, G.
2018-04-01
Advanced methodologies based on Raman spectroscopy are proposed to detect prebiotic and biotic molecules in returned samples from Mars: (a) optical microscopy with confocal micro-Raman, (b) the SCA instrument, (c) Raman Imaging. Examples for NWA 6148.
USDA-ARS?s Scientific Manuscript database
As remote sensing and variable rate technology are becoming more available for aerial applicators, practical methodologies on effective integration of these technologies are needed for site-specific aerial applications of crop production and protection materials. The objectives of this study were to...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seimenis, Ioannis; Tsekos, Nikolaos V.; Keroglou, Christoforos
2012-04-15
Purpose: The aim of this work was to develop and test a general methodology for the planning and performance of robot-assisted, MR-guided interventions. This methodology also includes the employment of software tools with appropriately tailored routines to effectively exploit the capabilities of MRI and address the relevant spatial limitations. Methods: The described methodology consists of: (1) patient-customized feasibility study that focuses on the geometric limitations imposed by the gantry, the robotic hardware, and interventional tools, as well as the patient; (2) stereotactic preoperative planning for initial positioning of the manipulator and alignment of its end-effector with a selected target; andmore » (3) real-time, intraoperative tool tracking and monitoring of the actual intervention execution. Testing was performed inside a standard 1.5T MRI scanner in which the MR-compatible manipulator is deployed to provide the required access. Results: A volunteer imaging study demonstrates the application of the feasibility stage. A phantom study on needle targeting is also presented, demonstrating the applicability and effectiveness of the proposed preoperative and intraoperative stages of the methodology. For this purpose, a manually actuated, MR-compatible robotic manipulation system was used to accurately acquire a prescribed target through alternative approaching paths. Conclusions: The methodology presented and experimentally examined allows the effective performance of MR-guided interventions. It is suitable for, but not restricted to, needle-targeting applications assisted by a robotic manipulation system, which can be deployed inside a cylindrical scanner to provide the required access to the patient facilitating real-time guidance and monitoring.« less
Semantic Network Adaptation Based on QoS Pattern Recognition for Multimedia Streams
NASA Astrophysics Data System (ADS)
Exposito, Ernesto; Gineste, Mathieu; Lamolle, Myriam; Gomez, Jorge
This article proposes an ontology based pattern recognition methodology to compute and represent common QoS properties of the Application Data Units (ADU) of multimedia streams. The use of this ontology by mechanisms located at different layers of the communication architecture will allow implementing fine per-packet self-optimization of communication services regarding the actual application requirements. A case study showing how this methodology is used by error control mechanisms in the context of wireless networks is presented in order to demonstrate the feasibility and advantages of this approach.
Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H
2018-07-01
Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.
Close Combat Missile Methodology Study
2010-10-14
Modeling: Industrial Applications of DEX.” Informatica 23 (1999): 487-491. Bohanec, Marko, Blaz Zupan, and Vladislav Rajkovic. “Applications of...Lisec. “Multi-attribute Decision Analysis in GIS: Weighted Linear Combination and Ordered Weighted Averaging.” Informatica 33, (1999): 459- 474
Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.
2017-11-15
The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.
A Generalizable Methodology for Quantifying User Satisfaction
NASA Astrophysics Data System (ADS)
Huang, Te-Yuan; Chen, Kuan-Ta; Huang, Polly; Lei, Chin-Laung
Quantifying user satisfaction is essential, because the results can help service providers deliver better services. In this work, we propose a generalizable methodology, based on survival analysis, to quantify user satisfaction in terms of session times, i. e., the length of time users stay with an application. Unlike subjective human surveys, our methodology is based solely on passive measurement, which is more cost-efficient and better able to capture subconscious reactions. Furthermore, by using session times, rather than a specific performance indicator, such as the level of distortion of voice signals, the effects of other factors like loudness and sidetone, can also be captured by the developed models. Like survival analysis, our methodology is characterized by low complexity and a simple model-developing process. The feasibility of our methodology is demonstrated through case studies of ShenZhou Online, a commercial MMORPG in Taiwan, and the most prevalent VoIP application in the world, namely Skype. Through the model development process, we can also identify the most significant performance factors and their impacts on user satisfaction and discuss how they can be exploited to improve user experience and optimize resource allocation.
Schematic representation of case study research designs.
Rosenberg, John P; Yates, Patsy M
2007-11-01
The paper is a report of a study to demonstrate how the use of schematics can provide procedural clarity and promote rigour in the conduct of case study research. Case study research is a methodologically flexible approach to research design that focuses on a particular case - whether an individual, a collective or a phenomenon of interest. It is known as the 'study of the particular' for its thorough investigation of particular, real-life situations and is gaining increased attention in nursing and social research. However, the methodological flexibility it offers can leave the novice researcher uncertain of suitable procedural steps required to ensure methodological rigour. This article provides a real example of a case study research design that utilizes schematic representation drawn from a doctoral study of the integration of health promotion principles and practices into a palliative care organization. The issues discussed are: (1) the definition and application of case study research design; (2) the application of schematics in research; (3) the procedural steps and their contribution to the maintenance of rigour; and (4) the benefits and risks of schematics in case study research. The inclusion of visual representations of design with accompanying explanatory text is recommended in reporting case study research methods.
Analysis of the impact of safeguards criteria
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mullen, M.F.; Reardon, P.T.
As part of the US Program of Technical Assistance to IAEA Safeguards, the Pacific Northwest Laboratory (PNL) was asked to assist in developing and demonstrating a model for assessing the impact of setting criteria for the application of IAEA safeguards. This report presents the results of PNL's work on the task. The report is in three parts. The first explains the technical approach and methodology. The second contains an example application of the methodology. The third presents the conclusions of the study. PNL used the model and computer programs developed as part of Task C.5 (Estimation of Inspection Efforts) ofmore » the Program of Technical Assistance. The example application of the methodology involves low-enriched uranium conversion and fuel fabrication facilities. The effects of variations in seven parameters are considered: false alarm probability, goal probability of detection, detection goal quantity, the plant operator's measurement capability, the inspector's variables measurement capability, the inspector's attributes measurement capability, and annual plant throughput. Among the key results and conclusions of the analysis are the following: the variables with the greatest impact on the probability of detection are the inspector's measurement capability, the goal quantity, and the throughput; the variables with the greatest impact on inspection costs are the throughput, the goal quantity, and the goal probability of detection; there are important interactions between variables. That is, the effects of a given variable often depends on the level or value of some other variable. With the methodology used in this study, these interactions can be quantitatively analyzed; reasonably good approximate prediction equations can be developed using the methodology described here.« less
Janke, Leandro; Lima, André O S; Millet, Maurice; Radetski, Claudemir M
2013-01-01
In Brazil, Solid Waste Disposal Sites have operated without consideration of environmental criteria, these areas being characterized by methane (CH4) emissions during the anaerobic degradation of organic matter. The United Nations organization has made efforts to control this situation, through the United Nations Framework Convention on Climate Change (UNFCCC) and the Kyoto Protocol, where projects that seek to reduce the emissions of greenhouse gases (GHG) can be financially rewarded through Certified Emission Reductions (CERs) if they respect the requirements established by the Clean Development Mechanism (CDM), such as the use of methodologies approved by the CDM Executive Board (CDM-EB). Thus, a methodology was developed according to the CDM standards related to the aeration, excavation and composting of closed Municipal Solid Waste (MSW) landfills, which was submitted to CDM-EB for assessment and, after its approval, applied to a real case study in Maringá City (Brazil) with a view to avoiding negative environmental impacts due the production of methane and leachates even after its closure. This paper describes the establishment of this CDM-EB-approved methodology to determine baseline emissions, project emissions and the resultant emission reductions with the application of appropriate aeration, excavation and composting practices at closed MSW landfills. A further result obtained through the application of the methodology in the landfill case study was that it would be possible to achieve an ex-ante emission reduction of 74,013 tCO2 equivalent if the proposed CDM project activity were implemented.
NASA Technical Reports Server (NTRS)
Mason, Gregory S.; Berg, Martin C.; Mukhopadhyay, Vivek
2002-01-01
To study the effectiveness of various control system design methodologies, the NASA Langley Research Center initiated the Benchmark Active Controls Project. In this project, the various methodologies were applied to design a flutter suppression system for the Benchmark Active Controls Technology (BACT) Wing. This report describes the user's manual and software toolbox developed at the University of Washington to design a multirate flutter suppression control law for the BACT wing.
Design consideration of resonance inverters with electro-technological application
NASA Astrophysics Data System (ADS)
Hinov, Nikolay
2017-12-01
This study presents design consideration of resonance inverters with electro-technological application. The presented methodology was achieved as a result of investigations and analyses of different types and working regimes of resonance inverters, made by the author. Are considered schemes of resonant inverters without inverse diodes. The first harmonic method is used in the analysis and design. This method for the case of inverters with electro-technological application gives very good accuracy. This does not require the use of a complex and heavy mathematical apparatus. The proposed methodology is easy to use and is suitable for use in training students in power electronics. Authenticity of achieved results is confirmed by simulating and physical prototypes research work.
ERIC Educational Resources Information Center
Ram, Shri; Anbu K., John Paul; Kataria, Sanjay
2011-01-01
Purpose: This paper seeks to provide an insight into the implementation of some of the innovative Web 2.0 applications at Jaypee University of Information Technology with the aim of exploring the expectations of the users and their awareness and usage of such applications. Design/methodology/approach: The study was undertaken at the Learning…
A multicriteria-based methodology for site prioritisation in sediment management.
Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos
2009-08-01
Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.
78 FR 77399 - Basic Health Program: Proposed Federal Funding Methodology for Program Year 2015
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-23
... American Indians and Alaska Natives F. Example Application of the BHP Funding Methodology III. Collection... effectively 138 percent due to the application of a required 5 percent income disregard in determining the... correct errors in applying the methodology (such as mathematical errors). Under section 1331(d)(3)(ii) of...
NASA Technical Reports Server (NTRS)
Celaya, Jose R.; Kulkarni, Chetan S.; Biswas, Gautam; Goebel, Kai
2012-01-01
A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
ERIC Educational Resources Information Center
Galeeva, Railya B.
2016-01-01
Purpose: The purpose of this study is to demonstrate an adaptation of the SERVQUAL survey method for measuring the quality of higher educational services in a Russian university context. We use a new analysis and a graphical technique for presentation of results. Design/methodology/approach: The methodology of this research follows the classic…
ERIC Educational Resources Information Center
Sun, Shuyan; Pan, Wei
2014-01-01
As applications of multilevel modelling in educational research increase, researchers realize that multilevel data collected in many educational settings are often not purely nested. The most common multilevel non-nested data structure is one that involves student mobility in longitudinal studies. This article provides a methodological review of…
Visual sensitivity of river recreation to power plants
David H. Blau; Michael C. Bowie
1979-01-01
The consultants were asked by the Power Plant Siting Staff of the Minnesota Environmental Quality Council to develop a methodology for evaluating the sensitivity of river-related recreational activities to visual intrusion by large coal-fired power plants. The methodology, which is applicable to any major stream in the state, was developed and tested on a case study...
Working in the Methodological "Outfield": The Case of Bourdieu and Occupational Therapy
ERIC Educational Resources Information Center
Watson, Jo; Grenfell, Michael
2016-01-01
The article reports on a study of methodological innovation involving occupational therapy (OT) students in higher education (HE). It is based on an original project which examined the experiences and outcomes of non-traditional entrants to pre-registration OT education. A feature of the original project was the application of the epistemological…
Long-term land use and land cover change, and the associated impacts, pose critical challenges to sustaining healthy communities and ecosystems. In this study, a methodology was developed to use parcel data to evaluate land use trends in southeast Arizona’s San Pedro River Water...
Disease Risk Score (DRS) as a Confounder Summary Method: Systematic Review and Recommendations
Tadrous, Mina; Gagne, Joshua J.; Stürmer, Til; Cadarette, Suzanne M.
2013-01-01
Purpose To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. Methods We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Results Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Conclusion Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. PMID:23172692
Disease risk score as a confounder summary method: systematic review and recommendations.
Tadrous, Mina; Gagne, Joshua J; Stürmer, Til; Cadarette, Suzanne M
2013-02-01
To systematically examine trends and applications of the disease risk score (DRS) as a confounder summary method. We completed a systematic search of MEDLINE and Web of Science® to identify all English language articles that applied DRS methods. We tabulated the number of publications by year and type (empirical application, methodological contribution, or review paper) and summarized methods used in empirical applications overall and by publication year (<2000, ≥2000). Of 714 unique articles identified, 97 examined DRS methods and 86 were empirical applications. We observed a bimodal distribution in the number of publications over time, with a peak 1979-1980, and resurgence since 2000. The majority of applications with methodological detail derived DRS using logistic regression (47%), used DRS as a categorical variable in regression (93%), and applied DRS in a non-experimental cohort (47%) or case-control (42%) study. Few studies examined effect modification by outcome risk (23%). Use of DRS methods has increased yet remains low. Comparative effectiveness research may benefit from more DRS applications, particularly to examine effect modification by outcome risk. Standardized terminology may facilitate identification, application, and comprehension of DRS methods. More research is needed to support the application of DRS methods, particularly in case-control studies. Copyright © 2012 John Wiley & Sons, Ltd.
Self-Organisation and Capacity Building: Sustaining the Change
ERIC Educational Resources Information Center
Bain, Alan; Walker, Allan; Chan, Anissa
2011-01-01
Purpose: The paper aims to describe the application of theoretical principles derived from a study of self-organisation and complex systems theory and their application to school-based capacity building to support planned change. Design/methodology/approach: The paper employs a case example in a Hong Kong School to illustrate the application of…
Error-rate prediction for programmable circuits: methodology, tools and studied cases
NASA Astrophysics Data System (ADS)
Velazco, Raoul
2013-05-01
This work presents an approach to predict the error rates due to Single Event Upsets (SEU) occurring in programmable circuits as a consequence of the impact or energetic particles present in the environment the circuits operate. For a chosen application, the error-rate is predicted by combining the results obtained from radiation ground testing and the results of fault injection campaigns performed off-beam during which huge numbers of SEUs are injected during the execution of the studied application. The goal of this strategy is to obtain accurate results about different applications' error rates, without using particle accelerator facilities, thus significantly reducing the cost of the sensitivity evaluation. As a case study, this methodology was applied a complex processor, the Power PC 7448 executing a program issued from a real space application and a crypto-processor application implemented in an SRAM-based FPGA and accepted to be embedded in the payload of a scientific satellite of NASA. The accuracy of predicted error rates was confirmed by comparing, for the same circuit and application, predictions with measures issued from radiation ground testing performed at the cyclotron Cyclone cyclotron of HIF (Heavy Ion Facility) of Louvain-la-Neuve (Belgium).
Application of CFD in Indonesian Research: A review
NASA Astrophysics Data System (ADS)
Ambarita, H.; Siregar, M. R.; Kishinami, K.; Daimaruya, M.; Kawai, H.
2018-04-01
Computational Fluid Dynamics (CFD) is a numerical method that solves fluid flow and related governing equations using a computational tool. The studies on CFD, its methodology and its application as a research tool, are increasing. In this study, application of CFD by Indonesian researcher is briefly reviewed. The main objective is to explore the characteristics of CFD applications in Indonesian researchers. Considering the size and reputation, this study uses Scopus publications indexed data base. All of the documents in Scopus related to CFD which is affiliated by at least one of Indonesian researcher are collected to be reviewed. Research topics, CFD method, and simulation results are reviewed in brief. The results show that there are 260 documents found in literature indexed by Scopus. These documents divided into research articles 125 titles, conference paper 135 titles, book 1 title and review 1 title. In the research articles, only limited researchers focused on the development of CFD methodology. Almost all of the articles focus on using CFD in a particular application, as a research tool, such as aircraft application, wind power and heat exchanger. The topics of the 125 research articles can be divided into 12 specific applications and 1 miscellaneous application. The most popular application is Heating Ventilating and Air Conditioning and followed by Reactor, Transportation and Heat Exchanger applications. The most popular commercial CFD code used is ANSYS Fluent and only several researchers use CFX.
Corvalán, Roberto M; Osses, Mauricio; Urrutia, Cristian M
2002-02-01
Depending on the final application, several methodologies for traffic emission estimation have been developed. Emission estimation based on total miles traveled or other average factors is a sufficient approach only for extended areas such as national or worldwide areas. For road emission control and strategies design, microscale analysis based on real-world emission estimations is often required. This involves actual driving behavior and emission factors of the local vehicle fleet under study. This paper reports on a microscale model for hot road emissions and its application to the metropolitan region of the city of Santiago, Chile. The methodology considers the street-by-street hot emission estimation with its temporal and spatial distribution. The input data come from experimental emission factors based on local driving patterns and traffic surveys of traffic flows for different vehicle categories. The methodology developed is able to estimate hourly hot road CO, total unburned hydrocarbons (THCs), particulate matter (PM), and NO(x) emissions for predefined day types and vehicle categories.
Makar, Susan; Malanowski, Amanda; Rapp, Katie
2016-01-01
The Information Services Office (ISO) of the National Institute of Standards and Technology (NIST) proactively sought out an opportunity to present the findings of a study that showed the impact of NIST’s forensic research output to its internal customers and outside researchers. ISO analyzed the impact of NIST’s contributions to the peer-reviewed forensic journal literature through citation analysis and network visualizations. The findings of this study were compiled into a poster that was presented during the Forensics@NIST Symposium in December 2014. ISO’s study informed the forensic research community where NIST has had some of the greatest scholarly impact. This paper describes the methodology used to assess the impact of NIST’s forensic publications and shares the results, outcomes, and impacts of ISO’s study and poster presentation. This methodology is adaptable and applicable to other research fields and to other libraries. It has improved the recognition of ISO’s capabilities within NIST and resulted in application of the methodology to additional scientific disciplines. PMID:27956754
Brain Dynamics: Methodological Issues and Applications in Psychiatric and Neurologic Diseases
NASA Astrophysics Data System (ADS)
Pezard, Laurent
The human brain is a complex dynamical system generating the EEG signal. Numerical methods developed to study complex physical dynamics have been used to characterize EEG since the mid-eighties. This endeavor raised several issues related to the specificity of EEG. Firstly, theoretical and methodological studies should address the major differences between the dynamics of the human brain and physical systems. Secondly, this approach of EEG signal should prove to be relevant for dealing with physiological or clinical problems. A set of studies performed in our group is presented here within the context of these two problematic aspects. After the discussion of methodological drawbacks, we review numerical simulations related to the high dimension and spatial extension of brain dynamics. Experimental studies in neurologic and psychiatric disease are then presented. We conclude that if it is now clear that brain dynamics changes in relation with clinical situations, methodological problems remain largely unsolved.
Modified Dynamic Inversion to Control Large Flexible Aircraft: What's Going On?
NASA Technical Reports Server (NTRS)
Gregory, Irene M.
1999-01-01
High performance aircraft of the future will be designed lighter, more maneuverable, and operate over an ever expanding flight envelope. One of the largest differences from the flight control perspective between current and future advanced aircraft is elasticity. Over the last decade, dynamic inversion methodology has gained considerable popularity in application to highly maneuverable fighter aircraft, which were treated as rigid vehicles. This paper explores dynamic inversion application to an advanced highly flexible aircraft. An initial application has been made to a large flexible supersonic aircraft. In the course of controller design for this advanced vehicle, modifications were made to the standard dynamic inversion methodology. The results of this application were deemed rather promising. An analytical study has been undertaken to better understand the nature of the made modifications and to determine its general applicability. This paper presents the results of this initial analytical look at the modifications to dynamic inversion to control large flexible aircraft.
Taylor, J V; DiBennardo, R; Linares, G H; Goldman, A D; DeForest, P R
1984-07-01
A case study is presented to demonstrate the utility of the team approach to the identification of human remains, and to illustrate a methodological innovation developed by MFAT. Case 1 represents the first of several planned case studies, each designed to present new methodological solutions to standard problems in identification. The present case describes a test, by application, of race and sex assessment of the postcranial skeleton by discriminant function analysis.
Flamm, Christoph; Graef, Andreas; Pirker, Susanne; Baumgartner, Christoph; Deistler, Manfred
2013-01-01
Granger causality is a useful concept for studying causal relations in networks. However, numerical problems occur when applying the corresponding methodology to high-dimensional time series showing co-movement, e.g. EEG recordings or economic data. In order to deal with these shortcomings, we propose a novel method for the causal analysis of such multivariate time series based on Granger causality and factor models. We present the theoretical background, successfully assess our methodology with the help of simulated data and show a potential application in EEG analysis of epileptic seizures. PMID:23354014
Application of numerical methods to heat transfer and thermal stress analysis of aerospace vehicles
NASA Technical Reports Server (NTRS)
Wieting, A. R.
1979-01-01
The paper describes a thermal-structural design analysis study of a fuel-injection strut for a hydrogen-cooled scramjet engine for a supersonic transport, utilizing finite-element methodology. Applications of finite-element and finite-difference codes to the thermal-structural design-analysis of space transports and structures are discussed. The interaction between the thermal and structural analyses has led to development of finite-element thermal methodology to improve the integration between these two disciplines. The integrated thermal-structural analysis capability developed within the framework of a computer code is outlined.
NASA Astrophysics Data System (ADS)
Halbe, Johannes; Pahl-Wostl, Claudia; Adamowski, Jan
2018-01-01
Multiple barriers constrain the widespread application of participatory methods in water management, including the more technical focus of most water agencies, additional cost and time requirements for stakeholder involvement, as well as institutional structures that impede collaborative management. This paper presents a stepwise methodological framework that addresses the challenges of context-sensitive initiation, design and institutionalization of participatory modeling processes. The methodological framework consists of five successive stages: (1) problem framing and stakeholder analysis, (2) process design, (3) individual modeling, (4) group model building, and (5) institutionalized participatory modeling. The Management and Transition Framework is used for problem diagnosis (Stage One), context-sensitive process design (Stage Two) and analysis of requirements for the institutionalization of participatory water management (Stage Five). Conceptual modeling is used to initiate participatory modeling processes (Stage Three) and ensure a high compatibility with quantitative modeling approaches (Stage Four). This paper describes the proposed participatory model building (PMB) framework and provides a case study of its application in Québec, Canada. The results of the Québec study demonstrate the applicability of the PMB framework for initiating and designing participatory model building processes and analyzing barriers towards institutionalization.
Application of Six Sigma towards improving surgical outcomes.
Shukla, P J; Barreto, S G; Nadkarni, M S
2008-01-01
Six Sigma is a 'process excellence' tool targeting continuous improvement achieved by providing a methodology for improving key steps of a process. It is ripe for application into health care since almost all health care processes require a near-zero tolerance for mistakes. The aim of this study is to apply the Six Sigma methodology into a clinical surgical process and to assess the improvement (if any) in the outcomes and patient care. The guiding principles of Six Sigma, namely DMAIC (Define, Measure, Analyze, Improve, Control), were used to analyze the impact of double stapling technique (DST) towards improving sphincter preservation rates for rectal cancer. The analysis using the Six Sigma methodology revealed a Sigma score of 2.10 in relation to successful sphincter preservation. This score demonstrates an improvement over the previous technique (73% over previous 54%). This study represents one of the first clinical applications of Six Sigma in the surgical field. By understanding, accepting, and applying the principles of Six Sigma, we have an opportunity to transfer a very successful management philosophy to facilitate the identification of key steps that can improve outcomes and ultimately patient safety and the quality of surgical care provided.
ERIC Educational Resources Information Center
Association for Education in Journalism and Mass Communication.
The Communication Theory and Methodology section of the Proceedings contains the following 20 papers: "Information Sufficiency and Risk Communication" (Robert J. Griffin, Kurt Neuwirth, and Sharon Dunwoody); "The Therapeutic Application of Television: An Experimental Study" (Charles Kingsley); "A Path Model Examining the…
Epel, Boris; Sundramoorthy, Subramanian V.; Barth, Eugene D.; Mailer, Colin; Halpern, Howard J.
2011-01-01
Purpose: The authors compare two electron paramagnetic resonance imaging modalities at 250 MHz to determine advantages and disadvantages of those modalities for in vivo oxygen imaging. Methods: Electron spin echo (ESE) and continuous wave (CW) methodologies were used to obtain three-dimensional images of a narrow linewidth, water soluble, nontoxic oxygen-sensitive trityl molecule OX063 in vitro and in vivo. The authors also examined sequential images obtained from the same animal injected intravenously with trityl spin probe to determine temporal stability of methodologies. Results: A study of phantoms with different oxygen concentrations revealed a threefold advantage of the ESE methodology in terms of reduced imaging time and more precise oxygen resolution for samples with less than 70 torr oxygen partial pressure. Above∼100 torr, CW performed better. The images produced by both methodologies showed pO2 distributions with similar mean values. However, ESE images demonstrated superior performance in low pO2 regions while missing voxels in high pO2 regions. Conclusions: ESE and CW have different areas of applicability. ESE is superior for hypoxia studies in tumors. PMID:21626937
Manfredi, Simone; Cristobal, Jorge
2016-09-01
Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.
Qualitative Research Literature: A Bibliographic Essay.
ERIC Educational Resources Information Center
Horn, Jim
1998-01-01
Presents selected literature that exemplifies (in theory and in practice) four methodological frameworks that have found wide application in qualitative studies: symbolic interactionism, phenomenological description, constructivist hermeneutics, and critical studies. (Author/LRW)
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
1978-09-01
This report describes an effort to specify a software design methodology applicable to the Air Force software environment . Available methodologies...of techniques for proof of correctness, design specification, and performance assessment of static designs. The rational methodology selected is a
End State: The Fallacy of Modern Military Planning
2017-04-06
operational planning for non -linear, complex scenarios requires application of non -linear, advanced planning techniques such as design methodology ...cannot be approached in a linear, mechanistic manner by a universal planning methodology . Theater/global campaign plans and theater strategies offer no...strategic environments, and instead prescribes a universal linear methodology that pays no mind to strategic complexity. This universal application
The Role of Ambulatory Assessment in Psychological Science.
Trull, Timothy J; Ebner-Priemer, Ulrich
2014-12-01
We describe the current use and future promise of an innovative methodology, ambulatory assessment (AA), that can be used to investigate psychological, emotional, behavioral, and biological processes of individuals in their daily life. The term AA encompasses a wide range of methods used to study people in their natural environment, including momentary self-report, observational, and physiological. We emphasize applications of AA that integrate two or more of these methods, discuss the smart phone as a hub or access point for AA, and discuss future applications of AA methodology to the science of psychology. We pay particular attention to the development and application of Wireless Body Area Networks (WBANs) that can be implemented with smart phones and wireless physiological monitoring devices, and we close by discussing future applications of this approach to matters relevant to psychological science.
Application-specific coarse-grained reconfigurable array: architecture and design methodology
NASA Astrophysics Data System (ADS)
Zhou, Li; Liu, Dongpei; Zhang, Jianfeng; Liu, Hengzhu
2015-06-01
Coarse-grained reconfigurable arrays (CGRAs) have shown potential for application in embedded systems in recent years. Numerous reconfigurable processing elements (PEs) in CGRAs provide flexibility while maintaining high performance by exploring different levels of parallelism. However, a difference remains between the CGRA and the application-specific integrated circuit (ASIC). Some application domains, such as software-defined radios (SDRs), require flexibility with performance demand increases. More effective CGRA architectures are expected to be developed. Customisation of a CGRA according to its application can improve performance and efficiency. This study proposes an application-specific CGRA architecture template composed of generic PEs (GPEs) and special PEs (SPEs). The hardware of the SPE can be customised to accelerate specific computational patterns. An automatic design methodology that includes pattern identification and application-specific function unit generation is also presented. A mapping algorithm based on ant colony optimisation is provided. Experimental results on the SDR target domain show that compared with other ordinary and application-specific reconfigurable architectures, the CGRA generated by the proposed method performs more efficiently for given applications.
Crozier, Sarah E; Cassell, Catherine M
2016-06-01
The use of longitudinal methodology as a means of capturing the intricacies in complex organizational phenomena is well documented, and many different research strategies for longitudinal designs have been put forward from both a qualitative and quantitative stance. This study explores a specific emergent qualitative methodology, audio diaries, and assesses their utility for work psychology research drawing on the findings from a four-stage study addressing transient working patterns and stress in UK temporary workers. Specifically, we explore some important methodological, analytical and technical issues for practitioners and researchers who seek to use these methods and explain how this type of methodology has much to offer when studying stress and affective experiences at work. We provide support for the need to implement pluralistic and complementary methodological approaches in unearthing the depth in sense-making and assert their capacity to further illuminate the process orientation of stress. This study illustrates the importance of verbalization in documenting stress and affective experience as a mechanism for accessing cognitive processes in making sense of such experience.This study compares audio diaries with more traditional qualitative methods to assess applicability to different research contexts.This study provides practical guidance and a methodological framework for the design of audio diary research and design, taking into account challenges and solutions for researchers and practitioners.
Applying axiomatic design to a medication distribution system
NASA Astrophysics Data System (ADS)
Raguini, Pepito B.
As the need to minimize medication errors drives many medical facilities to come up with robust solutions to the most common error that affects patient's safety, these hospitals would be wise to put a concerted effort into finding methodologies that can facilitate an optimized medical distribution system. If the hospitals' upper management is looking for an optimization method that is an ideal fit, it is just as important that the right tool be selected for the application at hand. In the present work, we propose the application of Axiomatic Design (AD), which is a process that focuses on the generation and selection of functional requirements to meet the customer needs for product and/or process design. The appeal of the axiomatic approach is to provide both a formal design process and a set of technical coefficients for meeting the customer's needs. Thus, AD offers a strategy for the effective integration of people, design methods, design tools and design data. Therefore, we propose the AD methodology to medical applications with the main objective of allowing nurses the opportunity to provide cost effective delivery of medications to inpatients, thereby improving quality patient care. The AD methodology will be implemented through the use of focused stores, where medications can be readily stored and can be conveniently located near patients, as well as a mobile apparatus that can also store medications and is commonly used by hospitals, the medication cart. Moreover, a robust methodology called the focused store methodology will be introduced and developed for both the uncapacitated and capacitated case studies, which will set up an appropriate AD framework and design problem for a medication distribution case study.
Recent studies of measures to improve basamid soil disinfestation.
Van Wambeke, E
2011-01-01
Basamid micro-granule is used worldwide as a broad spectrum soil fumigant generator and has replaced methyl bromide for many applications. A lot is known for decades regarding the factors determining the success of the application from soil preparation and conditions to the application and soil sealing or soil tarping, as well as the operations and hygienic measures after the fumigant contact time. This paper explains last 6 years studies regarding the improvement of application methods, both from the viewpoint of homogenous incorporation of the granule over the soil profile to become treated as well as from possible premature loss of the gaseous active methyl isothiocyanate (MITC) by using improved tarping materials. Both result in lower environmental exposure and better biological performance of the application. In that respect, product incorporation in soil was studied in France and in Italy with more recent commercially available Basamid application machinery, and 29 plastic films have been compared for their MITC barrier properties with an 'in house' developed method. Film testing allowed clear categorizing in standard (monolayer) films, V.I.F. (Virtually Impermeable Film) and T.I.F. (Totally Impermeable Film). The paper presents the methodology for granule incorporation study and results from trials with two specific Basamid application machines compared with a classic rotovator, the methodology and comparison of plastic film barrier properties testing, and directives to minimize exposure and to maximize performance.
Chuen, Onn Chiu; Yusoff, Sumiani
2012-03-01
This study performed an assessment on the beneficial of the Clean Development Mechanism (CDM) application on waste treatment system in a local palm oil industry in Malaysia. Life cycle assessment (LCA) was conducted to assess the environmental impacts of the greenhouse gas (GHG) reduction from the CDM application. Calculations on the emission reduction used the methodology based on AM002 (Avoided Wastewater and On-site Energy Use Emissions in the Industrial Sector) Version 4 published by United Nations Framework Convention on Climate Change (UNFCC). The results from the studies showed that the introduction of CDM in the palm oil mill through conversion of the captured biogas from palm oil mill effluent (POME) treatment into power generation were able to reduce approximate 0.12 tonnes CO2 equivalent concentration (tCO2e) emission and 30 kW x hr power generation per 1 tonne of fresh fruit bunch processed. Thus, the application of CDM methodology on palm oil mill wastewater treatment was able to reduce up to 1/4 of the overall environment impact generated in palm oil mill.
Novel thermal management system design methodology for power lithium-ion battery
NASA Astrophysics Data System (ADS)
Nieto, Nerea; Díaz, Luis; Gastelurrutia, Jon; Blanco, Francisco; Ramos, Juan Carlos; Rivas, Alejandro
2014-12-01
Battery packs conformed by large format lithium-ion cells are increasingly being adopted in hybrid and pure electric vehicles in order to use the energy more efficiently and for a better environmental performance. Safety and cycle life are two of the main concerns regarding this technology, which are closely related to the cell's operating behavior and temperature asymmetries in the system. Therefore, the temperature of the cells in battery packs needs to be controlled by thermal management systems (TMSs). In the present paper an improved design methodology for developing TMSs is proposed. This methodology involves the development of different mathematical models for heat generation, transmission, and dissipation and their coupling and integration in the battery pack product design methodology in order to improve the overall safety and performance. The methodology is validated by comparing simulation results with laboratory measurements on a single module of the battery pack designed at IK4-IKERLAN for a traction application. The maximum difference between model predictions and experimental temperature data is 2 °C. The models developed have shown potential for use in battery thermal management studies for EV/HEV applications since they allow for scalability with accuracy and reasonable simulation time.
Kansei, surfaces and perception engineering
NASA Astrophysics Data System (ADS)
Rosen, B.-G.; Eriksson, L.; Bergman, M.
2016-09-01
The aesthetic and pleasing properties of a product are important and add significantly to the meaning and relevance of a product. Customer sensation and perception are largely about psychological factors. There has been a strong industrial and academic need and interest for methods and tools to quantify and link product properties to the human response but a lack of studies of the impact of surfaces. In this study, affective surface engineering is used to illustrate and model the link between customer expectations and perception to controllable product surface properties. The results highlight the use of the soft metrology concept for linking physical and human factors contributing to the perception of products. Examples of surface applications of the Kansei methodology are presented from sauna bath, health care, architectural and hygiene tissue application areas to illustrate, discuss and confirm the strength of the methodology. In the conclusions of the study, future research in soft metrology is proposed to allow understanding and modelling of product perception and sensations in combination with a development of the Kansei surface engineering methodology and software tools.
A semi-quantitative approach to GMO risk-benefit analysis.
Morris, E Jane
2011-10-01
In many countries there are increasing calls for the benefits of genetically modified organisms (GMOs) to be considered as well as the risks, and for a risk-benefit analysis to form an integral part of GMO regulatory frameworks. This trend represents a shift away from the strict emphasis on risks, which is encapsulated in the Precautionary Principle that forms the basis for the Cartagena Protocol on Biosafety, and which is reflected in the national legislation of many countries. The introduction of risk-benefit analysis of GMOs would be facilitated if clear methodologies were available to support the analysis. Up to now, methodologies for risk-benefit analysis that would be applicable to the introduction of GMOs have not been well defined. This paper describes a relatively simple semi-quantitative methodology that could be easily applied as a decision support tool, giving particular consideration to the needs of regulators in developing countries where there are limited resources and experience. The application of the methodology is demonstrated using the release of an insect resistant maize variety in South Africa as a case study. The applicability of the method in the South African regulatory system is also discussed, as an example of what might be involved in introducing changes into an existing regulatory process.
Longo, S; Hospido, A; Lema, J M; Mauricio-Iglesias, M
2018-05-10
This article examines the potential benefits of using Data Envelopment Analysis (DEA) for conducting energy-efficiency assessment of wastewater treatment plants (WWTPs). WWTPs are characteristically heterogeneous (in size, technology, climate, function …) which limits the correct application of DEA. This paper proposes and describes the Robust Energy Efficiency DEA (REED) in its various stages, a systematic state-of-the-art methodology aimed at including exogenous variables in nonparametric frontier models and especially designed for WWTP operation. In particular, the methodology systematizes the modelling process by presenting an integrated framework for selecting the correct variables and appropriate models, possibly tackling the effect of exogenous factors. As a result, the application of REED improves the quality of the efficiency estimates and hence the significance of benchmarking. For the reader's convenience, this article is presented as a step-by-step guideline to guide the user in the determination of WWTPs energy efficiency from beginning to end. The application and benefits of the developed methodology are demonstrated by a case study related to the comparison of the energy efficiency of a set of 399 WWTPs operating in different countries and under heterogeneous environmental conditions. Copyright © 2018 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lynn, R.Y.S.; Bolmarcich, J.J.
The purpose of this Memorandum is to propose a prototype procedure which the Office of Munitions might employ to exercise, in a supportive joint fashion, two of its High Level Conventional Munitions Models, namely, the OSD Threat Methodology and the Joint Munitions Assessment and Planning (JMAP) model. The joint application of JMAP and the OSD Threat Methodology provides a tool to optimize munitions stockpiles. The remainder of this Memorandum comprises five parts. The first is a description of the structure and use of the OSD Threat Methodology. The second is a description of JMAP and its use. The third discussesmore » the concept of the joint application of JMAP and OSD Threat Methodology. The fourth displays sample output of the joint application. The fifth is a summary and epilogue. Finally, three appendices contain details of the formulation, data, and computer code.« less
Basic principles, methodology, and applications of remote sensing in agriculture
NASA Technical Reports Server (NTRS)
Moreira, M. A. (Principal Investigator); Deassuncao, G. V.
1984-01-01
The basic principles of remote sensing applied to agriculture and the methods used in data analysis are described. Emphasis is placed on the importance of developing a methodology that may help crop forecast, basic concepts of spectral signatures of vegetation, the methodology of the LANDSAT data utilization in agriculture, and the remote sensing program application of INPE (Institute for Space Research) in agriculture.
Methodological triangulation: an approach to understanding data.
Bekhet, Abir K; Zauszniewski, Jaclene A
2012-01-01
To describe the use of methodological triangulation in a study of how people who had moved to retirement communities were adjusting. Methodological triangulation involves using more than one kind of method to study a phenomenon. It has been found to be beneficial in providing confirmation of findings, more comprehensive data, increased validity and enhanced understanding of studied phenomena. While many researchers have used this well-established technique, there are few published examples of its use. The authors used methodological triangulation in their study of people who had moved to retirement communities in Ohio, US. A blended qualitative and quantitative approach was used. The collected qualitative data complemented and clarified the quantitative findings by helping to identify common themes. Qualitative data also helped in understanding interventions for promoting 'pulling' factors and for overcoming 'pushing' factors of participants. The authors used focused research questions to reflect the research's purpose and four evaluative criteria--'truth value', 'applicability', 'consistency' and 'neutrality'--to ensure rigour. This paper provides an example of how methodological triangulation can be used in nursing research. It identifies challenges associated with methodological triangulation, recommends strategies for overcoming them, provides a rationale for using triangulation and explains how to maintain rigour. Methodological triangulation can be used to enhance the analysis and the interpretation of findings. As data are drawn from multiple sources, it broadens the researcher's insight into the different issues underlying the phenomena being studied.
Carrol, N V; Gagon, J P
1983-01-01
Because of increasing competition, it is becoming more important that health care providers pursue consumer-based market segmentation strategies. This paper presents a methodology for identifying and describing consumer segments in health service markets, and demonstrates the use of the methodology by presenting a study of consumer segments in the ambulatory care pharmacy market.
ERIC Educational Resources Information Center
Serafin, Ana Gil
This study examined the application of the Basic Direct Instruction Model (BDIM), a methodology designed to maximize student interest in instrumental and methodological courses, to graduate level educational leadership students. The research used qualitative techniques and a participatory approach with a sample of 92 beginning level Masters…
Reliability Centered Maintenance - Methodologies
NASA Technical Reports Server (NTRS)
Kammerer, Catherine C.
2009-01-01
Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.
Nicolay, C R; Purkayastha, S; Greenhalgh, A; Benn, J; Chaturvedi, S; Phillips, N; Darzi, A
2012-03-01
The demand for the highest-quality patient care coupled with pressure on funding has led to the increasing use of quality improvement (QI) methodologies from the manufacturing industry. The aim of this systematic review was to identify and evaluate the application and effectiveness of these QI methodologies to the field of surgery. MEDLINE, the Cochrane Database, Allied and Complementary Medicine Database, British Nursing Index, Cumulative Index to Nursing and Allied Health Literature, Embase, Health Business(™) Elite, the Health Management Information Consortium and PsycINFO(®) were searched according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses statement. Empirical studies were included that implemented a described QI methodology to surgical care and analysed a named outcome statistically. Some 34 of 1595 articles identified met the inclusion criteria after consensus from two independent investigators. Nine studies described continuous quality improvement (CQI), five Six Sigma, five total quality management (TQM), five plan-do-study-act (PDSA) or plan-do-check-act (PDCA) cycles, five statistical process control (SPC) or statistical quality control (SQC), four Lean and one Lean Six Sigma; 20 of the studies were undertaken in the USA. The most common aims were to reduce complications or improve outcomes (11), to reduce infection (7), and to reduce theatre delays (7). There was one randomized controlled trial. QI methodologies from industry can have significant effects on improving surgical care, from reducing infection rates to increasing operating room efficiency. The evidence is generally of suboptimal quality, and rigorous randomized multicentre studies are needed to bring evidence-based management into the same league as evidence-based medicine. Copyright © 2011 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd.
Discrete and continuous dynamics modeling of a mass moving on a flexible structure
NASA Technical Reports Server (NTRS)
Herman, Deborah Ann
1992-01-01
A general discrete methodology for modeling the dynamics of a mass that moves on the surface of a flexible structure is developed. This problem was motivated by the Space Station/Mobile Transporter system. A model reduction approach is developed to make the methodology applicable to large structural systems. To validate the discrete methodology, continuous formulations are also developed. Three different systems are examined: (1) simply-supported beam, (2) free-free beam, and (3) free-free beam with two points of contact between the mass and the flexible beam. In addition to validating the methodology, parametric studies were performed to examine how the system's physical properties affect its dynamics.
González, Martín Maximino León
2009-10-01
With the purpose to analyze the health strategic planning model based on determinants experienced in the municipality of Campo Bom, Rio Grande do Sul State, it was conducted an observational, qualitative study, of documental analysis as well as an evaluation of new process technologies in local health administration. This study contains an analysis of the methodological coherency and applicability of this model, based on the revision of the elaborated plans. The plans presented at Campo Bom case shows the possibility of integration and applicability at local level, of a health strategic planning model oriented to the new health concepts considering elements of different theoretical developments that enables the response to the most common local needs and situations. It was identified evolutional stages of health planning and analyzed integrative elements of the model and limitations of its application, pointing to the need of support the deepening on the study and the development of the field.
Reliability modelling and analysis of thermal MEMS
NASA Astrophysics Data System (ADS)
Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.
2006-04-01
This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.
Implications for Application of Qualitative Methods to Library and Information Science Research.
ERIC Educational Resources Information Center
Grover, Robert; Glazier, Jack
1985-01-01
Presents conceptual framework for library and information science research and analyzes research methodology that has application for information science, using as example results of study conducted by authors. Rationale for use of qualitative research methods in theory building is discussed and qualitative and quantitative research methods are…
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
ERIC Educational Resources Information Center
Medina-Dominguez, Fuensanta; Sanchez-Segura, Maria-Isabel; Mora-Soto, Arturo; Amescua, Antonio
2010-01-01
The development of collaborative Web applications does not follow a software engineering methodology. This is because when university students study Web applications in general, and collaborative Web portals in particular, they are not being trained in the use of software engineering techniques to develop collaborative Web portals. This paper…
ERIC Educational Resources Information Center
Tekinarslan, Erkan
2013-01-01
The purpose of this study is to investigate the effects of screencasts on the Turkish undergraduate students' achievement and knowledge acquisitions in spreadsheet applications. The methodology of the study is based on a pretest-posttest experimental design with a control group. A total of 66 undergraduate students in two groups (n = 33 in…
Frosini, Francesco; Miniati, Roberto; Grillone, Saverio; Dori, Fabrizio; Gentili, Guido Biffi; Belardinelli, Andrea
2016-11-14
The following study proposes and tests an integrated methodology involving Health Technology Assessment (HTA) and Failure Modes, Effects and Criticality Analysis (FMECA) for the assessment of specific aspects related to robotic surgery involving safety, process and technology. The integrated methodology consists of the application of specific techniques coming from the HTA joined to the aid of the most typical models from reliability engineering such as FMEA/FMECA. The study has also included in-site data collection and interviews to medical personnel. The total number of robotic procedures included in the analysis was 44: 28 for urology and 16 for general surgery. The main outcomes refer to the comparative evaluation between robotic, laparoscopic and open surgery. Risk analysis and mitigation interventions come from FMECA application. The small sample size available for the study represents an important bias, especially for the clinical outcomes reliability. Despite this, the study seems to confirm the better trend for robotics' surgical times with comparison to the open technique as well as confirming the robotics' clinical benefits in urology. More complex situation is observed for general surgery, where robotics' clinical benefits directly measured are the lowest blood transfusion rate.
NASA Astrophysics Data System (ADS)
Çakır, Süleyman
2017-10-01
In this study, a two-phase methodology for resource allocation problems under a fuzzy environment is proposed. In the first phase, the imprecise Shannon's entropy method and the acceptability index are suggested, for the first time in the literature, to select input and output variables to be used in the data envelopment analysis (DEA) application. In the second step, an interval inverse DEA model is executed for resource allocation in a short run. In an effort to exemplify the practicality of the proposed fuzzy model, a real case application has been conducted involving 16 cement firms listed in Borsa Istanbul. The results of the case application indicated that the proposed hybrid model is a viable procedure to handle input-output selection and resource allocation problems under fuzzy conditions. The presented methodology can also lend itself to different applications such as multi-criteria decision-making problems.
Zhang, Melvyn W B; Ho, Cyrus S H; Ho, Roger C M
2014-01-01
The usage of Smartphones and smartphone applications in the recent decade has indeed become more prevalent. Previous research has highlighted the lack of critical appraisal of new applications. In addition, previous research has highlighted a method of using just the Internet Browser and a text editor to create an application, but this does not eliminate the challenges faced by clinicians. In addition, even though there has been a high rate of smartphone applications usage and acceptance, it is common knowledge that it would cost clinicians as well as their centers a lot to develop smartphone applications that could be catered to their needs, and help them in their daily educational needs. The objectives of the current research are thus to highlight a cost-effective methodology of development of interactive education smartphone applications, and also to determine whether medical students are receptive towards having smartphone applications and their perspectives with regards to the contents within. In this study, we will elaborate how the Mastering Psychiatry Online Portal and web-based mobile application were developed using HTML5 as the core programming language. The online portal and web-based application was launched in July 2012 and usage data were obtained. Subsequently, a native application was developed, as it was funded by an educational grant and students are recruited after their end of posting clinical examination to fill up a survey questionnaire relating to perspectives. Our initial analytical results showed that since inception to date, for the online portal, there have been a total of 15,803 views, with a total of 2,109 copies of the online textbook being downloaded. As for the online videos, 5,895 viewers have watched the training videos from the start till the end. 722 users have accessed the mobile textbook application. A total of 185 students participated in the perspective survey, with the majority having positive perspectives about the implementation of a smartphone application in psychiatry. This is one of the few studies that describe how an educational application could be developed using a simple and cost effective methodology and this study has also demonstrated students' perspectives towards Smartphone in psychiatric education. Our methods might apply to future research involving the use of technology in education.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lala, J.H.; Nagle, G.A.; Harper, R.E.
1993-05-01
The Maglev control computer system should be designed to verifiably possess high reliability and safety as well as high availability to make Maglev a dependable and attractive transportation alternative to the public. A Maglev control computer system has been designed using a design-for-validation methodology developed earlier under NASA and SDIO sponsorship for real-time aerospace applications. The present study starts by defining the maglev mission scenario and ends with the definition of a maglev control computer architecture. Key intermediate steps included definitions of functional and dependability requirements, synthesis of two candidate architectures, development of qualitative and quantitative evaluation criteria, and analyticalmore » modeling of the dependability characteristics of the two architectures. Finally, the applicability of the design-for-validation methodology was also illustrated by applying it to the German Transrapid TR07 maglev control system.« less
Application of ion chromatography in pharmaceutical and drug analysis.
Jenke, Dennis
2011-08-01
Ion chromatography (IC) has developed and matured into an important analytical methodology in a number of diverse applications and industries, including pharmaceuticals. This manuscript provides a review of IC applications for the determinations of active and inactive ingredients, excipients, degradation products, and impurities relevant to pharmaceutical analyses and thus serves as a resource for investigators looking for insights into the use of the IC methodology in this field of application.
Ancient DNA studies: new perspectives on old samples
2012-01-01
In spite of past controversies, the field of ancient DNA is now a reliable research area due to recent methodological improvements. A series of recent large-scale studies have revealed the true potential of ancient DNA samples to study the processes of evolution and to test models and assumptions commonly used to reconstruct patterns of evolution and to analyze population genetics and palaeoecological changes. Recent advances in DNA technologies, such as next-generation sequencing make it possible to recover DNA information from archaeological and paleontological remains allowing us to go back in time and study the genetic relationships between extinct organisms and their contemporary relatives. With the next-generation sequencing methodologies, DNA sequences can be retrieved even from samples (for example human remains) for which the technical pitfalls of classical methodologies required stringent criteria to guaranty the reliability of the results. In this paper, we review the methodologies applied to ancient DNA analysis and the perspectives that next-generation sequencing applications provide in this field. PMID:22697611
Methodology of management of dredging operations II. Applications.
Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D
2006-04-01
This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.
NASA Astrophysics Data System (ADS)
Aguilar Cisneros, Jorge; Vargas Martinez, Hector; Pedroza Melendez, Alejandro; Alonso Arevalo, Miguel
2013-09-01
Mexico is a country where the experience to build software for satellite applications is beginning. This is a delicate situation because in the near future we will need to develop software for the SATEX-II (Mexican Experimental Satellite). SATEX- II is a SOMECyTA's project (the Mexican Society of Aerospace Science and Technology). We have experienced applying software development methodologies, like TSP (Team Software Process) and SCRUM in other areas. Then, we analyzed these methodologies and we concluded: these can be applied to develop software for the SATEX-II, also, we supported these methodologies with SSP-05-0 Standard in particular with ESA PSS-05-11. Our analysis was focusing on main characteristics of each methodology and how these methodologies could be used with the ESA PSS 05-0 Standards. Our outcomes, in general, may be used by teams who need to build small satellites, but, in particular, these are going to be used when we will build the on board software applications for the SATEX-II.
Rao, Ravella Sreenivas; Kumar, C Ganesh; Prakasham, R Shetty; Hobbs, Phil J
2008-04-01
Success in experiments and/or technology mainly depends on a properly designed process or product. The traditional method of process optimization involves the study of one variable at a time, which requires a number of combinations of experiments that are time, cost and labor intensive. The Taguchi method of design of experiments is a simple statistical tool involving a system of tabulated designs (arrays) that allows a maximum number of main effects to be estimated in an unbiased (orthogonal) fashion with a minimum number of experimental runs. It has been applied to predict the significant contribution of the design variable(s) and the optimum combination of each variable by conducting experiments on a real-time basis. The modeling that is performed essentially relates signal-to-noise ratio to the control variables in a 'main effect only' approach. This approach enables both multiple response and dynamic problems to be studied by handling noise factors. Taguchi principles and concepts have made extensive contributions to industry by bringing focused awareness to robustness, noise and quality. This methodology has been widely applied in many industrial sectors; however, its application in biological sciences has been limited. In the present review, the application and comparison of the Taguchi methodology has been emphasized with specific case studies in the field of biotechnology, particularly in diverse areas like fermentation, food processing, molecular biology, wastewater treatment and bioremediation.
Application of Risk-Based Inspection method for gas compressor station
NASA Astrophysics Data System (ADS)
Zhang, Meng; Liang, Wei; Qiu, Zeyang; Lin, Yang
2017-05-01
According to the complex process and lots of equipment, there are risks in gas compressor station. At present, research on integrity management of gas compressor station is insufficient. In this paper, the basic principle of Risk Based Inspection (RBI) and the RBI methodology are studied; the process of RBI in the gas compressor station is developed. The corrosion loop and logistics loop of the gas compressor station are determined through the study of corrosion mechanism and process of the gas compressor station. The probability of failure is calculated by using the modified coefficient, and the consequence of failure is calculated by the quantitative method. In particular, we addressed the application of a RBI methodology in a gas compressor station. The risk ranking is helpful to find the best preventive plan for inspection in the case study.
Sign Language Studies with Chimpanzees and Children.
ERIC Educational Resources Information Center
Van Cantfort, Thomas E.; Rimpau, James B.
1982-01-01
Reviews methodologies of sign language studies with chimpanzees and compares major findings of those studies with studies of human children. Considers relevance of input conditions for language acquisition, evidence used to demonstrate linguistic achievements, and application of rigorous testing procedures in developmental psycholinguistics.…
Generating social impact scenarios: A key step in making technology assessment studies
NASA Technical Reports Server (NTRS)
Jones, M. V.
1975-01-01
The MITRE methodological studies were conducted to define relevant questions in relation to the concept of total impact analysis and to provide a procedure for integrating diverse checklists of questions which trace the initial and secondary impacts of any major technological application or of society's attempts to respond to or redirect that application. Some of the results of that study are presented in tabular form.
NASA Astrophysics Data System (ADS)
Johannes, Bernd; Salnitski, Vyacheslav; Soll, Henning; Rauch, Melina; Hoermann, Hans-Juergen
For the evaluation of an operator's skill reliability indicators of work quality as well as of psychophysiological states during the work have to be considered. The herein presented methodology and measurement equipment were developed and tested in numerous terrestrial and space experiments using a simulation of a spacecraft docking on a space station. However, in this study the method was applied to a comparable terrestrial task—the flight simulator test (FST) used in the DLR selection procedure for ab initio pilot applicants for passenger airlines. This provided a large amount of data for a statistical verification of the space methodology. For the evaluation of the strain level of applicants during the FST psychophysiological measurements were used to construct a "psychophysiological arousal vector" (PAV) which is sensitive to various individual reaction patterns of the autonomic nervous system to mental load. Its changes and increases will be interpreted as "strain". In the first evaluation study, 614 subjects were analyzed. The subjects first underwent a calibration procedure for the assessment of their autonomic outlet type (AOT) and on the following day they performed the FST, which included three tasks and was evaluated by instructors applying well-established and standardized rating scales. This new method will possibly promote a wide range of other future applications in aviation and space psychology.
ERIC Educational Resources Information Center
Quilling, Mary Rintoul
The purpose of the present study is to demonstrate the utility of data analysis methodology in evaluative research relating pupil and curriculum variables to pupil achievement. Regression models which account for achievement will result from the application of the methodology to two evaluative problems--one of curriculum comparison and another…
ERIC Educational Resources Information Center
Westermeyer, Juan Carlos Briede; Ortuno, Bernabe Hernandis
2011-01-01
This study describes the application of a new product concurrent design methodologies in the context in the education of industrial design. The use of the sketch has been utilized many times as a tool of creative expression especially in the conceptual design stage, in an intuitive way and a little out of the context of the reality needs that the…
Yardley, Sarah; Brosnan, Caragh; Richardson, Jane
2013-01-01
Theoretical integration is a necessary element of study design if clarification of experiential learning is to be achieved. There are few published examples demonstrating how this can be achieved. This methodological article provides a worked example of research methodology that achieved clarification of authentic early experiences (AEEs) through a bi-directional approach to theory and data. Bi-directional refers to our simultaneous use of theory to guide and interrogate empirical data and the use of empirical data to refine theory. We explain the five steps of our methodological approach: (1) understanding the context; (2) critique on existing applications of socio-cultural models to inform study design; (3) data generation; (4) analysis and interpretation and (5) theoretical development through a novel application of Metis. These steps resulted in understanding of how and why different outcomes arose from students participating in AEE. Our approach offers a mechanism for clarification without which evidence-based effective ways to maximise constructive learning cannot be developed. In our example it also contributed to greater theoretical understanding of the influence of social interactions. By sharing this example of research undertaken to develop both theory and educational practice we hope to assist others seeking to conduct similar research.
Non-linear forecasting in high-frequency financial time series
NASA Astrophysics Data System (ADS)
Strozzi, F.; Zaldívar, J. M.
2005-08-01
A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.
Kobayashi, Leo; Gosbee, John W; Merck, Derek L
2017-07-01
(1) To develop a clinical microsystem simulation methodology for alarm fatigue research with a human factors engineering (HFE) assessment framework and (2) to explore its application to the comparative examination of different approaches to patient monitoring and provider notification. Problems with the design, implementation, and real-world use of patient monitoring systems result in alarm fatigue. A multidisciplinary team is developing an open-source tool kit to promote bedside informatics research and mitigate alarm fatigue. Simulation, HFE, and computer science experts created a novel simulation methodology to study alarm fatigue. Featuring multiple interconnected simulated patient scenarios with scripted timeline, "distractor" patient care tasks, and triggered true and false alarms, the methodology incorporated objective metrics to assess provider and system performance. Developed materials were implemented during institutional review board-approved study sessions that assessed and compared an experimental multiparametric alerting system with a standard monitor telemetry system for subject response, use characteristics, and end-user feedback. A four-patient simulation setup featuring objective metrics for participant task-related performance and response to alarms was developed along with accompanying structured HFE assessment (questionnaire and interview) for monitor systems use testing. Two pilot and four study sessions with individual nurse subjects elicited true alarm and false alarm responses (including diversion from assigned tasks) as well as nonresponses to true alarms. In-simulation observation and subject questionnaires were used to test the experimental system's approach to suppressing false alarms and alerting providers. A novel investigative methodology applied simulation and HFE techniques to replicate and study alarm fatigue in controlled settings for systems assessment and experimental research purposes.
Using Modern Methodologies with Maintenance Software
NASA Technical Reports Server (NTRS)
Streiffert, Barbara A.; Francis, Laurie K.; Smith, Benjamin D.
2014-01-01
Jet Propulsion Laboratory uses multi-mission software produced by the Mission Planning and Sequencing (MPS) team to process, simulate, translate, and package the commands that are sent to a spacecraft. MPS works under the auspices of the Multi-Mission Ground Systems and Services (MGSS). This software consists of nineteen applications that are in maintenance. The MPS software is classified as either class B (mission critical) or class C (mission important). The scheduling of tasks is difficult because mission needs must be addressed prior to performing any other tasks and those needs often spring up unexpectedly. Keeping track of the tasks that everyone is working on is also difficult because each person is working on a different software component. Recently the group adopted the Scrum methodology for planning and scheduling tasks. Scrum is one of the newer methodologies typically used in agile development. In the Scrum development environment, teams pick their tasks that are to be completed within a sprint based on priority. The team specifies the sprint length usually a month or less. Scrum is typically used for new development of one application. In the Scrum methodology there is a scrum master who is a facilitator who tries to make sure that everything moves smoothly, a product owner who represents the user(s) of the software and the team. MPS is not the traditional environment for the Scrum methodology. MPS has many software applications in maintenance, team members who are working on disparate applications, many users, and is interruptible based on mission needs, issues and requirements. In order to use scrum, the methodology needed adaptation to MPS. Scrum was chosen because it is adaptable. This paper is about the development of the process for using scrum, a new development methodology, with a team that works on disparate interruptible tasks on multiple software applications.
Grounded theory as a method for research in speech and language therapy.
Skeat, J; Perry, A
2008-01-01
The use of qualitative methodologies in speech and language therapy has grown over the past two decades, and there is now a body of literature, both generally describing qualitative research, and detailing its applicability to health practice(s). However, there has been only limited profession-specific discussion of qualitative methodologies and their potential application to speech and language therapy. To describe the methodology of grounded theory, and to explain how it might usefully be applied to areas of speech and language research where theoretical frameworks or models are lacking. Grounded theory as a methodology for inductive theory-building from qualitative data is explained and discussed. Some differences between 'modes' of grounded theory are clarified and areas of controversy within the literature are highlighted. The past application of grounded theory to speech and language therapy, and its potential for informing research and clinical practice, are examined. This paper provides an in-depth critique of a qualitative research methodology, including an overview of the main difference between two major 'modes'. The article supports the application of a theory-building approach in the profession, which is sometimes complex to learn and apply, but worthwhile in its results. Grounded theory as a methodology has much to offer speech and language therapists and researchers. Although the majority of research and discussion around this methodology has rested within sociology and nursing, grounded theory can be applied by researchers in any field, including speech and language therapists. The benefit of the grounded theory method to researchers and practitioners lies in its application to social processes and human interactions. The resulting theory may support further research in the speech and language therapy profession.
NASA Astrophysics Data System (ADS)
Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.
2016-12-01
Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.
Motivating Students for Project-based Learning for Application of Research Methodology Skills.
Tiwari, Ranjana; Arya, Raj Kumar; Bansal, Manoj
2017-12-01
Project-based learning (PBL) is motivational for students to learn research methodology skills. It is a way to engage and give them ownership over their own learning. The aim of this study is to use PBL for application of research methodology skills for better learning by encouraging an all-inclusive approach in teaching and learning rather than an individualized tailored approach. The present study was carried out for MBBS 6 th - and 7 th -semester students of community medicine. Students and faculties were sensitized about PBL and components of research methodology skills. They worked in small groups. The students were asked to fill the student feedback Questionnaire and the faculty was also asked to fill the faculty feedback Questionnaire. Both the Questionnaires were assessed on a 5 point Likert scale. After submitted projects, document analysis was done. A total of 99 students of the 6 th and 7 th semester were participated in PBL. About 90.91% students agreed that there should be continuation of PBL in subsequent batches. 73.74% felt satisfied and motivated with PBL, whereas 76.77% felt that they would be able to use research methodology in the near future. PBL requires considerable knowledge, effort, persistence, and self-regulation on the part of the students. They need to devise plans, gather information evaluate both the findings, and their approach. Facilitator plays a critical role in helping students in the process by shaping opportunity for learning, guiding students, thinking, and helping them construct new understanding.
Global-local methodologies and their application to nonlinear analysis
NASA Technical Reports Server (NTRS)
Noor, Ahmed K.
1989-01-01
An assessment is made of the potential of different global-local analysis strategies for predicting the nonlinear and postbuckling responses of structures. Two postbuckling problems of composite panels are used as benchmarks and the application of different global-local methodologies to these benchmarks is outlined. The key elements of each of the global-local strategies are discussed and future research areas needed to realize the full potential of global-local methodologies are identified.
Application of Fuzzy Logic to Matrix FMECA
NASA Astrophysics Data System (ADS)
Shankar, N. Ravi; Prabhu, B. S.
2001-04-01
A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.
Clarke, Brydie; Swinburn, Boyd; Sacks, Gary
2016-10-13
Theories of the policy process are recommended as tools to help explain both policy stasis and change. A systematic review of the application of such theoretical frameworks within the field of obesity prevention policy was conducted. A meta-synthesis was also undertaken to identify the key influences on policy decision-making. The review identified 17 studies of obesity prevention policy underpinned by political science theories. The majority of included studies were conducted in the United States (US), with significant heterogeneity in terms of policy level (e.g., national, state) studied, areas of focus, and methodologies used. Many of the included studies were methodologically limited, in regard to rigour and trustworthiness. Prominent themes identified included the role of groups and networks, political institutions, and political system characteristics, issue framing, the use of evidence, personal values and beliefs, prevailing political ideology, and timing. The limited application of political science theories indicates a need for future theoretically based research into the complexity of policy-making and multiple influences on obesity prevention policy processes.
Maina, G; Sorasio, D; Rossi, F; Zito, D; Perrelli, E; Baracco, A
2012-01-01
The risk assessment in apiculture points out methodological problems due to discontinuities and variability of exposure. This study analyzes a comprehensive set of potential determinants influencing the biomechanical risks in apiarists using recognized technical standards to ensure the technical-scientific accuracy; it offers a simplified methodological toolkit to be used in the risk assessment process and provides a user-friendly computer application. The toolkit asks the beekeeper to specify, for each month, the total number of hours worked, specifying the distribution among different tasks. As a result, the application calculates the average index risk and the peak index risk. The evidence of the study indicates that there are activities in this occupational area with biomechanical risks that remain for some tasks, while reducing the exposure time.
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration. PMID:28133988
A Framework for Determining the Return on Investment of Simulation-Based Training in Health Care.
Bukhari, Hatim; Andreatta, Pamela; Goldiez, Brian; Rabelo, Luis
2017-01-01
This article describes a framework that has been developed to monetize the real value of simulation-based training in health care. A significant consideration has been given to the incorporation of the intangible and qualitative benefits, not only the tangible and quantitative benefits of simulation-based training in health care. The framework builds from three works: the value measurement methodology (VMM) used by several departments of the US Government, a methodology documented in several books by Dr Jack Phillips to monetize various training approaches, and a traditional return on investment methodology put forth by Frost and Sullivan, and Immersion Medical. All 3 source materials were adapted to create an integrated methodology that can be readily implemented. This article presents details on each of these methods and how they can be integrated and presents a framework that integrates the previous methods. In addition to that, it describes the concept and the application of the developed framework. As a test of the applicability of the framework, a real case study has been used to demonstrate the application of the framework. This case study provides real data related to the correlation between the pediatric patient cardiopulmonary arrest (CPA) survival rates and a simulation-based mock codes at the University of Michigan tertiary care academic medical center. It is important to point out that the proposed framework offers the capability to consider a wide range of benefits and values, but on the other hand, there are several limitations that has been discussed and need to be taken in consideration.
Corbett, Andrea M; Francis, Karen; Chapman, Ysanne
2007-04-01
Identifying a methodology to guide a study that aims to enhance service delivery can be challenging. Participatory action research offers a solution to this challenge as it both informs and is informed by critical social theory. In addition, using a feminist lens helps acquiesce this approach as a suitable methodology for changing practice. This methodology embraces empowerment self-determination and the facilitation of agreed change as central tenets that guide the research process. Encouraged by the work of Foucault, Friere, Habermas, and Maguire, this paper explicates the philosophical assumptions underpinning critical social theory and outlines how feminist influences are complimentary in exploring the processes and applications of nursing research that seeks to embrace change.
The Applicability of Course Experience Questionnaire for a Malaysian University Context
ERIC Educational Resources Information Center
Thien, Lei Mee; Ong, Mei Yean
2016-01-01
Purpose: The purpose of this study is to examine the applicability of Course Experience Questionnaire (CEQ) in a Malaysian university context. Design/methodology/approach: The CEQ was translated into Malay language using rigorous cross-cultural adaptation procedures. The Malay version CEQ was administered to 190 undergraduate students in one…
Using Sandelowski and Barroso's Meta-Synthesis Method in Advancing Qualitative Evidence.
Ludvigsen, Mette S; Hall, Elisabeth O C; Meyer, Gabriele; Fegran, Liv; Aagaard, Hanne; Uhrenfeldt, Lisbeth
2016-02-01
The purpose of this article was to iteratively account for and discuss the handling of methodological challenges in two qualitative research syntheses concerning patients' experiences of hospital transition. We applied Sandelowski and Barroso's guidelines for synthesizing qualitative research, and to our knowledge, this is the first time researchers discuss their methodological steps. In the process, we identified a need for prolonged discussions to determine mutual understandings of the methodology. We discussed how to identify the appropriate qualitative research literature and how to best conduct exhaustive literature searches on our target phenomena. Another finding concerned our status as third-order interpreters of participants' experiences and what this meant for synthesizing the primary findings. Finally, we discussed whether our studies could be classified as metasummaries or metasyntheses. Although we have some concerns regarding the applicability of the methodology, we conclude that following Sandelowski and Barroso's guidelines contributed to valid syntheses of our studies. © The Author(s) 2015.
Marzocchini, Manrico; Tatàno, Fabio; Moretti, Michela Simona; Antinori, Caterina; Orilisi, Stefano
2018-06-05
A possible approach for determining soil and groundwater quality criteria for contaminated sites is the comparative risk assessment. Originating from but not limited to Italian interest in a decentralised (regional) implementation of comparative risk assessment, this paper first addresses the proposal of an original methodology called CORIAN REG-M , which was created with initial attention to the context of potentially contaminated sites in the Marche Region (Central Italy). To deepen the technical-scientific knowledge and applicability of the comparative risk assessment, the following characteristics of the CORIAN REG-M methodology appear to be relevant: the simplified but logical assumption of three categories of factors (source and transfer/transport of potential contamination, and impacted receptors) within each exposure pathway; the adaptation to quality and quantity of data that are available or derivable at the given scale of concern; the attention to a reliable but unsophisticated modelling; the achievement of a conceptual linkage to the absolute risk assessment approach; and the potential for easy updating and/or refining of the methodology. Further, the application of the CORIAN REG-M methodology to some case-study sites located in the Marche Region indicated the following: a positive correlation can be expected between air and direct contact pathway scores, as well as between individual pathway scores and the overall site scores based on a root-mean-square algorithm; the exposure pathway, which presents the highest variability of scores, tends to be dominant at sites with the highest computed overall site scores; and the adoption of a root-mean-square algorithm can be expected to emphasise the overall site scoring.
Additive Manufacturing in Production: A Study Case Applying Technical Requirements
NASA Astrophysics Data System (ADS)
Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni
Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.
Molinos-Senante, María; Maziotis, Alexandros
2018-05-01
The water industry presents several structures in different countries and also within countries. Hence, several studies have been conducted to evaluate the presence of economies of scope and scale in the water industry leading to inconclusive results. The lack of a common methodology has been identified as an important factor contributing to divergent conclusions. This paper evaluates, for the first time, the presence of economies of scale and scope in the water industry using a flexible technology approach integrating operational and exogenous variables of the water companies in the cost functions. The empirical application carried out for the English and Welsh water industry evidenced that the inclusion of exogenous variables accounts for significant differences in economies of scale and scope. Moreover, completely different results were obtained when the economies of scale and scope were estimated using common and flexible technology methodological approaches. The findings of this study reveal the importance of using an appropriate methodology to support policy decision-making processes to promote sustainable urban water activities.
NASA Astrophysics Data System (ADS)
Galan, Berta; Muñoz, Iciar; Viguri, Javier R.
2016-09-01
This paper shows the planning, the teaching activities and the evaluation of the learning and teaching process implemented in the Chemical Process Design course at the University of Cantabria, Spain. Educational methods to address the knowledge, skills and attitudes that students who complete the course are expected to acquire are proposed and discussed. Undergraduate and graduate engineers' perceptions of the methodology used are evaluated by means of a questionnaire. Results of the teaching activities and the strengths and weaknesses of the proposed case study are discussed in relation to the course characteristics. The findings of the empirical evaluation shows that the excessive time students had to dedicate to the case study project and dealing with limited information are the most negative aspects obtained, whereas an increase in the students' self-confidence and the practical application of the methodology are the most positive aspects. Finally, improvements are discussed in order to extend the application of the methodology to other courses offered as part of the chemical engineering degree.
42 CFR 67.15 - Peer review of applications.
Code of Federal Regulations, 2010 CFR
2010-10-01
...; (viii) The extent to which women and minorities are adequately represented in study populations; (ix... conference, specifically the importance of the issue or problem being addressed, including methodological or...
42 CFR 67.15 - Peer review of applications.
Code of Federal Regulations, 2012 CFR
2012-10-01
...; (viii) The extent to which women and minorities are adequately represented in study populations; (ix... conference, specifically the importance of the issue or problem being addressed, including methodological or...
42 CFR 67.15 - Peer review of applications.
Code of Federal Regulations, 2011 CFR
2011-10-01
...; (viii) The extent to which women and minorities are adequately represented in study populations; (ix... conference, specifically the importance of the issue or problem being addressed, including methodological or...
42 CFR 67.15 - Peer review of applications.
Code of Federal Regulations, 2014 CFR
2014-10-01
...; (viii) The extent to which women and minorities are adequately represented in study populations; (ix... conference, specifically the importance of the issue or problem being addressed, including methodological or...
42 CFR 67.15 - Peer review of applications.
Code of Federal Regulations, 2013 CFR
2013-10-01
...; (viii) The extent to which women and minorities are adequately represented in study populations; (ix... conference, specifically the importance of the issue or problem being addressed, including methodological or...
Methodology Application: Logistic Regression the Using CODES Data
DOT National Transportation Integrated Search
1996-09-06
Congress directed the Secretary of Transportation, through the Intermodal : Surface Transportation Efficiency Act (ISTEA) of 1991, to carry out a study or : studies to determine the impact of safety belt and motorcycle helmet use. In : order to carry...
Dekant, Wolfgang; Bridges, James
2016-11-01
Quantitative weight of evidence (QWoE) methodology utilizes detailed scoring sheets to assess the quality/reliability of each publication on toxicity of a chemical and gives numerical scores for quality and observed toxicity. This QWoE-methodology was applied to the reproductive toxicity data on diisononylphthalate (DINP), di-n-hexylphthalate (DnHP), and dicyclohexylphthalate (DCHP) to determine if the scientific evidence for adverse effects meets the requirements for classification as reproductive toxicants. The scores for DINP were compared to those when applying the methodology DCHP and DnHP that have harmonized classifications. Based on the quality/reliability scores, application of the QWoE shows that the three databases are of similar quality; but effect scores differ widely. Application of QWoE to DINP studies resulted in an overall score well below the benchmark required to trigger classification. For DCHP, the QWoE also results in low scores. The high scores from the application of the QWoE methodology to the toxicological data for DnHP represent clear evidence for adverse effects and justify a classification of DnHP as category 1B for both development and fertility. The conclusions on classification based on the QWoE are well supported using a narrative assessment of consistency and biological plausibility. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... VIRGIN ISLANDS General Financial Eligibility Requirements and Options § 436.601 Application of financial... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
Brunner, Emanuel; De Herdt, Amber; Minguet, Philippe; Baldew, Se-Sergio; Probst, Michel
2013-01-01
The primary purpose was to detect randomized controlled trials investigating cognitive behaviour therapy-based (CBT) treatments applied in acute/sub-acute low back pain (LBP). The secondary purpose was to analyse the methodological properties of the included studies, and to identify theory-based treatment strategies that are applicable for physiotherapists. A systematic literature search was conducted using four databases. Risk of bias of included studies was assessed and the methodological properties summarized. In addition, content and treatment theory of detected CBT-based strategies were systematically analysed and classified into three distinctive concepts of CBT: operant, cognitive and respondent treatment. Finally, applicability of treatment strategies in physiotherapy practice was discussed. Eight studies were included in the present systematic review. Half of the studies suffered from high risk of bias, and study characteristics varied in all domains of methodology, particularly in terms of treatment design and outcome measures. Graded activity, an operant treatment approach based on principles of operant conditioning, was identified as a CBT-based strategy with traceable theoretical justification that can be applied by physiotherapists. Operant conditioning can be integrated in ambulant physiotherapy practice and is a promising CBT-based strategy for the prevention of chronic LBP.
Frederiksen, Kirsten; Lomborg, Kirsten; Beedholm, Kirsten
2015-09-01
This study takes its point of departure in an oft-voiced critique that the French philosopher Michel Foucault gives discourse priority over practice, thereby being deterministic and leaving little space for the individual to act as an agent. Based on an interpretation of the latter part of Foucault's oeuvre, we argue against this critique and provide a methodological discussion of the perception that Foucault's method constitutes, primarily, discourse analysis. We argue that it is possible to overcome this critique of Foucault's work by the application of methodological tools adapted from Foucault's later writings and his diagnosis of his own work as studies of forms of problematization. To shed light on the possibilities that this approach offers to the researcher, we present a reading of aspects of Foucault's work, with a focus on his notion of forms of problematization. Furthermore, we elaborate on concepts from his so-called genealogical period, namely 'the dispositive', strategy and tactics. Our interpretation is supported by examples from a study of the emergence of Danish nursing education, which is based on an analytical framework that we developed in the light of an interpretation of aspects of Foucault's work. © 2015 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Corces-Zimmerman, Chris; Utt, Jamie; Cabrera, Nolan L.
2017-01-01
In this response to the article by Tanner and Corrie, the authors provide three critiques of the methodology and theoretical framing of the study with the hopes of informing future scholarship and practice. Specifically, the three critiques addressed in this paper include the integration of CWS frameworks and YPAR methodology, the application and…
Methods and pitfalls of measuring thermal preference and tolerance in lizards.
Camacho, Agustín; Rusch, Travis W
2017-08-01
Understanding methodological and biological sources of bias during the measurement of thermal parameters is essential for the advancement of thermal biology. For more than a century, studies on lizards have deepened our understanding of thermal ecophysiology, employing multiple methods to measure thermal preferences and tolerances. We reviewed 129 articles concerned with measuring preferred body temperature (PBT), voluntary thermal tolerance, and critical temperatures of lizards to offer: a) an overview of the methods used to measure and report these parameters, b) a summary of the methodological and biological factors affecting thermal preference and tolerance, c) recommendations to avoid identified pitfalls, and d) directions for continued progress in our application and understanding of these thermal parameters. We emphasize the need for more methodological and comparative studies. Lastly, we urge researchers to provide more detailed methodological descriptions and suggest ways to make their raw data more informative to increase the utility of thermal biology studies. Copyright © 2017 Elsevier Ltd. All rights reserved.
Sharma, Govind K; Kumar, Anish; Jayakumar, T; Purnachandra Rao, B; Mariyappa, N
2015-03-01
A signal processing methodology is proposed in this paper for effective reconstruction of ultrasonic signals in coarse grained high scattering austenitic stainless steel. The proposed methodology is comprised of the Ensemble Empirical Mode Decomposition (EEMD) processing of ultrasonic signals and application of signal minimisation algorithm on selected Intrinsic Mode Functions (IMFs) obtained by EEMD. The methodology is applied to ultrasonic signals obtained from austenitic stainless steel specimens of different grain size, with and without defects. The influence of probe frequency and data length of a signal on EEMD decomposition is also investigated. For a particular sampling rate and probe frequency, the same range of IMFs can be used to reconstruct the ultrasonic signal, irrespective of the grain size in the range of 30-210 μm investigated in this study. This methodology is successfully employed for detection of defects in a 50mm thick coarse grain austenitic stainless steel specimens. Signal to noise ratio improvement of better than 15 dB is observed for the ultrasonic signal obtained from a 25 mm deep flat bottom hole in 200 μm grain size specimen. For ultrasonic signals obtained from defects at different depths, a minimum of 7 dB extra enhancement in SNR is achieved as compared to the sum of selected IMF approach. The application of minimisation algorithm with EEMD processed signal in the proposed methodology proves to be effective for adaptive signal reconstruction with improved signal to noise ratio. This methodology was further employed for successful imaging of defects in a B-scan. Copyright © 2014. Published by Elsevier B.V.
Characterization of Damage Accumulation in a C/SiC Composite at Elevated Temperatures
NASA Technical Reports Server (NTRS)
Telesman, Jack; Verrilli, Mike; Ghosn, Louis; Kantzos, Pete
1997-01-01
This research is part of a program aimed to evaluate and demonstrate the ability of candidate CMC materials for a variety of applications in reusable launch vehicles. The life and durability of these materials in rocket and engine applications are of major concern and there is a need to develop and validate life prediction methodology. In this study, material characterization and mechanical testing was performed in order to identify the failure modes, degradation mechanisms, and progression of damage in a C/SiC composite at elevated temperatures. The motivation for this work is to provide the relevant damage information that will form the basis for the development of a physically based life prediction methodology.
NASA Technical Reports Server (NTRS)
Chen, Y.; Nguyen, D.; Guertin, S.; Berstein, J.; White, M.; Menke, R.; Kayali, S.
2003-01-01
This paper presents a reliability evaluation methodology to obtain the statistical reliability information of memory chips for space applications when the test sample size needs to be kept small because of the high cost of the radiation hardness memories.
An overview of key technology thrusts at Bell Helicopter Textron
NASA Technical Reports Server (NTRS)
Harse, James H.; Yen, Jing G.; Taylor, Rodney S.
1988-01-01
Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.
Development of flight experiment task requirements. Volume 1: Summary
NASA Technical Reports Server (NTRS)
Hatterick, G. R.
1972-01-01
A study was conducted to develop the means to identify skills required of scientist passengers on advanced missions related to the space shuttle and RAM programs. The scope of the study was defined to include only the activities of on-orbit personnel which are directly related to, or required by, on-orbit experimentation and scientific investigations conducted on or supported by the shuttle orbiter. A program summary is presented which provides a description of the methodology developed, an overview of the activities performed during the study, and the results obtained through application of the methodology.
Outline of cost-benefit analysis and a case study
NASA Technical Reports Server (NTRS)
Kellizy, A.
1978-01-01
The methodology of cost-benefit analysis is reviewed and a case study involving solar cell technology is presented. Emphasis is placed on simplifying the technique in order to permit a technical person not trained in economics to undertake a cost-benefit study comparing alternative approaches to a given problem. The role of economic analysis in management decision making is discussed. In simplifying the methodology it was necessary to restrict the scope and applicability of this report. Additional considerations and constraints are outlined. Examples are worked out to demonstrate the principles. A computer program which performs the computational aspects appears in the appendix.
ERIC Educational Resources Information Center
Moore, John W., Ed.; Moore, Elizabeth A., Ed.
1977-01-01
Discusses the role of the US Food and Drug Administration (FDA) in protecting the American public from carcinogens. Describes scientific testing methodology, risk-benefit analysis and the Delaney clause with its application to saccharin. (CP)
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
New methodology for fast prediction of wheel wear evolution
NASA Astrophysics Data System (ADS)
Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.
2017-07-01
In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.
Application of low-cost methodologies for mobile phone app development.
Zhang, Melvyn; Cheow, Enquan; Ho, Cyrus Sh; Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon
2014-12-09
The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users' self-rated perception of the apps. In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the "Mastering Psychiatry" app for undergraduates and "Déjà vu" app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines.
Application of Low-Cost Methodologies for Mobile Phone App Development
Ng, Beng Yeong; Ho, Roger; Cheok, Christopher Cheng Soon
2014-01-01
Background The usage of mobile phones and mobile phone apps in the recent decade has indeed become more prevalent. Previous research has highlighted a method of using just the Internet browser and a text editor to create an app, but this does not eliminate the challenges faced by clinicians. More recently, two methodologies of app development have been shared, but there has not been any disclosures pertaining to the costs involved. In addition, limitations such as the distribution and dissemination of the apps have not been addressed. Objective The aims of this research article are to: (1) highlight a low-cost methodology that clinicians without technical knowledge could use to develop educational apps; (2) clarify the respective costs involved in the process of development; (3) illustrate how limitations pertaining to dissemination could be addressed; and (4) to report initial utilization data of the apps and to share initial users’ self-rated perception of the apps. Methods In this study, we will present two techniques of how to create a mobile app using two of the well-established online mobile app building websites. The costs of development are specified and the methodology of dissemination of the apps will be shared. The application of the low-cost methodologies in the creation of the “Mastering Psychiatry” app for undergraduates and “Déjà vu” app for postgraduates will be discussed. A questionnaire survey has been administered to undergraduate students collating their perceptions towards the app. Results For the Mastering Psychiatry app, a cumulative total of 722 users have used the mobile app since inception, based on our analytics. For the Déjà vu app, there has been a cumulative total of 154 downloads since inception. The utilization data demonstrated the receptiveness towards these apps, and this is reinforced by the positive perceptions undergraduate students (n=185) had towards the low-cost self-developed apps. Conclusions This is one of the few studies that have demonstrated the low-cost methodologies of app development; as well as student and trainee receptivity toward self-created Web-based mobile phone apps. The results obtained have demonstrated that these Web-based low-cost apps are applicable in the real life, and suggest that the methodologies shared in this research paper might be of benefit for other specialities and disciplines. PMID:25491323
From LCAs to simplified models: a generic methodology applied to wind power electricity.
Padey, Pierryves; Girard, Robin; le Boulch, Denis; Blanc, Isabelle
2013-02-05
This study presents a generic methodology to produce simplified models able to provide a comprehensive life cycle impact assessment of energy pathways. The methodology relies on the application of global sensitivity analysis to identify key parameters explaining the impact variability of systems over their life cycle. Simplified models are built upon the identification of such key parameters. The methodology is applied to one energy pathway: onshore wind turbines of medium size considering a large sample of possible configurations representative of European conditions. Among several technological, geographical, and methodological parameters, we identified the turbine load factor and the wind turbine lifetime as the most influent parameters. Greenhouse Gas (GHG) performances have been plotted as a function of these key parameters identified. Using these curves, GHG performances of a specific wind turbine can be estimated, thus avoiding the undertaking of an extensive Life Cycle Assessment (LCA). This methodology should be useful for decisions makers, providing them a robust but simple support tool for assessing the environmental performance of energy systems.
Q methodology in health economics.
Baker, Rachel; Thompson, Carl; Mannion, Russell
2006-01-01
The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.
This is one of a series of reports that present methodologies for assessing the potential risks to humans or other organisms from the disposal or reuse of municipal sludge. The sludge management practices addressed by this series include land application practices, distribution a...
Measurement of Productivity and Quality in Non-Marketable Services: With Application to Schools
ERIC Educational Resources Information Center
Fare, R.; Grosskopf, S.; Forsund, F. R.; Hayes, K.; Heshmati, A.
2006-01-01
Purpose: This paper seeks to model and compute productivity, including a measure of quality, of a service which does not have marketable outputs--namely public education at the micro level. This application is a case study for Sweden public schools. Design/methodology/approach: A Malmquist productivity index is employed which allows for multiple…
ERIC Educational Resources Information Center
Martin de Lama, M. Teresa
2015-01-01
The present article intends to show the positive evaluation of post-graduate university students at a Spanish university after the curricular integration experience and the application of CLIL scaffolding techniques. It also aims to identify areas of methodological improvements and recommendations in the application of CLIL in the referred…
Using a Mobile Application to Support Children's Writing Motivation
ERIC Educational Resources Information Center
Kanala, Sari; Nousiainen, Tuula; Kankaanranta, Marja
2013-01-01
Purpose: The purpose of this paper is to explore the use of the prototype of a mobile application for the enhancement of children's motivation for writing. The results are explored from students' and experts' perspectives. Design/methodology/approach: This study is based on a field trial and expert evaluations of a prototype of a mobile…
An e-Portfolio-Based Model for the Application and Sharing of College English ESP MOOCs
ERIC Educational Resources Information Center
Chen, Jinshi
2017-01-01
The informationalized knowledge sharing of MOOCs not only promotes the change of teaching concept and the reform of teaching methodology, but also provides a new opportunity for the teaching resource integration and sharing between different universities. The present study has constructed an e-Portfolio-based model for the application and sharing…
Wojtusiak, Janusz; Michalski, Ryszard S; Simanivanh, Thipkesone; Baranova, Ancha V
2009-12-01
Systematic reviews and meta-analysis of published clinical datasets are important part of medical research. By combining results of multiple studies, meta-analysis is able to increase confidence in its conclusions, validate particular study results, and sometimes lead to new findings. Extensive theory has been built on how to aggregate results from multiple studies and arrive to the statistically valid conclusions. Surprisingly, very little has been done to adopt advanced machine learning methods to support meta-analysis. In this paper we describe a novel machine learning methodology that is capable of inducing accurate and easy to understand attributional rules from aggregated data. Thus, the methodology can be used to support traditional meta-analysis in systematic reviews. Most machine learning applications give primary attention to predictive accuracy of the learned knowledge, and lesser attention to its understandability. Here we employed attributional rules, the special form of rules that are relatively easy to interpret for medical experts who are not necessarily trained in statistics and meta-analysis. The methodology has been implemented and initially tested on a set of publicly available clinical data describing patients with metabolic syndrome (MS). The objective of this application was to determine rules describing combinations of clinical parameters used for metabolic syndrome diagnosis, and to develop rules for predicting whether particular patients are likely to develop secondary complications of MS. The aggregated clinical data was retrieved from 20 separate hospital cohorts that included 12 groups of patients with present liver disease symptoms and 8 control groups of healthy subjects. The total of 152 attributes were used, most of which were measured, however, in different studies. Twenty most common attributes were selected for the rule learning process. By applying the developed rule learning methodology we arrived at several different possible rulesets that can be used to predict three considered complications of MS, namely nonalcoholic fatty liver disease (NAFLD), simple steatosis (SS), and nonalcoholic steatohepatitis (NASH).
Methodological reviews of economic evaluations in health care: what do they target?
Hutter, Maria-Florencia; Rodríguez-Ibeas, Roberto; Antonanzas, Fernando
2014-11-01
An increasing number of published studies of economic evaluations of health technologies have been reviewed and summarized with different purposes, among them to facilitate decision-making processes. These reviews have covered different aspects of economic evaluations, using a variety of methodological approaches. The aim of this study is to analyze the methodological characteristics of the reviews of economic evaluations in health care, published during the period 1990-2010, to identify their main features and the potential missing elements. This may help to develop a common procedure for elaborating these kinds of reviews. We performed systematic searches in electronic databases (Scopus, Medline and PubMed) of methodological reviews published in English, period 1990-2010. We selected the articles whose main purpose was to review and assess the methodology applied in the economic evaluation studies. We classified the data according to the study objectives, period of the review, number of reviewed studies, methodological and non-methodological items assessed, medical specialty, type of disease and technology, databases used for the review and their main conclusions. We performed a descriptive statistical analysis and checked how generalizability issues were considered in the reviews. We identified 76 methodological reviews, 42 published in the period 1990-2001 and 34 during 2002-2010. The items assessed most frequently (by 70% of the reviews) were perspective, type of economic study, uncertainty and discounting. The reviews also described the type of intervention and disease, funding sources, country in which the evaluation took place, type of journal and author's characteristics. Regarding the intertemporal comparison, higher frequencies were found in the second period for two key methodological items: the source of effectiveness data and the models used in the studies. However, the generalizability issues that apparently are creating a growing interest in the economic evaluation literature did not receive as much attention in the reviews of the second period. The remaining items showed similar frequencies in both periods. Increasingly more reviews of economic evaluation studies aim to analyze the application of methodological principles, and offer summaries of papers classified by either diseases or health technologies. These reviews are useful for finding literature trends, aims of studies and possible deficiencies in the implementation of methods of specific health interventions. As no significant methodological improvement was clearly detected in the two periods analyzed, it would be convenient to pay more attention to the methodological aspects of the reviews.
Post-Positivist Research: Two Examples of Methodological Pluralism.
ERIC Educational Resources Information Center
Wildemuth, Barbara M.
1993-01-01
Discussion of positivist and interpretive approaches to research and postpositivism focuses on two studies that apply interpretive research in different ways: an exploratory study of user-developed computing applications conducted prior to a positivist study and a study of end-user searching behaviors conducted concurrently with a positivist…
[The GIPSY-RECPAM model: a versatile approach for integrated evaluation in cardiologic care].
Carinci, F
2009-01-01
Tree-structured methodology applied for the GISSI-PSICOLOGIA project, although performed in the framework of earliest GISSI studies, represents a powerful tool to analyze different aspects of cardiologic care. The GISSI-PSICOLOGIA project has delivered a novel methodology based on the joint application of psychometric tools and sophisticated statistical techniques. Its prospective use could allow building effective epidemiological models relevant to the prognosis of the cardiologic patient. The various features of the RECPAM method allow a versatile use in the framework of modern e-health projects. The study used the Cognitive Behavioral Assessment H Form (CBA-H) psychometrics scales. The potential for its future application in the framework of Italian cardiology is relevant and particularly indicated to assist planning of systems for integrated care and routine evaluation of the cardiologic patient.
Robust nonlinear variable selective control for networked systems
NASA Astrophysics Data System (ADS)
Rahmani, Behrooz
2016-10-01
This paper is concerned with the networked control of a class of uncertain nonlinear systems. In this way, Takagi-Sugeno (T-S) fuzzy modelling is used to extend the previously proposed variable selective control (VSC) methodology to nonlinear systems. This extension is based upon the decomposition of the nonlinear system to a set of fuzzy-blended locally linearised subsystems and further application of the VSC methodology to each subsystem. To increase the applicability of the T-S approach for uncertain nonlinear networked control systems, this study considers the asynchronous premise variables in the plant and the controller, and then introduces a robust stability analysis and control synthesis. The resulting optimal switching-fuzzy controller provides a minimum guaranteed cost on an H2 performance index. Simulation studies on three nonlinear benchmark problems demonstrate the effectiveness of the proposed method.
Probabilistic failure assessment with application to solid rocket motors
NASA Technical Reports Server (NTRS)
Jan, Darrell L.; Davidson, Barry D.; Moore, Nicholas R.
1990-01-01
A quantitative methodology is being developed for assessment of risk of failure of solid rocket motors. This probabilistic methodology employs best available engineering models and available information in a stochastic framework. The framework accounts for incomplete knowledge of governing parameters, intrinsic variability, and failure model specification error. Earlier case studies have been conducted on several failure modes of the Space Shuttle Main Engine. Work in progress on application of this probabilistic approach to large solid rocket boosters such as the Advanced Solid Rocket Motor for the Space Shuttle is described. Failure due to debonding has been selected as the first case study for large solid rocket motors (SRMs) since it accounts for a significant number of historical SRM failures. Impact of incomplete knowledge of governing parameters and failure model specification errors is expected to be important.
NASA Astrophysics Data System (ADS)
Sheate, William R.; Partidário, Maria Rosário Do; Byron, Helen; Bina, Olivia; Dagg, Suzan
2008-02-01
BioScene (scenarios for reconciling biodiversity conservation with declining agriculture use in mountain areas in Europe) was a three-year project (2002 2005) funded by the European Union’s Fifth Framework Programme, and aimed to investigate the implications of agricultural restructuring and decline for biodiversity conservation in the mountain areas of Europe. The research took a case study approach to the analysis of the biodiversity processes and outcomes of different scenarios of agri-environmental change in six countries (France, Greece, Norway, Slovakia, Switzerland, and the United Kingdom) covering the major biogeographical regions of Europe. The project was coordinated by Imperial College London, and each study area had a multidisciplinary team including ecologists and social and economic experts, which sought a comprehensive understanding of the drivers for change and their implications for sustainability. A key component was the sustainability assessment (SA) of the alternative scenarios. This article discusses the development and application of the SA methodology developed for BioScene. While the methodology was objectives-led, it was also strongly grounded in baseline ecological and socio-economic data. This article also describes the engagement of stakeholder panels in each study area and the use of causal chain analysis for understanding the likely implications for land use and biodiversity of strategic drivers of change under alternative scenarios for agriculture and rural policy and for biodiversity management. Finally, this article draws conclusions for the application of SA more widely, its use with scenarios, and the benefits of stakeholder engagement in the SA process.
Brattoli, Magda; Cisternino, Ezia; Dambruoso, Paolo Rosario; de Gennaro, Gianluigi; Giungato, Pasquale; Mazzone, Antonio; Palmisani, Jolanda; Tutino, Maria
2013-01-01
The gas chromatography-olfactometry (GC-O) technique couples traditional gas chromatographic analysis with sensory detection in order to study complex mixtures of odorous substances and to identify odor active compounds. The GC-O technique is already widely used for the evaluation of food aromas and its application in environmental fields is increasing, thus moving the odor emission assessment from the solely olfactometric evaluations to the characterization of the volatile components responsible for odor nuisance. The aim of this paper is to describe the state of the art of gas chromatography-olfactometry methodology, considering the different approaches regarding the operational conditions and the different methods for evaluating the olfactometric detection of odor compounds. The potentials of GC-O are described highlighting the improvements in this methodology relative to other conventional approaches used for odor detection, such as sensoristic, sensorial and the traditional gas chromatographic methods. The paper also provides an examination of the different fields of application of the GC-O, principally related to fragrances and food aromas, odor nuisance produced by anthropic activities and odorous compounds emitted by materials and medical applications. PMID:24316571
Conjugate gradient based projection - A new explicit methodology for frictional contact
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Li, Maocheng; Sha, Desong
1993-01-01
With special attention towards the applicability to parallel computation or vectorization, a new and effective explicit approach for linear complementary formulations involving a conjugate gradient based projection methodology is proposed in this study for contact problems with Coulomb friction. The overall objectives are focussed towards providing an explicit methodology of computation for the complete contact problem with friction. In this regard, the primary idea for solving the linear complementary formulations stems from an established search direction which is projected to a feasible region determined by the non-negative constraint condition; this direction is then applied to the Fletcher-Reeves conjugate gradient method resulting in a powerful explicit methodology which possesses high accuracy, excellent convergence characteristics, fast computational speed and is relatively simple to implement for contact problems involving Coulomb friction.
An Application of the Methodology for Assessment of the Sustainability of Air Transport System
NASA Technical Reports Server (NTRS)
Janic, Milan
2003-01-01
An assessment and operationalization of the concept of sustainable air transport system is recognized as an important but complex research, operational and policy task. In the scope of the academic efforts to properly address the problem, this paper aims to assess the sustainability of air transport system. It particular, the paper describes the methodology for assessment of sustainability and its potential application. The methodology consists of the indicator systems, which relate to the air transport system operational, economic, social and environmental dimension of performance. The particular indicator systems are relevant for the particular actors such users (air travellers), air transport operators, aerospace manufacturers, local communities, governmental authorities at different levels (local, national, international), international air transport associations, pressure groups and public. In the scope of application of the methodology, the specific cases are selected to estimate the particular indicators, and thus to assess the system sustainability under given conditions.
Sanchez-Vazquez, Manuel J; Nielen, Mirjam; Edwards, Sandra A; Gunn, George J; Lewis, Fraser I
2012-08-31
Abattoir detected pathologies are of crucial importance to both pig production and food safety. Usually, more than one pathology coexist in a pig herd although it often remains unknown how these different pathologies interrelate to each other. Identification of the associations between different pathologies may facilitate an improved understanding of their underlying biological linkage, and support the veterinarians in encouraging control strategies aimed at reducing the prevalence of not just one, but two or more conditions simultaneously. Multi-dimensional machine learning methodology was used to identify associations between ten typical pathologies in 6485 batches of slaughtered finishing pigs, assisting the comprehension of their biological association. Pathologies potentially associated with septicaemia (e.g. pericarditis, peritonitis) appear interrelated, suggesting on-going bacterial challenges by pathogens such as Haemophilus parasuis and Streptococcus suis. Furthermore, hepatic scarring appears interrelated with both milk spot livers (Ascaris suum) and bacteria-related pathologies, suggesting a potential multi-pathogen nature for this pathology. The application of novel multi-dimensional machine learning methodology provided new insights into how typical pig pathologies are potentially interrelated at batch level. The methodology presented is a powerful exploratory tool to generate hypotheses, applicable to a wide range of studies in veterinary research.
Analysis of experts' perception of the effectiveness of teaching methods
NASA Astrophysics Data System (ADS)
Kindra, Gurprit S.
1984-03-01
The present study attempts to shed light on the perceptions of business educators regarding the effectiveness of six methodologies in achieving Gagné's five learning outcomes. Results of this study empirically confirm the oft-stated contention that no one method is globally effective for the attainment of all objectives. Specifically, business games, traditional lecture, and case study methods are perceived to be most effective for the learning of application, knowledge acquisition, and analysis and application, respectively.
Graded activation of the intrinsic laryngeal muscles for vocal fold posturing
Chhetri, Dinesh K.; Neubauer, Juergen; Berry, David A.
2010-01-01
Previous investigations using in vivo models to study the role of intrinsic laryngeal muscles in phonation have used neuromuscular stimulation to study voice parameters. However, these studies used coarse stimulation techniques using limited levels of neuromuscular stimulation. In the current investigation, a technique for fine control of laryngeal posturing was developed using graded stimulation of the laryngeal nerves. Vocal fold strain history to graded stimulation and a methodology for establishing symmetric laryngeal activation is presented. This methodology has immediate applications for the study of laryngeal paralysis and paresis, as well as general questions of neuromuscular control of the larynx. PMID:20369979
The role of structural characteristics in video-game play motivation: a Q-methodology study.
Westwood, Dave; Griffiths, Mark D
2010-10-01
Until recently, there has been very little naturalistic study of what gaming experiences are like, and how gaming fits into people's lives. Using a recently developed structural characteristic taxonomy of video games, this study examined the psycho-structural elements of computer games that motivate gamers to play them. Using Q-Sort methodology, 40 gamers participated in an online Q-sort task. Results identified six distinct types of gamers based on the factors generated: (a) story-driven solo gamers; (b) social gamers; (c) solo limited gamers; (d) hardcore online gamers; (e) solo control/identity gamers; and (f ) casual gamers. These gaming types are discussed, and a brief evaluation of similar and unique elements of the different types of gamer is also offered. The current study shows Q-methodology to be a relevant and applicable method in the psychological research of gaming.
NASA Technical Reports Server (NTRS)
Nauda, A.
1982-01-01
Performance and reliability models of alternate microcomputer architectures as a methodology for optimizing system design were examined. A methodology for selecting an optimum microcomputer architecture for autonomous operation of planetary spacecraft power systems was developed. Various microcomputer system architectures are analyzed to determine their application to spacecraft power systems. It is suggested that no standardization formula or common set of guidelines exists which provides an optimum configuration for a given set of specifications.
Molinos-Senante, María; Hernández-Sancho, Francesc; Sala-Garrido, Ramón
2012-01-01
The concept of sustainability involves the integration of economic, environmental, and social aspects and this also applies in the field of wastewater treatment. Economic feasibility studies are a key tool for selecting the most appropriate option from a set of technological proposals. Moreover, these studies are needed to assess the viability of transferring new technologies from pilot-scale to full-scale. In traditional economic feasibility studies, the benefits that have no market price, such as environmental benefits, are not considered and are therefore underestimated. To overcome this limitation, we propose a new methodology to assess the economic viability of wastewater treatment technologies that considers internal and external impacts. The estimation of the costs is based on the use of cost functions. To quantify the environmental benefits from wastewater treatment, the distance function methodology is proposed to estimate the shadow price of each pollutant removed in the wastewater treatment. The application of this methodological approach by decision makers enables the calculation of the true costs and benefits associated with each alternative technology. The proposed methodology is presented as a useful tool to support decision making.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
Chambaron, Stéphanie; Ginhac, Dominique; Perruchet, Pierre
2008-05-01
Serial reaction time tasks and, more generally, the visual-motor sequential paradigms are increasingly popular tools in a variety of research domains, from studies on implicit learning in laboratory contexts to the assessment of residual learning capabilities of patients in clinical settings. A consequence of this success, however, is the increased variability in paradigms and the difficulty inherent in respecting the methodological principles that two decades of experimental investigations have made more and more stringent. The purpose of the present article is to address those problems. We present a user-friendly application that simplifies running classical experiments, but is flexible enough to permit a broad range of nonstandard manipulations for more specific objectives. Basic methodological guidelines are also provided, as are suggestions for using the software to explore unconventional directions of research. The most recent version of gSRT-Soft may be obtained for free by contacting the authors.
Armour, Carl L.; Taylor, Jonathan G.
1991-01-01
This paper summarizes results of a survey conducted in 1988 of 57 U.S. Fish and Wildlife Service field offices. The purpose was to document opinions of biologists experienced in applying the Instream Flow Incremental Methodology (IFIM). Responses were received from 35 offices where 616 IFIM applications were reported. The existence of six monitoring studies designed to evaluate the adequacy of flows provided at sites was confirmed. The two principal categories reported as stumbling blocks to the successful application of IFIM were beliefs that the methodology is technically too simplistic or that it is too complex to apply. Recommendations receiving the highest scores for future initiatives to enhance IFIM use were (1) training and workshops for field biologists; and (2) improving suitability index (SI) curves and computer models, and evaluating the relationship of weighted useable area (WUA) to fish responses. The authors concur that emphasis for research should be on addressing technical concerns about SI curves and WUA.
A methodology for extending domain coverage in SemRep.
Rosemblat, Graciela; Shin, Dongwook; Kilicoglu, Halil; Sneiderman, Charles; Rindflesch, Thomas C
2013-12-01
We describe a domain-independent methodology to extend SemRep coverage beyond the biomedical domain. SemRep, a natural language processing application originally designed for biomedical texts, uses the knowledge sources provided by the Unified Medical Language System (UMLS©). Ontological and terminological extensions to the system are needed in order to support other areas of knowledge. We extended SemRep's application by developing a semantic representation of a previously unsupported domain. This was achieved by adapting well-known ontology engineering phases and integrating them with the UMLS knowledge sources on which SemRep crucially depends. While the process to extend SemRep coverage has been successfully applied in earlier projects, this paper presents in detail the step-wise approach we followed and the mechanisms implemented. A case study in the field of medical informatics illustrates how the ontology engineering phases have been adapted for optimal integration with the UMLS. We provide qualitative and quantitative results, which indicate the validity and usefulness of our methodology. Published by Elsevier Inc.
An object-oriented approach for harmonization of multimedia markup languages
NASA Astrophysics Data System (ADS)
Chen, Yih-Feng; Kuo, May-Chen; Sun, Xiaoming; Kuo, C.-C. Jay
2003-12-01
An object-oriented methodology is proposed to harmonize several different markup languages in this research. First, we adopt the Unified Modelling Language (UML) as the data model to formalize the concept and the process of the harmonization process between the eXtensible Markup Language (XML) applications. Then, we design the Harmonization eXtensible Markup Language (HXML) based on the data model and formalize the transformation between the Document Type Definitions (DTDs) of the original XML applications and HXML. The transformation between instances is also discussed. We use the harmonization of SMIL and X3D as an example to demonstrate the proposed methodology. This methodology can be generalized to various application domains.
Applications of decision analysis and related techniques to industrial engineering problems at KSC
NASA Technical Reports Server (NTRS)
Evans, Gerald W.
1995-01-01
This report provides: (1) a discussion of the origination of decision analysis problems (well-structured problems) from ill-structured problems; (2) a review of the various methodologies and software packages for decision analysis and related problem areas; (3) a discussion of how the characteristics of a decision analysis problem affect the choice of modeling methodologies, thus providing a guide as to when to choose a particular methodology; and (4) examples of applications of decision analysis to particular problems encountered by the IE Group at KSC. With respect to the specific applications at KSC, particular emphasis is placed on the use of the Demos software package (Lumina Decision Systems, 1993).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onoufriou, T.; Simpson, R.J.; Protopapas, M.
This paper presents the development and application of reliability based inspection planning techniques for floaters. Based on previous experience from jacket structure applications optimized inspection planning (OIP) techniques for floaters are developed. The differences between floaters and jacket structures in relation to fatigue damage, redundancy levels and inspection practice are examined and reflected in the proposed methodology. The application and benefits of these techniques is demonstrated through representative analyses and important trends are highlighted through the results of a parametric sensitivity study.
Crossing trend analysis methodology and application for Turkish rainfall records
NASA Astrophysics Data System (ADS)
Şen, Zekâi
2018-01-01
Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.
Simoens, Steven
2013-01-01
Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474
Simoens, Steven
2013-01-01
This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.
A Formal Methodology to Design and Deploy Dependable Wireless Sensor Networks
Testa, Alessandro; Cinque, Marcello; Coronato, Antonio; Augusto, Juan Carlos
2016-01-01
Wireless Sensor Networks (WSNs) are being increasingly adopted in critical applications, where verifying the correct operation of sensor nodes is a major concern. Undesired events may undermine the mission of the WSNs. Hence, their effects need to be properly assessed before deployment, to obtain a good level of expected performance; and during the operation, in order to avoid dangerous unexpected results. In this paper, we propose a methodology that aims at assessing and improving the dependability level of WSNs by means of an event-based formal verification technique. The methodology includes a process to guide designers towards the realization of a dependable WSN and a tool (“ADVISES”) to simplify its adoption. The tool is applicable to homogeneous WSNs with static routing topologies. It allows the automatic generation of formal specifications used to check correctness properties and evaluate dependability metrics at design time and at runtime for WSNs where an acceptable percentage of faults can be defined. During the runtime, we can check the behavior of the WSN accordingly to the results obtained at design time and we can detect sudden and unexpected failures, in order to trigger recovery procedures. The effectiveness of the methodology is shown in the context of two case studies, as proof-of-concept, aiming to illustrate how the tool is helpful to drive design choices and to check the correctness properties of the WSN at runtime. Although the method scales up to very large WSNs, the applicability of the methodology may be compromised by the state space explosion of the reasoning model, which must be faced by partitioning large topologies into sub-topologies. PMID:28025568
Security Quality Requirements Engineering (SQUARE) Methodology
2005-11-01
such as Joint Application Development and the Accelerated Requirements Method [Wood 89, Hubbard 99] • Soft Systems Methodology [Checkland 89...investigated were misuse cases [Jacobson 92], Soft Systems Methodology (SSM) [Checkland 89], Quality Function Deployment (QFD) [QFD 05], Con- trolled...html (2005). [Checkland 89] Checkland, Peter. Soft Systems Methodology . Rational Analysis for a Problematic World. New York, NY: John Wiley & Sons
Radioactive waste disposal fees-Methodology for calculation
NASA Astrophysics Data System (ADS)
Bemš, Július; Králík, Tomáš; Kubančák, Ján; Vašíček, Jiří; Starý, Oldřich
2014-11-01
This paper summarizes the methodological approach used for calculation of fee for low- and intermediate-level radioactive waste disposal and for spent fuel disposal. The methodology itself is based on simulation of cash flows related to the operation of system for waste disposal. The paper includes demonstration of methodology application on the conditions of the Czech Republic.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Janjusic, Tommy; Kartsaklis, Christos
Memory scalability is an enduring problem and bottleneck that plagues many parallel codes. Parallel codes designed for High Performance Systems are typically designed over the span of several, and in some instances 10+, years. As a result, optimization practices which were appropriate for earlier systems may no longer be valid and thus require careful optimization consideration. Specifically, parallel codes whose memory footprint is a function of their scalability must be carefully considered for future exa-scale systems. In this paper we present a methodology and tool to study the memory scalability of parallel codes. Using our methodology we evaluate an applicationmore » s memory footprint as a function of scalability, which we coined memory efficiency, and describe our results. In particular, using our in-house tools we can pinpoint the specific application components which contribute to the application s overall memory foot-print (application data- structures, libraries, etc.).« less
ERIC Educational Resources Information Center
Jevsikova, Tatjana; Berniukevicius, Andrius; Kurilovas, Eugenijus
2017-01-01
The paper is aimed to present a methodology of learning personalisation based on applying Resource Description Framework (RDF) standard model. Research results are two-fold: first, the results of systematic literature review on Linked Data, RDF "subject-predicate-object" triples, and Web Ontology Language (OWL) application in education…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-07
... complementary methodological frameworks of the IDEAL and TPLC initiatives, more comprehensive and applicable... devices, surgical operations, and invasive medical procedures; Unique study designs and reporting methods...
Applications of adenine nucleotide measurements in oceanography
NASA Technical Reports Server (NTRS)
Holm-Hansen, O.; Hodson, R.; Azam, F.
1975-01-01
The methodology involved in nucleotide measurements is outlined, along with data to support the premise that ATP concentrations in microbial cells can be extrapolated to biomass parameters. ATP concentrations in microorganisms and nucleotide analyses are studied.
Political Science, The Judicial Process, and A Legal Education
ERIC Educational Resources Information Center
Funston, Richard
1975-01-01
Application of the behavioral approach to the study of the judicial process is examined including methodological approaches used, typical findings, and "behavioralists'" rejection of the case method of studying law. The author concludes that the behavioral approach to the study of judicial politics has not been substantially productive. (JT)
Castro, Alexander Garcia; Rocca-Serra, Philippe; Stevens, Robert; Taylor, Chris; Nashar, Karim; Ragan, Mark A; Sansone, Susanna-Assunta
2006-01-01
Background Incorporation of ontologies into annotations has enabled 'semantic integration' of complex data, making explicit the knowledge within a certain field. One of the major bottlenecks in developing bio-ontologies is the lack of a unified methodology. Different methodologies have been proposed for different scenarios, but there is no agreed-upon standard methodology for building ontologies. The involvement of geographically distributed domain experts, the need for domain experts to lead the design process, the application of the ontologies and the life cycles of bio-ontologies are amongst the features not considered by previously proposed methodologies. Results Here, we present a methodology for developing ontologies within the biological domain. We describe our scenario, competency questions, results and milestones for each methodological stage. We introduce the use of concept maps during knowledge acquisition phases as a feasible transition between domain expert and knowledge engineer. Conclusion The contributions of this paper are the thorough description of the steps we suggest when building an ontology, example use of concept maps, consideration of applicability to the development of lower-level ontologies and application to decentralised environments. We have found that within our scenario conceptual maps played an important role in the development process. PMID:16725019
Application of the Hardman methodology to the Single Channel Ground-Airborne Radio System (SINCGARS)
NASA Technical Reports Server (NTRS)
1984-01-01
The HARDMAN methodology was applied to the various configurations of employment for an emerging Army multipurpose communications system. The methodology was used to analyze the manpower, personnel and training (MPT) requirements and associated costs, of the system concepts responsive to the Army's requirement for the Single Channel Ground-Airborne Radio System (SINCGARS). The scope of the application includes the analysis of two conceptual designs Cincinnati Electronics and ITT Aerospace/Optical Division for operating and maintenance support addressed through the general support maintenance echelon.
Reliability Issues and Solutions in Flexible Electronics Under Mechanical Fatigue
NASA Astrophysics Data System (ADS)
Yi, Seol-Min; Choi, In-Suk; Kim, Byoung-Joon; Joo, Young-Chang
2018-07-01
Flexible devices are of significant interest due to their potential expansion of the application of smart devices into various fields, such as energy harvesting, biological applications and consumer electronics. Due to the mechanically dynamic operations of flexible electronics, their mechanical reliability must be thoroughly investigated to understand their failure mechanisms and lifetimes. Reliability issue caused by bending fatigue, one of the typical operational limitations of flexible electronics, has been studied using various test methodologies; however, electromechanical evaluations which are essential to assess the reliability of electronic devices for flexible applications had not been investigated because the testing method was not established. By employing the in situ bending fatigue test, we has studied the failure mechanism for various conditions and parameters, such as bending strain, fatigue area, film thickness, and lateral dimensions. Moreover, various methods for improving the bending reliability have been developed based on the failure mechanism. Nanostructures such as holes, pores, wires and composites of nanoparticles and nanotubes have been suggested for better reliability. Flexible devices were also investigated to find the potential failures initiated by complex structures under bending fatigue strain. In this review, the recent advances in test methodology, mechanism studies, and practical applications are introduced. Additionally, perspectives including the future advance to stretchable electronics are discussed based on the current achievements in research.
Reliability Issues and Solutions in Flexible Electronics Under Mechanical Fatigue
NASA Astrophysics Data System (ADS)
Yi, Seol-Min; Choi, In-Suk; Kim, Byoung-Joon; Joo, Young-Chang
2018-03-01
Flexible devices are of significant interest due to their potential expansion of the application of smart devices into various fields, such as energy harvesting, biological applications and consumer electronics. Due to the mechanically dynamic operations of flexible electronics, their mechanical reliability must be thoroughly investigated to understand their failure mechanisms and lifetimes. Reliability issue caused by bending fatigue, one of the typical operational limitations of flexible electronics, has been studied using various test methodologies; however, electromechanical evaluations which are essential to assess the reliability of electronic devices for flexible applications had not been investigated because the testing method was not established. By employing the in situ bending fatigue test, we has studied the failure mechanism for various conditions and parameters, such as bending strain, fatigue area, film thickness, and lateral dimensions. Moreover, various methods for improving the bending reliability have been developed based on the failure mechanism. Nanostructures such as holes, pores, wires and composites of nanoparticles and nanotubes have been suggested for better reliability. Flexible devices were also investigated to find the potential failures initiated by complex structures under bending fatigue strain. In this review, the recent advances in test methodology, mechanism studies, and practical applications are introduced. Additionally, perspectives including the future advance to stretchable electronics are discussed based on the current achievements in research.
2013-01-01
Background Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state’s Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. Methods The clustering methodology employs a 2-step K-means + Ward’s clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Results Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Conclusions Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units. PMID:23964905
Delamater, Paul L; Shortridge, Ashton M; Messina, Joseph P
2013-08-22
Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state's Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. The clustering methodology employs a 2-step K-means + Ward's clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units.
Case Study on Project Risk Management Planning Based on Soft System Methodology
NASA Astrophysics Data System (ADS)
Lifang, Xie; Jun, Li
This paper analyzed the soft system characters of construction projects and the applicability on using Soft System Methodology (SSM) for risk analysis after a brief review of SSM. Taking a hydropower project as an example, it constructed the general frame of project risk management planning (PRMP) and established the Risk Management Planning (RMP) system from the perspective of the interests of co-ordination. This paper provided the ideas and methods for construction RMP under the win-win situation through the practice of SSM.
Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M
2015-08-01
To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
An update on technical and methodological aspects for cardiac PET applications.
Presotto, Luca; Busnardo, Elena; Gianolli, Luigi; Bettinardi, Valentino
2016-12-01
Positron emission tomography (PET) is indicated for a large number of cardiac diseases: perfusion and viability studies are commonly used to evaluate coronary artery disease; PET can also be used to assess sarcoidosis and endocarditis, as well as to investigate amyloidosis. Furthermore, a hot topic for research is plaque characterization. Most of these studies are technically very challenging. High count rates and short acquisition times characterize perfusion scans while very small targets have to be imaged in inflammation/infection and plaques examinations. Furthermore, cardiac PET suffers from respiratory and cardiac motion blur. Each type of studies has specific requirements from the technical and methodological point of view, thus PET systems with overall high performances are required. Furthermore, in the era of hybrid PET/computed tomography (CT) and PET/Magnetic Resonance Imaging (MRI) systems, the combination of complementary functional and anatomical information can be used to improve diagnosis and prognosis. Moreover, PET images can be qualitatively and quantitatively improved exploiting information from the other modality, using advanced algorithms. In this review we will report the latest technological and methodological innovations for PET cardiac applications, with particular reference to the state of the art of the hybrid PET/CT and PET/MRI. We will also report the most recent advancements in software, from reconstruction algorithms to image processing and analysis programs.
The methodological quality of systematic reviews of animal studies in dentistry.
Faggion, C M; Listl, S; Giannakopoulos, N N
2012-05-01
Systematic reviews and meta-analyses of animal studies are important for improving estimates of the effects of treatment and for guiding future clinical studies on humans. The purpose of this systematic review was to assess the methodological quality of systematic reviews and meta-analyses of animal studies in dentistry through using a validated checklist. A literature search was conducted independently and in duplicate in the PubMed and LILACS databases. References in selected systematic reviews were assessed to identify other studies not captured by the electronic searches. The methodological quality of studies was assessed independently and in duplicate by using the AMSTAR checklist; the quality was scored as low, moderate, or high. The reviewers were calibrated before the assessment and agreement between them was assessed using Cohen's Kappa statistic. Of 444 studies retrieved, 54 systematic reviews were selected after full-text assessment. Agreement between the reviewers was regarded as excellent. Only two studies were scored as high quality; 17 and 35 studies were scored as medium and low quality, respectively. There is room for improvement of the methodological quality of systematic reviews of animal studies in dentistry. Checklists, such as AMSTAR, can guide researchers in planning and executing systematic reviews and meta-analyses. For determining the need for additional investigations in animals and in order to provide good data for potential application in human, such reviews should be based on animal experiments performed according to sound methodological principles. Copyright © 2011 Elsevier Ltd. All rights reserved.
An overview of the impact of rare disease characteristics on research methodology.
Whicher, Danielle; Philbin, Sarah; Aronson, Naomi
2018-01-19
About 30 million individuals in the United States are living with a rare disease, which by definition have a prevalence of 200,000 or fewer cases in the United States ([National Organization for Rare Disorders], [About NORD], [2016]). Disease heterogeneity and geographic dispersion add to the difficulty of completing robust studies in small populations. Improving the ability to conduct research on rare diseases would have a significant impact on population health. The purpose of this paper is to raise awareness of methodological approaches that can address the challenges to conducting robust research on rare diseases. We conducted a landscape review of available methodological and analytic approaches to address the challenges of rare disease research. Our objectives were to: 1. identify algorithms for matching study design to rare disease attributes and the methodological approaches applicable to these algorithms; 2. draw inferences on how research communities and infrastructure can contribute to the efficiency of research on rare diseases; and 3. to describe methodological approaches in the rare disease portfolio of the Patient-Centered Outcomes Research Institute (PCORI), a funder promoting both rare disease research and research infrastructure. We identified three algorithms for matching study design to rare disease or intervention characteristics (Gagne, et.al, BMJ 349:g6802, 2014); (Gupta, et.al, J Clin Epidemiol 64:1085-1094, 2011); (Cornu, et. al, Orphet J Rare Dis 8:48,2012) and summarized the applicable methodological and analytic approaches. From this literature we were also able to draw inferences on how an effective research infrastructure can set an agenda, prioritize studies, accelerate accrual, catalyze patient engagement and terminate poorly performing studies. Of the 24 rare disease projects in the PCORI portfolio, 11 are randomized controlled trials (RCTs) using standard designs. Thirteen are observational studies using case-control, prospective cohort, or natural history designs. PCORI has supported the development of 9 Patient-Powered Research Networks (PPRNs) focused on rare diseases. Matching research design to attributes of rare diseases and interventions can facilitate the completion of RCTs that are adequately powered. An effective research infrastructure can improve efficiency and avoid waste in rare disease research. Our review of the PCORI research portfolio demonstrates that it is feasible to conduct RCTs in rare disease. However, most of these studies are using standard RCT designs. This suggests that use of a broader array of methodological approaches to RCTs --such as adaptive trials, cross-over trials, and early escape designs can improve the productivity of robust research in rare diseases.
Bayesian Local Contamination Models for Multivariate Outliers
Page, Garritt L.; Dunson, David B.
2013-01-01
In studies where data are generated from multiple locations or sources it is common for there to exist observations that are quite unlike the majority. Motivated by the application of establishing a reference value in an inter-laboratory setting when outlying labs are present, we propose a local contamination model that is able to accommodate unusual multivariate realizations in a flexible way. The proposed method models the process level of a hierarchical model using a mixture with a parametric component and a possibly nonparametric contamination. Much of the flexibility in the methodology is achieved by allowing varying random subsets of the elements in the lab-specific mean vectors to be allocated to the contamination component. Computational methods are developed and the methodology is compared to three other possible approaches using a simulation study. We apply the proposed method to a NIST/NOAA sponsored inter-laboratory study which motivated the methodological development. PMID:24363465
1987-06-01
evaluation and chip layout planning for VLSI digital systems. A high-level applicative (functional) language, implemented at UCLA, allows combining of...operating system. 2.1 Introduction The complexity of VLSI requires the application of CAD tools at all levels of the design process. In order to be...effective, these tools must be adaptive to the specific design. In this project we studied a design method based on the use of applicative languages
Modelling of nuclear power plant decommissioning financing.
Bemš, J; Knápek, J; Králík, T; Hejhal, M; Kubančák, J; Vašíček, J
2015-06-01
Costs related to the decommissioning of nuclear power plants create a significant financial burden for nuclear power plant operators. This article discusses the various methodologies employed by selected European countries for financing of the liabilities related to the nuclear power plant decommissioning. The article also presents methodology of allocation of future decommissioning costs to the running costs of nuclear power plant in the form of fee imposed on each megawatt hour generated. The application of the methodology is presented in the form of a case study on a new nuclear power plant with installed capacity 1000 MW. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
[Progress in methodological characteristics of clinical practice guideline for osteoarthritis].
Xing, D; Wang, B; Lin, J H
2017-06-01
At present, several clinical practice guidelines for the treatment of osteoarthritis have been developed by institutes or societies. The ultimate purpose of developing clinical practice guidelines is to formulate the process in the treatment of osteoarthritis effectively. However, the methodologies used in developing clinical practice guidelines may place an influence on the transformation and application of that in treating osteoarthritis. The present study summarized the methodological features of individual clinical practice guideline and presented the tools for quality evaluation of clinical practice guideline. The limitations of current osteoarthritis guidelines of China are also indicated. The review article might help relevant institutions improve the quality in developing guide and clinical transformation.
Applied Coastal Oceanography--A Course That Integrates Science and Business.
ERIC Educational Resources Information Center
Montvilo, Jerome A.; Levin, Douglas R.
1998-01-01
Describes a course designed to teach students the fundamentals of coastal oceanography and the scientific methodologies used in studying this field. Business applications of this information also play an important role in the course. (DDR)
A Case Study of Measuring Process Risk for Early Insights into Software Safety
NASA Technical Reports Server (NTRS)
Layman, Lucas; Basili, Victor; Zelkowitz, Marvin V.; Fisher, Karen L.
2011-01-01
In this case study, we examine software safety risk in three flight hardware systems in NASA's Constellation spaceflight program. We applied our Technical and Process Risk Measurement (TPRM) methodology to the Constellation hazard analysis process to quantify the technical and process risks involving software safety in the early design phase of these projects. We analyzed 154 hazard reports and collected metrics to measure the prevalence of software in hazards and the specificity of descriptions of software causes of hazardous conditions. We found that 49-70% of 154 hazardous conditions could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. The application of the TPRM methodology identified process risks in the application of the hazard analysis process itself that may lead to software safety risk.
Standards and Methodologies for Characterizing Radiobiological Impact of High-Z Nanoparticles
Subiel, Anna; Ashmore, Reece; Schettino, Giuseppe
2016-01-01
Research on the application of high-Z nanoparticles (NPs) in cancer treatment and diagnosis has recently been the subject of growing interest, with much promise being shown with regards to a potential transition into clinical practice. In spite of numerous publications related to the development and application of nanoparticles for use with ionizing radiation, the literature is lacking coherent and systematic experimental approaches to fully evaluate the radiobiological effectiveness of NPs, validate mechanistic models and allow direct comparison of the studies undertaken by various research groups. The lack of standards and established methodology is commonly recognised as a major obstacle for the transition of innovative research ideas into clinical practice. This review provides a comprehensive overview of radiobiological techniques and quantification methods used in in vitro studies on high-Z nanoparticles and aims to provide recommendations for future standardization for NP-mediated radiation research. PMID:27446499
ERIC Educational Resources Information Center
Vitale, Michael R.; Romance, Nancy
Adopting perspectives based on applications of artificial intelligence proven in industry, this paper discusses methodological strategies and issues that underlie the development of such software environments. The general concept of an expert system is discussed in the context of its relevance to the problem of increasing the accessibility of…
Wauters, Lauri D J; Miguel-Moragas, Joan San; Mommaerts, Maurice Y
2015-11-01
To gain insight into the methodology of different computer-aided design-computer-aided manufacturing (CAD-CAM) applications for the reconstruction of cranio-maxillo-facial (CMF) defects. We reviewed and analyzed the available literature pertaining to CAD-CAM for use in CMF reconstruction. We proposed a classification system of the techniques of implant and cutting, drilling, and/or guiding template design and manufacturing. The system consisted of 4 classes (I-IV). These classes combine techniques used for both the implant and template to most accurately describe the methodology used. Our classification system can be widely applied. It should facilitate communication and immediate understanding of the methodology of CAD-CAM applications for the reconstruction of CMF defects.
Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications.
Fernández-Caramés, Tiago M; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis
2016-12-24
The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol.
Reverse Engineering and Security Evaluation of Commercial Tags for RFID-Based IoT Applications
Fernández-Caramés, Tiago M.; Fraga-Lamas, Paula; Suárez-Albela, Manuel; Castedo, Luis
2016-01-01
The Internet of Things (IoT) is a distributed system of physical objects that requires the seamless integration of hardware (e.g., sensors, actuators, electronics) and network communications in order to collect and exchange data. IoT smart objects need to be somehow identified to determine the origin of the data and to automatically detect the elements around us. One of the best positioned technologies to perform identification is RFID (Radio Frequency Identification), which in the last years has gained a lot of popularity in applications like access control, payment cards or logistics. Despite its popularity, RFID security has not been properly handled in numerous applications. To foster security in such applications, this article includes three main contributions. First, in order to establish the basics, a detailed review of the most common flaws found in RFID-based IoT systems is provided, including the latest attacks described in the literature. Second, a novel methodology that eases the detection and mitigation of such flaws is presented. Third, the latest RFID security tools are analyzed and the methodology proposed is applied through one of them (Proxmark 3) to validate it. Thus, the methodology is tested in different scenarios where tags are commonly used for identification. In such systems it was possible to clone transponders, extract information, and even emulate both tags and readers. Therefore, it is shown that the methodology proposed is useful for auditing security and reverse engineering RFID communications in IoT applications. It must be noted that, although this paper is aimed at fostering RFID communications security in IoT applications, the methodology can be applied to any RFID communications protocol. PMID:28029119
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2010 CFR
2010-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
43 CFR 11.83 - Damage determination phase-use value methodologies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... subject to standards governing its application? (vi) Are methodological inputs and assumptions supported... used for unique or difficult design and estimating conditions. This methodology requires the construction of a simple design for which an estimate can be found and applied to the unique or difficult...
Researching the impact of oral health on diet and nutritional status: methodological issues.
Moynihan, Paula; Thomason, Mark; Walls, Angus; Gray-Donald, Katherine; Morais, Jose A; Ghanem, Henry; Wollin, Stephanie; Ellis, Janice; Steele, Jimmy; Lund, James; Feine, Jocelyne
2009-04-01
Assessment of the impact of dental function on diet and nutritional status requires robust methodologies and a standardised approach to increase accuracy of results and to facilitate cross study comparisons. The objectives of this paper are: to report the outcomes of a consensus workshop that critically reviewed publications reporting on dietary methodologies in relation to the impact of oral health on nutrition; to highlight future directions for research and; to make recommendations for appropriate use of methodologies for future research. Data relevant to nutrition and dental status published from 1980 to 2005 in English were presented at the consensus workshop for discussion and appraisal. Relevant papers were retrieved through PubMed. Relevant texts were obtained from the library at Newcastle University, UK. A purposive sample of original articles that illustrated the application of a range of nutritional methodologies to the study of oral health impacts was identified. Original flagship texts on nutritional methodologies were reviewed. Numerous studies have shown an association between loss of teeth and inferior diet. Further research is required to elucidate the impact of novel approaches to prosthetic rehabilitation and the impact of contemporaneous dietary and dental intervention on diet, nutritional status, disease progression and quality of life. The recommendation of the consensus workshop was that future studies should adopt a comprehensive approach to the assessment of nutrition that encompasses measurement of diet, body composition, biochemical indices of intake and levels of nutrients, and functional biomarkers of disease.
Social capital: theory, evidence, and implications for oral health.
Rouxel, Patrick L; Heilmann, Anja; Aida, Jun; Tsakos, Georgios; Watt, Richard G
2015-04-01
In the last two decades, there has been increasing application of the concept of social capital in various fields of public health, including oral health. However, social capital is a contested concept with debates on its definition, measurement, and application. This study provides an overview of the concept of social capital, highlights the various pathways linking social capital to health, and discusses the potential implication of this concept for health policy. An extensive and diverse international literature has examined the relationship between social capital and a range of general health outcomes across the life course. A more limited but expanding literature has also demonstrated the potential influence of social capital on oral health. Much of the evidence in relation to oral health is limited by methodological shortcomings mainly related to the measurement of social capital, cross-sectional study designs, and inadequate controls for confounding factors. Further research using stronger methodological designs should explore the role of social capital in oral health and assess its potential application in the development of oral health improvement interventions. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
A methodology for collecting valid software engineering data
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Weiss, David M.
1983-01-01
An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.
Boiocchi, Riccardo; Gernaey, Krist V; Sin, Gürkan
2016-10-01
A methodology is developed to systematically design the membership functions of fuzzy-logic controllers for multivariable systems. The methodology consists of a systematic derivation of the critical points of the membership functions as a function of predefined control objectives. Several constrained optimization problems corresponding to different qualitative operation states of the system are defined and solved to identify, in a consistent manner, the critical points of the membership functions for the input variables. The consistently identified critical points, together with the linguistic rules, determine the long term reachability of the control objectives by the fuzzy logic controller. The methodology is highlighted using a single-stage side-stream partial nitritation/Anammox reactor as a case study. As a result, a new fuzzy-logic controller for high and stable total nitrogen removal efficiency is designed. Rigorous simulations are carried out to evaluate and benchmark the performance of the controller. The results demonstrate that the novel control strategy is capable of rejecting the long-term influent disturbances, and can achieve a stable and high TN removal efficiency. Additionally, the controller was tested, and showed robustness, against measurement noise levels typical for wastewater sensors. A feedforward-feedback configuration using the present controller would give even better performance. In comparison, a previously developed fuzzy-logic controller using merely expert and intuitive knowledge performed worse. This proved the importance of using a systematic methodology for the derivation of the membership functions for multivariable systems. These results are promising for future applications of the controller in real full-scale plants. Furthermore, the methodology can be used as a tool to help systematically design fuzzy logic control applications for other biological processes. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Z.; Che, W.; Frey, H. C.; Lau, A. K. H.
2016-12-01
Portable air monitors are currently being developed and used to enable a move towards exposure monitoring as opposed to fixed site monitoring. Reliable methods are needed regarding capturing spatial and temporal variability in exposure concentration to obtain credible data from which to develop efficient exposure mitigation measures. However, there are few studies that quantify the validity and repeatability of the collected data. The objective of this study is to present and evaluate a collocated exposure monitoring (CEM) methodology including the calibration of portable air monitors against stationary reference equipment, side-by-side comparison of portable air monitors, personal or microenvironmental exposure monitoring and the processing and interpretation of the collected data. The CEM methodology was evaluated based on application to portable monitors TSI DustTrak II Aerosol Monitor 8530 for fine particulate matter (PM2.5) and TSI Q-Trak model 7575 with probe model 982 for CO, CO2, temperature and relative humidity. Taking a school sampling campaign in Hong Kong in January and June, 2015 as an example, the calibrated side-by-side measured 1 Hz PM2.5 concentrations showed good consistency between two sets of portable air monitors. Confidence in side-by-side comparison, PM2.5 concentrations of which most of the time were within 2 percent, enabled robust inference regarding differences when the monitors measured in classroom and pedestrian during school hour. The proposed CEM methodology can be widely applied in sampling campaigns with the objective of simultaneously characterizing pollutant concentrations in two or more locations or microenvironments. The further application of the CEM methodology to transportation exposure will be presented and discussed.
Knai, Cécile; Brusamento, Serena; Legido-Quigley, Helena; Saliba, Vanessa; Panteli, Dimitra; Turk, Eva; Car, Josip; McKee, Martin; Busse, Reinhard
2012-10-01
The use of evidence-based clinical guidelines is an essential component of chronic disease management. However, there is well-documented concern about variability in the quality of clinical guidelines, with evidence of persisting methodological shortcomings. The most widely accepted approach to assessing the quality of guidelines is the Appraisal of Guidelines for Research and Evaluation (AGREE) instrument. We have conducted a systematic review of the methodological quality (as assessed by AGREE) of clinical guidelines developed in Europe for the management of chronic diseases published since 2000. The systematic review was undertaken in accordance with the Cochrane methodology. The inclusion criteria were that studies should have appraised European clinical guidelines for certain selected chronic disorders using the AGREE instrument. We searched five databases (Cab Abstracts, EMBASE, MEDLINE, Trip and EPPI). Nine studies reported in 10 papers, analysing a total of 28 European guidelines from eight countries as well as pan-European, were included. There was considerable variation in the quality of clinical guidelines across the AGREE domains. The least well addressed domains were 'editorial independence' (with a mean domain score of 41%), 'applicability' (44%), 'stakeholder involvement' (55%), and 'rigour of development' (64%), while 'clarity of presentation' (80%) and 'scope and purpose' (84%) were less problematic. This review indicates that there is considerable scope for improvement in the methods used to develop clinical guidelines for the prevention, management and treatment of chronic diseases in Europe. Given the importance of decision support strategies such as clinical guidelines in chronic disease management, improvement measures should include the explicit and transparent involvement of key stakeholders (especially scientific experts, guideline users and methodological specialists) and consideration of the implications for guideline implementation and applicability early on in the process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Empirical Distributional Semantics: Methods and Biomedical Applications
Cohen, Trevor; Widdows, Dominic
2009-01-01
Over the past fifteen years, a range of methods have been developed that are able to learn human-like estimates of the semantic relatedness between terms from the way in which these terms are distributed in a corpus of unannotated natural language text. These methods have also been evaluated in a number of applications in the cognitive science, computational linguistics and the information retrieval literatures. In this paper, we review the available methodologies for derivation of semantic relatedness from free text, as well as their evaluation in a variety of biomedical and other applications. Recent methodological developments, and their applicability to several existing applications are also discussed. PMID:19232399
NASA Astrophysics Data System (ADS)
Claeys, M.; Sinou, J.-J.; Lambelin, J.-P.; Todeschini, R.
2016-08-01
The nonlinear vibration response of an assembly with friction joints - named "Harmony" - is studied both experimentally and numerically. The experimental results exhibit a softening effect and an increase of dissipation with excitation level. Modal interactions due to friction are also evidenced. The numerical methodology proposed groups together well-known structural dynamic methods, including finite elements, substructuring, Harmonic Balance and continuation methods. On the one hand, the application of this methodology proves its capacity to treat a complex system where several friction movements occur at the same time. On the other hand, the main contribution of this paper is the experimental and numerical study of evidence of modal interactions due to friction. The simulation methodology succeeds in reproducing complex form of dynamic behavior such as these modal interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The final report for the project is comprised of five volumes. The volume presents the study conclusions, summarizes the methodology used (more detail is found in Volume 3), discusses four case study applications of the model, and contains profiles of coastal communities in an Appendix.
ERIC Educational Resources Information Center
Savelyeva, Tamara
2013-01-01
This study addresses the methodological and conceptual challenges associated with the application of disconnected frameworks of organizational theory and case studies, focused on "efficiency, effectiveness, and economy" to investigate complex educational phenomena in post-Soviet higher education systems under the condition of…
ERIC Educational Resources Information Center
Zimman, Richard N.
Using ethnographic case study methodology (involving open-ended interviews, participant observation, and document analysis) theories of administrative organization, processes, and behavior were tested during a three-week observation of a model comprehensive (experimental) high school. Although the study is limited in its general application, it…
Blended Teaching and Learning in the School of Science and Technology of UniSIM
ERIC Educational Resources Information Center
Toon, Andrew John; Samir, Attallah; Kheng, Jennifer Huang Mui; Chew, Lim Kin; Vythilingam, Moorthy; Kiat, Stephen Low Wee
2009-01-01
Purpose: The purpose of this paper is to investigate the blended learning preferences under which adult students study mathematics, electronics and industry certificate examinations like project management and e-SAP (systems, applications and products). Design/methodology/approach: The study is based on four case studies in mathematics,…
A Proposal for Studying the Values/Reasoning Distinction in Moral Development and Training.
ERIC Educational Resources Information Center
Kaplan, Martin F.
Application of a common framework in studies of the development of social cognition can reduce conceptual and methodological ambiguities and enable clearer study of core issues. This paper describes the core issues and their attendant problems, outlines a model of information integration that addresses the issues, and describes some illustrative…
ERIC Educational Resources Information Center
Tingerthal, John Steven
2013-01-01
Using case study methodology and autoethnographic methods, this study examines a process of curricular development known as "Decoding the Disciplines" (Decoding) by documenting the experience of its application in a construction engineering mechanics course. Motivated by the call to integrate what is known about teaching and learning…
Strategically Focused Training in Six Sigma Way: A Case Study
ERIC Educational Resources Information Center
Pandey, Ashish
2007-01-01
Purpose: The purpose of the current study is to examine the utility of Six Sigma interventions as a performance measure and explore its applicability for making the training design and delivery operationally efficient and strategically effective. Design/methodology/approach: This is a single revelatory case study. Data were collected from multiple…
Fault Modeling of Extreme Scale Applications Using Machine Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.
Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less
Fault Modeling of Extreme Scale Applications Using Machine Learning
Vishnu, Abhinav; Dam, Hubertus van; Tallent, Nathan R.; ...
2016-05-01
Faults are commonplace in large scale systems. These systems experience a variety of faults such as transient, permanent and intermittent. Multi-bit faults are typically not corrected by the hardware resulting in an error. Here, this paper attempts to answer an important question: Given a multi-bit fault in main memory, will it result in an application error — and hence a recovery algorithm should be invoked — or can it be safely ignored? We propose an application fault modeling methodology to answer this question. Given a fault signature (a set of attributes comprising of system and application state), we use machinemore » learning to create a model which predicts whether a multibit permanent/transient main memory fault will likely result in error. We present the design elements such as the fault injection methodology for covering important data structures, the application and system attributes which should be used for learning the model, the supervised learning algorithms (and potentially ensembles), and important metrics. Lastly, we use three applications — NWChem, LULESH and SVM — as examples for demonstrating the effectiveness of the proposed fault modeling methodology.« less
Collecting and validating experiential expertise is doable but poses methodological challenges.
Burda, Marika H F; van den Akker, Marjan; van der Horst, Frans; Lemmens, Paul; Knottnerus, J André
2016-04-01
To give an overview of important methodological challenges in collecting, validating, and further processing experiential expertise and how to address these challenges. Based on our own experiences in studying the concept, operationalization, and contents of experiential expertise, we have formulated methodological issues regarding the inventory and application of experiential expertise. The methodological challenges can be categorized in six developmental research stages, comprising the conceptualization of experiential expertise, methods to harvest experiential expertise, the validation of experiential expertise, evaluation of the effectiveness, how to translate experiential expertise into acceptable guidelines, and how to implement these. The description of methodological challenges and ways to handle those are illustrated using diabetes mellitus as an example. Experiential expertise can be defined and operationalized in terms of successful illness-related behaviors and translated into recommendations regarding life domains. Pathways have been identified to bridge the gaps between the world of patients' daily lives and the medical world. Copyright © 2016 Elsevier Inc. All rights reserved.
Luzardo, Octavio P; Almeida-González, Maira; Ruiz-Suárez, Norberto; Zumbado, Manuel; Henríquez-Hernández, Luis A; Meilán, María José; Camacho, María; Boada, Luis D
2015-09-01
Pesticides are frequently responsible for human poisoning and often the information on the involved substance is lacking. The great variety of pesticides that could be responsible for intoxication makes necessary the development of powerful and versatile analytical methodologies, which allows the identification of the unknown toxic substance. Here we developed a methodology for simultaneous identification and quantification in human blood of 109 highly toxic pesticides. The application of this analytical scheme would help in minimizing the cost of this type of chemical identification, maximizing the chances of identifying the pesticide involved. In the methodology that we present here, we use a liquid-liquid extraction, followed by one single purification step, and quantitation of analytes by a combination of liquid and gas chromatography, both coupled to triple quadrupole mass spectrometry, which is operated in the mode of multiple reaction monitoring. The methodology has been fully validated, and its applicability has been demonstrated in two recent cases involving one self-poisoning fatality and one non-fatal homicidal attempt. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.
Setnik, Beatrice; Schoedel, Kerri A; Levy-Cooperman, Naama; Shram, Megan; Pixton, Glenn C; Roland, Carl L
With the development of opioid abuse-deterrent formulations (ADFs), there is a need to conduct well-designed human abuse potential studies to evaluate the effectiveness of their deterrent properties. Although these types of studies have been conducted for many years, largely to evaluate inherent abuse potential of a molecule and inform drug scheduling, methodological approaches have varied across studies. The focus of this review is to describe current "best practices" and methodological adaptations required to assess abuse-deterrent opioid formulations for regulatory submissions. A literature search was conducted in PubMed® to review methodological approaches (study conduct and analysis) used in opioid human abuse potential studies. Search terms included a combination of "opioid," "opiate," "abuse potential," "abuse liability," "liking," AND "pharmacodynamic," and only studies that evaluated single doses of opioids in healthy, nondependent individuals with or without prior opioid experience were included. Seventy-one human abuse potential studies meeting the prespecified criteria were identified, of which 21 studies evaluated a purported opioid ADF. Based on these studies, key methodological considerations were reviewed and summarized according to participant demographics, study prequalification, comparator and dose selection, route of administration and drug manipulation, study blinding, outcome measures and training, safety, and statistical analyses. The authors recommend careful consideration of key elements (eg, a standardized definition of a "nondependent recreational user"), as applicable, and offer key principles and "best practices" when conducting human abuse potential studies for opioid ADFs. Careful selection of appropriate study conditions is dependent on the type of ADF technology being evaluated.
Prosser, Diann J.; Nagel, Jessica L.; Marban, Paul; Ze, Luo; Day, Daniel D.; Erwin, R. Michael
2017-01-01
In recent decades, there has been increasing interest in the application of ecological indices to assess ecosystem condition in response to anthropogenic activities. An Index of Waterbird Community Integrity was previously developed for the Chesapeake Bay, USA. However, the scoring criteria were not defined well enough to generate scores for new species that were not observed in the original study. The goal of this study was to explicitly define the scoring criteria for the existing index and to develop index scores for all waterbirds of the Chesapeake Bay. The standardized index then was applied to a case study investigating the relationship between waterbird community integrity and shoreline development during late summer and late fall (2012–2014) using an alternative approach to survey methodology, which allowed for greater area coverage compared to the approach used in the original study. Index scores for both seasons were negatively related to percentage of developed shorelines. Providing these updated tools using the detailed scoring system will facilitate future application to new species or development of the index in other estuaries worldwide. This methodology allows for consistent cross-study comparisons and can be combined with other community integrity indices, allowing for more effective estuarine management.
Evolution of 3-D geologic framework modeling and its application to groundwater flow studies
Blome, Charles D.; Smith, David V.
2012-01-01
In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.
THE EVOLUTION OF ATOMIC SPECTROSCOPY IN MEASURING TOXIC CONTAMINANTS
Three decades of study of environmental conditions necessary for the protection of freshwater
aquatic life have been limited by the development and application of analytical methodology utilizing atomic adsorption, atomic fluorescence, and atomic emission spectroscopy.
The...
Studies to determine the effectiveness of longitudinal channelizing devices in work zones.
DOT National Transportation Integrated Search
2011-01-01
This report describes the methodology and results of analyses performed to determine whether the following longitudinal : channelizing device (LCD) applications improve the traffic safety and operations of work zones relative to the use of standard :...
Application of the Delphi technique in healthcare maintenance.
Njuangang, Stanley; Liyanage, Champika; Akintoye, Akintola
2017-10-09
Purpose The purpose of this paper is to examine the research design, issues and considerations in the application of the Delphi technique to identify, refine and rate the critical success factors and performance measures in maintenance-associated infections. Design/methodology/approach In-depth literature review through the application of open and axial coding were applied to formulate the interview and research questions. These were used to conduct an exploratory case study of two healthcare maintenance managers, randomly selected from two National Health Service Foundation Trusts in England. The results of exploratory case study provided the rationale for the application of the Delphi technique in this research. The different processes in the application of the Delphi technique in healthcare research are examined thoroughly. Findings This research demonstrates the need to apply and integrate different research methods to enhance the validity of the Delphi technique. The rationale for the application of the Delphi technique in this research is because some healthcare maintenance managers lack knowledge about basic infection control (IC) principles to make hospitals safe for patient care. The result of first round of the Delphi exercise is a useful contribution in its own rights. It identified a number of salient issues and differences in the opinions of the Delphi participants, noticeably between healthcare maintenance managers and members of the infection control team. It also resulted in useful suggestions and comments to improve the quality and presentation of the second- and third-round Delphi instruments. Practical implications This research provides a research methodology that can be adopted by researchers investigating new and emerging issues in the healthcare sector. As this research demonstrates, the Delphi technique is relevant in soliciting expert knowledge and opinion to identify performance measures to control maintenance-associated infections in hospitals. The methodology provided here could be applied by other researchers elsewhere to probe, investigate and generate rich information about new and emerging healthcare research topics. Originality/value The authors demonstrate how different research methods can be integrated to enhance the validity of the Delphi technique. For example, the results of an exploratory case study provided the rationale for the application of the Delphi technique investigating the key performance measures in maintenance-associated infections. The different processes involved in the application of the Delphi technique are also carefully explored and discussed in depth.
Sparks, A N; Gadal, L; Ni, X
2015-08-01
The primary Lepidoptera pests of sweet corn (Zea mays L. convar. saccharata) in Georgia are the corn earworm, Helicoverpa zea (Boddie), and the fall armyworm, Spodoptera frugiperda (J. E. Smith). Management of these pests typically requires multiple insecticide applications from first silking until harvest, with commercial growers frequently spraying daily. This level of insecticide use presents problems for small growers, particularly for "pick-your-own" operations. Injection of oil into the corn ear silk channel 5-8 days after silking initiation has been used to suppress damage by these insects. Initial work with this technique in Georgia provided poor results. Subsequently, a series of experiments was conducted to evaluate the efficacy of silk channel injections as an application methodology for insecticides. A single application of synthetic insecticide, at greatly reduced per acre rates compared with common foliar applications, provided excellent control of Lepidoptera insects attacking the ear tip and suppressed damage by sap beetles (Nitidulidae). While this methodology is labor-intensive, it requires a single application of insecticide at reduced rates applied ∼2 wk prior to harvest, compared with potential daily applications at full rates up to the day of harvest with foliar insecticide applications. This methodology is not likely to eliminate the need for foliar applications because of other insect pests which do not enter through the silk channel or are not affected by the specific selective insecticide used in the silk channel injection, but would greatly reduce the number of applications required. This methodology may prove particularly useful for small acreage growers. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
2013-02-07
specific biosurveillance activities as well as clinical applications and alternative versions preformatted and categorized as ‘high-tech’ and ‘low-tech’ and...methodologies. Application for patent protection of this DoD intellectual property is underway. 1::1 . :::.u~~l-1 I ~111VI:::O Leishmaniasis...LHL assay and the need to develop novel and unique sample preparation methodologies. Application for patent protection of this DoD intellectual
French, Michael T; Salomé, Helena J; Sindelar, Jody L; McLellan, A Thomas
2002-04-01
To provide detailed methodological guidelines for using the Drug Abuse Treatment Cost Analysis Program (DATCAP) and Addiction Severity Index (ASI) in a benefit-cost analysis of addiction treatment. A representative benefit-cost analysis of three outpatient programs was conducted to demonstrate the feasibility and value of the methodological guidelines. Procedures are outlined for using resource use and cost data collected with the DATCAP. Techniques are described for converting outcome measures from the ASI to economic (dollar) benefits of treatment. Finally, principles are advanced for conducting a benefit-cost analysis and a sensitivity analysis of the estimates. The DATCAP was administered at three outpatient drug-free programs in Philadelphia, PA, for 2 consecutive fiscal years (1996 and 1997). The ASI was administered to a sample of 178 treatment clients at treatment entry and at 7-months postadmission. The DATCAP and ASI appear to have significant potential for contributing to an economic evaluation of addiction treatment. The benefit-cost analysis and subsequent sensitivity analysis all showed that total economic benefit was greater than total economic cost at the three outpatient programs, but this representative application is meant to stimulate future economic research rather than justifying treatment per se. This study used previously validated, research-proven instruments and methods to perform a practical benefit-cost analysis of real-world treatment programs. The study demonstrates one way to combine economic and clinical data and offers a methodological foundation for future economic evaluations of addiction treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jetter, R. I.; Messner, M. C.; Sham, T. -L.
The goal of the proposed integrated Elastic Perfectly-Plastic (EPP) and Simplified Model Test (SMT) methodology is to incorporate an SMT data based approach for creep-fatigue damage evaluation into the EPP methodology to avoid the separate evaluation of creep and fatigue damage and eliminate the requirement for stress classification in current methods; thus greatly simplifying evaluation of elevated temperature cyclic service. This methodology should minimize over-conservatism while properly accounting for localized defects and stress risers. To support the implementation of the proposed methodology and to verify the applicability of the code rules, analytical studies and evaluation of thermomechanical test results continuedmore » in FY17. This report presents the results of those studies. An EPP strain limits methodology assessment was based on recent two-bar thermal ratcheting test results on 316H stainless steel in the temperature range of 405 to 7050C. Strain range predictions from the EPP evaluation of the two-bar tests were also evaluated and compared with the experimental results. The role of sustained primary loading on cyclic life was assessed using the results of pressurized SMT data from tests on Alloy 617 at 9500C. A viscoplastic material model was used in an analytic simulation of two-bar tests to compare with EPP strain limits assessments using isochronous stress strain curves that are consistent with the viscoplastic material model. A finite element model of a prior 304H stainless steel Oak Ridge National Laboratory (ORNL) nozzle-to-sphere test was developed and used for an EPP strain limits and creep-fatigue code case damage evaluations. A theoretical treatment of a recurring issue with convergence criteria for plastic shakedown illustrated the role of computer machine precision in EPP calculations.« less
Analysis of SBIR phase I and phase II review results at the National Institutes of Health.
Vener, K J; Calkins, B M
1991-09-01
A cohort of phase I and phase II summary statements for the SBIR grant applications was evaluated to determine the strengths and weaknesses in approved and disapproved applications. An analysis of outcome variables (disapproval or unfunded status) was examined with respect to exposure variables (strengths or shortcomings). Logistic regression models were developed for comparisons to measure the predictive value of shortcomings and strengths to the outcomes. Disapproved phase I results were compared with an earlier 1985 study. Although the magnitude of the frequencies of shortcomings was greater in the present study, the relative rankings within shortcoming class were more alike than different. Also, the frequencies of shortcomings were, with one exception, not significantly different in the two studies. Differences in the summary statement review may have accounted for some differences observed between the 1985 data and results of the present study. Comparisons of Approved/Disapproved and Approved-Unfunded/Funded yielded the following observations. For phase I applicants, a lack of a clearly stated, testable hypothesis, a poorly qualified or described investigative team, and inadequate methodological approaches contributed significantly (in that order) to a rating of disapproval. A critical flaw for phase II proposals was failure to accomplish objectives of the phase I study. Methodological issues also dominate the distinctions in both comparison groups. A clear result of the data presented here and that published previously is that SBIR applicants need continuing assistance to improve the chances of their success. These results should serve as a guide to assist NIH staff as they provide information to prospective applicants focusing on key elements of the application. A continuing review of the SBIR program would be helpful to evaluate the quality of the submitted science.
75 FR 32195 - Procedures and Costs for Use of the Research Data Center
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-07
... related variables are published so that outside analysts may conduct original research and special studies... and the proposals should address why the requested data are needed for the proposed study. Overly... their application does not constitute endorsement by NCHS of the substantive, methodological...
Four receptor-oriented source apportionment models were applied to personal exposure measurements for toxic volatile organic compounds (VOCs). The measurements are from the total exposure assessment methodology studies conducted from 1980 to 1984 in New Jersey (NJ) and Califor...
Class Extraction and Classification Accuracy in Latent Class Models
ERIC Educational Resources Information Center
Wu, Qiong
2009-01-01
Despite the increasing popularity of latent class models (LCM) in educational research, methodological studies have not yet accumulated much information on the appropriate application of this modeling technique, especially with regard to requirement on sample size and number of indicators. This dissertation study represented an initial attempt to…
The Application of Survival Analysis to the Study of Psychotherapy Termination
ERIC Educational Resources Information Center
Corning, Alexadra F.; Malofeeva, Elena V.
2004-01-01
The state of the psychotherapy termination literature to date might best be characterized as inconclusive. Despite decades of studies, almost no predictors of premature termination have emerged consistently. An examination of this literature reveals a number of recurrent methodological-analytical problems that likely have contributed substantially…
Engaging or Distracting: Children's Tablet Computer Use in Education
ERIC Educational Resources Information Center
McEwen, Rhonda N.; Dubé, Adam K.
2015-01-01
Communications studies and psychology offer analytical and methodological tools that when combined have the potential to bring novel perspectives on human interaction with technologies. In this study of children using simple and complex mathematics applications on tablet computers, cognitive load theory is used to answer the question: how…
The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research
ERIC Educational Resources Information Center
Mertens, Steven B.
2006-01-01
This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…
48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.
Code of Federal Regulations, 2011 CFR
2011-10-01
... used. If escalation is included, state the degree (percent) and methodology. The methodology shall.... If so, state the number required, the professional or technical level and the methodology used to... for which the salary is applicable; (C) List of other research Projects or proposals for which...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sandor, Debra; Chung, Donald; Keyser, David
This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Thomas; Trail, Jessica; Gevondyan, Erna
During times of crisis, communities and regions rely heavily on critical infrastructure systems to support their emergency management response and recovery activities. Therefore, the resilience of critical infrastructure systems to crises is a pivotal factor to a community’s overall resilience. Critical infrastructure resilience can be influenced by many factors, including State policies – which are not always uniform in their structure or application across the United States – were identified by the U.S. Department of Homeland Security as an area of particular interest with respect to their the influence on the resilience of critical infrastructure systems. This study focuses onmore » developing an analytical methodology to assess links between policy and resilience, and applies that methodology to critical infrastructure in the Transportation Systems Sector. Specifically, this study seeks to identify potentially influential linkages between State transportation capital funding policies and the resilience of bridges located on roadways that are under the management of public agencies. This study yielded notable methodological outcomes, including the general capability of the analytical methodology to yield – in the case of some States – significant results connecting State policies with critical infrastructure resilience, with the suggestion that further refinement of the methodology may be beneficial.« less
Liese, Angela D; Crandell, Jamie L; Tooze, Janet A; Kipnis, Victor; Bell, Ronny; Couch, Sarah C; Dabelea, Dana; Crume, Tessa L; Mayer-Davis, Elizabeth J
2015-08-14
The SEARCH Nutrition Ancillary Study aims to investigate the role of dietary intake on the development of long-term complications of type 1 diabetes in youth, and capitalise on measurement error (ME) adjustment methodology. Using the National Cancer Institute (NCI) method for episodically consumed foods, we evaluated the relationship between sugar-sweetened beverage (SSB) intake and cardiovascular risk factor profile, with the application of ME adjustment methodology. The calibration sample included 166 youth with two FFQ and three 24 h dietary recall data within 1 month. The full sample included 2286 youth with type 1 diabetes. SSB intake was significantly associated with higher TAG, total and LDL-cholesterol concentrations, after adjusting for energy, age, diabetes duration, race/ethnicity, sex and education. The estimated effect size was larger (model coefficients increased approximately 3-fold) after the application of the NCI method than without adjustment for ME. Compared with individuals consuming one serving of SSB every 2 weeks, those who consumed one serving of SSB every 2 d had 3.7 mg/dl (0.04 mmol/l) higher TAG concentrations and 4.0 mg/dl (0.10 mmol/l) higher total cholesterol and LDL-cholesterol concentrations, after adjusting for ME and covariates. SSB intake was not associated with measures of adiposity and blood pressure. Our findings suggest that SSB intake is significantly related to increased lipid levels in youth with type 1 diabetes, and that estimates of the effect size of SSB on lipid levels are severely attenuated in the presence of ME. Future studies in youth with diabetes should consider a design that will allow for the adjustment for ME when studying the influence of diet on health status.
NASA Astrophysics Data System (ADS)
Velasco-Forero, Carlos A.; Sempere-Torres, Daniel; Cassiraga, Eduardo F.; Jaime Gómez-Hernández, J.
2009-07-01
Quantitative estimation of rainfall fields has been a crucial objective from early studies of the hydrological applications of weather radar. Previous studies have suggested that flow estimations are improved when radar and rain gauge data are combined to estimate input rainfall fields. This paper reports new research carried out in this field. Classical approaches for the selection and fitting of a theoretical correlogram (or semivariogram) model (needed to apply geostatistical estimators) are avoided in this study. Instead, a non-parametric technique based on FFT is used to obtain two-dimensional positive-definite correlograms directly from radar observations, dealing with both the natural anisotropy and the temporal variation of the spatial structure of the rainfall in the estimated fields. Because these correlation maps can be automatically obtained at each time step of a given rainfall event, this technique might easily be used in operational (real-time) applications. This paper describes the development of the non-parametric estimator exploiting the advantages of FFT for the automatic computation of correlograms and provides examples of its application on a case study using six rainfall events. This methodology is applied to three different alternatives to incorporate the radar information (as a secondary variable), and a comparison of performances is provided. In particular, their ability to reproduce in estimated rainfall fields (i) the rain gauge observations (in a cross-validation analysis) and (ii) the spatial patterns of radar fields are analyzed. Results seem to indicate that the methodology of kriging with external drift [KED], in combination with the technique of automatically computing 2-D spatial correlograms, provides merged rainfall fields with good agreement with rain gauges and with the most accurate approach to the spatial tendencies observed in the radar rainfall fields, when compared with other alternatives analyzed.
Applying Case-Based Method in Designing Self-Directed Online Instruction: A Formative Research Study
ERIC Educational Resources Information Center
Luo, Heng; Koszalka, Tiffany A.; Arnone, Marilyn P.; Choi, Ikseon
2018-01-01
This study investigated the case-based method (CBM) instructional-design theory and its application in designing self-directed online instruction. The purpose of this study was to validate and refine the theory for a self-directed online instruction context. Guided by formative research methodology, this study first developed an online tutorial…
ERIC Educational Resources Information Center
Nonnenmacher, Alexandra; Friedrichs, Jurgen
2013-01-01
To explain country differences in an analytical or structural dependent variable, the application of a macro-micro-model containing contextual hypotheses is necessary. Our methodological study examines whether empirical studies apply such a model. We propose that a theoretical base for country differences is well described in multilevel studies,…
NASA Astrophysics Data System (ADS)
Erduran, Sibel; Simon, Shirley; Osborne, Jonathan
2004-11-01
This paper reports some methodological approaches to the analysis of argumentation discourse developed as part of the two-and-a-half year project titled Enhancing the Quality of Argument in School Scienc'' supported by the Economic and Social Research Council in the United Kingdom. In this project researchers collaborated with middle-school science teachers to develop models of instructional activities in an effort to make argumentation a component of instruction. We begin the paper with a brief theoretical justification for why we consider argumentation to be of significance to science education. We then contextualize the use of Toulmin's Argument Pattern in the study of argumentation discourse and provide a justification for the methodological outcomes our approach generates. We illustrate how our work refines and develops research methodologies in argumentation analysis. In particular, we present two methodological approaches to the analysis of argumentation resulting in whole-class as well as small-group student discussions. For each approach, we illustrate our coding scheme and some results as well as how our methodological approach has enabled our inquiry into the quality of argumentation in the classroom. We conclude with some implications for future research in argumentation in science education.
Transportation Systems Evaluation
NASA Technical Reports Server (NTRS)
Fanning, M. L.; Michelson, R. A.
1972-01-01
A methodology for the analysis of transportation systems consisting of five major interacting elements is reported. The analysis begins with the causes of travel demand: geographic, economic, and demographic characteristics as well as attitudes toward travel. Through the analysis, the interaction of these factors with the physical and economic characteristics of the transportation system is determined. The result is an evaluation of the system from the point of view of both passenger and operator. The methodology is applicable to the intraurban transit systems as well as major airlines. Applications of the technique to analysis of a PRT system and a study of intraurban air travel are given. In the discussion several unique models or techniques are mentioned: i.e., passenger preference modeling, an integrated intraurban transit model, and a series of models to perform airline analysis.
A Robustness Testing Campaign for IMA-SP Partitioning Kernels
NASA Astrophysics Data System (ADS)
Grixti, Stephen; Lopez Trecastro, Jorge; Sammut, Nicholas; Zammit-Mangion, David
2015-09-01
With time and space partitioned architectures becoming increasingly appealing to the European space sector, the dependability of partitioning kernel technology is a key factor to its applicability in European Space Agency projects. This paper explores the potential of the data type fault model, which injects faults through the Application Program Interface, in partitioning kernel robustness testing. This fault injection methodology has been tailored to investigate its relevance in uncovering vulnerabilities within partitioning kernels and potentially contributing towards fault removal campaigns within this domain. This is demonstrated through a robustness testing case study of the XtratuM partitioning kernel for SPARC LEON3 processors. The robustness campaign exposed a number of vulnerabilities in XtratuM, exhibiting the potential benefits of using such a methodology for the robustness assessment of partitioning kernels.
NASA Astrophysics Data System (ADS)
Dionne, J. P.; Levine, J.; Makris, A.
2018-01-01
To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.
Stoop, Rahel; Clijsen, Ron; Leoni, Diego; Soldini, Emiliano; Castellini, Greta; Redaelli, Valentina; Barbero, Marco
2017-08-01
The methodological quality of controlled clinical trials (CCTs) of physiotherapeutic treatment modalities for myofascial trigger points (MTrP) has not been investigated yet. To detect the methodological quality of CCTs for physiotherapy treatments of MTrPs and demonstrating the possible increase over time. Systematic review. A systematic search was conducted in two databases, Physiotherapy Evidence Database (PEDro) and Medicine Medical Literature Analysis and Retrieval System online (MEDLINE), using the same keywords and selection procedure corresponding to pre-defined inclusion criteria. The methodological quality, assessed by the 11-item PEDro scale, served as outcome measure. The CCTs had to compare at least two interventions, where one intervention had to lay within the scope of physiotherapy. Participants had to be diagnosed with myofascial pain syndrome or trigger points (active or latent). A total of n = 230 studies was analysed. The cervico-thoracic region was the most frequently treated body part (n = 143). Electrophysical agent applications was the most frequent intervention. The average methodological quality reached 5.5 on the PEDro scale. A total of n = 6 studies scored the value of 9. The average PEDro score increased by 0.7 points per decade between 1978 and 2015. The average PEDro score of CCTs for MTrP treatments does not reach the cut-off of 6 proposed for moderate to high methodological quality. Nevertheless, a promising trend towards an increase of the average methodological quality of CCTs for MTrPs was recorded. More high-quality CCT studies with thorough research procedures are recommended to enhance methodological quality. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Tretinoin peel: a critical view*
Sumita, Juliana Mayumi; Leonardi, Gislaine Ricci; Bagatin, Ediléia
2017-01-01
The tretinoin peel, also known as retinoic acid peel, is a superficial peeling often performed in dermatological clinics in Brazil. The first study on this was published in 2001, by Cuce et al., as a treatment option for melasma. Since then, other studies have reported its applicability with reasonable methodology, although without a consistent scientific background and consensus. Topical tretinoin is used for the treatment of various dermatoses such as acne, melasma, scars, skin aging and non-melanoma skin cancer. The identification of retinoids cellular receptors was reported in 1987, but a direct cause-effect relation has not been established. This article reviews studies evaluating the use of topical tretinoin as agent for superficial chemical peel. Most of them have shown benefits in the treatment of melasma and skin aging. A better quality methodology in the study design, considering indication and intervention is indispensable regarding concentration, vehicle and treatment regimen (interval and number of applications). Additionally, more controlled and randomized studies comparing the treatment with tretinoin cream versus its use as a peeling agent, mainly for melasma and photoaging, are necessary. PMID:29186249
Tretinoin peel: a critical view.
Sumita, Juliana Mayumi; Leonardi, Gislaine Ricci; Bagatin, Ediléia
2017-01-01
The tretinoin peel, also known as retinoic acid peel, is a superficial peeling often performed in dermatological clinics in Brazil. The first study on this was published in 2001, by Cuce et al., as a treatment option for melasma. Since then, other studies have reported its applicability with reasonable methodology, although without a consistent scientific background and consensus. Topical tretinoin is used for the treatment of various dermatoses such as acne, melasma, scars, skin aging and non-melanoma skin cancer. The identification of retinoids cellular receptors was reported in 1987, but a direct cause-effect relation has not been established. This article reviews studies evaluating the use of topical tretinoin as agent for superficial chemical peel. Most of them have shown benefits in the treatment of melasma and skin aging. A better quality methodology in the study design, considering indication and intervention is indispensable regarding concentration, vehicle and treatment regimen (interval and number of applications). Additionally, more controlled and randomized studies comparing the treatment with tretinoin cream versus its use as a peeling agent, mainly for melasma and photoaging, are necessary.
Kalil, Andre C; Sun, Junfeng
2014-10-01
To review Bayesian methodology and its utility to clinical decision making and research in the critical care field. Clinical, epidemiological, and biostatistical studies on Bayesian methods in PubMed and Embase from their inception to December 2013. Bayesian methods have been extensively used by a wide range of scientific fields, including astronomy, engineering, chemistry, genetics, physics, geology, paleontology, climatology, cryptography, linguistics, ecology, and computational sciences. The application of medical knowledge in clinical research is analogous to the application of medical knowledge in clinical practice. Bedside physicians have to make most diagnostic and treatment decisions on critically ill patients every day without clear-cut evidence-based medicine (more subjective than objective evidence). Similarly, clinical researchers have to make most decisions about trial design with limited available data. Bayesian methodology allows both subjective and objective aspects of knowledge to be formally measured and transparently incorporated into the design, execution, and interpretation of clinical trials. In addition, various degrees of knowledge and several hypotheses can be tested at the same time in a single clinical trial without the risk of multiplicity. Notably, the Bayesian technology is naturally suited for the interpretation of clinical trial findings for the individualized care of critically ill patients and for the optimization of public health policies. We propose that the application of the versatile Bayesian methodology in conjunction with the conventional statistical methods is not only ripe for actual use in critical care clinical research but it is also a necessary step to maximize the performance of clinical trials and its translation to the practice of critical care medicine.
Khan, F I; Iqbal, A; Ramesh, N; Abbasi, S A
2001-10-12
As it is conventionally done, strategies for incorporating accident--prevention measures in any hazardous chemical process industry are developed on the basis of input from risk assessment. However, the two steps-- risk assessment and hazard reduction (or safety) measures--are not linked interactively in the existing methodologies. This prevents a quantitative assessment of the impacts of safety measures on risk control. We have made an attempt to develop a methodology in which risk assessment steps are interactively linked with implementation of safety measures. The resultant system tells us the extent of reduction of risk by each successive safety measure. It also tells based on sophisticated maximum credible accident analysis (MCAA) and probabilistic fault tree analysis (PFTA) whether a given unit can ever be made 'safe'. The application of the methodology has been illustrated with a case study.
DB4US: A Decision Support System for Laboratory Information Management.
Carmona-Cejudo, José M; Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-11-14
Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.
78 FR 68449 - Notice of Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-14
... RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With Multiple Chronic... applications for the ``AHRQ RFA-HS14-002, Addressing Methodological Challenges in Research for Patients With...
Technology transfer methodology
NASA Technical Reports Server (NTRS)
Labotz, Rich
1991-01-01
Information on technology transfer methodology is given in viewgraph form. Topics covered include problems in economics, technology drivers, inhibitors to using improved technology in development, technology application opportunities, and co-sponsorship of technology.
An applicational process for dynamic balancing of turbomachinery shafting
NASA Technical Reports Server (NTRS)
Verhoff, Vincent G.
1990-01-01
The NASA Lewis Research Center has developed and implemented a time-efficient methodology for dynamically balancing turbomachinery shafting. This methodology minimizes costly facility downtime by using a balancing arbor (mandrel) that simulates the turbomachinery (rig) shafting. The need for precision dynamic balancing of turbomachinery shafting and for a dynamic balancing methodology is discussed in detail. Additionally, the inherent problems (and their causes and effects) associated with unbalanced turbomachinery shafting as a function of increasing shaft rotational speeds are discussed. Included are the design criteria concerning rotor weight differentials for rotors made of different materials that have similar parameters and shafting. The balancing methodology for applications where rotor replaceability is a requirement is also covered. This report is intended for use as a reference when designing, fabricating, and troubleshooting turbomachinery shafting.
Part two: Qualitative research.
Quick, J; Hall, S
2015-01-01
This second article in the series Spotlight on Research focuses on qualitative research, its applications, principles and methodologies. It provides an insight into how this approach can be used within the perioperative setting and gives advice for practitioners looking to undertake a qualitative research study.
Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik
2016-01-01
Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.
Discrete choice experiments in pharmacy: a review of the literature.
Naik-Panvelkar, Pradnya; Armour, Carol; Saini, Bandana
2013-02-01
Discrete choice experiments (DCEs) have been widely used to elicit patient preferences for various healthcare services and interventions. The aim of our study was to conduct an in-depth scoping review of the literature and provide a current overview of the progressive application of DCEs within the field of pharmacy. Electronic databases (MEDLINE, EMBASE, SCOPUS, ECONLIT) were searched (January 1990-August 2011) to identify published English language studies using DCEs within the pharmacy context. Data were abstracted with respect to DCE methodology and application to pharmacy. Our search identified 12 studies. The DCE methodology was utilised to elicit preferences for different aspects of pharmacy products, therapy or services. Preferences were elicited from either patients or pharmacists, with just two studies incorporating the views of both. Most reviewed studies examined preferences for process-related or provider-related aspects with a lesser focus on health outcomes. Monetary attributes were considered to be important by most patients and pharmacists in the studies reviewed. Logit, probit or multinomial logit models were most commonly employed for estimation. Our study showed that the pharmacy profession has adopted the DCE methodology consistent with the general health DCEs although the number of studies is quite limited. Future studies need to examine preferences of both patients and providers for particular products or disease-state management services. Incorporation of health outcome attributes in the design, testing for external validity and the incorporation of DCE results in economic evaluation framework to inform pharmacy policy remain important areas for future research. © 2012 The Authors. IJPP © 2012 Royal Pharmaceutical Society.
Unicomb, Rachael; Colyvas, Kim; Harrison, Elisabeth; Hewat, Sally
2015-06-01
Case-study methodology studying change is often used in the field of speech-language pathology, but it can be criticized for not being statistically robust. Yet with the heterogeneous nature of many communication disorders, case studies allow clinicians and researchers to closely observe and report on change. Such information is valuable and can further inform large-scale experimental designs. In this research note, a statistical analysis for case-study data is outlined that employs a modification to the Reliable Change Index (Jacobson & Truax, 1991). The relationship between reliable change and clinical significance is discussed. Example data are used to guide the reader through the use and application of this analysis. A method of analysis is detailed that is suitable for assessing change in measures with binary categorical outcomes. The analysis is illustrated using data from one individual, measured before and after treatment for stuttering. The application of this approach to assess change in categorical, binary data has potential application in speech-language pathology. It enables clinicians and researchers to analyze results from case studies for their statistical and clinical significance. This new method addresses a gap in the research design literature, that is, the lack of analysis methods for noncontinuous data (such as counts, rates, proportions of events) that may be used in case-study designs.
NMR contributions to structural dynamics studies of intrinsically disordered proteins☆
Konrat, Robert
2014-01-01
Intrinsically disordered proteins (IDPs) are characterized by substantial conformational plasticity. Given their inherent structural flexibility X-ray crystallography is not applicable to study these proteins. In contrast, NMR spectroscopy offers unique opportunities for structural and dynamic studies of IDPs. The past two decades have witnessed significant development of NMR spectroscopy that couples advances in spin physics and chemistry with a broad range of applications. This article will summarize key advances in basic physical-chemistry and NMR methodology, outline their limitations and envision future R&D directions. PMID:24656082
Ball, Elaine; McLoughlin, Moira; Darvill, Angela
2011-04-01
Qualitative methodology has increased in application and acceptability in all research disciplines. In nursing, it is appropriate that a plethora of qualitative methods can be found as nurses pose real-world questions to clinical, cultural and ethical issues of patient care (Johnson, 2007; Long and Johnson, 2007), yet the methods nurses readily use in pursuit of answers remains under intense scrutiny. One of the problems with qualitative methodology for nursing research is its place in the hierarchy of evidence (HOE); another is its comparison to the positivist constructs of what constitutes good research and the measurement of qualitative research against this. In order to position and strengthen its evidence base, nursing may well seek to distance itself from a qualitative perspective and utilise methods at the top of the HOE; yet given the relation of qualitative methods to nursing this would constrain rather than broaden the profession in search of answers and an evidence base. The comparison between qualitative and quantitative can be both mutually exclusive and rhetorical, by shifting the comparison this study takes a more reflexive position and critically appraises qualitative methods against the standards set by qualitative researchers. By comparing the design and application of qualitative methods in nursing over a two year period, the study examined how qualitative stands up to independent rather than comparative scrutiny. For the methods, a four-step mixed methods approach newly constructed by the first author was used to define the scope of the research question and develop inclusion criteria. 2. Synthesis tables were constructed to organise data, 3. Bibliometrics configured data. 4. Studies selected for inclusion in the review were critically appraised using a critical interpretive synthesis (Dixon-Woods et al., 2006). The paper outlines the research process as well as findings. Results showed of the 240 papers analysed, 27% used ad hoc or no references to qualitative; methodological terms such as thematic analysis or constant comparative methods were used inconsistently; qualitative was a catch-all panacea rather than a methodology with well-argued terms or contextual definition. Copyright © 2010 Elsevier Ltd. All rights reserved.
T-4G Methodology: Undergraduate Pilot Training T-37 Phase.
ERIC Educational Resources Information Center
Woodruff, Robert R.; And Others
The report's brief introduction describes the application of T-4G methodology to the T-37 instrument phase of undergraduate pilot training. The methodology is characterized by instruction in trainers, proficiency advancement, a highly structured syllabus, the training manager concept, early exposure to instrument training, and hands-on training.…
Methodological Limitations of the Application of Expert Systems Methodology in Reading.
ERIC Educational Resources Information Center
Willson, Victor L.
Methodological deficiencies inherent in expert-novice reading research make it impossible to draw inferences about curriculum change. First, comparisons of intact groups are often used as a basis for making causal inferences about how observed characteristics affect behaviors. While comparing different groups is not by itself a useless activity,…
Applications of Mass Spectrometry for Cellular Lipid Analysis
Wang, Chunyan; Wang, Miao; Han, Xianlin
2015-01-01
Mass spectrometric analysis of cellular lipids is an enabling technology for lipidomics, which is a rapidly-developing research field. In this review, we briefly discuss the principles, advantages, and possible limitations of electrospray ionization (ESI) and matrix assisted laser desorption/ionization (MALDI) mass spectrometry-based methodologies for the analysis of lipid species. The applications of these methodologies to lipidomic research are also summarized. PMID:25598407
Recent developments and applications of immobilized laccase.
Fernández-Fernández, María; Sanromán, M Ángeles; Moldes, Diego
2013-12-01
Laccase is a promising biocatalyst with many possible applications, including bioremediation, chemical synthesis, biobleaching of paper pulp, biosensing, textile finishing and wine stabilization. The immobilization of enzymes offers several improvements for enzyme applications because the storage and operational stabilities are frequently enhanced. Moreover, the reusability of immobilized enzymes represents a great advantage compared with free enzymes. In this work, we discuss the different methodologies of enzyme immobilization that have been reported for laccases, such as adsorption, entrapment, encapsulation, covalent binding and self-immobilization. The applications of laccase immobilized by the aforementioned methodologies are presented, paying special attention to recent approaches regarding environmental applications and electrobiochemistry. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
1974-01-01
The purpose of the BRAVO User's Manual is to describe the BRAVO methodology in terms of step-by-step procedures. The BRAVO methodology then becomes a tool which a team of analysts can utilize to perform cost effectiveness analyses on potential future space applications with a relatively general set of input information and a relatively small expenditure of resources. An overview of the BRAVO procedure is given by describing the complete procedure in a general form.
Volume, Conservation and Instruction: A Classroom Based Solomon Four Group Study of Conflict.
ERIC Educational Resources Information Center
Rowell, J. A.; Dawson, C. J.
1981-01-01
Summarizes a study to widen the applicability of Piagetian theory-based conflict methodology from individual situations to entire classes. A Solomon four group design was used to study effects of conflict instruction on students' (N=127) ability to conserve volume of noncompressible matter and to apply that knowledge to gas volume. (Author/JN)
Application of Computers in Methodical Planning of Natural and Social Studies
ERIC Educational Resources Information Center
Muradbegovic, Aida; Zufic, Janko
2005-01-01
Learning preparedness of students is becoming one of the most important issues in modern education, and it could be established through development of new culture of methodology and teaching at all educational levels. In this study, we started with the premise that quality teaching of the subject of natural and social studies in first four grades…
ERIC Educational Resources Information Center
Scott, Marc A.
2012-01-01
The field of composition studies has benefitted from applications of feminist, materialist, postcolonial and similar critical theories to the teaching and study of written texts. In addition, critical theories continue to make a significant impact on the teaching and study of writing and other co-fields of inquiry such as writing center and…
Investigating transport pathways in the ocean
NASA Astrophysics Data System (ADS)
Griffa, Annalisa; Haza, Angelique; Özgökmen, Tamay M.; Molcard, Anne; Taillandier, Vincent; Schroeder, Katrin; Chang, Yeon; Poulain, P.-M.
2013-01-01
The ocean is a very complex medium with scales of motion that range from thousands of kilometers to the dissipation scales. Transport by ocean currents plays an important role in many practical applications ranging from climatic problems to coastal management and accident mitigation at sea. Understanding transport is challenging because of the chaotic nature of particle motion. In the last decade, new methods have been put forth to improve our understanding of transport. Powerful tools are provided by dynamical system theory, that allow the identification of the barriers to transport and their time variability for a given flow. A shortcoming of this approach, though, is that it is based on the assumption that the velocity field is known with good accuracy, which is not always the case in practical applications. Improving model performance in terms of transport can be addressed using another important methodology that has been recently developed, namely the assimilation of Lagrangian data provided by floating buoys. The two methodologies are technically different but in many ways complementary. In this paper, we review examples of applications of both methodologies performed by the authors in the last few years, considering flows at different scales and in various ocean basins. The results are among the very first examples of applications of the methodologies to the real ocean including testing with Lagrangian in-situ data. The results are discussed in the general framework of the extended fields related to these methodologies, pointing out to open questions and potential for improvements, with an outlook toward future strategies.
Interpretation of remotely sensed data and its applications in oceanography
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Tanaka, K.; Inostroza, H. M.; Verdesio, J. J.
1982-01-01
The methodology of interpretation of remote sensing data and its oceanographic applications are described. The elements of image interpretation for different types of sensors are discussed. The sensors utilized are the multispectral scanner of LANDSAT, and the thermal infrared of NOAA and geostationary satellites. Visual and automatic data interpretation in studies of pollution, the Brazil current system, and upwelling along the southeastern Brazilian coast are compared.
Applications of aerospace technology in biology and medicine
NASA Technical Reports Server (NTRS)
Rouse, D. J.
1983-01-01
Utilization of NASA technology and its application to medicine is discussed. The introduction of new or improved commercially available medical products and incorporation of aerospace technology is outlined. A biopolar donor-recipient model of medical technology transfer is presented to provide a basis for the methodology. The methodology is designed to: (1) identify medical problems and NASA technology that, in combination, constitute opportunities for successful medical products; (2) obtain the early participation of industry in the transfer process; and (3) obtain acceptance by the medical community of new medical products based on NASA technology. Two commercial transfers were completed: the ocular screening device, a system for quick detection of vision problems in preschool children, and Porta-Fib III, a hospital monitoring unit. Two institutional transfers were completed: implant materials testing, the application of NASA fracture control technology to improve reliability of metallic prostheses, and incinerator monitoring, a quadrupole mass spectrometer to monitor combustion products of municipal incinerators. Mobility aids for the blind and ultrasound diagnosis of burn depth are also studied.
Patorno, Elisabetta; Patrick, Amanda R; Garry, Elizabeth M; Schneeweiss, Sebastian; Gillet, Victoria G; Bartels, Dorothee B; Masso-Gonzalez, Elvira; Seeger, John D
2014-11-01
Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges. From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.
Contribution to the application of two-colour imaging to diesel combustion
NASA Astrophysics Data System (ADS)
Payri, F.; Pastor, J. V.; García, J. M.; Pastor, J. M.
2007-08-01
The two-colour method (2C) is a well-known methodology for the estimation of flame temperature and the soot-related KL factor. A 2C imaging system has been built with a single charge-coupled device (CCD) camera for visualization of the diesel flame in a single-cylinder 2-stroke engine with optical accesses. The work presented here focuses on methodological aspects. In that sense, the influence of calibration uncertainties on the measured temperature and KL factor has been analysed. Besides, a theoretical study is presented that tries to link the true flame temperature and soot distributions with those derived from the 2C images. Finally, an experimental study has been carried out in order to show the influence of injection pressure, air density and temperature on the 2C-derived parameters. Comparison with the expected results has shown the limitations of this methodology for diesel flame analysis.
GetReal in network meta-analysis: a review of the methodology.
Efthimiou, Orestis; Debray, Thomas P A; van Valkenhoef, Gert; Trelle, Sven; Panayidou, Klea; Moons, Karel G M; Reitsma, Johannes B; Shang, Aijing; Salanti, Georgia
2016-09-01
Pairwise meta-analysis is an established statistical tool for synthesizing evidence from multiple trials, but it is informative only about the relative efficacy of two specific interventions. The usefulness of pairwise meta-analysis is thus limited in real-life medical practice, where many competing interventions may be available for a certain condition and studies informing some of the pairwise comparisons may be lacking. This commonly encountered scenario has led to the development of network meta-analysis (NMA). In the last decade, several applications, methodological developments, and empirical studies in NMA have been published, and the area is thriving as its relevance to public health is increasingly recognized. This article presents a review of the relevant literature on NMA methodology aiming to pinpoint the developments that have appeared in the field. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Sara, Balazs; Antonini, Ernesto; Tarantini, Mario
2001-02-01
The VAMP project (VAlorization of building demolition Materials and Products, LIFE 98/ENV/IT/33) aims to build an effective and innovative information system to support decision making in selective demolition activity and to manage the valorization (recovery-reuse-recycling) of waste flows produced by the construction and demolition (C&D) sector. The VAMP information system will be tested it in Italy in some case studies of selective demolition. In this paper the proposed demolition-valorization system will be compared to the traditional one in a life cycle perspective, applying LCA methodology to highlight the advantages of VAMP system from an eco-sustainability point of view. Within the system boundaries demolition processes, transport of demolition wastes and its recovery/treatment or disposal in landfill were included. Processes avoided due to reuse-recycling activities, such as extraction of natural resources and manufacture of building materials and components, were considered too. In this paper data collection procedure applied in inventory and impact assessment phases and a general overview about data availability for LCA studies in this sector are presented. Results of application of VAMP methodology to a case study are discussed and compared with a simulated traditional demolition of the same building. Environmental advantages of VAMP demolition-valorization system are demonstrated quantitatively emphasizing the special importance of reuse of building components with high demand of energy for manufacture.
Application of machine learning methodology for pet-based definition of lung cancer
Kerhet, A.; Small, C.; Quon, H.; Riauka, T.; Schrader, L.; Greiner, R.; Yee, D.; McEwan, A.; Roa, W.
2010-01-01
We applied a learning methodology framework to assist in the threshold-based segmentation of non-small-cell lung cancer (nsclc) tumours in positron-emission tomography–computed tomography (pet–ct) imaging for use in radiotherapy planning. Gated and standard free-breathing studies of two patients were independently analysed (four studies in total). Each study had a pet–ct and a treatment-planning ct image. The reference gross tumour volume (gtv) was identified by two experienced radiation oncologists who also determined reference standardized uptake value (suv) thresholds that most closely approximated the gtv contour on each slice. A set of uptake distribution-related attributes was calculated for each pet slice. A machine learning algorithm was trained on a subset of the pet slices to cope with slice-to-slice variation in the optimal suv threshold: that is, to predict the most appropriate suv threshold from the calculated attributes for each slice. The algorithm’s performance was evaluated using the remainder of the pet slices. A high degree of geometric similarity was achieved between the areas outlined by the predicted and the reference suv thresholds (Jaccard index exceeding 0.82). No significant difference was found between the gated and the free-breathing results in the same patient. In this preliminary work, we demonstrated the potential applicability of a machine learning methodology as an auxiliary tool for radiation treatment planning in nsclc. PMID:20179802
Towards a Unified Approach to Information Integration - A review paper on data/information fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitney, Paul D.; Posse, Christian; Lei, Xingye C.
2005-10-14
Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, informationmore » is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.« less
Design for dependability: A simulation-based approach. Ph.D. Thesis, 1993
NASA Technical Reports Server (NTRS)
Goswami, Kumar K.
1994-01-01
This research addresses issues in simulation-based system level dependability analysis of fault-tolerant computer systems. The issues and difficulties of providing a general simulation-based approach for system level analysis are discussed and a methodology that address and tackle these issues is presented. The proposed methodology is designed to permit the study of a wide variety of architectures under various fault conditions. It permits detailed functional modeling of architectural features such as sparing policies, repair schemes, routing algorithms as well as other fault-tolerant mechanisms, and it allows the execution of actual application software. One key benefit of this approach is that the behavior of a system under faults does not have to be pre-defined as it is normally done. Instead, a system can be simulated in detail and injected with faults to determine its failure modes. The thesis describes how object-oriented design is used to incorporate this methodology into a general purpose design and fault injection package called DEPEND. A software model is presented that uses abstractions of application programs to study the behavior and effect of software on hardware faults in the early design stage when actual code is not available. Finally, an acceleration technique that combines hierarchical simulation, time acceleration algorithms and hybrid simulation to reduce simulation time is introduced.
Rodríguez-Prieto, V; Vicente-Rubiano, M; Sánchez-Matamoros, A; Rubio-Guerri, C; Melero, M; Martínez-López, B; Martínez-Avilés, M; Hoinville, L; Vergne, T; Comin, A; Schauer, B; Dórea, F; Pfeiffer, D U; Sánchez-Vizcaíno, J M
2015-07-01
In this globalized world, the spread of new, exotic and re-emerging diseases has become one of the most important threats to animal production and public health. This systematic review analyses conventional and novel early detection methods applied to surveillance. In all, 125 scientific documents were considered for this study. Exotic (n = 49) and re-emerging (n = 27) diseases constituted the most frequently represented health threats. In addition, the majority of studies were related to zoonoses (n = 66). The approaches found in the review could be divided in surveillance modalities, both active (n = 23) and passive (n = 5); and tools and methodologies that support surveillance activities (n = 57). Combinations of surveillance modalities and tools (n = 40) were also found. Risk-based approaches were very common (n = 60), especially in the papers describing tools and methodologies (n = 50). The main applications, benefits and limitations of each approach were extracted from the papers. This information will be very useful for informing the development of tools to facilitate the design of cost-effective surveillance strategies. Thus, the current literature review provides key information about the advantages, disadvantages, limitations and potential application of methodologies for the early detection of new, exotic and re-emerging diseases.
Improving automation standards via semantic modelling: Application to ISA88.
Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès
2017-03-01
Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Sanchez, E Y; Represa, S; Mellado, D; Balbi, K B; Acquesta, A D; Colman Lerner, J E; Porta, A A
2018-06-15
The potential impact of a technological accident can be assessed by risk estimation. Taking this into account, the latent or potential condition can be warned and mitigated. In this work we propose a methodology to estimate risk of technological hazards, focused on two components. The first one is the processing of meteorological databases to define the most probably and conservative scenario of study, and the second one, is the application of a local social vulnerability index to classify the population. In this case of study, the risk was estimated for a hypothetical release of liquefied ammonia in a meat-packing industry in the city of La Plata, Argentina. The method consists in integrating the simulated toxic threat zone with ALOHA software, and the layer of sociodemographic classification of the affected population. The results show the areas associated with higher risks of exposure to ammonia, which are worth being addressed for the prevention of disasters in the region. Advantageously, this systemic approach is methodologically flexible as it provides the possibility of being applied in various scenarios based on the available information of both, the exposed population and its meteorology. Furthermore, this methodology optimizes the processing of the input data and its calculation. Copyright © 2018 Elsevier B.V. All rights reserved.
Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yau, M.; Motamed, M.; Guarro, S.
2006-07-01
Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less
The PHM-Ethics methodology: interdisciplinary technology assessment of personal health monitoring.
Schmidt, Silke; Verweij, Marcel
2013-01-01
The contribution briefly introduces the PHM Ethics project and the PHM methodology. Within the PHM-Ethics project, a set of tools and modules had been developed that may assist in the evaluation and assessment of new technologies for personal health monitoring, referred to as "PHM methodology" or "PHM toolbox". An overview on this interdisciplinary methodology and its comprising modules is provided, areas of application and intended target groups are indicated.
Market projections of cellulose nanomaterial-enabled products-- Part 2: Volume estimates
John Cowie; E.M. (Ted) Bilek; Theodore H. Wegner; Jo Anne Shatkin
2014-01-01
Nanocellulose has enormous potential to provide an important materials platform in numerous product sectors. This study builds on previous work by the same authors in which likely high-volume, low-volume, and novel applications for cellulosic nanomaterials were identified. In particular, this study creates a transparent methodology and estimates the potential annual...
Mobile Phone Applications in Academic Library Services: A Students' Feedback Survey
ERIC Educational Resources Information Center
Karim, Nor Shahriza Abdul; Darus, Siti Hawa; Hussin, Ramlah
2006-01-01
Purpose: This study seeks to explore the utilization of mobile phone services in the educational environment, explore the nature of mobile phone use among university students, and investigate the perception of university students on mobile phone uses in library and information services. Design/methodology/approach: The study used a review of…
Logistic Achievement Test Scaling and Equating with Fixed versus Estimated Lower Asymptotes.
ERIC Educational Resources Information Center
Phillips, S. E.
This study compared the lower asymptotes estimated by the maximum likelihood procedures of the LOGIST computer program with those obtained via application of the Norton methodology. The study also compared the equating results from the three-parameter logistic model with those obtained from the equipercentile, Rasch, and conditional…
An Empirical Research Study of the Efficacy of Two Plagiarism-Detection Applications
ERIC Educational Resources Information Center
Hill, Jacob D.; Page, Elaine Fetyko
2009-01-01
This article describes a study of the two most popular plagiarism-detection software platforms available on today's market--Turnitin (http://www.turnitin.com/static/index.html) and SafeAssign (http://www.safeassign.com/). After a brief discussion of plagiarism's relevance to librarians, the authors examine plagiarism-detection methodology and…
A Methodology in the Teaching Process of Calculus and Its Motivation.
ERIC Educational Resources Information Center
Vasquez-Martinez, Claudio-Rafael
The development of calculus and science by being permanent, didactic, demands on one part an analytical, deductive study and on another an application of methods, rhochrematics, resources, within calculus, which allows to dialectically conform knowledge in its different phases and to test the results. For the purpose of this study, the motivation…
Revista de Investigacion Educativa, 1998 (Journal of Educational Research, 1998).
ERIC Educational Resources Information Center
Revista de Investigacion Educativa, 1998
1998-01-01
Articles in this volume focus on the following: specialized research; methodological challenges; establishment of a categorization system for sociometric analysis and its application in the multicultural classroom; a case study of factors to prevent school failure in children at risk; the KeyMatch-R scale (study of a curriculum-related diagnostic…
Knowledge Management Model: Practical Application for Competency Development
ERIC Educational Resources Information Center
Lustri, Denise; Miura, Irene; Takahashi, Sergio
2007-01-01
Purpose: This paper seeks to present a knowledge management (KM) conceptual model for competency development and a case study in a law service firm, which implemented the KM model in a competencies development program. Design/methodology/approach: The case study method was applied according to Yin (2003) concepts, focusing a six-professional group…
NEW SAMPLING THEORY FOR MEASURING ECOSYSTEM STRUCTURE
This research considered the application of systems analysis to the study of laboratory ecosystems. The work concerned the development of a methodology which was shown to be useful in the design of laboratory experiments, the processing and interpretation of the results of these ...
Advances in Artificial Neural Networks - Methodological Development and Application
USDA-ARS?s Scientific Manuscript database
Artificial neural networks as a major soft-computing technology have been extensively studied and applied during the last three decades. Research on backpropagation training algorithms for multilayer perceptron networks has spurred development of other neural network training algorithms for other ne...
APPLICATION OF BENCHMARK DOSE METHODOLOGY TO DATA FROM PRENATAL DEVELOPMENTAL TOXICITY STUDIES
The benchmark dose (BMD) concept was applied to 246 conventional developmental toxicity datasets from government, industry and commercial laboratories. Five modeling approaches were used, two generic and three specific to developmental toxicity (DT models). BMDs for both quantal ...
Multimedia Sampling During The Application Of Biosolids On A Land Test Site
This report documents the approach, methodologies, results, and interpretation of a collaborative research study conducted by the National Risk Management Research Center (NRMRL) of the U.S. Environmental Protection Agency's (U.S. EPA's) Office of Research and Development (ORD); ...
Utility of distributed hydrologic and water quality models for watershed management and sustainability studies should be accompanied by rigorous model uncertainty analysis. However, the use of complex watershed models primarily follows the traditional {calibrate/validate/predict}...
Suggested criteria for evaluating systems engineering methodologies
NASA Technical Reports Server (NTRS)
Gates, Audrey; Paul, Arthur S.; Gill, Tepper L.
1989-01-01
Systems engineering is the application of mathematical and scientific principles to practical ends in the life-cycle of a system. A methodology for systems engineering is a carefully developed, relatively complex procedure or process for applying these mathematical and scientific principles. There are many systems engineering methodologies (or possibly many versions of a few methodologies) currently in use in government and industry. These methodologies are usually designed to meet the needs of a particular organization. It has been observed, however, that many technical and non-technical problems arise when inadequate systems engineering methodologies are applied by organizations to their systems development projects. Various criteria for evaluating systems engineering methodologies are discussed. Such criteria are developed to assist methodology-users in identifying and selecting methodologies that best fit the needs of the organization.
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2013 CFR
2013-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2011 CFR
2011-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2012 CFR
2012-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
42 CFR 436.601 - Application of financial eligibility methodologies.
Code of Federal Regulations, 2014 CFR
2014-10-01
... methodologies in determining financial eligibility of the following groups: (i) Qualified pregnant women and children under the mandatory categorically needy group under § 436.120; (ii) Low-income pregnant women...
GPS system simulation methodology
NASA Technical Reports Server (NTRS)
Ewing, Thomas F.
1993-01-01
The following topics are presented: background; Global Positioning System (GPS) methodology overview; the graphical user interface (GUI); current models; application to space nuclear power/propulsion; and interfacing requirements. The discussion is presented in vugraph form.
A biased review of biases in Twitter studies on political collective action
NASA Astrophysics Data System (ADS)
Cihon, Peter; Yasseri, Taha
2016-08-01
In recent years researchers have gravitated to Twitter and other social media platforms as fertile ground for empirical analysis of social phenomena. Social media provides researchers access to trace data of interactions and discourse that once went unrecorded in the offline world. Researchers have sought to use these data to explain social phenomena both particular to social media and applicable to the broader social world. This paper offers a minireview of Twitter-based research on political crowd behaviour. This literature offers insight into particular social phenomena on Twitter, but often fails to use standardized methods that permit interpretation beyond individual studies. Moreover, the literature fails to ground methodologies and results in social or political theory, divorcing empirical research from the theory needed to interpret it. Rather, investigations focus primarily on methodological innovations for social media analyses, but these too often fail to sufficiently demonstrate the validity of such methodologies. This minireview considers a small number of selected papers; we analyse their (often lack of) theoretical approaches, review their methodological innovations, and offer suggestions as to the relevance of their results for political scientists and sociologists.
Methodologies for Root Locus and Loop Shaping Control Design with Comparisons
NASA Technical Reports Server (NTRS)
Kopasakis, George
2017-01-01
This paper describes some basics for the root locus controls design method as well as for loop shaping, and establishes approaches to expedite the application of these two design methodologies to easily obtain control designs that meet requirements with superior performance. The two design approaches are compared for their ability to meet control design specifications and for ease of application using control design examples. These approaches are also compared with traditional Proportional Integral Derivative (PID) control in order to demonstrate the limitations of PID control. Robustness of these designs is covered as it pertains to these control methodologies and for the example problems.
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2010-01-01
Methodologies for understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2011-01-01
Methodologies for understanding the plastic deformation mechanisms related 10 crack propagation at the nano, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
CARVALHO, Suzana Papile Maciel; BRITO, Liz Magalhães; de PAIVA, Luiz Airton Saavedra; BICUDO, Lucilene Arilho Ribeiro; CROSATO, Edgard Michel; de OLIVEIRA, Rogério Nogueira
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. Objective This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. Material and Methods The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. Results The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. Conclusion It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study. PMID:24037076
Carvalho, Suzana Papile Maciel; Brito, Liz Magalhães; Paiva, Luiz Airton Saavedra de; Bicudo, Lucilene Arilho Ribeiro; Crosato, Edgard Michel; Oliveira, Rogério Nogueira de
2013-01-01
Validation studies of physical anthropology methods in the different population groups are extremely important, especially in cases in which the population variations may cause problems in the identification of a native individual by the application of norms developed for different communities. This study aimed to estimate the gender of skeletons by application of the method of Oliveira, et al. (1995), previously used in a population sample from Northeast Brazil. The accuracy of this method was assessed for a population from Southeast Brazil and validated by statistical tests. The method used two mandibular measurements, namely the bigonial distance and the mandibular ramus height. The sample was composed of 66 skulls and the method was applied by two examiners. The results were statistically analyzed by the paired t test, logistic discriminant analysis and logistic regression. The results demonstrated that the application of the method of Oliveira, et al. (1995) in this population achieved very different outcomes between genders, with 100% for females and only 11% for males, which may be explained by ethnic differences. However, statistical adjustment of measurement data for the population analyzed allowed accuracy of 76.47% for males and 78.13% for females, with the creation of a new discriminant formula. It was concluded that methods involving physical anthropology present high rate of accuracy for human identification, easy application, low cost and simplicity; however, the methodologies must be validated for the different populations due to differences in ethnic patterns, which are directly related to the phenotypic aspects. In this specific case, the method of Oliveira, et al. (1995) presented good accuracy and may be used for gender estimation in Brazil in two geographic regions, namely Northeast and Southeast; however, for other regions of the country (North, Central West and South), previous methodological adjustment is recommended as demonstrated in this study.
2008-07-23
This final rule applies to the Temporary Assistance for Needy Families (TANF) program and requires States, the District of Columbia and the Territories (hereinafter referred to as the "States") to use the "benefiting program" cost allocation methodology in U.S. Office of Management and Budget (OMB) Circular A-87 (2 CFR part 225). It is the judgment and determination of HHS/ACF that the "benefiting program" cost allocation methodology is the appropriate methodology for the proper use of Federal TANF funds. The Personal Responsibility and Work Opportunity Reconciliation Act (PRWORA) of 1996 gave federally-recognized Tribes the opportunity to operate their own Tribal TANF programs. Federally-recognized Indian tribes operating approved Tribal TANF programs have always followed the "benefiting program" cost allocation methodology in accordance with OMB Circular A-87 (2 CFR part 225) and the applicable regulatory provisions at 45 CFR 286.45(c) and (d). This final rule contains no substantive changes to the proposed rule published on September 27, 2006.
NASA Astrophysics Data System (ADS)
Greca, Ileana M.
2016-03-01
Several international reports promote the use of the inquiry teaching methodology for improvements in science education at elementary school. Nevertheless, research indicates that pre-service elementary teachers have insufficient experience with this methodology and when they try to implement it, the theory they learnt in their university education clashes with the classroom practice they observe, a problem that has also been noted with other innovative methodologies. So, it appears essential for pre-service teachers to conduct supportive reflective practice during their education to integrate theory and practice, which various studies suggest is not usually done. Our study shows how opening up a third discursive space can assist this supportive reflective practice. The third discursive space appears when pre-service teachers are involved in specific activities that allow them to contrast the discourses of theoretical knowledge taught at university with practical knowledge arising from their ideas on science and science teaching and their observations during classroom practice. The case study of three pre-service teachers shows that this strategy was fundamental in helping them to integrate theory and practice, resulting in a better understanding of the inquiry methodology and its application in the classroom.
Warehouses information system design and development
NASA Astrophysics Data System (ADS)
Darajatun, R. A.; Sukanta
2017-12-01
Materials/goods handling industry is fundamental for companies to ensure the smooth running of their warehouses. Efficiency and organization within every aspect of the business is essential in order to gain a competitive advantage. The purpose of this research is design and development of Kanban of inventory storage and delivery system. Application aims to facilitate inventory stock checks to be more efficient and effective. Users easily input finished goods from production department, warehouse, customer, and also suppliers. Master data designed as complete as possible to be prepared applications used in a variety of process logistic warehouse variations. The author uses Java programming language to develop the application, which is used for building Java Web applications, while the database used is MySQL. System development methodology that I use is the Waterfall methodology. Waterfall methodology has several stages of the Analysis, System Design, Implementation, Integration, Operation and Maintenance. In the process of collecting data the author uses the method of observation, interviews, and literature.
Sequeiros, R C P; Neng, N R; Portugal, F C M; Pinto, M L; Pires, J; Nogueira, J M F
2011-04-01
This work describes the development, validation, and application of a novel methodology for the determination of testosterone and methenolone in urine matrices by stir bar sorptive extraction using polyurethane foams [SBSE(PU)] followed by liquid desorption and high-performance liquid chromatography with diode array detection. The methodology was optimized in terms of extraction time, agitation speed, pH, ionic strength and organic modifier, as well as back-extraction solvent and desorption time. Under optimized experimental conditions, convenient accuracy were achieved with average recoveries of 49.7 8.6% for testosterone and 54.2 ± 4.7% for methenolone. Additionally, the methodology showed good precision (<9%), excellent linear dynamic ranges (>0.9963) and convenient detection limits (0.2-0.3 μg/L). When comparing the efficiency obtained by SBSE(PU) and with the conventional polydimethylsiloxane phase [SBSE(PDMS)], yields up to four-fold higher are attained for the former, under the same experimental conditions. The application of the proposed methodology for the analysis of testosterone and methenolone in urine matrices showed negligible matrix effects and good analytical performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrieta, Gabriela, E-mail: tonina1903@hotmail.com; Requena, Ignacio, E-mail: requena@decsai.ugr.es; Toro, Javier, E-mail: jjtoroca@unal.edu.co
Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable withinmore » the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management. • The follow-up of environmental management plans may help diminish the implementation gap in Environmental Impact Assessment.« less
Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving
Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice
2016-01-01
The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171
Methodological and Pedagogical Potential of Reflection in Development of Contemporary Didactics
ERIC Educational Resources Information Center
Chupina, Valentina A.; Pleshakova, Anastasiia Yu.; Konovalova, Maria E.
2016-01-01
Applicability of the issue under research is preconditioned by the need of practical pedagogics to expand methodological and methodical tools of contemporary didactics. The purpose of the article is to detect the methodological core of reflection as a form of thinking and to provide insight thereunto on the basis of systematic attributes of the…
Determining Faculty and Student Views: Applications of Q Methodology in Higher Education
ERIC Educational Resources Information Center
Ramlo, Susan
2012-01-01
William Stephenson specifically developed Q methodology, or Q, as a means of measuring subjectivity. Q has been used to determine perspectives/views in a wide variety of fields from marketing research to political science but less frequently in education. In higher education, the author has used Q methodology to determine views about a variety of…
A theoretical and experimental investigation of propeller performance methodologies
NASA Technical Reports Server (NTRS)
Korkan, K. D.; Gregorek, G. M.; Mikkelson, D. C.
1980-01-01
This paper briefly covers aspects related to propeller performance by means of a review of propeller methodologies; presentation of wind tunnel propeller performance data taken in the NASA Lewis Research Center 10 x 10 wind tunnel; discussion of the predominent limitations of existing propeller performance methodologies; and a brief review of airfoil developments appropriate for propeller applications.
The temporal structure of pollution levels in developed cities.
Barrigón Morillas, Juan Miguel; Ortiz-Caraballo, Carmen; Prieto Gajardo, Carlos
2015-06-01
Currently, the need for mobility can cause significant pollution levels in cities, with important effects on health and quality of life. Any approach to the study of urban pollution and its effects requires an analysis of spatial distribution and temporal variability. It is a crucial dilemma to obtain proven methodologies that allow an increase in the quality of the prediction and the saving of resources in the spatial and temporal sampling. This work proposes a new analytical methodology in the study of temporal structure. As a result, a model for estimating annual levels of urban traffic noise was proposed. The average errors are less than one decibel in all acoustics indicators. A new working methodology of urban noise has begun. Additionally, a general application can be found for the study of the impacts of pollution associated with traffic, with implications for urban design and possibly in economic and sociological aspects. Copyright © 2015 Elsevier B.V. All rights reserved.
An eco-balance of a recycling plant for spent lead-acid batteries.
Salomone, Roberta; Mondello, Fabio; Lanuzza, Francesco; Micali, Giuseppe
2005-02-01
This study applies Life Cycle Assessment (LCA) methodology to present an eco-balance of a recycling plant that treats spent lead-acid batteries. The recycling plant uses pyrometallurgical treatment to obtain lead from spent batteries. The application of LCA methodology (ISO 14040 series) enabled us to assess the potential environmental impacts arising from the recycling plant's operations. Thus, net emissions of greenhouse gases as well as other major environmental consequences were examined and hot spots inside the recycling plant were identified. A sensitivity analysis was also performed on certain variables to evaluate their effect on the LCA study. The LCA of a recycling plant for spent lead-acid batteries presented shows that this methodology allows all of the major environmental consequences associated with lead recycling using the pyrometallurgical process to be examined. The study highlights areas in which environmental improvements are easily achievable by a business, providing a basis for suggestions to minimize the environmental impact of its production phases, improving process and company performance in environmental terms.
Navarro, Alejandra; Puig, Rita; Fullana-I-Palmer, Pere
2017-03-01
Carbon footprint (CF) is nowadays one of the most widely used environmental indicators. The scope of the CF assessment could be corporate (when all production processes of a company are evaluated, together with upstream and downstream processes following a life cycle approach) or product (when one of the products is evaluated throughout its life cycle). Our hypothesis was that usually product CF studies (PCF) collect corporate data, because it is easier for companies to obtain them than product data. Six main methodological issues to take into account when collecting corporate data to be used for PCF studies were postulated and discussed in the present paper: fugitive emissions, credits from waste recycling, use of "equivalent factors", reference flow definition, accumulation and allocation of corporate values to minor products. A big project with 18 wineries, being wine one of the most important agri-food products assessed through CF methodologies, was used to study and to exemplify these 6 methodological issues. One of the main conclusions was that indeed, it is possible to collect corporate inventory data in a per year basis to perform a PCF, but having in mind the 6 methodological issues described here. In the literature, most of the papers are presenting their results as a PCF, while they collected company data and obtained, in fact, a "key performance indicator" (ie., CO 2 eq emissions per unit of product produced), which is then used as a product environmental impact figure. The methodology discussed in this paper for the wine case study is widely applicable to any other product or industrial activity. Copyright © 2017 Elsevier B.V. All rights reserved.
Evans-Agnew, Robin A; Johnson, Susan; Liu, Fuqin; Boutain, Doris M
2016-08-01
Critical discourse analysis (CDA) is a promising methodology for policy research in nursing. As a critical theoretical methodology, researchers use CDA to analyze social practices and language use in policies to examine whether such policies may promote or impede social transformation. Despite the widespread use of CDA in other disciplines such as education and sociology, nursing policy research employing CDA methodology is sparse. To advance CDA use in nursing science, it is important to outline the overall research strategies and describe the steps of CDA in policy research. This article describes, using exemplar case studies, how nursing and health policy researchers can employ CDA as a methodology. Three case studies are provided to discuss the application of CDA research methodologies in nursing policy research: (a) implementation of preconception care policies in the Zhejiang province of China, (b) formation and enactment of statewide asthma policy in Washington state of the United States, and (c) organizational implementation of employee antibullying policies in hospital systems in the Pacific Northwest of the United States. Each exemplar details how CDA guided the examination of policy within specific contexts and social practices. The variations of the CDA approaches in the three exemplars demonstrated the flexibilities and potentials for conducting policy research grounded in CDA. CDA provides novel insights for nurse researchers examining health policy formation, enactment, and implementation. © The Author(s) 2016.
Environment, genes, and experience: lessons from behavior genetics.
Barsky, Philipp I
2010-11-01
The article reviews the theoretical analysis of the problems inherent in studying the environment within behavior genetics across several periods in the development of environmental studies in behavior genetics and proposes some possible alternatives to traditional approaches to studying the environment in behavior genetics. The first period (from the end of the 1920s to the end of the 1970s), when the environment was not actually studied, is called pre-environmental; during this time, the basic principles and theoretical models of understanding environmental effects in behavior genetics were developed. The second period is characterized by the development of studies on environmental influences within the traditional behavior genetics paradigm; several approaches to studying the environment emerged in behavior genetics during this period, from the beginning of the 1980s until today. At the present time, the field is undergoing paradigmatic changes, concerned with methodology, theory, and mathematical models of genotype-environment interplay; this might be the beginning of a third period of development of environmental studies in behavior genetics. In another part, the methodological problems related to environmental studies in behavior genetics are discussed. Although the methodology used in differential psychology is applicable for assessment of differences between individuals, it is insufficient to explain the sources of these differences. In addition, we stress that psychoanalytic studies of twins and their experiences, initiated in the 1930s and continued episodically until the 1980s, could bring an interesting methodology and contribute to the explanation of puzzling findings from environmental studies of behavior genetics. Finally, we will conclude with implications from the results of environmental studies in behavior genetics, including methodological issues. Copyright © 2010 Elsevier Ltd. All rights reserved.
Application of gamma spectrometry in the Kola peninsula (in Russian)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Golovin, I.V.; Kolesnik, N.I.; Antipov, V.S.
1973-01-01
The methodology used and results obtained in gamma spectrometric studies of pre-Cambrian formations of some nickel-bearing regions of the Kola Penlnsula are described. The radioactive element contents of typical metamorphic and magmatic complexes and sulfide ores are presented. (au-trans)
Methods for Estimating the Uncertainty in Emergy Table-Form Models
Emergy studies have suffered criticism due to the lack of uncertainty analysis and this shortcoming may have directly hindered the wider application and acceptance of this methodology. Recently, to fill this gap, the sources of uncertainty in emergy analysis were described and an...
Training and Farmers' Organizations' Performance
ERIC Educational Resources Information Center
Miiro, Richard F.; Matsiko, Frank B.; Mazur, Robert E.
2014-01-01
Purpose: This study sought to determine the influence of training transfer factors and actual application of training on organization level outcomes among farmer owned produce marketing organizations in Uganda. Design/methodology/approach: Interviews based on the Learning Transfer Systems Inventory (LTSI) were conducted with 120 PMO leaders…
34 CFR 462.11 - What must an application contain?
Code of Federal Regulations, 2010 CFR
2010-07-01
... the methodology and procedures used to measure the reliability of the test. (h) Construct validity... previous test, and results from validity, reliability, and equating or standard-setting studies undertaken... NRS educational functioning levels (content validity). Documentation of the extent to which the items...
Remote sensing of Qatar nearshore habitats with perspectives for coastal management.
Warren, Christopher; Dupont, Jennifer; Abdel-Moati, Mohamed; Hobeichi, Sanaa; Palandro, David; Purkis, Sam
2016-04-30
A framework is proposed for utilizing remote sensing and ground-truthing field data to map benthic habitats in the State of Qatar, with potential application across the Arabian Gulf. Ideally the methodology can be applied to optimize the efficiency and effectiveness of mapping the nearshore environment to identify sensitive habitats, monitor for change, and assist in management decisions. The framework is applied to a case study for northeastern Qatar with a key focus on identifying high sensitivity coral habitat. The study helps confirm the presence of known coral and provides detail on a region in the area of interest where corals have not been previously mapped. Challenges for the remote sensing methodology associated with natural heterogeneity of the physical and biological environment are addressed. Recommendations on the application of this approach to coastal environmental risk assessment and management planning are discussed as well as future opportunities for improvement of the framework. Copyright © 2015 Elsevier Ltd. All rights reserved.
Roosta, M; Ghaedi, M; Daneshfar, A; Sahraei, R; Asghari, A
2014-01-01
The present study was focused on the removal of methylene blue (MB) from aqueous solution by ultrasound-assisted adsorption onto the gold nanoparticles loaded on activated carbon (Au-NP-AC). This nanomaterial was characterized using different techniques such as SEM, XRD, and BET. The effects of variables such as pH, initial dye concentration, adsorbent dosage (g), temperature and sonication time (min) on MB removal were studied and using central composite design (CCD) and the optimum experimental conditions were found with desirability function (DF) combined response surface methodology (RSM). Fitting the experimental equilibrium data to various isotherm models such as Langmuir, Freundlich, Tempkin and Dubinin-Radushkevich models show the suitability and applicability of the Langmuir model. Analysis of experimental adsorption data to various kinetic models such as pseudo-first and second order, Elovich and intraparticle diffusion models show the applicability of the second-order equation model. The small amount of proposed adsorbent (0.01 g) is applicable for successful removal of MB (RE>95%) in short time (1.6 min) with high adsorption capacity (104-185 mg g(-1)). Copyright © 2013 Elsevier B.V. All rights reserved.
LEVELS OF SYNTHETIC MUSKS COMPOUNDS IN AQUATIC ...
Synthetic musk compounds are consumer chemicals manufactured as fragrance materials Due to their high worldwide usage and release, they frequently occur in the aquatic and marine environments. The U.S. EPA (ORD, Las Vegas) developed surface-water monitoring methodology and conducted a one-year monthly monitoring of synthetic musks in water and biota from Lake Mead (Nevada) as well as from combined sewage effluent streams feeding Lake Mead. Presented are the overview of the chemistry, the monitoring methodology, and the significance of synthetic musk compounds in the aquatic environment. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water, and ORD in the area of Water Quality. Located In the subtasks are the various research projects being performed in support of this Task and more in-depth coverage of each project. Briefly, each project's objective is stated below.Subtask 1: To integrate state-of-the-art technologies (polar organic chemical integrative samplers, advanced solid-phase extraction methodologies with liquid chromatography/electrospray/mass spectrometry) and apply them to studying the sources and fate of a select list of PPCPs. Application and improvement of analytical methodologies that can detect non-volatile, polar, water-soluble pharmaceuticals in source waters at levels that could be environmentally significant (at concentrations less than p
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaminsky, J.; Tschanz, J.F.
In order to adress barriers to community energy-conservation efforts, DOE has established the Comprehensive Community Energy Management (CCEM) program. The role of CCEM is to provide direction and technical support for energy-conservation efforts at the local level. The program to date has included project efforts to develop combinations and variations of community energy planning and management tools applicable to communities of diverse characteristics. This paper describes the salient features of some of the tools and relates them to the testing program soon to begin in several pilot-study communities. Two methodologies that arose within such an actual planning context are takenmore » from DOE-sponsored projects in Clarksburg, West Virginia and the proposed new capital city for Alaska. Energy management in smaller communities and/or communities with limited funding and manpower resources has received special attention. One project of this type developed in general methodology that emphasizes efficient ways for small communities to reach agreement on local energy problems and potential solutions; by this guidance, the community is led to understand where it should concentrate its efforts in subsequent management activities. Another project concerns rapid growth of either a new or an existing community that could easily outstrip the management resources available locally. This methodology strives to enable the community to seize the opportunity for energy conservation through integrating the design of its energy systems and its development pattern. The last methodology creates applicable tools for comprehensive community energy planning. (MCW)« less
NASA Astrophysics Data System (ADS)
Abdeljabbar Kharrat, Nourhene; Plateaux, Régis; Miladi Chaabane, Mariem; Choley, Jean-Yves; Karra, Chafik; Haddar, Mohamed
2018-05-01
The present work tackles the modeling of multi-physics systems applying a topological approach while proceeding with a new methodology using a topological modification to the structure of systems. Then the comparison with the Magos' methodology is made. Their common ground is the use of connectivity within systems. The comparison and analysis of the different types of modeling show the importance of the topological methodology through the integration of the topological modification to the topological structure of a multi-physics system. In order to validate this methodology, the case of Pogo-stick is studied. The first step consists in generating a topological graph of the system. Then the connectivity step takes into account the contact with the ground. During the last step of this research; the MGS language (Modeling of General System) is used to model the system through equations. Finally, the results are compared to those obtained by MODELICA. Therefore, this proposed methodology may be generalized to model multi-physics systems that can be considered as a set of local elements.
Introducing a methodology for estimating duration of surgery in health services research.
Redelmeier, Donald A; Thiruchelvam, Deva; Daneman, Nick
2008-09-01
The duration of surgery is an indicator for the quality, risks, and efficiency of surgical procedures. We introduce a new methodology for assessing the duration of surgery based on anesthesiology billing records, along with reviewing its fundamental logic and limitations. The validity of the methodology was assessed through a population-based cohort of patients (n=480,986) undergoing elective operations in 246 Ontario hospitals with 1,084 anesthesiologists between April 1, 1992 and March 31, 2002 (10 years). The weaknesses of the methodology relate to missing data, self-serving exaggerations by providers, imprecisions from clinical diversity, upper limits due to accounting regulations, fluctuations from updates over the years, national differences in reimbursement schedules, and the general failings of claims base analyses. The strengths of the methodology are in providing data that match clinical experiences, correspond to chart review, are consistent over time, can detect differences where differences would be anticipated, and might have implications for examining patient outcomes after long surgical times. We suggest that an understanding and application of large studies of surgical duration may help scientists explore selected questions concerning postoperative complications.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
Kubelka, Jan
2009-04-01
Many important biochemical processes occur on the time-scales of nanoseconds and microseconds. The introduction of the laser temperature-jump (T-jump) to biophysics more than a decade ago opened these previously inaccessible time regimes up to direct experimental observation. Since then, laser T-jump methodology has evolved into one of the most versatile and generally applicable methods for studying fast biomolecular kinetics. This perspective is a review of the principles and applications of the laser T-jump technique in biophysics. A brief overview of the T-jump relaxation kinetics and the historical development of laser T-jump methodology is presented. The physical principles and practical experimental considerations that are important for the design of the laser T-jump experiments are summarized. These include the Raman conversion for generating heating pulses, considerations of size, duration and uniformity of the temperature jump, as well as potential adverse effects due to photo-acoustic waves, cavitation and thermal lensing, and their elimination. The laser T-jump apparatus developed at the NIH Laboratory of Chemical Physics is described in detail along with a brief survey of other laser T-jump designs in use today. Finally, applications of the laser T-jump in biophysics are reviewed, with an emphasis on the broad range of problems where the laser T-jump methodology has provided important new results and insights into the dynamics of the biomolecular processes.
DB4US: A Decision Support System for Laboratory Information Management
Hortas, Maria Luisa; Baena-García, Manuel; Lana-Linati, Jorge; González, Carlos; Redondo, Maximino; Morales-Bueno, Rafael
2012-01-01
Background Until recently, laboratory automation has focused primarily on improving hardware. Future advances are concentrated on intelligent software since laboratories performing clinical diagnostic testing require improved information systems to address their data processing needs. In this paper, we propose DB4US, an application that automates information related to laboratory quality indicators information. Currently, there is a lack of ready-to-use management quality measures. This application addresses this deficiency through the extraction, consolidation, statistical analysis, and visualization of data related to the use of demographics, reagents, and turn-around times. The design and implementation issues, as well as the technologies used for the implementation of this system, are discussed in this paper. Objective To develop a general methodology that integrates the computation of ready-to-use management quality measures and a dashboard to easily analyze the overall performance of a laboratory, as well as automatically detect anomalies or errors. The novelty of our approach lies in the application of integrated web-based dashboards as an information management system in hospital laboratories. Methods We propose a new methodology for laboratory information management based on the extraction, consolidation, statistical analysis, and visualization of data related to demographics, reagents, and turn-around times, offering a dashboard-like user web interface to the laboratory manager. The methodology comprises a unified data warehouse that stores and consolidates multidimensional data from different data sources. The methodology is illustrated through the implementation and validation of DB4US, a novel web application based on this methodology that constructs an interface to obtain ready-to-use indicators, and offers the possibility to drill down from high-level metrics to more detailed summaries. The offered indicators are calculated beforehand so that they are ready to use when the user needs them. The design is based on a set of different parallel processes to precalculate indicators. The application displays information related to tests, requests, samples, and turn-around times. The dashboard is designed to show the set of indicators on a single screen. Results DB4US was deployed for the first time in the Hospital Costa del Sol in 2008. In our evaluation we show the positive impact of this methodology for laboratory professionals, since the use of our application has reduced the time needed for the elaboration of the different statistical indicators and has also provided information that has been used to optimize the usage of laboratory resources by the discovery of anomalies in the indicators. DB4US users benefit from Internet-based communication of results, since this information is available from any computer without having to install any additional software. Conclusions The proposed methodology and the accompanying web application, DB4US, automates the processing of information related to laboratory quality indicators and offers a novel approach for managing laboratory-related information, benefiting from an Internet-based communication mechanism. The application of this methodology has been shown to improve the usage of time, as well as other laboratory resources. PMID:23608745
van den Noort, Josien C; Verhagen, Rens; van Dijk, Kees J; Veltink, Peter H; Vos, Michelle C P M; de Bie, Rob M A; Bour, Lo J; Heida, Ciska T
2017-10-01
This proof-of-principle study describes the methodology and explores and demonstrates the applicability of a system, existing of miniature inertial sensors on the hand and a separate force sensor, to objectively quantify hand motor symptoms in patients with Parkinson's disease (PD) in a clinical setting (off- and on-medication condition). Four PD patients were measured in off- and on- dopaminergic medication condition. Finger tapping, rapid hand opening/closing, hand pro/supination, tremor during rest, mental task and kinetic task, and wrist rigidity movements were measured with the system (called the PowerGlove). To demonstrate applicability, various outcome parameters of measured hand motor symptoms of the patients in off- vs. on-medication condition are presented. The methodology described and results presented show applicability of the PowerGlove in a clinical research setting, to objectively quantify hand bradykinesia, tremor and rigidity in PD patients, using a single system. The PowerGlove measured a difference in off- vs. on-medication condition in all tasks in the presented patients with most of its outcome parameters. Further study into the validity and reliability of the outcome parameters is required in a larger cohort of patients, to arrive at an optimal set of parameters that can assist in clinical evaluation and decision-making.
78 FR 66681 - Census Advisory Committees
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-06
..., filing of petitions and applications and agency #0;statements of organization and functions are examples... policies, research and methodology, tests, operations, communications/messaging and other activities to..., socioeconomic, linguistic, technological, methodological, geographic, behavioral and operational variables...
Application of a statewide intermodal freight planning methodology.
DOT National Transportation Integrated Search
2001-08-01
Anticipating the need for Virginia to comply with the new freight planning requirements mandated by ISTEA and TEA-21, the Virginia Transportation Research Council in 1998 developed a Statewide Intermodal Freight Transportation Planning Methodology, w...
ERIC Educational Resources Information Center
Exarchou, Evi; Klonari, Aikaterini; Lambrinos, Nikos; Vaitis, Michalis
2017-01-01
This study focused on the analysis of Grade-12 (Senior) students' sociocultural constructivist interactions using Web 2.0 applications during a geographical research process. In the study methodology context, a transdisciplinary case study (TdCS) with ethnographic and research action data was designed, implemented and analyzed in real teaching…
Materials Selection Criteria for Nuclear Power Applications: A Decision Algorithm
NASA Astrophysics Data System (ADS)
Rodríguez-Prieto, Álvaro; Camacho, Ana María; Sebastián, Miguel Ángel
2016-02-01
An innovative methodology based on stringency levels is proposed in this paper and improves the current selection method for structural materials used in demanding industrial applications. This paper describes a new approach for quantifying the stringency of materials requirements based on a novel deterministic algorithm to prevent potential failures. We have applied the new methodology to different standardized specifications used in pressure vessels design, such as SA-533 Grade B Cl.1, SA-508 Cl.3 (issued by the American Society of Mechanical Engineers), DIN 20MnMoNi55 (issued by the German Institute of Standardization) and 16MND5 (issued by the French Nuclear Commission) specifications and determine the influence of design code selection. This study is based on key scientific publications on the influence of chemical composition on the mechanical behavior of materials, which were not considered when the technological requirements were established in the aforementioned specifications. For this purpose, a new method to quantify the efficacy of each standard has been developed using a deterministic algorithm. The process of assigning relative weights was performed by consulting a panel of experts in materials selection for reactor pressure vessels to provide a more objective methodology; thus, the resulting mathematical calculations for quantitative analysis are greatly simplified. The final results show that steel DIN 20MnMoNi55 is the best material option. Additionally, more recently developed materials such as DIN 20MnMoNi55, 16MND5 and SA-508 Cl.3 exhibit mechanical requirements more stringent than SA-533 Grade B Cl.1. The methodology presented in this paper can be used as a decision tool in selection of materials for a wide range of applications.
Danielson, Jeffrey J.; Poppenga, Sandra K.; Brock, John C.; Evans, Gayla A.; Tyler, Dean; Gesch, Dean B.; Thatcher, Cindy A.; Barras, John
2016-01-01
During the coming decades, coastlines will respond to widely predicted sea-level rise, storm surge, and coastalinundation flooding from disastrous events. Because physical processes in coastal environments are controlled by the geomorphology of over-the-land topography and underwater bathymetry, many applications of geospatial data in coastal environments require detailed knowledge of the near-shore topography and bathymetry. In this paper, an updated methodology used by the U.S. Geological Survey Coastal National Elevation Database (CoNED) Applications Project is presented for developing coastal topobathymetric elevation models (TBDEMs) from multiple topographic data sources with adjacent intertidal topobathymetric and offshore bathymetric sources to generate seamlessly integrated TBDEMs. This repeatable, updatable, and logically consistent methodology assimilates topographic data (land elevation) and bathymetry (water depth) into a seamless coastal elevation model. Within the overarching framework, vertical datum transformations are standardized in a workflow that interweaves spatially consistent interpolation (gridding) techniques with a land/water boundary mask delineation approach. Output gridded raster TBDEMs are stacked into a file storage system of mosaic datasets within an Esri ArcGIS geodatabase for efficient updating while maintaining current and updated spatially referenced metadata. Topobathymetric data provide a required seamless elevation product for several science application studies, such as shoreline delineation, coastal inundation mapping, sediment-transport, sea-level rise, storm surge models, and tsunami impact assessment. These detailed coastal elevation data are critical to depict regions prone to climate change impacts and are essential to planners and managers responsible for mitigating the associated risks and costs to both human communities and ecosystems. The CoNED methodology approach has been used to construct integrated TBDEM models in Mobile Bay, the northern Gulf of Mexico, San Francisco Bay, the Hurricane Sandy region, and southern California.
Digital Methodology to implement the ECOUTER engagement process.
Wilson, Rebecca C; Butters, Oliver W; Clark, Tom; Minion, Joel; Turner, Andrew; Murtagh, Madeleine J
2016-01-01
ECOUTER ( E mploying CO ncept u al schema for policy and T ranslation E in R esearch - French for 'to listen' - is a new stakeholder engagement method incorporating existing evidence to help participants draw upon their own knowledge of cognate issues and interact on a topic of shared concern. The results of an ECOUTER can form the basis of recommendations for research, governance, practice and/or policy. This paper describes the development of a digital methodology for the ECOUTER engagement process based on currently available mind mapping freeware software. The implementation of an ECOUTER process tailored to applications within health studies are outlined for both online and face-to-face scenarios. Limitations of the present digital methodology are discussed, highlighting the requirement of a purpose built software for ECOUTER research purposes.
Methodology of developing a smartphone application for crisis research and its clinical application.
Zhang, Melvyn W B; Ho, Cyrus S H; Fang, Pan; Lu, Yanxia; Ho, Roger C M
2014-01-01
Recent advancement in Internet based technologies have resulted in the growth of a sub-specialized field, termed as "Infodemiology" and "Infoveillance". Infoveillence refers to the collation of infodemiology measures for the purpose of surveillance and trending. Previous research has only demonstrated the research potential of Web 2.0 medium in collation of data in crisis situation. The objectives for the current study are to demonstrate the methodology of implementation of a smartphone-based application for dissemination and collation of information during a crisis situation. The Haze Smartphone application was developed using an online application builder and using HTML5 as the core programming language. A five-phase developmental method including a) formulation of user requirements, b) system design, c) system development, d) system evaluation and finally e) system application and implementation were adopted. The smartphone application was deployed during a one-week period via a self-sponsored Facebook post and via direct dissemination of the web-links by emails. A total of 298 respondents took part in the survey within the application. Most of them were between the ages of 20- to 29-years old and had a university education. More individuals preferred the option of accessing and providing feedback to a survey on physical and psychological wellbeing via direct access to a Web-based questionnaire. In addition, the participants reported a mean number of 4.03 physical symptoms (SD 2.6). The total Impact of Event Scale-Revised (IES-R) score was 18.47 (SD 11.69), which indicated that the study population did experience psychological stress but not posttraumatic stress disorder. The perceived dangerous Pollutant Standards Index (PSI) level and the number of physical symptoms were associated with higher IES-R Score (P<0.05). This study demonstrates how a smartphone application could potentially be used to acquire research data in a crisis situation. However, it is crucial for future research to further evaluate its effectiveness in a crisis situation.
A methodological systematic review of what's wrong with meta-ethnography reporting.
France, Emma F; Ring, Nicola; Thomas, Rebecca; Noyes, Jane; Maxwell, Margaret; Jepson, Ruth
2014-11-19
Syntheses of qualitative studies can inform health policy, services and our understanding of patient experience. Meta-ethnography is a systematic seven-phase interpretive qualitative synthesis approach well-suited to producing new theories and conceptual models. However, there are concerns about the quality of meta-ethnography reporting, particularly the analysis and synthesis processes. Our aim was to investigate the application and reporting of methods in recent meta-ethnography journal papers, focusing on the analysis and synthesis process and output. Methodological systematic review of health-related meta-ethnography journal papers published from 2012-2013. We searched six electronic databases, Google Scholar and Zetoc for papers using key terms including 'meta-ethnography.' Two authors independently screened papers by title and abstract with 100% agreement. We identified 32 relevant papers. Three authors independently extracted data and all authors analysed the application and reporting of methods using content analysis. Meta-ethnography was applied in diverse ways, sometimes inappropriately. In 13% of papers the approach did not suit the research aim. In 66% of papers reviewers did not follow the principles of meta-ethnography. The analytical and synthesis processes were poorly reported overall. In only 31% of papers reviewers clearly described how they analysed conceptual data from primary studies (phase 5, 'translation' of studies) and in only one paper (3%) reviewers explicitly described how they conducted the analytic synthesis process (phase 6). In 38% of papers we could not ascertain if reviewers had achieved any new interpretation of primary studies. In over 30% of papers seminal methodological texts which could have informed methods were not cited. We believe this is the first in-depth methodological systematic review of meta-ethnography conduct and reporting. Meta-ethnography is an evolving approach. Current reporting of methods, analysis and synthesis lacks clarity and comprehensiveness. This is a major barrier to use of meta-ethnography findings that could contribute significantly to the evidence base because it makes judging their rigour and credibility difficult. To realise the high potential value of meta-ethnography for enhancing health care and understanding patient experience requires reporting that clearly conveys the methodology, analysis and findings. Tailored meta-ethnography reporting guidelines, developed through expert consensus, could improve reporting.
Seventh NASTRAN User's Colloquium
NASA Technical Reports Server (NTRS)
1978-01-01
The general application of finite element methodology and the specific application of NASTRAN to a wide variety of static and dynamic structural problems are described. Topics include: fluids and thermal applications, NASTRAN programming, substructuring methods, unique new applications, general auxiliary programs, specific applications, and new capabilities.
Evaluating Multi-Input/Multi-Output Digital Control Systems
NASA Technical Reports Server (NTRS)
Pototzky, Anthony S.; Wieseman, Carol D.; Hoadley, Sherwood T.; Mukhopadhyay, Vivek
1994-01-01
Controller-performance-evaluation (CPE) methodology for multi-input/multi-output (MIMO) digital control systems developed. Procedures identify potentially destabilizing controllers and confirm satisfactory performance of stabilizing ones. Methodology generic and used in many types of multi-loop digital-controller applications, including digital flight-control systems, digitally controlled spacecraft structures, and actively controlled wind-tunnel models. Also applicable to other complex, highly dynamic digital controllers, such as those in high-performance robot systems.
Application of Design Methodologies for Feedback Compensation Associated with Linear Systems
NASA Technical Reports Server (NTRS)
Smith, Monty J.
1996-01-01
The work that follows is concerned with the application of design methodologies for feedback compensation associated with linear systems. In general, the intent is to provide a well behaved closed loop system in terms of stability and robustness (internal signals remain bounded with a certain amount of uncertainty) and simultaneously achieve an acceptable level of performance. The approach here has been to convert the closed loop system and control synthesis problem into the interpolation setting. The interpolation formulation then serves as our mathematical representation of the design process. Lifting techniques have been used to solve the corresponding interpolation and control synthesis problems. Several applications using this multiobjective design methodology have been included to show the effectiveness of these techniques. In particular, the mixed H 2-H performance criteria with algorithm has been used on several examples including an F-18 HARV (High Angle of Attack Research Vehicle) for sensitivity performance.
Current evidence of percutaneous nucleoplasty for the cervical herniated disk: a systematic review.
Wullems, Jorgen A; Halim, Willy; van der Weegen, Walter
2014-07-01
Although percutaneous cervical nucleoplasty (PCN) has been shown to be both safe and effective, its application is still debated. PCN applied in disk herniation has not been systematically reviewed before, resulting in a limited insight into its effectiveness and safety, and the quality of available evidence. Therefore, we systematically reviewed the evidence on the efficacy and safety of PCN in patients with a (contained) herniated disk. MEDLINE, EMBASE, and the Cochrane Library (Central Register of Controlled Trials) were searched for randomized controlled trials (RCTs) and nonrandomized studies using the following keywords: "Nucleoplasty," "Cervical," "Hernia," "Herniation," "Prolapse," "Protrusion," "Intervertebral disk," and "Percutaneous disk decompression." First, all articles were appraised for methodological quality, and then, RCTs were graded for the level of evidence according a best-evidence synthesis, because a meta-analysis was not possible. Finally, the RCTs' applicability and clinical relevance also was assessed. Of 75 identified abstracts, 10 full-text articles were included (3 RCTs and 7 nonrandomized studies). These studies represented a total of 1021 patients: 823 patients (≥ 892 disks) were treated by PCN. All studies showed low methodological quality, except for two. The level of evidence of the RCTs was graded as moderate, with low to moderate applicability and clinical relevance. All included studies showed PCN to be an effective and safe procedure in the treatment of (contained) herniated disks at short-, mid-, and long-term follow-up. However, the level of evidence is moderate and shows only low to moderate applicability and clinical relevance. © 2013 World Institute of Pain.
The International Baccalaureate Diploma Programme in Mexico as Preparation for Higher Education
ERIC Educational Resources Information Center
Saavedra, Anna Rosefsky; Lavore, Elisa; Flores-Ivich, Georgina
2016-01-01
In this study we analyse the relationship between Mexican students' enrolment in the International Baccalaureate (IB) Diploma Programme (DP) and their college preparedness using a case-study methodology. We found that from the Mexican schools that offer the IB DP, most IB students are fairly successful in their college applications, such that the…
ERIC Educational Resources Information Center
Madrigal-Hopes, Diana L.; Villavicencio, Edna; Foote, Martha M.; Green, Chris
2014-01-01
This qualitative study examined the impact of a six-step framework for work-specific vocabulary instruction in adult English language learners (ELLs). Guided by research in English as a second language (ESL) methodology and the transactional theory, the researchers sought to unveil how these processes supported the acquisition and application of…
ERIC Educational Resources Information Center
Phipps, Charmaine
2013-01-01
Purpose: The purpose of this study was to identify which of the motivators of organizational citizenship behavior (OCB) present in the literature are reported as applicable to community college faculty and to examine the nature of those motivators. Methodology: A mixed-methods design was selected for this study. Emphasis was on the qualitative…
ERIC Educational Resources Information Center
Hong, Guanglei; Yu, Bing
2008-01-01
This study examines the effects of kindergarten retention on children's social-emotional development in the early, middle, and late elementary years. Previous studies have generated mixed results partly due to some major methodological challenges, including selection bias, measurement error, and divergent perceptions of multiple respondents in…
ERIC Educational Resources Information Center
Engel, Anna; Coll, Cesar; Bustos, Alfonso
2013-01-01
This work explores some methodological challenges in the application of Social Network Analysis (SNA) to the study of "Asynchronous Learning Networks" (ALN). Our interest in the SNA is situated within the framework of the study of Distributed Teaching Presence (DTP), understood as the exercise of educational influence, through a multi-method…
Aggarwal, Vasudha; Ha, Taekjip
2014-11-01
Macromolecular interactions play a central role in many biological processes. Protein-protein interactions have mostly been studied by co-immunoprecipitation, which cannot provide quantitative information on all possible molecular connections present in the complex. We will review a new approach that allows cellular proteins and biomolecular complexes to be studied in real-time at the single-molecule level. This technique is called single-molecule pull-down (SiMPull), because it integrates principles of conventional immunoprecipitation with the powerful single-molecule fluorescence microscopy. SiMPull is used to count how many of each protein is present in the physiological complexes found in cytosol and membranes. Concurrently, it serves as a single-molecule biochemical tool to perform functional studies on the pulled-down proteins. In this review, we will focus on the detailed methodology of SiMPull, its salient features and a wide range of biological applications in comparison with other biosensing tools. © 2014 WILEY Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
A new approach to assessing the water footprint of wine: an Italian case study.
Lamastra, Lucrezia; Suciu, Nicoleta Alina; Novelli, Elisa; Trevisan, Marco
2014-08-15
Agriculture is the largest freshwater consumer, accounting for 70% of the world's water withdrawal. Water footprints (WFs) are being increasingly used to indicate the impacts of water use by production systems. A new methodology to assess WF of wine was developed in the framework of the V.I.V.A. project (Valutazione Impatto Viticoltura sull'Ambiente), launched by the Italian Ministry for the Environment in 2011 to improve the Italian wine sector's sustainability. The new methodology has been developed that enables different vines from the same winery to be compared. This was achieved by calculating the gray water footprint, following Tier III approach proposed by Hoekstra et al. (2011). The impact of water use during the life cycle of grape-wine production was assessed for six different wines from the same winery in Sicily, Italy using both the newly developed methodology (V.I.V.A.) and the classical methodology proposed by the Water Footprint Network (WFN). In all cases green water was the largest contributor to WF, but the new methodology also detected differences between vines of the same winery. Furthermore, V.I.V.A. methodology assesses water body contamination by pesticides application whereas the WFN methodology considers just fertilization. This fact ended highlights the highest WF of vineyard 4 calculated by V.I.V.A. if compared with the WF calculated with WFN methodology. Comparing the WF of wine produced with grapes from the six different wines, the factors most greatly influencing the results obtained in this study were: distance from the water body, fertilization rate, amount and eco-toxicological behavior of the active ingredients used. Copyright © 2014 Elsevier B.V. All rights reserved.
Drenkard, K N
2001-01-01
The application of a strategic planning methodology for the discipline of nursing is described in use by a large, nonprofit integrated healthcare system. The methodology uses a transformational leadership assessment tool, quality planning methods, and large group intervention to engage nurses in the implementation of strategies. Based on systems theory, the methodology outlined by the author has application at any level in an organization, from an entire delivery network, to a patient care unit. The author discusses getting started on a strategic planning journey, tools that are useful in the process, integrating already existing business plans into the strategies for nursing, preliminary measures to monitor progress, and lessons learned along the journey.
Fostering Effective Leadership in Foreign Contexts through Study of Cultural Values
ERIC Educational Resources Information Center
Schenck, Andrew D.
2016-01-01
While leadership styles have been extensively examined, cultural biases implicit within research methodologies often preclude application of results in foreign contexts. To more holistically comprehend the impact of culture on leadership, belief systems were empirically correlated to both transactional and transformational tendencies in public…
Theory and Scholarly Inquiry Need Not Be Scientific to Be of Value.
ERIC Educational Resources Information Center
Martin, Jack
1995-01-01
Reacts to responses concerning a previous article in this issue, "Against Scientism in Psychological Counselling and Therapy." Reasserts that there are important, undeniable limitations to the application of physical science methodologies and epistemologies to the study of humans and their experiences. (JPS)
Qualitative Audience Research: Toward an Integrative Approach to Reception.
ERIC Educational Resources Information Center
Jensen, Klaus Bruhn
1987-01-01
Analyzes research about the mass communication audience and describes a theoretical and methodological framework for further empirical studies. Discusses the (1) explanatory value of qualitative research; (2) social and cultural implications of the reception process, with special reference to television; and (3) applications and social relevance…
Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)
NASA Technical Reports Server (NTRS)
1983-01-01
The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.
PDTRT special section: Methodological issues in personality disorder research.
Widiger, Thomas A
2017-10-01
Personality Disorders: Theory, Research, and Treatment includes a rolling, ongoing Special Section concerned with methodological issues in personality disorder research. This third edition of this series includes two articles. The first is by Brian Hicks, Angus Clark, and Emily Durbin: "Person-Centered Approaches in the Study of Personality Disorders." The second article is by Steve Balsis: "Item Response Theory Applications in Personality Disorder Research." Both articles should be excellent resources for future research and certainly manuscripts submitted to this journal that use these analytic tools. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Epistasis analysis using artificial intelligence.
Moore, Jason H; Hill, Doug P
2015-01-01
Here we introduce artificial intelligence (AI) methodology for detecting and characterizing epistasis in genetic association studies. The ultimate goal of our AI strategy is to analyze genome-wide genetics data as a human would using sources of expert knowledge as a guide. The methodology presented here is based on computational evolution, which is a type of genetic programming. The ability to generate interesting solutions while at the same time learning how to solve the problem at hand distinguishes computational evolution from other genetic programming approaches. We provide a general overview of this approach and then present a few examples of its application to real data.
Development of Techniques for Visualization of Scalar and Vector Fields in the Immersive Environment
NASA Technical Reports Server (NTRS)
Bidasaria, Hari B.; Wilson, John W.; Nealy, John E.
2005-01-01
Visualization of scalar and vector fields in the immersive environment (CAVE - Cave Automated Virtual Environment) is important for its application to radiation shielding research at NASA Langley Research Center. A complete methodology and the underlying software for this purpose have been developed. The developed software has been put to use for the visualization of the earth s magnetic field, and in particular for the study of the South Atlantic Anomaly. The methodology has also been put to use for the visualization of geomagnetically trapped protons and electrons within Earth's magnetosphere.
Tagging Water Sources in Atmospheric Models
NASA Technical Reports Server (NTRS)
Bosilovich, M.
2003-01-01
Tagging of water sources in atmospheric models allows for quantitative diagnostics of how water is transported from its source region to its sink region. In this presentation, we review how this methodology is applied to global atmospheric models. We will present several applications of the methodology. In one example, the regional sources of water for the North American Monsoon system are evaluated by tagging the surface evaporation. In another example, the tagged water is used to quantify the global water cycling rate and residence time. We will also discuss the need for more research and the importance of these diagnostics in water cycle studies.
Methodology of development of a Delirium clinical application and initial feasibility results.
Zhang, Melvyn W B; Ho, Roger C M; Sockalingam, Sanjeev
2015-01-01
Delirium is a highly prevalent condition in the hospital settings, with prevalence rates ranging from 6% to 56%, based on previous studies. A recent review provides evidence for the need of practice tools at the point of care to increase impact and to improve patient outcomes related to delirium care. The major challenge is to help maintain the skill-sets required by clinicians and allied healthcare workers over time. There have been massive advancements in smartphone technologies, as well as several papers being published recently about how clinicians could be application developers. The following study will serve to illustrate how the authors made use of the latest advances in application creation technologies in designing a Delirium education application, containing protocols that are appropriate to their healthcare setting. The study in itself will serve as a pilot project aimed at implementing smartphone technologies in delirium education, to determine its feasibility as well as user's perspectives towards such an implementation. The Delirium UHN Application was developed between the months of February 2013 to September 2014. Making use of the methodologies shared by Zhang MWB et al., the authors embarked on the development of the web-based and the native application. The web-based application was developed using HTML5 programming language and with the aid of an online application builder. Psychiatry residents and allied health professionals, at the University of Toronto were recruited to help evaluate the pilot web-based version of the application. Since the introduction of the web-based application during the delirium awareness week, there has been a total of 1165 unique access to the online web-based application. Of significance, there is a shift in the confidence levels of the participants with regards to the management of delirium after using the application. The majority of the participants (44.0%) reported being moderately comfortable with managing delirium prior to the usage of the application, but this changed after the implementation of the application, with 39.0% reporting being very confident and 44.0% being extremely confident about managing delirium after using the application. 69.0% of the participants also perceived the smartphone application to be of use to their clinical care for delirious patients. This study is one of the first to demonstrate the potential usage of smartphone innovations in delirium education. The current study demonstrated the added feasibility of smartphone applications, and demonstrated that users perceived that they are more abled with managing delirium after the usage of the smartphone application.
Feijoo-Cid, Maria; Moriña, David; Gómez-Ibáñez, Rebeca; Leyva-Moral, Juan M
2017-03-01
To evaluate nursing students' satisfaction with Expert Patient Illness Narratives as a teaching and learning methodology based on patient involvement. Mixed methods were used in this study: online survey with quantitative and qualitative items designed by researchers. Sixty-four nursing students of the Universitat Autònoma de Barcelona, attending a Medical Anthropology elective course. Women more frequently considered that the new learning methodology was useful in developing the competency "to reason to reason the presence of the triad Health-Illness-Care in all the groups, societies and historical moments" (p-value=0.02) and in that it was consolidated as a learning outcome (p-value=0.022). On the other hand, men considered that this methodology facilitated the development of critical thinking (p=0.01) and the ability to identify normalized or deviant care situations (p=0.007). Students recognized the value of Expert Patient Illness Narratives in their nursing training as a way to acquire new nursing skills and broaden previously acquired knowledge. This educational innovation improved nursing skills and provided a different and richer perspective of humanization of care. The results of the present study demonstrate that nursing students found Expert Patient Illness Narratives satisfactory as a learning and teaching methodology, and reported improvement in different areas of their training and also the integration of new knowledge, meaning, theory applicability, as well las critical and reflective thinking. Involvement of patients as storytellers also provides a new humanizing perspective of care. Nonetheless, further studies of Expert Patient Illness Narratives are needed in order to improve its benefits as a teaching and learning methodology. Copyright © 2016 Elsevier Ltd. All rights reserved.
a New Ubiquitous-Based Indoor Positioning System with Minimum Extra Hardware Using Smart Phones
NASA Astrophysics Data System (ADS)
Hassany Pazoky, S.; Chehreghan, A.; Sadeghi Niaraki, A.; Abbaspour, R. Ali
2014-10-01
Knowing the position has been an ambition in many areas such as science, military, business, etc. GPS was the realization of this wish in 1970s. Technological advances such as ubiquitous computing, as a conquering perspective, requires any service to work for any user, any place, anytime, and via any network. As GPS cannot provide services in indoor environments, many scientists began to develop indoor positioning systems (IPS). Smart phones penetrating our everyday lives were a great platform to host IPS applications. Sensors in smart phones were another big motive to develop IPS applications. Many researchers have been working on the topic developing various applications. However, the applications introduced lack simplicity. In other words, they need to install a step counter or smart phone on the ankle, which makes it awkward and inapplicable in many situations. In the current study, a new IPS methodology is introduced using only the usual embedded sensors in the smart phones. The robustness of this methodology cannot compete with those of the aforementioned approaches. The price paid for simplicity was decreasing robustness and complicating the methods and formulations. However, methods or tricks to harness the errors to an acceptable range are introduced as the future works.
Methodology and issues of integral experiments selection for nuclear data validation
NASA Astrophysics Data System (ADS)
Tatiana, Ivanova; Ivanov, Evgeny; Hill, Ian
2017-09-01
Nuclear data validation involves a large suite of Integral Experiments (IEs) for criticality, reactor physics and dosimetry applications. [1] Often benchmarks are taken from international Handbooks. [2, 3] Depending on the application, IEs have different degrees of usefulness in validation, and usually the use of a single benchmark is not advised; indeed, it may lead to erroneous interpretation and results. [1] This work aims at quantifying the importance of benchmarks used in application dependent cross section validation. The approach is based on well-known General Linear Least Squared Method (GLLSM) extended to establish biases and uncertainties for given cross sections (within a given energy interval). The statistical treatment results in a vector of weighting factors for the integral benchmarks. These factors characterize the value added by a benchmark for nuclear data validation for the given application. The methodology is illustrated by one example, selecting benchmarks for 239Pu cross section validation. The studies were performed in the framework of Subgroup 39 (Methods and approaches to provide feedback from nuclear and covariance data adjustment for improvement of nuclear data files) established at the Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD).
Peer Review of Grant Applications: Criteria Used and Qualitative Study of Reviewer Practices
Abdoul, Hendy; Perrey, Christophe; Amiel, Philippe; Tubach, Florence; Gottot, Serge; Durand-Zaleski, Isabelle; Alberti, Corinne
2012-01-01
Background Peer review of grant applications has been criticized as lacking reliability. Studies showing poor agreement among reviewers supported this possibility but usually focused on reviewers’ scores and failed to investigate reasons for disagreement. Here, our goal was to determine how reviewers rate applications, by investigating reviewer practices and grant assessment criteria. Methods and Findings We first collected and analyzed a convenience sample of French and international calls for proposals and assessment guidelines, from which we created an overall typology of assessment criteria comprising nine domains relevance to the call for proposals, usefulness, originality, innovativeness, methodology, feasibility, funding, ethical aspects, and writing of the grant application. We then performed a qualitative study of reviewer practices, particularly regarding the use of assessment criteria, among reviewers of the French Academic Hospital Research Grant Agencies (Programmes Hospitaliers de Recherche Clinique, PHRCs). Semi-structured interviews and observation sessions were conducted. Both the time spent assessing each grant application and the assessment methods varied across reviewers. The assessment criteria recommended by the PHRCs were listed by all reviewers as frequently evaluated and useful. However, use of the PHRC criteria was subjective and varied across reviewers. Some reviewers gave the same weight to each assessment criterion, whereas others considered originality to be the most important criterion (12/34), followed by methodology (10/34) and feasibility (4/34). Conceivably, this variability might adversely affect the reliability of the review process, and studies evaluating this hypothesis would be of interest. Conclusions Variability across reviewers may result in mistrust among grant applicants about the review process. Consequently, ensuring transparency is of the utmost importance. Consistency in the review process could also be improved by providing common definitions for each assessment criterion and uniform requirements for grant application submissions. Further research is needed to assess the feasibility and acceptability of these measures. PMID:23029386
Analyzing checkpointing trends for applications on the IBM Blue Gene/P system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naik, H.; Gupta, R.; Beckman, P.
Current petascale systems have tens of thousands of hardware components and complex system software stacks, which increase the probability of faults occurring during the lifetime of a process. Checkpointing has been a popular method of providing fault tolerance in high-end systems. While considerable research has been done to optimize checkpointing, in practice the method still involves a high-cost overhead for users. In this paper, we study the checkpointing overhead seen by applications running on leadership-class machines such as the IBM Blue Gene/P at Argonne National Laboratory. We study various applications and design a methodology to assist users in understanding andmore » choosing checkpointing frequency and reducing the overhead incurred. In particular, we study three popular applications -- the Grid-Based Projector-Augmented Wave application, the Carr-Parrinello Molecular Dynamics application, and a Nek5000 computational fluid dynamics application -- and analyze their memory usage and possible checkpointing trends on 32,768 processors of the Blue Gene/P system.« less
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
NASA Astrophysics Data System (ADS)
Moglia, Magnus; Sharma, Ashok K.; Maheepala, Shiroma
2012-07-01
SummaryPlanning of regional and urban water resources, and in particular with Integrated Urban Water Management approaches, often considers inter-relationships between human uses of water, the health of the natural environment as well as the cost of various management strategies. Decision makers hence typically need to consider a combination of social, environmental and economic goals. The types of strategies employed can include water efficiency measures, water sensitive urban design, stormwater management, or catchment management. Therefore, decision makers need to choose between different scenarios and to evaluate them against a number of criteria. This type of problem has a discipline devoted to it, i.e. Multi-Criteria Decision Analysis, which has often been applied in water management contexts. This paper describes the application of Subjective Logic in a basic Bayesian Network to a Multi-Criteria Decision Analysis problem. By doing this, it outlines a novel methodology that explicitly incorporates uncertainty and information reliability. The application of the methodology to a known case study context allows for exploration. By making uncertainty and reliability of assessments explicit, it allows for assessing risks of various options, and this may help in alleviating cognitive biases and move towards a well formulated risk management policy.
Application of tolerance limits to the characterization of image registration performance.
Fedorov, Andriy; Wells, William M; Kikinis, Ron; Tempany, Clare M; Vangel, Mark G
2014-07-01
Deformable image registration is used increasingly in image-guided interventions and other applications. However, validation and characterization of registration performance remain areas that require further study. We propose an analysis methodology for deriving tolerance limits on the initial conditions for deformable registration that reliably lead to a successful registration. This approach results in a concise summary of the probability of registration failure, while accounting for the variability in the test data. The (β, γ) tolerance limit can be interpreted as a value of the input parameter that leads to successful registration outcome in at least 100β% of cases with the 100γ% confidence. The utility of the methodology is illustrated by summarizing the performance of a deformable registration algorithm evaluated in three different experimental setups of increasing complexity. Our examples are based on clinical data collected during MRI-guided prostate biopsy registered using publicly available deformable registration tool. The results indicate that the proposed methodology can be used to generate concise graphical summaries of the experiments, as well as a probabilistic estimate of the registration outcome for a future sample. Its use may facilitate improved objective assessment, comparison and retrospective stress-testing of deformable.
Integration Methodology For Oil-Free Shaft Support Systems: Four Steps to Success
NASA Technical Reports Server (NTRS)
Howard, Samuel A.; DellaCorte, Christopher; Bruckner, Robert J.
2010-01-01
Commercial applications for Oil-Free turbomachinery are slowly becoming a reality. Micro-turbine generators, highspeed electric motors, and electrically driven centrifugal blowers are a few examples of products available in today's commercial marketplace. Gas foil bearing technology makes most of these applications possible. A significant volume of component level research has led to recent acceptance of gas foil bearings in several specialized applications, including those mentioned above. Component tests identifying such characteristics as load carrying capacity, power loss, thermal behavior, rotordynamic coefficients, etc. all help the engineer design foil bearing machines, but the development process can be just as important. As the technology gains momentum and acceptance in a wider array of machinery, the complexity and variety of applications will grow beyond the current class of machines. Following a robust integration methodology will help improve the probability of successful development of future Oil-Free turbomachinery. This paper describes a previously successful four-step integration methodology used in the development of several Oil-Free turbomachines. Proper application of the methods put forward here enable successful design of Oil-Free turbomachinery. In addition when significant design changes or unique machinery are developed, this four-step process must be considered.
Horizon Mission Methodology - A tool for the study of technology innovation and new paradigms
NASA Technical Reports Server (NTRS)
Anderson, John L.
1993-01-01
The Horizon Mission (HM) methodology was developed to provide a means of identifying and evaluating highly innovative, breakthrough technology concepts (BTCs) and for assessing their potential impact on advanced space missions. The methodology is based on identifying new capabilities needed by hypothetical 'horizon' space missions having performance requirements that cannot be met even by extrapolating known space technologies. Normal human evaluation of new ideas such as BTCs appears to be governed (and limited) by 'inner models of reality' defined as paradigms. Thus, new ideas are evaluated by old models. This paper describes the use of the HM Methodology to define possible future paradigms that would provide alternatives to evaluation by current paradigms. The approach is to represent a future paradigm by a set of new BTC-based capabilities - called a paradigm abstract. The paper describes methods of constructing and using the abstracts for evaluating BTCs for space applications and for exploring the concept of paradigms and paradigm shifts as a representation of technology innovation.
Towards sustainable mobile systems configurations: Application to a tuna purse seiner.
García Rellán, A; Vázquez Brea, C; Bello Bugallo, P M
2018-08-01
Fishing is one of the most important marine activities. It contributes to both overfishing and marine pollution, the two main threats to the ocean environment. In this context, the aim of this work is to investigate and validate methodologies for the identification of more sustainable operating configurations for a tuna purse seiner. The proposed methodology is based on a previous one applied to secondary industrial systems, taking into account the Integrated Pollution Prevention and Control focus, developed for the most potentially industrial polluting sources. The idea is to apply the same type of methodologies and concepts used for secondary industrial punctual sources, to a primary industrial mobile activity. This methodology combines two tools: "Material and Energy Flow Analysis" (a tool from industrial metabolism), and "Best Available Techniques Analysis". The first provides a way to detect "Improvable Flows" into de system, and the second provides a way to define sustainable options to improve them. Five main Improvable Flows have been identified in the selected case study, the activity of a purse seiner, most of them related with energy consumption and air emission, in different stages of the fishing activity. Thirty-one Best Available Techniques candidates for the system have been inventoried, that potentially could improve the sustainability of the activity. Seven of them are not implemented yet to the case study. The potential improvements of the system proposed by this work are related to energy efficiency, waste management, prevention and control of air emissions. This methodology demonstrates to be a good tool towards sustainable punctual systems, but also towards sustainable mobile systems such as the fishing activity in oceans, as the tuna purse seiner validated here. The practical application of the identified technologies to fishing systems will contribute to prevent and reduce marine pollution, one of the greatest threats of today's oceans. Copyright © 2017 Elsevier B.V. All rights reserved.
Cancer diagnosis by infrared spectroscopy: methodological aspects
NASA Astrophysics Data System (ADS)
Jackson, Michael; Kim, Keith; Tetteh, John; Mansfield, James R.; Dolenko, Brion; Somorjai, Raymond L.; Orr, F. W.; Watson, Peter H.; Mantsch, Henry H.
1998-04-01
IR spectroscopy is proving to be a powerful tool for the study and diagnosis of cancer. The application of IR spectroscopy to the analysis of cultured tumor cells and grading of breast cancer sections is outlined. Potential sources of error in spectral interpretation due to variations in sample histology and artifacts associated with sample storage and preparation are discussed. The application of statistical techniques to assess differences between spectra and to non-subjectively classify spectra is demonstrated.
NASA Technical Reports Server (NTRS)
Martino, J. P.; Lenz, R. C., Jr.; Chen, K. L.; Kahut, P.; Sekely, R.; Weiler, J.
1979-01-01
The appendices for the cross impact methodology are presented. These include: user's guide, telecommunication events, cross impacts, projection of historical trends, and projection of trends in satellite communications.
Circulating Tumor Cells: Moving Biological Insights into Detection
Chen, Lichan; Bode, Ann M; Dong, Zigang
2017-01-01
Circulating tumor cells (CTCs) have shown promising potential as liquid biopsies that facilitate early detection, prognosis, therapeutic target selection and monitoring treatment response. CTCs in most cancer patients are low in abundance and heterogeneous in morphological and phenotypic profiles, which complicate their enrichment and subsequent characterization. Several methodologies for CTC enrichment and characterization have been developed over the past few years. However, integrating recent advances in CTC biology into these methodologies and the selection of appropriate enrichment and characterization methods for specific applications are needed to improve the reliability of CTC biopsies. In this review, we summarize recent advances in the studies of CTC biology, including the mechanisms of their generation and their potential forms of existence in blood, as well as the current CTC enrichment technologies. We then critically examine the selection of methods for appropriately enriching CTCs for further investigation of their clinical applications. PMID:28819450
Feasibility and benefits of laminar flow control on supersonic cruise airplanes
NASA Technical Reports Server (NTRS)
Powell, A. G.; Agrawal, S.; Lacey, T. R.
1989-01-01
An evaluation was made of the applicability and benefits of laminar flow control (LFC) technology to supersonic cruise airplanes. Ancillary objectives were to identify the technical issues critical to supersonic LFC application, and to determine how those issues can be addressed through flight and wind-tunnel testing. Vehicle types studied include a Mach 2.2 supersonic transport configuration, a Mach 4.0 transport, and two Mach 2-class fighter concepts. Laminar flow control methodologies developed for subsonic and transonic wing laminarization were extended and applied. No intractible aerodynamic problems were found in applying LFC to airplanes of the Mach 2 class, even ones of large size. Improvements of 12 to 17 percent in lift-drag ratios were found. Several key technical issues, such as contamination avoidance and excresence criteria were identified. Recommendations are made for their resolution. A need for an inverse supersonic wing design methodology is indicated.
NASA Astrophysics Data System (ADS)
Nieland, Simon; Kleinschmit, Birgit; Förster, Michael
2015-05-01
Ontology-based applications hold promise in improving spatial data interoperability. In this work we use remote sensing-based biodiversity information and apply semantic formalisation and ontological inference to show improvements in data interoperability/comparability. The proposed methodology includes an observation-based, "bottom-up" engineering approach for remote sensing applications and gives a practical example of semantic mediation of geospatial products. We apply the methodology to three different nomenclatures used for remote sensing-based classification of two heathland nature conservation areas in Belgium and Germany. We analysed sensor nomenclatures with respect to their semantic formalisation and their bio-geographical differences. The results indicate that a hierarchical and transparent nomenclature is far more important for transferability than the sensor or study area. The inclusion of additional information, not necessarily belonging to a vegetation class description, is a key factor for the future success of using semantics for interoperability in remote sensing.
Application of electrical geophysics to the release of water resources, case of Ain Leuh (Morocco)
NASA Astrophysics Data System (ADS)
Zitouni, A.; Boukdir, A.; El Fjiji, H.; Baite, W.; Ekouele Mbaki, V. R.; Ben Said, H.; Echakraoui, Z.; Elissami, A.; El Maslouhi, M. R.
2018-05-01
Being seen needs in increasing waters in our contry for fine domestics, manufactures and agricultural, the prospecting of subterranean waters by geologic and hydrogeologic classic method remains inaplicable in the cases of the regions where one does not arrange drillings or polls (soundings) of gratitude (recongnition) in very sufficient (self-important) number. In that case of figure, the method of prospecting geophysics such as the method of nuclear magnetic resonance (NMR) and the method of the geophysics radar are usually used most usually because they showed, worldwide, results very desive in the projects of prospecting and evaluation of the resources in subterranean waters. In the present work, which concerns only the methodology of the electric resistivity, we treat the adopted methodological approach and the study of the case of application in the tray of Ajdir Ain Leuh.
The effect of erosion on the fatigue limit of metallic materials for aerospace applications
NASA Astrophysics Data System (ADS)
Kordatos, E. Z.; Exarchos, D. A.; Matikas, T. E.
2018-03-01
This work deals with the study of the fatigue behavior of metallic materials for aerospace applications which have undergone erosion. Particularly, an innovative non-destructive methodology based on infrared lock-in thermography was applied on aluminum samples for the rapid determination of their fatigue limit. The effect of erosion on the structural integrity of materials can lead to a catastrophic failure and therefore an efficient assessment of the fatigue behavior is of high importance. Infrared thermography (IRT) as a non-destructive, non-contact, real time and full field method can be employed in order the fatigue limit to be rapidly determined. The basic principle of this method is the detection and monitoring of the intrinsically dissipated energy due to the cyclic fatigue loading. This methodology was successfully applied on both eroded and non-eroded aluminum specimens in order the severity of erosion to be evaluated.
Energy modelling in sensor networks
NASA Astrophysics Data System (ADS)
Schmidt, D.; Krämer, M.; Kuhn, T.; Wehn, N.
2007-06-01
Wireless sensor networks are one of the key enabling technologies for the vision of ambient intelligence. Energy resources for sensor nodes are very scarce. A key challenge is the design of energy efficient communication protocols. Models of the energy consumption are needed to accurately simulate the efficiency of a protocol or application design, and can also be used for automatic energy optimizations in a model driven design process. We propose a novel methodology to create models for sensor nodes based on few simple measurements. In a case study the methodology was used to create models for MICAz nodes. The models were integrated in a simulation environment as well as in a SDL runtime framework of a model driven design process. Measurements on a test application that was created automatically from an SDL specification showed an 80% reduction in energy consumption compared to an implementation without power saving strategies.
Clinical practice guidelines in hypertension: a review.
Álvarez-Vargas, Mayita Lizbeth; Galvez-Olortegui, José Kelvin; Galvez-Olortegui, Tomas Vladimir; Sosa-Rosado, José Manuel; Camacho-Saavedra, Luis Arturo
2015-10-23
The aim of this study is the methodological evaluation of Clinical Practice Guidelines (CPG) in hypertension. This is the first in a series of review articles, analysis, assessment in methodology and content of clinical practice guidelines in Cardiology. Of all clinical practice guidelines, three were selected and the Appraisal of Guidelines for Research and Evaluation (AGREE II) instrument was used to assess each guide. The guidelines obtained the lowest score in the domain of applicability (mean 43.8%); while the highest score was for clarity of presentation (mean 81.5%). The lowest percentage was found in the applicability domain (European guideline) and the highest of all scores was found in two domains: scope and purpose, and clarity of presentation (Canadian guideline). Assessing the quality of the clinical practice guidelines analyzed, the Canadian is one with the best scores obtained by applying the AGREE II instrument, and it is advised to be used without modifications.
DNA Sequencing in Cultural Heritage.
Vai, Stefania; Lari, Martina; Caramelli, David
2016-02-01
During the last three decades, DNA analysis on degraded samples revealed itself as an important research tool in anthropology, archaeozoology, molecular evolution, and population genetics. Application on topics such as determination of species origin of prehistoric and historic objects, individual identification of famous personalities, characterization of particular samples important for historical, archeological, or evolutionary reconstructions, confers to the paleogenetics an important role also for the enhancement of cultural heritage. A really fast improvement in methodologies in recent years led to a revolution that permitted recovering even complete genomes from highly degraded samples with the possibility to go back in time 400,000 years for samples from temperate regions and 700,000 years for permafrozen remains and to analyze even more recent material that has been subjected to hard biochemical treatments. Here we propose a review on the different methodological approaches used so far for the molecular analysis of degraded samples and their application on some case studies.
Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics
NASA Astrophysics Data System (ADS)
Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph
2011-11-01
Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.
Teaching Environmental Education through PBL: Evaluation of a Teaching Intervention Program
NASA Astrophysics Data System (ADS)
Vasconcelos, Clara
2012-04-01
If our chosen aim in science education is to be inclusive and to improve students' learning achievements, then we must identify teaching methodologies that are appropriate for teaching and learning specific knowledge. Karagiorgi and Symeo 2005) remind us that instructional designers are thus challenged to translate the philosophy of constructivism into current practice. Thus, research in science education must focus on evaluating intervention programs which ensure the effective construction of knowledge and development of competencies. The present study reports the elaboration, application and evaluation of a problem-based learning (PBL) program with the aim of examining its effectiveness with students learning Environmental Education. Prior research on both PBL and Environmental Education (EE) was conducted within the context of science education so as to elaborate and construct the intervention program. Findings from these studies indicated both the PBL methodology and EE as helpful for teachers and students. PBL methodology has been adopted in this study since it is logically incorporated in a constructivism philosophy application (Hendry et al. 1999) and it was expected that this approach would assist students towards achieving a specific set of competencies (Engel 1997). On the other hand, EE has evolved at a rapid pace within many countries in the new millennium (Hart 2007), unlike any other educational area. However, many authors still appear to believe that schools are failing to prepare students adequately in EE (Walsche 2008; Winter 2007). The following section describes the research that was conducted in both areas so as to devise the intervention program.
Model-Driven Approach for Body Area Network Application Development.
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-05-12
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application.
Model-Driven Approach for Body Area Network Application Development
Venčkauskas, Algimantas; Štuikys, Vytautas; Jusas, Nerijus; Burbaitė, Renata
2016-01-01
This paper introduces the sensor-networked IoT model as a prototype to support the design of Body Area Network (BAN) applications for healthcare. Using the model, we analyze the synergistic effect of the functional requirements (data collection from the human body and transferring it to the top level) and non-functional requirements (trade-offs between energy-security-environmental factors, treated as Quality-of-Service (QoS)). We use feature models to represent the requirements at the earliest stage for the analysis and describe a model-driven methodology to design the possible BAN applications. Firstly, we specify the requirements as the problem domain (PD) variability model for the BAN applications. Next, we introduce the generative technology (meta-programming as the solution domain (SD)) and the mapping procedure to map the PD feature-based variability model onto the SD feature model. Finally, we create an executable meta-specification that represents the BAN functionality to describe the variability of the problem domain though transformations. The meta-specification (along with the meta-language processor) is a software generator for multiple BAN-oriented applications. We validate the methodology with experiments and a case study to generate a family of programs for the BAN sensor controllers. This enables to obtain the adequate measure of QoS efficiently through the interactive adjustment of the meta-parameter values and re-generation process for the concrete BAN application. PMID:27187394
A Practical Methodology for Disaggregating the Drivers of Drug Costs Using Administrative Data.
Lungu, Elena R; Manti, Orlando J; Levine, Mitchell A H; Clark, Douglas A; Potashnik, Tanya M; McKinley, Carol I
2017-09-01
Prescription drug expenditures represent a significant component of health care costs in Canada, with estimates of $28.8 billion spent in 2014. Identifying the major cost drivers and the effect they have on prescription drug expenditures allows policy makers and researchers to interpret current cost pressures and anticipate future expenditure levels. To identify the major drivers of prescription drug costs and to develop a methodology to disaggregate the impact of each of the individual drivers. The methodology proposed in this study uses the Laspeyres approach for cost decomposition. This approach isolates the effect of the change in a specific factor (e.g., price) by holding the other factor(s) (e.g., quantity) constant at the base-period value. The Laspeyres approach is expanded to a multi-factorial framework to isolate and quantify several factors that drive prescription drug cost. Three broad categories of effects are considered: volume, price and drug-mix effects. For each category, important sub-effects are quantified. This study presents a new and comprehensive methodology for decomposing the change in prescription drug costs over time including step-by-step demonstrations of how the formulas were derived. This methodology has practical applications for health policy decision makers and can aid researchers in conducting cost driver analyses. The methodology can be adjusted depending on the purpose and analytical depth of the research and data availability. © 2017 Journal of Population Therapeutics and Clinical Pharmacology. All rights reserved.
Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong
2018-03-01
Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Disney, R.K.
1994-10-01
The methodology for handling bias and uncertainty when calculational methods are used in criticality safety evaluations (CSE`s) is a rapidly evolving technology. The changes in the methodology are driven by a number of factors. One factor responsible for changes in the methodology for handling bias and uncertainty in CSE`s within the overview of the US Department of Energy (DOE) is a shift in the overview function from a ``site`` perception to a more uniform or ``national`` perception. Other causes for change or improvement in the methodology for handling calculational bias and uncertainty are; (1) an increased demand for benchmark criticalsmore » data to expand the area (range) of applicability of existing data, (2) a demand for new data to supplement existing benchmark criticals data, (3) the increased reliance on (or need for) computational benchmarks which supplement (or replace) experimental measurements in critical assemblies, and (4) an increased demand for benchmark data applicable to the expanded range of conditions and configurations encountered in DOE site restoration and remediation.« less
NASA Technical Reports Server (NTRS)
Young, G.
1982-01-01
A design methodology capable of dealing with nonlinear systems, such as a controlled ecological life support system (CELSS), containing parameter uncertainty is discussed. The methodology was applied to the design of discrete time nonlinear controllers. The nonlinear controllers can be used to control either linear or nonlinear systems. Several controller strategies are presented to illustrate the design procedure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brossmann, U.B.
1981-01-01
The application of the methodological design is demonstrated for the development of support concepts in the case of a Bitter-type magnet designed for a compact tokamak experimentat aiming at ignition of a DT plasma. With this methodology all boundary conditions and design criteria are more easily satisfied in a technical and economical way.
2009-03-01
III. Methodology ...............................................................................................................26 Overview...applications relating to this research and the results they have obtained, as well as the background on LEEDR. Chapter 3 will detail the methodology ...different in that the snow dissipates faster and it is better to descend slower, at rates of 200 – 300 ft/min. 26 III. Methodology This chapter
NASA Technical Reports Server (NTRS)
Celaya, Jose; Kulkarni, Chetan; Biswas, Gautam; Saha, Sankalita; Goebel, Kai
2011-01-01
A remaining useful life prediction methodology for electrolytic capacitors is presented. This methodology is based on the Kalman filter framework and an empirical degradation model. Electrolytic capacitors are used in several applications ranging from power supplies on critical avionics equipment to power drivers for electro-mechanical actuators. These devices are known for their comparatively low reliability and given their criticality in electronics subsystems they are a good candidate for component level prognostics and health management. Prognostics provides a way to assess remaining useful life of a capacitor based on its current state of health and its anticipated future usage and operational conditions. We present here also, experimental results of an accelerated aging test under electrical stresses. The data obtained in this test form the basis for a remaining life prediction algorithm where a model of the degradation process is suggested. This preliminary remaining life prediction algorithm serves as a demonstration of how prognostics methodologies could be used for electrolytic capacitors. In addition, the use degradation progression data from accelerated aging, provides an avenue for validation of applications of the Kalman filter based prognostics methods typically used for remaining useful life predictions in other applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuente, Rafael de la; Iglesias, Javier; Sedano, Pablo G.
IBERDROLA (Spanish utility) and IBERDROLA INGENIERIA (engineering branch) have been developing during the last 2 yr the 110% Extended Power Uprate Project for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved in advance by the Spanish Nuclear Regulatory Authority. This methodology has been applied to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 and 13 and to develop a significant number of safety analyses of the Cofrentes Extended Power.Because the scope of the licensing process of the Cofrentes Extended Power Uprate exceeds the range of analysis includedmore » in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients. This is the case of the total loss of feedwater (TLFW) transient.The content of this paper shows the benefits of having an in-house design and licensing methodology and describes the process to extend the applicability of the Cofrentes RETRAN model to the analysis of new transients, particularly in this paper the TLFW transient.« less
ERIC Educational Resources Information Center
Brock, Richard; Taber, Keith S.
2017-01-01
This paper examines the role of the microgenetic method in science education. The microgenetic method is a technique for exploring the progression of learning in detail through repeated, high-frequency observations of a learner's "performance" in some activity. Existing microgenetic studies in science education are analysed. This leads…
Critical Incident Stress Debriefing as a Trauma Intervention in First Nation Communities
ERIC Educational Resources Information Center
Hughes, Megan L.
2006-01-01
This study examines the appropriateness of a cross-cultural application of Critical Incident Stress Debriefing (CISD). Participant/observations were made of CISD workshops conducted for First Nations participants. The facilitator and five participants were interviewed using narrative methodology. Observations and interview data were examined using…
Elements of oxygen production systems using Martian atmosphere
NASA Technical Reports Server (NTRS)
Ash, R. L.; Huang, J.-K.; Johnson, P. B.; Sivertson, W. E., Jr.
1986-01-01
Hardware elements have been studied in terms of their applicability to Mars oxygen production systems. Various aspects of the system design are discussed and areas requiring further research are identified. Initial work on system reliability is discussed and a methodology for applying expert system technology to the oxygen processor is described.
Large-Scale Networked Virtual Environments: Architecture and Applications
ERIC Educational Resources Information Center
Lamotte, Wim; Quax, Peter; Flerackers, Eddy
2008-01-01
Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…
Impairment: The Case of Phonotactic Probability and Nonword Repetition
ERIC Educational Resources Information Center
McKean, Cristina; Letts, Carolyn; Howard, David
2013-01-01
Purpose: In this study, the authors aimed to explore the relationship between lexical and phonological knowledge in children with primary language impairment (PLI) through the application of a developmental methodology. Specifically, they tested whether there is evidence for an impairment in the process of phonological abstraction in this group of…
Model uncertainty estimation and risk assessment is essential to environmental management and informed decision making on pollution mitigation strategies. In this study, we apply a probabilistic methodology, which combines Bayesian Monte Carlo simulation and Maximum Likelihood e...