2017-11-01
ARL-TR-8225 ● NOV 2017 US Army Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques 5a. CONTRACT NUMBER
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
Minimum number of measurements for evaluating Bertholletia excelsa.
Baldoni, A B; Tonini, H; Tardin, F D; Botelho, S C C; Teodoro, P E
2017-09-27
Repeatability studies on fruit species are of great importance to identify the minimum number of measurements necessary to accurately select superior genotypes. This study aimed to identify the most efficient method to estimate the repeatability coefficient (r) and predict the minimum number of measurements needed for a more accurate evaluation of Brazil nut tree (Bertholletia excelsa) genotypes based on fruit yield. For this, we assessed the number of fruits and dry mass of seeds of 75 Brazil nut genotypes, from native forest, located in the municipality of Itaúba, MT, for 5 years. To better estimate r, four procedures were used: analysis of variance (ANOVA), principal component analysis based on the correlation matrix (CPCOR), principal component analysis based on the phenotypic variance and covariance matrix (CPCOV), and structural analysis based on the correlation matrix (mean r - AECOR). There was a significant effect of genotypes and measurements, which reveals the need to study the minimum number of measurements for selecting superior Brazil nut genotypes for a production increase. Estimates of r by ANOVA were lower than those observed with the principal component methodology and close to AECOR. The CPCOV methodology provided the highest estimate of r, which resulted in a lower number of measurements needed to identify superior Brazil nut genotypes for the number of fruits and dry mass of seeds. Based on this methodology, three measurements are necessary to predict the true value of the Brazil nut genotypes with a minimum accuracy of 85%.
1983-12-30
AD-Ri46 57? ARCHITECTURE DESIGN AND SYSTEM; PERFORMANCE ASSESSMENT i/i AND DEVELOPMENT ME..(U) NAVAL SURFACE WEAPONS CENTER SILYER SPRING MD J...AD-A 146 577 NSIWC TR 83-324 ARCHITECTURE , DESIGN , AND SYSTEM; PERFORMANCE ASSESSMENT AND DEVELOPMENT METHODOLOGY...REPORT NUMBER 12. GOVT ACCESSION NO.3. RECIPIENT’S CATALOG NUMBER NSWC TR 83-324 10- 1 1 51’ 4. ?ITLE (and subtitle) ARCHITECTURE , DESIGN , AND SYSTEM; S
Charpentier, Ronald R.; Moore, Thomas E.; Gautier, D.L.
2017-11-15
The methodological procedures used in the geologic assessments of the 2008 Circum-Arctic Resource Appraisal (CARA) were based largely on the methodology developed for the 2000 U.S. Geological Survey World Petroleum Assessment. The main variables were probability distributions for numbers and sizes of undiscovered accumulations with an associated risk of occurrence. The CARA methodology expanded on the previous methodology in providing additional tools and procedures more applicable to the many Arctic basins that have little or no exploration history. Most importantly, geologic analogs from a database constructed for this study were used in many of the assessments to constrain numbers and sizes of undiscovered oil and gas accumulations.
Methodology for Estimating Total Automotive Manufacturing Costs
DOT National Transportation Integrated Search
1983-04-01
A number of methodologies for estimating manufacturing costs have been developed. This report discusses the different approaches and shows that an approach to estimating manufacturing costs in the automobile industry based on surrogate plants is pref...
Inquiry-Based Learning for Older People at a University in Spain
ERIC Educational Resources Information Center
Martorell, Ingrid; Medrano, Marc; Sole, Cristian; Vila, Neus; Cabeza, Luisa F.
2009-01-01
With the increasing number of older people in the world and their interest in education, universities play an important role in providing effective learning methodologies. This paper presents a new instructional methodology implementing inquiry-based learning (IBL) in two courses focused on alternative energies in the Program for Older People at…
Prototyping a Microcomputer-Based Online Library Catalog. Occasional Papers Number 177.
ERIC Educational Resources Information Center
Lazinger, Susan S.; Shoval, Peretz
This report examines and evaluates the application of prototyping methodology in the design of a microcomputer-based online library catalog. The methodology for carrying out the research involves a five-part examination of the problem on both the theoretical and applied levels, each of which is discussed in a separate section as follows: (1) a…
ERIC Educational Resources Information Center
Hall, Wayne; Palmer, Stuart; Bennett, Mitchell
2012-01-01
Project-based learning (PBL) is a well-known student-centred methodology for engineering design education. The methodology claims to offer a number of educational benefits. This paper evaluates the student perceptions of the initial and second offering of a first-year design unit at Griffith University in Australia. It builds on an earlier…
Effective normalization for copy number variation detection from whole genome sequencing.
Janevski, Angel; Varadan, Vinay; Kamalakaran, Sitharthan; Banerjee, Nilanjana; Dimitrova, Nevenka
2012-01-01
Whole genome sequencing enables a high resolution view of the human genome and provides unique insights into genome structure at an unprecedented scale. There have been a number of tools to infer copy number variation in the genome. These tools, while validated, also include a number of parameters that are configurable to genome data being analyzed. These algorithms allow for normalization to account for individual and population-specific effects on individual genome CNV estimates but the impact of these changes on the estimated CNVs is not well characterized. We evaluate in detail the effect of normalization methodologies in two CNV algorithms FREEC and CNV-seq using whole genome sequencing data from 8 individuals spanning four populations. We apply FREEC and CNV-seq to a sequencing data set consisting of 8 genomes. We use multiple configurations corresponding to different read-count normalization methodologies in FREEC, and statistically characterize the concordance of the CNV calls between FREEC configurations and the analogous output from CNV-seq. The normalization methodologies evaluated in FREEC are: GC content, mappability and control genome. We further stratify the concordance analysis within genic, non-genic, and a collection of validated variant regions. The GC content normalization methodology generates the highest number of altered copy number regions. Both mappability and control genome normalization reduce the total number and length of copy number regions. Mappability normalization yields Jaccard indices in the 0.07 - 0.3 range, whereas using a control genome normalization yields Jaccard index values around 0.4 with normalization based on GC content. The most critical impact of using mappability as a normalization factor is substantial reduction of deletion CNV calls. The output of another method based on control genome normalization, CNV-seq, resulted in comparable CNV call profiles, and substantial agreement in variable gene and CNV region calls. Choice of read-count normalization methodology has a substantial effect on CNV calls and the use of genomic mappability or an appropriately chosen control genome can optimize the output of CNV analysis.
Risk-Based Explosive Safety Analysis
2016-11-30
safety siting of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed...liquids or propellants . 15. SUBJECT TERMS N/A 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF...of energetic liquids and propellants can be greatly aided by the use of risk-based methodologies. The low probability of exposed personnel and the
Okariz, Ana; Guraya, Teresa; Iturrondobeitia, Maider; Ibarretxe, Julen
2017-02-01
The SIRT (Simultaneous Iterative Reconstruction Technique) algorithm is commonly used in Electron Tomography to calculate the original volume of the sample from noisy images, but the results provided by this iterative procedure are strongly dependent on the specific implementation of the algorithm, as well as on the number of iterations employed for the reconstruction. In this work, a methodology for selecting the iteration number of the SIRT reconstruction that provides the most accurate segmentation is proposed. The methodology is based on the statistical analysis of the intensity profiles at the edge of the objects in the reconstructed volume. A phantom which resembles a a carbon black aggregate has been created to validate the methodology and the SIRT implementations of two free software packages (TOMOJ and TOMO3D) have been used. Copyright © 2016 Elsevier B.V. All rights reserved.
An Integrated Low-Speed Performance and Noise Prediction Methodology for Subsonic Aircraft
NASA Technical Reports Server (NTRS)
Olson, E. D.; Mavris, D. N.
2000-01-01
An integrated methodology has been assembled to compute the engine performance, takeoff and landing trajectories, and community noise levels for a subsonic commercial aircraft. Where feasible, physics-based noise analysis methods have been used to make the results more applicable to newer, revolutionary designs and to allow for a more direct evaluation of new technologies. The methodology is intended to be used with approximation methods and risk analysis techniques to allow for the analysis of a greater number of variable combinations while retaining the advantages of physics-based analysis. Details of the methodology are described and limited results are presented for a representative subsonic commercial aircraft.
Clinical relevance in anesthesia journals.
Lauritsen, Jakob; Møller, Ann M
2006-04-01
The purpose of this review is to present the latest knowledge and research on the definition and distribution of clinically relevant articles in anesthesia journals. It will also discuss the importance of the chosen methodology and outcome of articles. In the last few years, more attention has been paid to evidence-based medicine in anesthesia. Several articles on the subject have focused on the need to base clinical decisions on sound research employing both methodological rigor and clinically relevant outcomes. The number of systematic reviews in anesthesia literature is increasing as well as the focus on diminishing the number of surrogate outcomes. It has been shown that the impact factor is not a valid measure of establishing the level of clinical relevance to a journal. This review presents definitions of clinically relevant anesthesia articles. A clinically relevant article employs both methodological rigor and a clinically relevant outcome. The terms methodological rigor and clinical outcomes are fully discussed in the review as well as problems with journal impact factors.
Applying Chomsky's Linguistic Methodology to the Clinical Interpretation of Symbolic Play.
ERIC Educational Resources Information Center
Ariel, Shlomo
This paper summarizes how Chomsky's methodological principles of linguistics may be applied to the clinical interpretation of children's play. Based on Chomsky's derivation of a "universal grammar" (the set of essential, formal, and substantive traits of any human language), a number of hypothesized formal universals of…
A Methodological Review of Structural Equation Modelling in Higher Education Research
ERIC Educational Resources Information Center
Green, Teegan
2016-01-01
Despite increases in the number of articles published in higher education journals using structural equation modelling (SEM), research addressing their statistical sufficiency, methodological appropriateness and quantitative rigour is sparse. In response, this article provides a census of all covariance-based SEM articles published up until 2013…
Methodological quality assessment of paper-based systematic reviews published in oral health.
Wasiak, J; Shen, A Y; Tan, H B; Mahar, R; Kan, G; Khoo, W R; Faggion, C M
2016-04-01
This study aimed to conduct a methodological assessment of paper-based systematic reviews (SR) published in oral health using a validated checklist. A secondary objective was to explore temporal trends on methodological quality. Two electronic databases (OVID Medline and OVID EMBASE) were searched for paper-based SR of interventions published in oral health from inception to October 2014. Manual searches of the reference lists of paper-based SR were also conducted. Methodological quality of included paper-based SR was assessed using an 11-item questionnaire, Assessment of Multiple Systematic Reviews (AMSTAR) checklist. Methodological quality was summarized using the median and inter-quartile range (IQR) of the AMSTAR score over different categories and time periods. A total of 643 paper-based SR were included. The overall median AMSTAR score was 4 (IQR 2-6). The highest median score (5) was found in the pain dentistry and periodontology fields, while the lowest median score (3) was found in implant dentistry, restorative dentistry, oral medicine, and prosthodontics. The number of paper-based SR per year and the median AMSTAR score increased over time (median score in 1990s was 2 (IQR 2-3), 2000s was 4 (IQR 2-5), and 2010 onwards was 5 (IQR 3-6)). Although the methodological quality of paper-based SR published in oral health has improved in the last few years, there is still scope for improving quality in most evaluated dental specialties. Large-scale assessment of methodological quality of dental SR highlights areas of methodological strengths and weaknesses that can be targeted in future publications to encourage better quality review methodology.
2014-03-03
results. As part of this research and development effort, a number of products were developed that served to advance the research and provided a testing ...Teams, U.S. Navy SEALs, brown‐water Navy personnel, and Naval Reserve Officer Training Corps midshipmen. The base conducts research and tests of newly...effort, a number of products were developed that served to advance the research , and provided a testing ground for our methodologies. In addition
NASA Technical Reports Server (NTRS)
Li, Fei; Choudhari, Meelan M.; Carpenter, Mark H.; Malik, Mujeeb R.; Eppink, Jenna; Chang, Chau-Lyan; Streett, Craig L.
2010-01-01
A high fidelity transition prediction methodology has been applied to a swept airfoil design at a Mach number of 0.75 and chord Reynolds number of approximately 17 million, with the dual goal of an assessment of the design for the implementation and testing of roughness based crossflow transition control and continued maturation of such methodology in the context of realistic aerodynamic configurations. Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes in order to weaken the growth of naturally occurring, linearly more unstable instability modes via a nonlinear modification of the mean boundary layer profiles. Therefore, a synthesis of receptivity, linear and nonlinear growth of crossflow disturbances, and high-frequency secondary instabilities becomes desirable to model this form of control. Because experimental data is currently unavailable for passive crossflow transition control for such high Reynolds number configurations, a holistic computational approach is used to assess the feasibility of roughness based control methodology. Potential challenges inherent to this control application as well as associated difficulties in modeling this form of control in a computational setting are highlighted. At high Reynolds numbers, a broad spectrum of stationary crossflow disturbances amplify and, while it may be possible to control a specific target mode using Discrete Roughness Elements (DREs), nonlinear interaction between the control and target modes may yield strong amplification of the difference mode that could have an adverse impact on the transition delay using spanwise periodic roughness elements.
Integrating automated support for a software management cycle into the TAME system
NASA Technical Reports Server (NTRS)
Sunazuka, Toshihiko; Basili, Victor R.
1989-01-01
Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.
2013-01-01
Background Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state’s Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. Methods The clustering methodology employs a 2-step K-means + Ward’s clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Results Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Conclusions Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units. PMID:23964905
Delamater, Paul L; Shortridge, Ashton M; Messina, Joseph P
2013-08-22
Community-based health care planning and regulation necessitates grouping facilities and areal units into regions of similar health care use. Limited research has explored the methodologies used in creating these regions. We offer a new methodology that clusters facilities based on similarities in patient utilization patterns and geographic location. Our case study focused on Hospital Groups in Michigan, the allocation units used for predicting future inpatient hospital bed demand in the state's Bed Need Methodology. The scientific, practical, and political concerns that were considered throughout the formulation and development of the methodology are detailed. The clustering methodology employs a 2-step K-means + Ward's clustering algorithm to group hospitals. The final number of clusters is selected using a heuristic that integrates both a statistical-based measure of cluster fit and characteristics of the resulting Hospital Groups. Using recent hospital utilization data, the clustering methodology identified 33 Hospital Groups in Michigan. Despite being developed within the politically charged climate of Certificate of Need regulation, we have provided an objective, replicable, and sustainable methodology to create Hospital Groups. Because the methodology is built upon theoretically sound principles of clustering analysis and health care service utilization, it is highly transferable across applications and suitable for grouping facilities or areal units.
Munksgaard, Rasmus; Demant, Jakob; Branwen, Gwern
2016-09-01
The development of cryptomarkets has gained increasing attention from academics, including growing scientific literature on the distribution of illegal goods using cryptomarkets. Dolliver's 2015 article "Evaluating drug trafficking on the Tor Network: Silk Road 2, the Sequel" addresses this theme by evaluating drug trafficking on one of the most well-known cryptomarkets, Silk Road 2.0. The research on cryptomarkets in general-particularly in Dolliver's article-poses a number of new questions for methodologies. This commentary is structured around a replication of Dolliver's original study. The replication study is not based on Dolliver's original dataset, but on a second dataset collected applying the same methodology. We have found that the results produced by Dolliver differ greatly from our replicated study. While a margin of error is to be expected, the inconsistencies we found are too great to attribute to anything other than methodological issues. The analysis and conclusions drawn from studies using these methods are promising and insightful. However, based on the replication of Dolliver's study, we suggest that researchers using these methodologies consider and that datasets be made available for other researchers, and that methodology and dataset metrics (e.g. number of downloaded pages, error logs) are described thoroughly in the context of web-o-metrics and web crawling. Copyright © 2016 Elsevier B.V. All rights reserved.
Auditing as part of the terminology design life cycle.
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology's concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert's manual review on portions of the concepts with a high likelihood of errors.
Application of the Hardman methodology to the Army Remotely Piloted Vehicle (RPV)
NASA Technical Reports Server (NTRS)
1983-01-01
The application of the HARDMAN Methodology to the Remotely Piloted Vehicle (RPV) is described. The methodology was used to analyze the manpower, personnel, and training (MPT) requirements of the proposed RPV system design for a number of operating scenarios. The RPV system is defined as consisting of the equipment, personnel, and operational procedures needed to perform five basic artillery missions: reconnaissance, target acquisition, artillery adjustment, target designation and damage assessment. The RPV design evaluated includes an air vehicle (AV), a modular integrated communications and navigation system (MICNS), a ground control station (GCS), a launch subsystem (LS), a recovery subsystem (RS), and a number of ground support requirements. The HARDMAN Methodology is an integrated set of data base management techniques and analytic tools, designed to provide timely and fully documented assessments of the human resource requirements associated with an emerging system's design.
Impact of computational structure-based methods on drug discovery.
Reynolds, Charles H
2014-01-01
Structure-based drug design has become an indispensible tool in drug discovery. The emergence of structure-based design is due to gains in structural biology that have provided exponential growth in the number of protein crystal structures, new computational algorithms and approaches for modeling protein-ligand interactions, and the tremendous growth of raw computer power in the last 30 years. Computer modeling and simulation have made major contributions to the discovery of many groundbreaking drugs in recent years. Examples are presented that highlight the evolution of computational structure-based design methodology, and the impact of that methodology on drug discovery.
Adaptive video-based vehicle classification technique for monitoring traffic.
DOT National Transportation Integrated Search
2015-08-01
This report presents a methodology for extracting two vehicle features, vehicle length and number of axles in order : to classify the vehicles from video, based on Federal Highway Administration (FHWA)s recommended vehicle : classification scheme....
ERIC Educational Resources Information Center
Bierema, Andrea M.-K.; Schwartz, Renee S.; Gill, Sharon A.
2017-01-01
Recent calls for reform in education recommend science curricula to be based on central ideas instead of a larger number of topics and for alignment between current scientific research and curricula. Because alignment is rarely studied, especially for central ideas, we developed a methodology to discover the extent of alignment between primary…
Air Force Energy Program Policy Memorandum
2009-06-16
Critical Asset Prioritization Methodology ( CAPM ) tool Manage costs. 3.4.2.5. Metrics Percentage of alternative/renewable fuel used for aviation fuel...supporting critical assets residing on military installations Field the Critical Asset Prioritization Methodology ( CAPM ) tool by Spring 2008. This CAPM ...Increase the number of flexible fuel systems • Identify/develop privately financed/operated energy production on Air Bases • Field the Critical
Surrogate based wind farm layout optimization using manifold mapping
NASA Astrophysics Data System (ADS)
Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester
2016-09-01
High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.
Auditing as Part of the Terminology Design Life Cycle
Min, Hua; Perl, Yehoshua; Chen, Yan; Halper, Michael; Geller, James; Wang, Yue
2006-01-01
Objective To develop and test an auditing methodology for detecting errors in medical terminologies satisfying systematic inheritance. This methodology is based on various abstraction taxonomies that provide high-level views of a terminology and highlight potentially erroneous concepts. Design Our auditing methodology is based on dividing concepts of a terminology into smaller, more manageable units. First, we divide the terminology’s concepts into areas according to their relationships/roles. Then each multi-rooted area is further divided into partial-areas (p-areas) that are singly-rooted. Each p-area contains a set of structurally and semantically uniform concepts. Two kinds of abstraction networks, called the area taxonomy and p-area taxonomy, are derived. These taxonomies form the basis for the auditing approach. Taxonomies tend to highlight potentially erroneous concepts in areas and p-areas. Human reviewers can focus their auditing efforts on the limited number of problematic concepts following two hypotheses on the probable concentration of errors. Results A sample of the area taxonomy and p-area taxonomy for the Biological Process (BP) hierarchy of the National Cancer Institute Thesaurus (NCIT) was derived from the application of our methodology to its concepts. These views led to the detection of a number of different kinds of errors that are reported, and to confirmation of the hypotheses on error concentration in this hierarchy. Conclusion Our auditing methodology based on area and p-area taxonomies is an efficient tool for detecting errors in terminologies satisfying systematic inheritance of roles, and thus facilitates their maintenance. This methodology concentrates a domain expert’s manual review on portions of the concepts with a high likelihood of errors. PMID:16929044
Risk-based Methodology for Validation of Pharmaceutical Batch Processes.
Wiles, Frederick
2013-01-01
In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.
Performance-based methodology for assessing seismic vulnerability and capacity of buildings
NASA Astrophysics Data System (ADS)
Shibin, Lin; Lili, Xie; Maosheng, Gong; Ming, Li
2010-06-01
This paper presents a performance-based methodology for the assessment of seismic vulnerability and capacity of buildings. The vulnerability assessment methodology is based on the HAZUS methodology and the improved capacitydemand-diagram method. The spectral displacement ( S d ) of performance points on a capacity curve is used to estimate the damage level of a building. The relationship between S d and peak ground acceleration (PGA) is established, and then a new vulnerability function is expressed in terms of PGA. Furthermore, the expected value of the seismic capacity index (SCev) is provided to estimate the seismic capacity of buildings based on the probability distribution of damage levels and the corresponding seismic capacity index. The results indicate that the proposed vulnerability methodology is able to assess seismic damage of a large number of building stock directly and quickly following an earthquake. The SCev provides an effective index to measure the seismic capacity of buildings and illustrate the relationship between the seismic capacity of buildings and seismic action. The estimated result is compared with damage surveys of the cities of Dujiangyan and Jiangyou in the M8.0 Wenchuan earthquake, revealing that the methodology is acceptable for seismic risk assessment and decision making. The primary reasons for discrepancies between the estimated results and the damage surveys are discussed.
Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S
2015-03-02
A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.
Design and analysis of sustainable computer mouse using design for disassembly methodology
NASA Astrophysics Data System (ADS)
Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia
2017-12-01
This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.
Montecinos, P; Rodewald, A M
1994-06-01
The aim this work was to assess and compare the achievements of medical students, subjected to problem based learning methodology. The information and comprehension categories of Bloom were tested in 17 medical students in four different occasions during the physiopathology course, using a multiple choice knowledge test. There was a significant improvement in the number of correct answers towards the end of the course. It is concluded that these medical students obtained adequate learning achievements in the information subcategory of Bloom using problem based learning methodology, during the physiopathology course.
Crossing trend analysis methodology and application for Turkish rainfall records
NASA Astrophysics Data System (ADS)
Şen, Zekâi
2018-01-01
Trend analyses are the necessary tools for depicting possible general increase or decrease in a given time series. There are many versions of trend identification methodologies such as the Mann-Kendall trend test, Spearman's tau, Sen's slope, regression line, and Şen's innovative trend analysis. The literature has many papers about the use, cons and pros, and comparisons of these methodologies. In this paper, a completely new approach is proposed based on the crossing properties of a time series. It is suggested that the suitable trend from the centroid of the given time series should have the maximum number of crossings (total number of up-crossings or down-crossings). This approach is applicable whether the time series has dependent or independent structure and also without any dependence on the type of the probability distribution function. The validity of this method is presented through extensive Monte Carlo simulation technique and its comparison with other existing trend identification methodologies. The application of the methodology is presented for a set of annual daily extreme rainfall time series from different parts of Turkey and they have physically independent structure.
Infinity Computer and Calculus
NASA Astrophysics Data System (ADS)
Sergeyev, Yaroslav D.
2007-09-01
Traditional computers work with finite numbers. Situations where the usage of infinite or infinitesimal quantities is required are studied mainly theoretically. In this survey talk, a new computational methodology (that is not related to nonstandard analysis) is described. It is based on the principle `The part is less than the whole' applied to all numbers (finite, infinite, and infinitesimal) and to all sets and processes (finite and infinite). It is shown that it becomes possible to write down finite, infinite, and infinitesimal numbers by a finite number of symbols as particular cases of a unique framework. The new methodology allows us to introduce the Infinity Computer working with all these numbers (its simulator is presented during the lecture). The new computational paradigm both gives possibilities to execute computations of a new type and simplifies fields of mathematics where infinity and/or infinitesimals are encountered. Numerous examples of the usage of the introduced computational tools are given during the lecture.
Potential and Limitations of an Improved Method to Produce Dynamometric Wheels
García de Jalón, Javier
2018-01-01
A new methodology for the estimation of tyre-contact forces is presented. The new procedure is an evolution of a previous method based on harmonic elimination techniques developed with the aim of producing low cost dynamometric wheels. While the original method required stress measurement in many rim radial lines and the fulfillment of some rigid conditions of symmetry, the new methodology described in this article significantly reduces the number of required measurement points and greatly relaxes symmetry constraints. This can be done without compromising the estimation error level. The reduction of the number of measuring radial lines increases the ripple of demodulated signals due to non-eliminated higher order harmonics. Therefore, it is necessary to adapt the calibration procedure to this new scenario. A new calibration procedure that takes into account angular position of the wheel is completely described. This new methodology is tested on a standard commercial five-spoke car wheel. Obtained results are qualitatively compared to those derived from the application of former methodology leading to the conclusion that the new method is both simpler and more robust due to the reduction in the number of measuring points, while contact forces’ estimation error remains at an acceptable level. PMID:29439427
Automated Authorship Attribution Using Advanced Signal Classification Techniques
Ebrahimpour, Maryam; Putniņš, Tālis J.; Berryman, Matthew J.; Allison, Andrew; Ng, Brian W.-H.; Abbott, Derek
2013-01-01
In this paper, we develop two automated authorship attribution schemes, one based on Multiple Discriminant Analysis (MDA) and the other based on a Support Vector Machine (SVM). The classification features we exploit are based on word frequencies in the text. We adopt an approach of preprocessing each text by stripping it of all characters except a-z and space. This is in order to increase the portability of the software to different types of texts. We test the methodology on a corpus of undisputed English texts, and use leave-one-out cross validation to demonstrate classification accuracies in excess of 90%. We further test our methods on the Federalist Papers, which have a partly disputed authorship and a fair degree of scholarly consensus. And finally, we apply our methodology to the question of the authorship of the Letter to the Hebrews by comparing it against a number of original Greek texts of known authorship. These tests identify where some of the limitations lie, motivating a number of open questions for future work. An open source implementation of our methodology is freely available for use at https://github.com/matthewberryman/author-detection. PMID:23437047
Ho, Cheng-I; Lin, Min-Der; Lo, Shang-Lien
2010-07-01
A methodology based on the integration of a seismic-based artificial neural network (ANN) model and a geographic information system (GIS) to assess water leakage and to prioritize pipeline replacement is developed in this work. Qualified pipeline break-event data derived from the Taiwan Water Corporation Pipeline Leakage Repair Management System were analyzed. "Pipe diameter," "pipe material," and "the number of magnitude-3( + ) earthquakes" were employed as the input factors of ANN, while "the number of monthly breaks" was used for the prediction output. This study is the first attempt to manipulate earthquake data in the break-event ANN prediction model. Spatial distribution of the pipeline break-event data was analyzed and visualized by GIS. Through this, the users can swiftly figure out the hotspots of the leakage areas. A northeastern township in Taiwan, frequently affected by earthquakes, is chosen as the case study. Compared to the traditional processes for determining the priorities of pipeline replacement, the methodology developed is more effective and efficient. Likewise, the methodology can overcome the difficulty of prioritizing pipeline replacement even in situations where the break-event records are unavailable.
Learning Motion Features for Example-Based Finger Motion Estimation for Virtual Characters
NASA Astrophysics Data System (ADS)
Mousas, Christos; Anagnostopoulos, Christos-Nikolaos
2017-09-01
This paper presents a methodology for estimating the motion of a character's fingers based on the use of motion features provided by a virtual character's hand. In the presented methodology, firstly, the motion data is segmented into discrete phases. Then, a number of motion features are computed for each motion segment of a character's hand. The motion features are pre-processed using restricted Boltzmann machines, and by using the different variations of semantically similar finger gestures in a support vector machine learning mechanism, the optimal weights for each feature assigned to a metric are computed. The advantages of the presented methodology in comparison to previous solutions are the following: First, we automate the computation of optimal weights that are assigned to each motion feature counted in our metric. Second, the presented methodology achieves an increase (about 17%) in correctly estimated finger gestures in comparison to a previous method.
Andrade, Luís Renato Balbão; Amaral, Fernando Gonçalves
2012-01-01
Nanotechnologies is a multidisciplinary set of techniques to manipulate matter on nanoscale level, more precisely particles below 100 nm whose characteristic due to small size is essentially different from those found in macro form materials. Regarding to these new properties of the materials there are knowledge gaps about the effects of these particles on human organism and the environment. Although it still being considered emerging technology it is growing increasingly fast as well as the number of products using nanotechnologies in some production level and so the number of researchers involved with the subject. Given this scenario and based on literature related, a comprehensive methodology for health and safety at work for researching laboratories with activities in nanotechnologies was developed, based on ILO structure guidelines for safety and health at work system on which a number of nanospecific recommendations were added to. The work intends to offer food for thought on controlling risks associated to nanotechnologies.
NASA Astrophysics Data System (ADS)
Kalanov, Temur Z.
2014-03-01
A critical analysis of the foundations of standard vector calculus is proposed. The methodological basis of the analysis is the unity of formal logic and of rational dialectics. It is proved that the vector calculus is incorrect theory because: (a) it is not based on a correct methodological basis - the unity of formal logic and of rational dialectics; (b) it does not contain the correct definitions of ``movement,'' ``direction'' and ``vector'' (c) it does not take into consideration the dimensions of physical quantities (i.e., number names, denominate numbers, concrete numbers), characterizing the concept of ''physical vector,'' and, therefore, it has no natural-scientific meaning; (d) operations on ``physical vectors'' and the vector calculus propositions relating to the ''physical vectors'' are contrary to formal logic.
Membrane Insertion Profiles of Peptides Probed by Molecular Dynamics Simulations
2008-07-17
Membrane insertion profiles of peptides probed by molecular dynamics simulations In-Chul Yeh,* Mark A. Olson,# Michael S. Lee,*#§ and Anders...a methodology based on molecular dynamics simulation techniques to probe the insertion profiles of small peptides across the membrane interface. The...profiles of peptides probed by molecular dynamics simulations 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d
NASA Astrophysics Data System (ADS)
Polatidis, Heracles; Morales, Jan Borràs
2016-11-01
In this paper a methodological framework for increasing the actual applicability of wind farms is developed and applied. The framework is based on multi-criteria decision aid techniques that perform an integrated technical and societal evaluation of a number of potential wind power projects that are a variation of a pre-existing actual proposal that faces implementation difficulties. A number of evaluation criteria are established and assessed via particular related software or are comparatively evaluated among each other on a semi-qualitative basis. The preference of a diverse audience of pertinent stakeholders can be also incorporated in the overall analysis. The result of the process is an identification of a new project that will exhibit increased actual implementation potential compared with the original proposal. The methodology is tested in a case study of a wind farm in the UK and relevant conclusions are drawn.
2010-08-01
a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. a ...SECURITY CLASSIFICATION OF: This study presents a methodology for computing stochastic sensitivities with respect to the design variables, which are the...Random Variables Report Title ABSTRACT This study presents a methodology for computing stochastic sensitivities with respect to the design variables
Díaz Córdova, Diego
2016-01-01
The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.
The Cylindrical Component Methodology Evaluation Module for MUVES-S2
2017-04-01
ARL-TR-7990 ● APR 2017 US Army Research Laboratory The Cylindrical Component Methodology Evaluation Module for MUVES-S2 by...Laboratory The Cylindrical Component Methodology Evaluation Module for MUVES-S2 by David S Butler, Marianne Kunkel, and Brian G Smith...Methodology Evaluation Module for MUVES-S2 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) David S Butler, Marianne
Online model-based diagnosis to support autonomous operation of an advanced life support system.
Biswas, Gautam; Manders, Eric-Jan; Ramirez, John; Mahadevan, Nagabhusan; Abdelwahed, Sherif
2004-01-01
This article describes methods for online model-based diagnosis of subsystems of the advanced life support system (ALS). The diagnosis methodology is tailored to detect, isolate, and identify faults in components of the system quickly so that fault-adaptive control techniques can be applied to maintain system operation without interruption. We describe the components of our hybrid modeling scheme and the diagnosis methodology, and then demonstrate the effectiveness of this methodology by building a detailed model of the reverse osmosis (RO) system of the water recovery system (WRS) of the ALS. This model is validated with real data collected from an experimental testbed at NASA JSC. A number of diagnosis experiments run on simulated faulty data are presented and the results are discussed.
Online model-based diagnosis to support autonomous operation of an advanced life support system
NASA Technical Reports Server (NTRS)
Biswas, Gautam; Manders, Eric-Jan; Ramirez, John; Mahadevan, Nagabhusan; Abdelwahed, Sherif
2004-01-01
This article describes methods for online model-based diagnosis of subsystems of the advanced life support system (ALS). The diagnosis methodology is tailored to detect, isolate, and identify faults in components of the system quickly so that fault-adaptive control techniques can be applied to maintain system operation without interruption. We describe the components of our hybrid modeling scheme and the diagnosis methodology, and then demonstrate the effectiveness of this methodology by building a detailed model of the reverse osmosis (RO) system of the water recovery system (WRS) of the ALS. This model is validated with real data collected from an experimental testbed at NASA JSC. A number of diagnosis experiments run on simulated faulty data are presented and the results are discussed.
A methodology for reduced order modeling and calibration of the upper atmosphere
NASA Astrophysics Data System (ADS)
Mehta, Piyush M.; Linares, Richard
2017-10-01
Atmospheric drag is the largest source of uncertainty in accurately predicting the orbit of satellites in low Earth orbit (LEO). Accurately predicting drag for objects that traverse LEO is critical to space situational awareness. Atmospheric models used for orbital drag calculations can be characterized either as empirical or physics-based (first principles based). Empirical models are fast to evaluate but offer limited real-time predictive/forecasting ability, while physics based models offer greater predictive/forecasting ability but require dedicated parallel computational resources. Also, calibration with accurate data is required for either type of models. This paper presents a new methodology based on proper orthogonal decomposition toward development of a quasi-physical, predictive, reduced order model that combines the speed of empirical and the predictive/forecasting capabilities of physics-based models. The methodology is developed to reduce the high dimensionality of physics-based models while maintaining its capabilities. We develop the methodology using the Naval Research Lab's Mass Spectrometer Incoherent Scatter model and show that the diurnal and seasonal variations can be captured using a small number of modes and parameters. We also present calibration of the reduced order model using the CHAMP and GRACE accelerometer-derived densities. Results show that the method performs well for modeling and calibration of the upper atmosphere.
Chung, Ka-Fai; Chan, Man-Sum; Lam, Ying-Yin; Lai, Cindy Sin-Yee; Yeung, Wing-Fai
2017-06-01
Insufficient sleep among students is a major school health problem. School-based sleep education programs tailored to reach large number of students may be one of the solutions. A systematic review and meta-analysis was conducted to summarize the programs' effectiveness and current status. Electronic databases were searched up until May 2015. Randomized controlled trials of school-based sleep intervention among 10- to 19-year-old students with outcome on total sleep duration were included. Methodological quality of the studies was assessed using the Cochrane's risk of bias assessment. Seven studies were included, involving 1876 students receiving sleep education programs and 2483 attending classes-as-usual. Four weekly 50-minute sleep education classes were most commonly provided. Methodological quality was only moderate, with a high or an uncertain risk of bias in several domains. Compared to classes-as-usual, sleep education programs produced significantly longer weekday and weekend total sleep time and better mood among students at immediate post-treatment, but the improvements were not maintained at follow-up. Limited by the small number of studies and methodological limitations, the preliminary data showed that school-based sleep education programs produced short-term benefits. Future studies should explore integrating sleep education with delayed school start time or other more effective approaches. © 2017, American School Health Association.
Garg, Harish
2013-03-01
The main objective of the present paper is to propose a methodology for analyzing the behavior of the complex repairable industrial systems. In real-life situations, it is difficult to find the most optimal design policies for MTBF (mean time between failures), MTTR (mean time to repair) and related costs by utilizing available resources and uncertain data. For this, the availability-cost optimization model has been constructed for determining the optimal design parameters for improving the system design efficiency. The uncertainties in the data related to each component of the system are estimated with the help of fuzzy and statistical methodology in the form of the triangular fuzzy numbers. Using these data, the various reliability parameters, which affects the system performance, are obtained in the form of the fuzzy membership function by the proposed confidence interval based fuzzy Lambda-Tau (CIBFLT) methodology. The computed results by CIBFLT are compared with the existing fuzzy Lambda-Tau methodology. Sensitivity analysis on the system MTBF has also been addressed. The methodology has been illustrated through a case study of washing unit, the main part of the paper industry. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.
Roughness Based Crossflow Transition Control: A Computational Assessment
NASA Technical Reports Server (NTRS)
Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan; Streett, Craig L.; Carpenter, Mark H.
2009-01-01
A combination of parabolized stability equations and secondary instability theory has been applied to a low-speed swept airfoil model with a chord Reynolds number of 7.15 million, with the goals of (i) evaluating this methodology in the context of transition prediction for a known configuration for which roughness based crossflow transition control has been demonstrated under flight conditions and (ii) of analyzing the mechanism of transition delay via the introduction of discrete roughness elements (DRE). Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes, so as to weaken the growth of naturally occurring, linearly more unstable crossflow modes. Therefore, a synthesis of receptivity, linear and nonlinear growth of stationary crossflow disturbances, and the ensuing development of high frequency secondary instabilities is desirable to understand the experimentally observed transition behavior. With further validation, such higher fidelity prediction methodology could be utilized to assess the potential for crossflow transition control at even higher Reynolds numbers, where experimental data is currently unavailable.
NASA Astrophysics Data System (ADS)
Wang, Lynn T.-N.; Schroeder, Uwe Paul; Madhavan, Sriram
2017-03-01
A pattern-based methodology for optimizing SADP-compliant layout designs is developed based on identifying cut mask patterns and replacing them with pre-characterized fixing solutions. A pattern-based library of difficult-tomanufacture cut patterns with pre-characterized fixing solutions is built. A pattern-based engine searches for matching patterns in the decomposed layouts. When a match is found, the engine opportunistically replaces the detected pattern with a pre-characterized fixing solution. The methodology was demonstrated on a 7nm routed metal2 block. A small library of 30 cut patterns increased the number of more manufacturable cuts by 38% and metal-via enclosure by 13% with a small parasitic capacitance impact of 0.3%.
Calibration methodology for proportional counters applied to yield measurements of a neutron burst.
Tarifeño-Saldivia, Ariel; Mayer, Roberto E; Pavez, Cristian; Soto, Leopoldo
2014-01-01
This paper introduces a methodology for the yield measurement of a neutron burst using neutron proportional counters. This methodology is to be applied when single neutron events cannot be resolved in time by nuclear standard electronics, or when a continuous current cannot be measured at the output of the counter. The methodology is based on the calibration of the counter in pulse mode, and the use of a statistical model to estimate the number of detected events from the accumulated charge resulting from the detection of the burst of neutrons. The model is developed and presented in full detail. For the measurement of fast neutron yields generated from plasma focus experiments using a moderated proportional counter, the implementation of the methodology is herein discussed. An experimental verification of the accuracy of the methodology is presented. An improvement of more than one order of magnitude in the accuracy of the detection system is obtained by using this methodology with respect to previous calibration methods.
ERIC Educational Resources Information Center
Notes on Literacy, 1997
1997-01-01
The 1997 volume of "Notes on Literacy," numbers 1-4, includes the following articles: "Community Based Literacy, Burkina Faso"; "The Acquisition of a Second Writing System"; "Appropriate Methodology and Social Context"; "Literacy Megacourse Offered"; "Fitting in with Local Assumptions about…
Implementing Competency-Based Education: Challenges, Strategies, and a Decision-Making Framework
ERIC Educational Resources Information Center
Dragoo, Amie; Barrows, Richard
2016-01-01
The number of competency-based education (CBE) degree programs has increased rapidly over the past five years, yet there is little research on CBE program development. This study utilized conceptual models of higher education change and a qualitative methodology to analyze the strategies and challenges in implementing CBE business degree programs…
Using scan statistics for congenital anomalies surveillance: the EUROCAT methodology.
Teljeur, Conor; Kelly, Alan; Loane, Maria; Densem, James; Dolk, Helen
2015-11-01
Scan statistics have been used extensively to identify temporal clusters of health events. We describe the temporal cluster detection methodology adopted by the EUROCAT (European Surveillance of Congenital Anomalies) monitoring system. Since 2001, EUROCAT has implemented variable window width scan statistic for detecting unusual temporal aggregations of congenital anomaly cases. The scan windows are based on numbers of cases rather than being defined by time. The methodology is imbedded in the EUROCAT Central Database for annual application to centrally held registry data. The methodology was incrementally adapted to improve the utility and to address statistical issues. Simulation exercises were used to determine the power of the methodology to identify periods of raised risk (of 1-18 months). In order to operationalize the scan methodology, a number of adaptations were needed, including: estimating date of conception as unit of time; deciding the maximum length (in time) and recency of clusters of interest; reporting of multiple and overlapping significant clusters; replacing the Monte Carlo simulation with a lookup table to reduce computation time; and placing a threshold on underlying population change and estimating the false positive rate by simulation. Exploration of power found that raised risk periods lasting 1 month are unlikely to be detected except when the relative risk and case counts are high. The variable window width scan statistic is a useful tool for the surveillance of congenital anomalies. Numerous adaptations have improved the utility of the original methodology in the context of temporal cluster detection in congenital anomalies.
Methodological standards in single-case experimental design: Raising the bar.
Ganz, Jennifer B; Ayres, Kevin M
2018-04-12
Single-case experimental designs (SCEDs), or small-n experimental research, are frequently implemented to assess approaches to improving outcomes for people with disabilities, particularly those with low-incidence disabilities, such as some developmental disabilities. SCED has become increasingly accepted as a research design. As this literature base is needed to determine what interventions are evidence-based practices, the acceptance of SCED has resulted in increased critiques with regard to methodological quality. Recent trends include recommendations from a number of expert scholars and institutions. The purpose of this article is to summarize the recent history of methodological quality considerations, synthesize the recommendations found in the SCED literature, and provide recommendations to researchers designing SCEDs with regard to essential and aspirational standards for methodological quality. Conclusions include imploring SCED to increase the quality of their experiments, with particular consideration regarding the applied nature of SCED research to be published in Research in Developmental Disabilities and beyond. Copyright © 2018 Elsevier Ltd. All rights reserved.
Experience-based co-design in an adult psychological therapies service.
Cooper, Kate; Gillmore, Chris; Hogg, Lorna
2016-01-01
Experience-based co-design (EBCD) is a methodology for service improvement and development, which puts service-user voices at the heart of improving health services. The aim of this paper was to implement the EBCD methodology in a mental health setting, and to investigate the challenges which arise during this process. In order to achieve this, a modified version of the EBCD methodology was undertaken, which involved listening to the experiences of the people who work in and use the mental health setting and sharing these experiences with the people who could effect change within the service, through collaborative work between service-users, staff and managers. EBCD was implemented within the mental health setting and was well received by service-users, staff and stakeholders. A number of modifications were necessary in this setting, for example high levels of support available to participants. It was concluded that EBCD is a suitable methodology for service improvement in mental health settings.
Methodological quality of randomised controlled trials in burns care. A systematic review.
Danilla, Stefan; Wasiak, Jason; Searle, Susana; Arriagada, Cristian; Pedreros, Cesar; Cleland, Heather; Spinks, Anneliese
2009-11-01
To evaluate the methodological quality of published randomised controlled trials (RCTs) in burn care treatment and management. Using a predetermined search strategy we searched Ovid MEDLINE (1950 to January 2008) database to identify all English RCTs related to burn care. Full text studies identified were reviewed for key demographic and methodological characteristics. Methodological trial quality was assessed using the Jadad scale. A total of 257 studies involving 14,535 patients met the inclusion criteria. The median Jadad score was 2 (out of a best possible score of 5). Information was given in the introduction and discussion sections of most RCTs, although insufficient detail was provided on randomisation, allocation concealment, and blinding. The number of RCTs increased between 1950 and 2008 (Spearman's rho=0.6129, P<0.001), although the reporting quality did not improve over the same time period (P=0.1896) and was better in RCTs with larger sample sizes (median Jadad score, 4 vs. 2 points, P<0.0001). Methodological quality did not correlate with journal impact factor (P=0.2371). The reporting standards of RCTs are highly variable and less than optimal in most cases. The advent of evidence-based medicine heralds a new approach to burns care and systematic steps are needed to improve the quality of RCTs in this field. Identifying and reviewing the existing number of RCTs not only highlights the need for burn clinicians to conduct more trials, but may also encourage burn health clinicians to consider the importance of conducting trials that follow appropriate, evidence-based standards.
Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions
NASA Astrophysics Data System (ADS)
De Risi, Raffaele; Goda, Katsuichiro
2017-08-01
Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.
Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology, and Experiments
2016-03-24
NUWC-NPT Technical Report 12,186 24 March 2016 Axial Magneto-Inductive Effect in Soft Magnetic Microfibers, Test Methodology , and...Microfibers, Test Methodology , and Experiments 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Anthony B...5 4 MEASUREMENTS AND EXPERIMENTAL APPARATUS ...........................................9 5 SAMPLE PREPARATION
[Why evidence-based medicine? 20 years of meta-analysis].
Ceballos, C; Valdizán, J R; Artal, A; Almárcegui, C; Allepuz, C; García Campayo, J; Fernández Liesa, R; Giraldo, P; Puértolas, T
2000-10-01
Meta-analysis, described within evidence-based medicine, has become a frequent issue in recent medical literature. An exhaustive search of reported meta-analysis from any medical specialty is described. Search of papers included in Medline or Embase between 1973-1998. A study of intra and inter-reviewers liability about selection and classification have been performed. A descriptive analysis of the reported papers (frequency tables and graphics) is described, including differences of mean of reported meta-analysis papers by medical specialty and year. 1,518 papers were selected and classified. Most frequently found (45.91%) were: methodology (15.7%), psychiatry (11.79%), cardiology (10.01%) and oncology (8.36%). Inter personal agreement was 0.93 in selecting papers and 0.72 in classifying them. Between 1977-1987 overall mean of reported studies of meta-analysis (1.67 + 4.10) was significatively inferior to the 1988-1998 (49.54 + 56.55) (p < 0.001). Global number of meta-analysis was positively correlated (p < 0.05) with the number of studies about fundamentals and methodology during the study period. The method used to identify meta-analysis reports can be considered to be adequate; however, the agreement in classifying them in medical specialties was inferior. A progressive increase in the number of reported meta-analysis since 1977 can be demonstrated. Specialties with a greater number of meta-analysis published in the literature were: psychiatry, oncology and cardiology. Diffusion of knowledge about fundamentals and methodology of meta-analysis seems to have drawn and increase in performing and reporting this kind of analysis.
Case formulation and management using pattern-based formulation (PBF) methodology: clinical case 1.
Fernando, Irosh; Cohen, Martin
2014-02-01
A tool for psychiatric case formulation known as pattern-based formulation (PBF) has been recently introduced. This paper presents an application of this methodology in formulating and managing complex clinical cases. The symptomatology of the clinical presentation has been parsed into individual clinical phenomena and interpreted by selecting explanatory models. The clinical presentation demonstrates how PBF has been used as a clinical tool to guide clinicians' thinking, that takes a structured approach to manage multiple issues using a broad range of management strategies. In doing so, the paper also introduces a number of patterns related to the observed clinical phenomena that can be re-used as explanatory models when formulating other clinical cases. It is expected that this paper will assist clinicians, and particularly trainees, to better understand PBF methodology and apply it to improve their formulation skills.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Augustine, Chad
Existing methodologies for estimating the electricity generation potential of Enhanced Geothermal Systems (EGS) assume thermal recovery factors of 5% or less, resulting in relatively low volumetric electricity generation potentials for EGS reservoirs. This study proposes and develops a methodology for calculating EGS electricity generation potential based on the Gringarten conceptual model and analytical solution for heat extraction from fractured rock. The electricity generation potential of a cubic kilometer of rock as a function of temperature is calculated assuming limits on the allowed produced water temperature decline and reservoir lifetime based on surface power plant constraints. The resulting estimates of EGSmore » electricity generation potential can be one to nearly two-orders of magnitude larger than those from existing methodologies. The flow per unit fracture surface area from the Gringarten solution is found to be a key term in describing the conceptual reservoir behavior. The methodology can be applied to aid in the design of EGS reservoirs by giving minimum reservoir volume, fracture spacing, number of fractures, and flow requirements for a target reservoir power output. Limitations of the idealized model compared to actual reservoir performance and the implications on reservoir design are discussed.« less
Emery, Sherry; Lee, Jungwha; Curry, Susan J; Johnson, Tim; Sporer, Amy K; Mermelstein, Robin; Flay, Brian; Warnecke, Richard
2010-02-01
Surveys of community-based programs are difficult to conduct when there is virtually no information about the number or locations of the programs of interest. This article describes the methodology used by the Helping Young Smokers Quit (HYSQ) initiative to identify and profile community-based youth smoking cessation programs in the absence of a defined sample frame. We developed a two-stage sampling design, with counties as the first-stage probability sampling units. The second stage used snowball sampling to saturation, to identify individuals who administered youth smoking cessation programs across three economic sectors in each county. Multivariate analyses modeled the relationship between program screening, eligibility, and response rates and economic sector and stratification criteria. Cumulative logit models analyzed the relationship between the number of contacts in a county and the number of programs screened, eligible, or profiled in a county. The snowball process yielded 9,983 unique and traceable contacts. Urban and high-income counties yielded significantly more screened program administrators; urban counties produced significantly more eligible programs, but there was no significant association between the county characteristics and program response rate. There is a positive relationship between the number of informants initially located and the number of programs screened, eligible, and profiled in a county. Our strategy to identify youth tobacco cessation programs could be used to create a sample frame for other nonprofit organizations that are difficult to identify due to a lack of existing directories, lists, or other traditional sample frames.
Crovelli, Robert A.; Coe, Jeffrey A.
2008-01-01
The Probabilistic Landslide Assessment Cost Estimation System (PLACES) presented in this report estimates the number and economic loss (cost) of landslides during a specified future time in individual areas, and then calculates the sum of those estimates. The analytic probabilistic methodology is based upon conditional probability theory and laws of expectation and variance. The probabilistic methodology is expressed in the form of a Microsoft Excel computer spreadsheet program. Using historical records, the PLACES spreadsheet is used to estimate the number of future damaging landslides and total damage, as economic loss, from future landslides caused by rainstorms in 10 counties of the San Francisco Bay region in California. Estimates are made for any future 5-year period of time. The estimated total number of future damaging landslides for the entire 10-county region during any future 5-year period of time is about 330. Santa Cruz County has the highest estimated number of damaging landslides (about 90), whereas Napa, San Francisco, and Solano Counties have the lowest estimated number of damaging landslides (5?6 each). Estimated direct costs from future damaging landslides for the entire 10-county region for any future 5-year period are about US $76 million (year 2000 dollars). San Mateo County has the highest estimated costs ($16.62 million), and Solano County has the lowest estimated costs (about $0.90 million). Estimated direct costs are also subdivided into public and private costs.
Delta Clipper-Experimental In-Ground Effect on Base-Heating Environment
NASA Technical Reports Server (NTRS)
Wang, Ten-See
1998-01-01
A quasitransient in-ground effect method is developed to study the effect of vertical landing on a launch vehicle base-heating environment. This computational methodology is based on a three-dimensional, pressure-based, viscous flow, chemically reacting, computational fluid dynamics formulation. Important in-ground base-flow physics such as the fountain-jet formation, plume growth, air entrainment, and plume afterburning are captured with the present methodology. Convective and radiative base-heat fluxes are computed for comparison with those of a flight test. The influence of the laminar Prandtl number on the convective heat flux is included in this study. A radiative direction-dependency test is conducted using both the discrete ordinate and finite volume methods. Treatment of the plume afterburning is found to be very important for accurate prediction of the base-heat fluxes. Convective and radiative base-heat fluxes predicted by the model using a finite rate chemistry option compared reasonably well with flight-test data.
Explosion/Blast Dynamics for Constellation Launch Vehicles Assessment
NASA Technical Reports Server (NTRS)
Baer, Mel; Crawford, Dave; Hickox, Charles; Kipp, Marlin; Hertel, Gene; Morgan, Hal; Ratzel, Arthur; Cragg, Clinton H.
2009-01-01
An assessment methodology is developed to guide quantitative predictions of adverse physical environments and the subsequent effects on the Ares-1 crew launch vehicle associated with the loss of containment of cryogenic liquid propellants from the upper stage during ascent. Development of the methodology is led by a team at Sandia National Laboratories (SNL) with guidance and support from a number of National Aeronautics and Space Administration (NASA) personnel. The methodology is based on the current Ares-1 design and feasible accident scenarios. These scenarios address containment failure from debris impact or structural response to pressure or blast loading from an external source. Once containment is breached, the envisioned assessment methodology includes predictions for the sequence of physical processes stemming from cryogenic tank failure. The investigative techniques, analysis paths, and numerical simulations that comprise the proposed methodology are summarized and appropriate simulation software is identified in this report.
Seismic Hazard Estimates Using Ill-defined Macroseismic Data at Site
NASA Astrophysics Data System (ADS)
Albarello, D.; Mucciarelli, M.
- A new approach is proposed to the seismic hazard estimate based on documentary data concerning local history of seismic effects. The adopted methodology allows for the use of ``poor'' data, such as the macroseismic ones, within a formally coherent approach that permits overcoming a number of problems connected to the forcing of available information in the frame of ``standard'' methodologies calibrated on the use of instrumental data. The use of the proposed methodology allows full exploitation of all the available information (that for many towns in Italy covers several centuries) making possible a correct use of macroseismic data characterized by different levels of completeness and reliability. As an application of the proposed methodology, seismic hazard estimates are presented for two towns located in Northern Italy: Bologna and Carpi.
What about N? A methodological study of sample-size reporting in focus group studies.
Carlsen, Benedicte; Glenton, Claire
2011-03-11
Focus group studies are increasingly published in health related journals, but we know little about how researchers use this method, particularly how they determine the number of focus groups to conduct. The methodological literature commonly advises researchers to follow principles of data saturation, although practical advise on how to do this is lacking. Our objectives were firstly, to describe the current status of sample size in focus group studies reported in health journals. Secondly, to assess whether and how researchers explain the number of focus groups they carry out. We searched PubMed for studies that had used focus groups and that had been published in open access journals during 2008, and extracted data on the number of focus groups and on any explanation authors gave for this number. We also did a qualitative assessment of the papers with regard to how number of groups was explained and discussed. We identified 220 papers published in 117 journals. In these papers insufficient reporting of sample sizes was common. The number of focus groups conducted varied greatly (mean 8.4, median 5, range 1 to 96). Thirty seven (17%) studies attempted to explain the number of groups. Six studies referred to rules of thumb in the literature, three stated that they were unable to organize more groups for practical reasons, while 28 studies stated that they had reached a point of saturation. Among those stating that they had reached a point of saturation, several appeared not to have followed principles from grounded theory where data collection and analysis is an iterative process until saturation is reached. Studies with high numbers of focus groups did not offer explanations for number of groups. Too much data as a study weakness was not an issue discussed in any of the reviewed papers. Based on these findings we suggest that journals adopt more stringent requirements for focus group method reporting. The often poor and inconsistent reporting seen in these studies may also reflect the lack of clear, evidence-based guidance about deciding on sample size. More empirical research is needed to develop focus group methodology.
NASA Astrophysics Data System (ADS)
Zhang, Jie; Nixon, Andrew; Barber, Tom; Budyn, Nicolas; Bevan, Rhodri; Croxford, Anthony; Wilcox, Paul
2018-04-01
In this paper, a methodology of using finite element (FE) model to validate a ray-based model in the simulation of full matrix capture (FMC) ultrasonic array data set is proposed. The overall aim is to separate signal contributions from different interactions in FE results for easier comparing each individual component in the ray-based model results. This is achieved by combining the results from multiple FE models of the system of interest that include progressively more geometrical features while preserving the same mesh structure. It is shown that the proposed techniques allow the interactions from a large number of different ray-paths to be isolated in FE results and compared directly to the results from a ray-based forward model.
Investigation of Dynamic Modulus and Flow Number Properties of Asphalt Mixtures In Washington State
DOT National Transportation Integrated Search
2011-11-11
Pavement design is now moving toward more mechanistic based design methodologies for the purpose of producing long : lasting and higher performance pavements in a cost-effective manner. The recent Mechanistic-Empirical pavement : design guide (MEPDG)...
[Neurological sciences based on evidence].
Ceballos, C; Almárcegui, C; Artal, A; García-Campayo, J; Valdizán, J R
An exhaustive search of reported metanalysis from any medical speciality is described. Search of papers included in MEDLINE or EMBASE between 1973-1998. A descriptive analysis of the reported papers (frequency tables and graphics) is described, including differences of mean of reported metanalysis papers by medical speciality and year. 1,514 papers were selected and classified. Between 1977-1987 overall mean of reported studies of neurologic metanalysis (1.20 +/- 1.10) was significatively inferior to the 1988-1998 (11.20 +/- 7.85) (p < 0.001). Global number of neurologic metanalysis was positively correlated (p < 0.05) with the number of studies about fundamentals and methodology during the study period. A progressive increase in the number of reported neurologic metanalysis since 1977 can be demonstrated. Diffusion of knowledge about fundamentals and methodology of metanalysis seems to have drawn and increase in performing and reporting this kind of analysis.
NASA Astrophysics Data System (ADS)
Wang, Lynn T.-N.; Madhavan, Sriram
2018-03-01
A pattern matching and rule-based polygon clustering methodology with DFM scoring is proposed to detect decomposition-induced manufacturability detractors and fix the layout designs prior to manufacturing. A pattern matcher scans the layout for pre-characterized patterns from a library. If a pattern were detected, rule-based clustering identifies the neighboring polygons that interact with those captured by the pattern. Then, DFM scores are computed for the possible layout fixes: the fix with the best score is applied. The proposed methodology was applied to two 20nm products with a chip area of 11 mm2 on the metal 2 layer. All the hotspots were resolved. The number of DFM spacing violations decreased by 7-15%.
Effective and Ethical and Interviewing of Young Children in Pedagogical Context
ERIC Educational Resources Information Center
Dunphy, Elizabeth
2005-01-01
Ethical and effective interviewing of young children in relation to their learning is a challenging and complex process. This paper describes the use of an experience-based flexible and focused interview methodology in a study based on young children's views and understandings of number. It shows how the approach used builds on previous work in…
ERIC Educational Resources Information Center
Calder Stegemann, Kim; Grünke, Matthias
2014-01-01
Number sense is critical to the development of higher order mathematic abilities. However, some children have difficulty acquiring these fundamental skills and the knowledge base of effective interventions/remediation is relatively limited. Based on emerging neuro-scientific research which has identified the association between finger…
Building Work-Based Learning into the School Curriculum
ERIC Educational Resources Information Center
Asher, Jenny
2005-01-01
Purpose - The purpose of this article is to examine the increasing number of opportunities for pre-16 young people at schools in England to become involved in work related and work based programmes and to look at the key drivers of change and their impact. Design/methodology/approach - The approach is descriptive, covering current trends and also…
Data Manipulation in an XML-Based Digital Image Library
ERIC Educational Resources Information Center
Chang, Naicheng
2005-01-01
Purpose: To help to clarify the role of XML tools and standards in supporting transition and migration towards a fully XML-based environment for managing access to information. Design/methodology/approach: The Ching Digital Image Library, built on a three-tier architecture, is used as a source of examples to illustrate a number of methods of data…
42 CFR 412.424 - Methodology for calculating the Federal per diem payment amount.
Code of Federal Regulations, 2014 CFR
2014-10-01
... facilities located in a rural area as defined in § 412.402. (iii) Teaching adjustment. CMS adjusts the Federal per diem base rate by a factor to account for indirect teaching costs. (A) An inpatient psychiatric facility's teaching adjustment is based on the ratio of the number of full-time equivalent...
42 CFR 412.424 - Methodology for calculating the Federal per diem payment amount.
Code of Federal Regulations, 2013 CFR
2013-10-01
... facilities located in a rural area as defined in § 412.402. (iii) Teaching adjustment. CMS adjusts the Federal per diem base rate by a factor to account for indirect teaching costs. (A) An inpatient psychiatric facility's teaching adjustment is based on the ratio of the number of full-time equivalent...
42 CFR 412.424 - Methodology for calculating the Federal per diem payment amount.
Code of Federal Regulations, 2011 CFR
2011-10-01
... facilities located in a rural area as defined in § 412.402. (iii) Teaching adjustment. CMS adjusts the Federal per diem base rate by a factor to account for indirect teaching costs. (A) An inpatient psychiatric facility's teaching adjustment is based on the ratio of the number of full-time equivalent...
42 CFR 412.424 - Methodology for calculating the Federal per diem payment amount.
Code of Federal Regulations, 2010 CFR
2010-10-01
... facilities located in a rural area as defined in § 412.402. (iii) Teaching adjustment. CMS adjusts the Federal per diem base rate by a factor to account for indirect teaching costs. (A) An inpatient psychiatric facility's teaching adjustment is based on the ratio of the number of full-time equivalent...
42 CFR 412.424 - Methodology for calculating the Federal per diem payment amount.
Code of Federal Regulations, 2012 CFR
2012-10-01
... facilities located in a rural area as defined in § 412.402. (iii) Teaching adjustment. CMS adjusts the Federal per diem base rate by a factor to account for indirect teaching costs. (A) An inpatient psychiatric facility's teaching adjustment is based on the ratio of the number of full-time equivalent...
Manfredi, Simone; Cristobal, Jorge
2016-09-01
Trying to respond to the latest policy needs, the work presented in this article aims at developing a life-cycle based framework methodology to quantitatively evaluate the environmental and economic sustainability of European food waste management options. The methodology is structured into six steps aimed at defining boundaries and scope of the evaluation, evaluating environmental and economic impacts and identifying best performing options. The methodology is able to accommodate additional assessment criteria, for example the social dimension of sustainability, thus moving towards a comprehensive sustainability assessment framework. A numerical case study is also developed to provide an example of application of the proposed methodology to an average European context. Different options for food waste treatment are compared, including landfilling, composting, anaerobic digestion and incineration. The environmental dimension is evaluated with the software EASETECH, while the economic assessment is conducted based on different indicators expressing the costs associated with food waste management. Results show that the proposed methodology allows for a straightforward identification of the most sustainable options for food waste, thus can provide factual support to decision/policy making. However, it was also observed that results markedly depend on a number of user-defined assumptions, for example on the choice of the indicators to express the environmental and economic performance. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Fraser, R.; Coulaud, M.; Aeschlimann, V.; Lemay, J.; Deschenes, C.
2016-11-01
With the growing proportion of inconstant energy source as wind and solar, hydroelectricity becomes a first class source of peak energy in order to regularize the grid. The important increase of start - stop cycles may then cause a premature ageing of runners by both a higher number of cycles in stress fluctuations and by reaching a higher stress level in absolute. Aiming to sustain good quality development on fully homologous scale model turbines, the Hydraulic Machines Laboratory (LAMH) of Laval University has developed a methodology to operate model size turbines on transient regimes such as start-up, stop or load rejection on its test stand. This methodology allows maintaining a constant head while the wicket gates are opening or closing in a representative speed on the model scale of what is made on the prototype. This paper first presents the opening speed on model based on dimensionless numbers, the methodology itself and its application. Then both its limitation and the first results using a bulb turbine are detailed.
NASA Astrophysics Data System (ADS)
Bando, Shigeru; Watanabe, Hiroki; Asano, Hiroshi; Tsujita, Shinsuke
A methodology was developed to design the number and capacity for each piece of equipment (e.g. gas engines, batteries, thermal storage tanks) in microgrids with combined heat and power systems. We analyzed three types of microgrids; the first one consists of an office building and an apartment, the second one consists of a hospital and an apartment, the third one consists of a hotel, office and retails. In the methodology, annual cost is minimized by considering the partial load efficiency of a gas engine and its scale economy, and the optimal number and capacity of each piece of equipment and the annual operational schedule are determined by using the optimal planning method. Based on calculations using this design methodology, it is found that the optimal number of gas engines is determined by the ratio of bottom to peak of the electricity demand and the ratio of heat to electricity demand. The optimal capacity of a battery required to supply electricity for a limited time during a peak demand period is auxiliary. The thermal storage tank for space cooling and space heating is selected to minimize the use of auxiliary equipment such as a gas absorption chiller.
Model-Based Economic Evaluation of Treatments for Depression: A Systematic Literature Review.
Kolovos, Spyros; Bosmans, Judith E; Riper, Heleen; Chevreul, Karine; Coupé, Veerle M H; van Tulder, Maurits W
2017-09-01
An increasing number of model-based studies that evaluate the cost effectiveness of treatments for depression are being published. These studies have different characteristics and use different simulation methods. We aimed to systematically review model-based studies evaluating the cost effectiveness of treatments for depression and examine which modelling technique is most appropriate for simulating the natural course of depression. The literature search was conducted in the databases PubMed, EMBASE and PsycInfo between 1 January 2002 and 1 October 2016. Studies were eligible if they used a health economic model with quality-adjusted life-years or disability-adjusted life-years as an outcome measure. Data related to various methodological characteristics were extracted from the included studies. The available modelling techniques were evaluated based on 11 predefined criteria. This methodological review included 41 model-based studies, of which 21 used decision trees (DTs), 15 used cohort-based state-transition Markov models (CMMs), two used individual-based state-transition models (ISMs), and three used discrete-event simulation (DES) models. Just over half of the studies (54%) evaluated antidepressants compared with a control condition. The data sources, time horizons, cycle lengths, perspectives adopted and number of health states/events all varied widely between the included studies. DTs scored positively in four of the 11 criteria, CMMs in five, ISMs in six, and DES models in seven. There were substantial methodological differences between the studies. Since the individual history of each patient is important for the prognosis of depression, DES and ISM simulation methods may be more appropriate than the others for a pragmatic representation of the course of depression. However, direct comparisons between the available modelling techniques are necessary to yield firm conclusions.
A novel methodology for building robust design rules by using design based metrology (DBM)
NASA Astrophysics Data System (ADS)
Lee, Myeongdong; Choi, Seiryung; Choi, Jinwoo; Kim, Jeahyun; Sung, Hyunju; Yeo, Hyunyoung; Shim, Myoungseob; Jin, Gyoyoung; Chung, Eunseung; Roh, Yonghan
2013-03-01
This paper addresses a methodology for building robust design rules by using design based metrology (DBM). Conventional method for building design rules has been using a simulation tool and a simple pattern spider mask. At the early stage of the device, the estimation of simulation tool is poor. And the evaluation of the simple pattern spider mask is rather subjective because it depends on the experiential judgment of an engineer. In this work, we designed a huge number of pattern situations including various 1D and 2D design structures. In order to overcome the difficulties of inspecting many types of patterns, we introduced Design Based Metrology (DBM) of Nano Geometry Research, Inc. And those mass patterns could be inspected at a fast speed with DBM. We also carried out quantitative analysis on PWQ silicon data to estimate process variability. Our methodology demonstrates high speed and accuracy for building design rules. All of test patterns were inspected within a few hours. Mass silicon data were handled with not personal decision but statistical processing. From the results, robust design rules are successfully verified and extracted. Finally we found out that our methodology is appropriate for building robust design rules.
Accurate Energy Transaction Allocation using Path Integration and Interpolation
NASA Astrophysics Data System (ADS)
Bhide, Mandar Mohan
This thesis investigates many of the popular cost allocation methods which are based on actual usage of the transmission network. The Energy Transaction Allocation (ETA) method originally proposed by A.Fradi, S.Brigonne and B.Wollenberg which gives unique advantage of accurately allocating the transmission network usage is discussed subsequently. Modified calculation of ETA based on simple interpolation technique is then proposed. The proposed methodology not only increase the accuracy of calculation but also decreases number of calculations to less than half of the number of calculations required in original ETAs.
Conceptualizing Effectiveness in Disability Research
ERIC Educational Resources Information Center
de Bruin, Catriona L.
2017-01-01
Policies promoting evidence-based practice in education typically endorse evaluations of the effectiveness of teaching strategies through specific experimental research designs and methods. A number of researchers have critiqued this approach to evaluation as narrow and called for greater methodological sophistication. This paper discusses the…
DNA-based random number generation in security circuitry.
Gearheart, Christy M; Arazi, Benjamin; Rouchka, Eric C
2010-06-01
DNA-based circuit design is an area of research in which traditional silicon-based technologies are replaced by naturally occurring phenomena taken from biochemistry and molecular biology. This research focuses on further developing DNA-based methodologies to mimic digital data manipulation. While exhibiting fundamental principles, this work was done in conjunction with the vision that DNA-based circuitry, when the technology matures, will form the basis for a tamper-proof security module, revolutionizing the meaning and concept of tamper-proofing and possibly preventing it altogether based on accurate scientific observations. A paramount part of such a solution would be self-generation of random numbers. A novel prototype schema employs solid phase synthesis of oligonucleotides for random construction of DNA sequences; temporary storage and retrieval is achieved through plasmid vectors. A discussion of how to evaluate sequence randomness is included, as well as how these techniques are applied to a simulation of the random number generation circuitry. Simulation results show generated sequences successfully pass three selected NIST random number generation tests specified for security applications.
Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos
2016-01-01
This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722
Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos
2016-11-18
This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.
Navier-Stokes simulations of slender axisymmetric shapes in supersonic, turbulent flow
NASA Astrophysics Data System (ADS)
Moran, Kenneth J.; Beran, Philip S.
1994-07-01
Computational fluid dynamics is used to study flows about slender, axisymmetric bodies at very high speeds. Numerical experiments are conducted to simulate a broad range of flight conditions. Mach number is varied from 1.5 to 8 and Reynolds number is varied from 1 X 10(exp 6)/m to 10(exp 8)/m. The primary objective is to develop and validate a computational and methodology for the accurate simulation of a wide variety of flow structures. Accurate results are obtained for detached bow shocks, recompression shocks, corner-point expansions, base-flow recirculations, and turbulent boundary layers. Accuracy is assessed through comparison with theory and experimental data; computed surface pressure, shock structure, base-flow structure, and velocity profiles are within measurement accuracy throughout the range of conditions tested. The methodology is both practical and general: general in its applicability, and practicaal in its performance. To achieve high accuracy, modifications to previously reported techniques are implemented in the scheme. These modifications improve computed results in the vicinity of symmetry lines and in the base flow region, including the turbulent wake.
Internet-based mental health interventions.
Ybarra, Michele L; Eaton, William W
2005-06-01
Following recent reviews of community- and practice-based mental health interventions, an assessment of Internet-based interventions is provided. Although relatively new, many Internet mental health interventions have reported early results that are promising. Both therapist-led as well as self-directed online therapies indicate significant alleviation of disorder-related symptomatology. The number of studies addressing child disorders lags behind those of adults. More research is needed to address methodological issues of Internet-based treatments.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
NASA Astrophysics Data System (ADS)
Dionne, J. P.; Levine, J.; Makris, A.
2018-01-01
To design the next generation of blast mitigation helmets that offer increasing levels of protection against explosive devices, manufacturers must be able to rely on appropriate test methodologies and human surrogates that will differentiate the performance level of various helmet solutions and ensure user safety. Ideally, such test methodologies and associated injury thresholds should be based on widely accepted injury criteria relevant within the context of blast. Unfortunately, even though significant research has taken place over the last decade in the area of blast neurotrauma, there currently exists no agreement in terms of injury mechanisms for blast-induced traumatic brain injury. In absence of such widely accepted test methods and injury criteria, the current study presents a specific blast test methodology focusing on explosive ordnance disposal protective equipment, involving the readily available Hybrid III mannequin, initially developed for the automotive industry. The unlikely applicability of the associated brain injury criteria (based on both linear and rotational head acceleration) is discussed in the context of blast. Test results encompassing a large number of blast configurations and personal protective equipment are presented, emphasizing the possibility to develop useful correlations between blast parameters, such as the scaled distance, and mannequin engineering measurements (head acceleration). Suggestions are put forward for a practical standardized blast testing methodology taking into account limitations in the applicability of acceleration-based injury criteria as well as the inherent variability in blast testing results.
Reliability modelling and analysis of thermal MEMS
NASA Astrophysics Data System (ADS)
Muratet, Sylvaine; Lavu, Srikanth; Fourniols, Jean-Yves; Bell, George; Desmulliez, Marc P. Y.
2006-04-01
This paper presents a MEMS reliability study methodology based on the novel concept of 'virtual prototyping'. This methodology can be used for the development of reliable sensors or actuators and also to characterize their behaviour in specific use conditions and applications. The methodology is demonstrated on the U-shaped micro electro thermal actuator used as test vehicle. To demonstrate this approach, a 'virtual prototype' has been developed with the modeling tools MatLab and VHDL-AMS. A best practice FMEA (Failure Mode and Effect Analysis) is applied on the thermal MEMS to investigate and assess the failure mechanisms. Reliability study is performed by injecting the identified defaults into the 'virtual prototype'. The reliability characterization methodology predicts the evolution of the behavior of these MEMS as a function of the number of cycles of operation and specific operational conditions.
On multi-site damage identification using single-site training data
NASA Astrophysics Data System (ADS)
Barthorpe, R. J.; Manson, G.; Worden, K.
2017-11-01
This paper proposes a methodology for developing multi-site damage location systems for engineering structures that can be trained using single-site damaged state data only. The methodology involves training a sequence of binary classifiers based upon single-site damage data and combining the developed classifiers into a robust multi-class damage locator. In this way, the multi-site damage identification problem may be decomposed into a sequence of binary decisions. In this paper Support Vector Classifiers are adopted as the means of making these binary decisions. The proposed methodology represents an advancement on the state of the art in the field of multi-site damage identification which require either: (1) full damaged state data from single- and multi-site damage cases or (2) the development of a physics-based model to make multi-site model predictions. The potential benefit of the proposed methodology is that a significantly reduced number of recorded damage states may be required in order to train a multi-site damage locator without recourse to physics-based model predictions. In this paper it is first demonstrated that Support Vector Classification represents an appropriate approach to the multi-site damage location problem, with methods for combining binary classifiers discussed. Next, the proposed methodology is demonstrated and evaluated through application to a real engineering structure - a Piper Tomahawk trainer aircraft wing - with its performance compared to classifiers trained using the full damaged-state dataset.
Design Optimization of Gas Generator Hybrid Propulsion Boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight; Fink, Larry
1990-01-01
A methodology used in support of a study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specific optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Recent advances in sortase-catalyzed ligation methodology.
Antos, John M; Truttmann, Matthias C; Ploegh, Hidde L
2016-06-01
The transpeptidation reaction catalyzed by bacterial sortases continues to see increasing use in the construction of novel protein derivatives. In addition to growth in the number of applications that rely on sortase, this field has also seen methodology improvements that enhance reaction performance and scope. In this opinion, we present an overview of key developments in the practice and implementation of sortase-based strategies, including applications relevant to structural biology. Topics include the use of engineered sortases to increase reaction rates, the use of redesigned acyl donors and acceptors to mitigate reaction reversibility, and strategies for expanding the range of substrates that are compatible with a sortase-based approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fraher, Erin P; Knapton, Andy; Holmes, George M
2017-02-01
To outline a methodology for allocating graduate medical education (GME) training positions based on data from a workforce projection model. Demand for visits is derived from the Medical Expenditure Panel Survey and Census data. Physician supply, retirements, and geographic mobility are estimated using concatenated AMA Masterfiles and ABMS certification data. The number and specialization behaviors of residents are derived from the AAMC's GMETrack survey. We show how the methodology could be used to allocate 3,000 new GME slots over 5 years-15,000 total positions-by state and specialty to address workforce shortages in 2026. We use the model to identify shortages for 19 types of health care services provided by 35 specialties in 50 states. The new GME slots are allocated to nearly all specialties, but nine states and the District of Columbia do not receive any new positions. This analysis illustrates an objective, evidence-based methodology for allocating GME positions that could be used as the starting point for discussions about GME expansion or redistribution. © Health Research and Educational Trust.
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 1
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. This is Volume 1, an Executive Summary. Volume 2 contains Appendices A (Aerothermal Comparisons) and B (Flight Derived h sub 1/h sub u vs. M sub inf. Plots), and Volume 3 contains Appendix C (Comparison of Interference Factors among OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
NASA Technical Reports Server (NTRS)
Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan; Streett, Craig L.; Carpenter, Mark H.
2011-01-01
A combination of parabolized stability equations and secondary instability theory has been applied to a low-speed swept airfoil model with a chord Reynolds number of 7.15 million, with the goals of (i) evaluating this methodology in the context of transition prediction for a known configuration for which roughness based crossflow transition control has been demonstrated under flight conditions and (ii) of analyzing the mechanism of transition delay via the introduction of discrete roughness elements (DRE). Roughness based transition control involves controlled seeding of suitable, subdominant crossflow modes, so as to weaken the growth of naturally occurring, linearly more unstable crossflow modes. Therefore, a synthesis of receptivity, linear and nonlinear growth of stationary crossflow disturbances, and the ensuing development of high frequency secondary instabilities is desirable to understand the experimentally observed transition behavior. With further validation, such higher fidelity prediction methodology could be utilized to assess the potential for crossflow transition control at even higher Reynolds numbers, where experimental data is currently unavailable.
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 3
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. Volume 2 contains Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub 1/h sub u vs. M sub inf. Plots). This is Volume 3, containing Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 2
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. This is volume 2, containing Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub i/h sub u vs. M sub inf. Plots). Volume 3 contains Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
PseudoBase: a database with RNA pseudoknots.
van Batenburg, F H; Gultyaev, A P; Pleij, C W; Ng, J; Oliehoek, J
2000-01-01
PseudoBase is a database containing structural, functional and sequence data related to RNA pseudo-knots. It can be reached at http://wwwbio. Leiden Univ.nl/ approximately Batenburg/PKB.html. This page will direct the user to a retrieval page from where a particular pseudoknot can be chosen, or to a submission page which enables the user to add pseudoknot information to the database or to an informative page that elaborates on the various aspects of the database. For each pseudoknot, 12 items are stored, e.g. the nucleotides of the region that contains the pseudoknot, the stem positions of the pseudoknot, the EMBL accession number of the sequence that contains this pseudoknot and the support that can be given regarding the reliability of the pseudoknot. Access is via a small number of steps, using 16 different categories. The development process was done by applying the evolutionary methodology for software development rather than by applying the methodology of the classical waterfall model or the more modern spiral model.
Bouça-Machado, Raquel; Rosário, Madalena; Alarcão, Joana; Correia-Guedes, Leonor; Abreu, Daisy; Ferreira, Joaquim J
2017-01-25
Over the past decades there has been a significant increase in the number of published clinical trials in palliative care. However, empirical evidence suggests that there are methodological problems in the design and conduct of studies, which raises questions about the validity and generalisability of the results and of the strength of the available evidence. We sought to evaluate the methodological characteristics and assess the quality of reporting of clinical trials in palliative care. We performed a systematic review of published clinical trials assessing therapeutic interventions in palliative care. Trials were identified using MEDLINE (from its inception to February 2015). We assessed methodological characteristics and describe the quality of reporting using the Cochrane Risk of Bias tool. We retrieved 107 studies. The most common medical field studied was oncology, and 43.9% of trials evaluated pharmacological interventions. Symptom control and physical dimensions (e.g. intervention on pain, breathlessness, nausea) were the palliative care-specific issues most studied. We found under-reporting of key information in particular on random sequence generation, allocation concealment, and blinding. While the number of clinical trials in palliative care has increased over time, methodological quality remains suboptimal. This compromises the quality of studies. Therefore, a greater effort is needed to enable the appropriate performance of future studies and increase the robustness of evidence-based medicine in this important field.
NASA Astrophysics Data System (ADS)
Tai, Wei; Abbasi, Mortez; Ricketts, David S.
2018-01-01
We present the analysis and design of high-power millimetre-wave power amplifier (PA) systems using zero-degree combiners (ZDCs). The methodology presented optimises the PA device sizing and the number of combined unit PAs based on device load pull simulations, driver power consumption analysis and loss analysis of the ZDC. Our analysis shows that an optimal number of N-way combined unit PAs leads to the highest power-added efficiency (PAE) for a given output power. To illustrate our design methodology, we designed a 1-W PA system at 45 GHz using a 45 nm silicon-on-insulator process and showed that an 8-way combined PA has the highest PAE that yields simulated output power of 30.6 dBm and 31% peak PAE.
Nguyen, Hien D; Ullmann, Jeremy F P; McLachlan, Geoffrey J; Voleti, Venkatakaushik; Li, Wenze; Hillman, Elizabeth M C; Reutens, David C; Janke, Andrew L
2018-02-01
Calcium is a ubiquitous messenger in neural signaling events. An increasing number of techniques are enabling visualization of neurological activity in animal models via luminescent proteins that bind to calcium ions. These techniques generate large volumes of spatially correlated time series. A model-based functional data analysis methodology via Gaussian mixtures is suggested for the clustering of data from such visualizations is proposed. The methodology is theoretically justified and a computationally efficient approach to estimation is suggested. An example analysis of a zebrafish imaging experiment is presented.
K-Means Subject Matter Expert Refined Topic Model Methodology
2017-01-01
Refined Topic Model Methodology Topic Model Estimation via K-Means U.S. Army TRADOC Analysis Center-Monterey 700 Dyer Road...January 2017 K-means Subject Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-Means Theodore T. Allen, Ph.D. Zhenhuan...Matter Expert Refined Topic Model Methodology Topic Model Estimation via K-means 5a. CONTRACT NUMBER W9124N-15-P-0022 5b. GRANT NUMBER 5c
NASA Astrophysics Data System (ADS)
Luo, Keqin
1999-11-01
The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and accurately what is going on in each tank, and (iii) identify all WM opportunities through process improvement. This work has formed a solid foundation for the further development of powerful WM technologies for comprehensive WM in the following decade.
Ethical and methodological issues in research with Sami experiencing disability.
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability.
Sustainability in Brazilian Federal Universities
ERIC Educational Resources Information Center
Palma, Lisiane Celia; de Oliveira, Lessandra M.; Viacava, Keitiline R.
2011-01-01
Purpose: The purpose of this paper is to identify the number of courses related to sustainability offered in bachelor degree programs of business administration in Brazilian federal universities. Design/methodology/approach: An exploratory research was carried out based on a descriptive scope. The process of mapping federal universities in Brazil…
HRD in France: The Corporate Perspective
ERIC Educational Resources Information Center
Weil, Amandine; Woodall, Jean
2005-01-01
Purpose: To explore and describe the roles, activities and strategies of French human resource development professionals. Design/methodology/approach: This paper is based primarily on exploratory and descriptive research. A range of secondary sources on European and French human resource development is critically reviewed to generate a number of…
Statistical power calculations for mixed pharmacokinetic study designs using a population approach.
Kloprogge, Frank; Simpson, Julie A; Day, Nicholas P J; White, Nicholas J; Tarning, Joel
2014-09-01
Simultaneous modelling of dense and sparse pharmacokinetic data is possible with a population approach. To determine the number of individuals required to detect the effect of a covariate, simulation-based power calculation methodologies can be employed. The Monte Carlo Mapped Power method (a simulation-based power calculation methodology using the likelihood ratio test) was extended in the current study to perform sample size calculations for mixed pharmacokinetic studies (i.e. both sparse and dense data collection). A workflow guiding an easy and straightforward pharmacokinetic study design, considering also the cost-effectiveness of alternative study designs, was used in this analysis. Initially, data were simulated for a hypothetical drug and then for the anti-malarial drug, dihydroartemisinin. Two datasets (sampling design A: dense; sampling design B: sparse) were simulated using a pharmacokinetic model that included a binary covariate effect and subsequently re-estimated using (1) the same model and (2) a model not including the covariate effect in NONMEM 7.2. Power calculations were performed for varying numbers of patients with sampling designs A and B. Study designs with statistical power >80% were selected and further evaluated for cost-effectiveness. The simulation studies of the hypothetical drug and the anti-malarial drug dihydroartemisinin demonstrated that the simulation-based power calculation methodology, based on the Monte Carlo Mapped Power method, can be utilised to evaluate and determine the sample size of mixed (part sparsely and part densely sampled) study designs. The developed method can contribute to the design of robust and efficient pharmacokinetic studies.
NASA Astrophysics Data System (ADS)
Kaskhedikar, Apoorva Prakash
According to the U.S. Energy Information Administration, commercial buildings represent about 40% of the United State's energy consumption of which office buildings consume a major portion. Gauging the extent to which an individual building consumes energy in excess of its peers is the first step in initiating energy efficiency improvement. Energy Benchmarking offers initial building energy performance assessment without rigorous evaluation. Energy benchmarking tools based on the Commercial Buildings Energy Consumption Survey (CBECS) database are investigated in this thesis. This study proposes a new benchmarking methodology based on decision trees, where a relationship between the energy use intensities (EUI) and building parameters (continuous and categorical) is developed for different building types. This methodology was applied to medium office and school building types contained in the CBECS database. The Random Forest technique was used to find the most influential parameters that impact building energy use intensities. Subsequently, correlations which were significant were identified between EUIs and CBECS variables. Other than floor area, some of the important variables were number of workers, location, number of PCs and main cooling equipment. The coefficient of variation was used to evaluate the effectiveness of the new model. The customization technique proposed in this thesis was compared with another benchmarking model that is widely used by building owners and designers namely, the ENERGY STAR's Portfolio Manager. This tool relies on the standard Linear Regression methods which is only able to handle continuous variables. The model proposed uses data mining technique and was found to perform slightly better than the Portfolio Manager. The broader impacts of the new benchmarking methodology proposed is that it allows for identifying important categorical variables, and then incorporating them in a local, as against a global, model framework for EUI pertinent to the building type. The ability to identify and rank the important variables is of great importance in practical implementation of the benchmarking tools which rely on query-based building and HVAC variable filters specified by the user.
Brophy, Robert H; Kluck, Dylan; Marx, Robert G
2016-05-01
In recent years, the number of articles in The American Journal of Sports Medicine (AJSM) has risen dramatically, with an increasing emphasis on evidence-based medicine in orthopaedics and sports medicine. Despite the increase in the number of articles published in AJSM over the past decade, the methodological quality of articles in 2011-2013 has improved relative to those in 2001-2003 and 1991-1993. Meta-analysis. All articles published in AJSM during 2011-2013 were reviewed and classified by study design. For each article, the use of pertinent methodologies, such as prospective data collection, randomization, control groups, and blinding, was recorded. The frequency of each article type and the use of evidence-based techniques were compared relative to 1991-1993 and 2001-2003 by use of Pearson χ(2) testing. The number of research articles published in AJSM more than doubled from 402 in 1991-1993 and 423 in 2001-2003 to 953 in 2011-2013. Case reports decreased from 15.2% to 10.6% to 2.1% of articles published over the study period (P < .001). Cadaveric/human studies and meta-analysis/literature review studies increased from 5.7% to 7.1% to 12.4% (P < .001) and from 0.2% to 0.9% to 2.3% (P = .01), respectively. Randomized, prospective clinical trials increased from 2.7% to 5.9% to 7.4% (P = .007). Fewer studies used retrospective compared with prospective data collection (P < .001). More studies tested an explicit hypothesis (P < .001) and used controls (P < .001), randomization (P < .001), and blinding of those assessing outcomes (P < .001). Multi-investigator trials increased (P < .001), as did the proportion of articles citing a funding source (P < .001). Despite a dramatic increase in the number of published articles, the research published in AJSM shifted toward more prospective, randomized, controlled, and blinded designs during 2011-2013 compared with 2001-2003 and 1991-1993, demonstrating a continued improvement in methodological quality. © 2015 The Author(s).
ERIC Educational Resources Information Center
García, Nuria Alonso; Caplan, Alison
2014-01-01
While there are a number of important critical pedagogies being proposed in the field of foreign language study, more attention should be given to providing concrete examples of how to apply these ideas in the classroom. This article offers a new approach to the textual analysis of literary classics through the keyword-based methodology originally…
Rater methodology for stroboscopy: a systematic review.
Bonilha, Heather Shaw; Focht, Kendrea L; Martin-Harris, Bonnie
2015-01-01
Laryngeal endoscopy with stroboscopy (LES) remains the clinical gold standard for assessing vocal fold function. LES is used to evaluate the efficacy of voice treatments in research studies and clinical practice. LES as a voice treatment outcome tool is only as good as the clinician interpreting the recordings. Research using LES as a treatment outcome measure should be evaluated based on rater methodology and reliability. The purpose of this literature review was to evaluate the rater-related methodology from studies that use stroboscopic findings as voice treatment outcome measures. Systematic literature review. Computerized journal databases were searched for relevant articles using terms: stroboscopy and treatment. Eligible articles were categorized and evaluated for the use of rater-related methodology, reporting of number of raters, types of raters, blinding, and rater reliability. Of the 738 articles reviewed, 80 articles met inclusion criteria. More than one-third of the studies included in the review did not report the number of raters who participated in the study. Eleven studies reported results of rater reliability analysis with only two studies reporting good inter- and intrarater reliability. The comparability and use of results from treatment studies that use LES are limited by a lack of rigor in rater methodology and variable, mostly poor, inter- and intrarater reliability. To improve our ability to evaluate and use the findings from voice treatment studies that use LES features as outcome measures, greater consistency of reporting rater methodology characteristics across studies and improved rater reliability is needed. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.
Academic Librarians and Project Management: An International Study
ERIC Educational Resources Information Center
Serrano, Silvia Cobo; Avilés, Rosario Arquero
2016-01-01
Because information and documentation units in libraries have responsibility for an ever-increasing number of projects, this paper aims at analyzing the discipline of project management in library and information science (LIS) from a professional perspective. To that end, the researchers employed quantitative and qualitative methodology based on a…
The Fifty Minute Ethnography: Teaching Theory through Fieldwork
ERIC Educational Resources Information Center
Trnka, Susanna
2017-01-01
Ethnography is becoming an increasingly popular research methodology used across a number of disciplines. Typically, teaching students how to write an ethnography, much less how to undertake "fieldwork" (or the ethnographic research upon which ethnographies are based), is reserved for senior- or MA-level research methods courses. This…
Teachers Implementing Entrepreneurship Education: Classroom Practices
ERIC Educational Resources Information Center
Ruskovaara, Elena; Pihkala, Timo
2013-01-01
Purpose: This study aims to highlight the entrepreneurship education practices teachers use in their work. Another target is to analyze how these practices differ based on a number of background factors. Design/methodology/approach: This article presents a quantitative analysis of 521 teachers and other entrepreneurship education actors. The paper…
An Educational Approach to Computationally Modeling Dynamical Systems
ERIC Educational Resources Information Center
Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl
2009-01-01
Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…
Conceptual and methodological challenges to integrating SEA and cumulative effects assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gunn, Jill, E-mail: jill.gunn@usask.c; Noble, Bram F.
The constraints to assessing and managing cumulative environmental effects in the context of project-based environmental assessment are well documented, and the potential benefits of a more strategic approach to cumulative effects assessment (CEA) are well argued; however, such benefits have yet to be clearly demonstrated in practice. While it is widely assumed that cumulative effects are best addressed in a strategic context, there has been little investigation as to whether CEA and strategic environmental assessment (SEA) are a 'good fit' - conceptually or methodologically. This paper identifies a number of conceptual and methodological challenges to the integration of CEA andmore » SEA. Based on results of interviews with international experts and practitioners, this paper demonstrates that: definitions and conceptualizations of CEA are typically weak in practice; approaches to effects aggregation vary widely; a systems perspective lacks in both SEA and CEA; the multifarious nature of SEA complicates CEA; tiering arrangements between SEA and project-based assessment are limited to non-existing; and the relationship of SEA to regional planning remains unclear.« less
NASA Astrophysics Data System (ADS)
Sakellariou, J. S.; Fassois, S. D.
2006-11-01
A stochastic output error (OE) vibration-based methodology for damage detection and assessment (localization and quantification) in structures under earthquake excitation is introduced. The methodology is intended for assessing the state of a structure following potential damage occurrence by exploiting vibration signal measurements produced by low-level earthquake excitations. It is based upon (a) stochastic OE model identification, (b) statistical hypothesis testing procedures for damage detection, and (c) a geometric method (GM) for damage assessment. The methodology's advantages include the effective use of the non-stationary and limited duration earthquake excitation, the handling of stochastic uncertainties, the tackling of the damage localization and quantification subproblems, the use of "small" size, simple and partial (in both the spatial and frequency bandwidth senses) identified OE-type models, and the use of a minimal number of measured vibration signals. Its feasibility and effectiveness are assessed via Monte Carlo experiments employing a simple simulation model of a 6 storey building. It is demonstrated that damage levels of 5% and 20% reduction in a storey's stiffness characteristics may be properly detected and assessed using noise-corrupted vibration signals.
Morón-Castañeda, L H; Useche-Bernal, A; Morales-Reyes, O L; Mojica-Figueroa, I L; Palacios-Carlos, A; Ardila-Gómez, C E; Parra-Ardila, M V; Martínez-Nieto, O; Sarmiento-Echeverri, N; Rodríguez, C A; Alvarado-Heine, C; Isaza-Ruget, M A
2015-01-01
The application of the Lean methodology in health institutions is an effective tool to improve the capacity and workflow, as well as to increase the level of satisfaction of patients and employees. To optimise the time of outpatient care in a clinical laboratory, by implementing a methodology based on the organisation of operational procedures to improve user satisfaction and reduce the number of complaints for delays in care. A quasi-experimental before and after study was conducted between October 2011 to September 2012. XBar and S charts were used to observe the mean service times and standard deviation. The user satisfaction was assessed using service questionnaires. A reduction of 17 minutes was observed in the time of patient care from arrival to leaving the laboratory, and a decrease of 60% in complaints of delay in care. Despite the high staff turnover and 38% increase in the number of patients seen, a culture of empowerment and continuous improvement was acquired, as well as greater efficiency and productivity in the care process, which was reflected by maintaining standards 12 months after implementation. Lean is a viable methodology for clinical laboratory procedures, improving their efficiency and effectiveness. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.
Hidden Markov Model-Based CNV Detection Algorithms for Illumina Genotyping Microarrays.
Seiser, Eric L; Innocenti, Federico
2014-01-01
Somatic alterations in DNA copy number have been well studied in numerous malignancies, yet the role of germline DNA copy number variation in cancer is still emerging. Genotyping microarrays generate allele-specific signal intensities to determine genotype, but may also be used to infer DNA copy number using additional computational approaches. Numerous tools have been developed to analyze Illumina genotype microarray data for copy number variant (CNV) discovery, although commonly utilized algorithms freely available to the public employ approaches based upon the use of hidden Markov models (HMMs). QuantiSNP, PennCNV, and GenoCN utilize HMMs with six copy number states but vary in how transition and emission probabilities are calculated. Performance of these CNV detection algorithms has been shown to be variable between both genotyping platforms and data sets, although HMM approaches generally outperform other current methods. Low sensitivity is prevalent with HMM-based algorithms, suggesting the need for continued improvement in CNV detection methodologies.
Clinical Studies of Biofield Therapies: Summary, Methodological Challenges, and Recommendations
Hammerschlag, Richard; Mills, Paul; Cohen, Lorenzo; Krieger, Richard; Vieten, Cassandra; Lutgendorf, Susan
2015-01-01
Biofield therapies are noninvasive therapies in which the practitioner explicitly works with a client's biofield (interacting fields of energy and information that surround living systems) to stimulate healing responses in patients. While the practice of biofield therapies has existed in Eastern and Western cultures for thousands of years, empirical research on the effectiveness of biofield therapies is still relatively nascent. In this article, we provide a summary of the state of the evidence for biofield therapies for a number of different clinical conditions. We note specific methodological issues for research in biofield therapies that need to be addressed (including practitioner-based, outcomes-based, and research design considerations), as well as provide a list of suggested next steps for biofield researchers to consider. PMID:26665043
Clinical Studies of Biofield Therapies: Summary, Methodological Challenges, and Recommendations.
Jain, Shamini; Hammerschlag, Richard; Mills, Paul; Cohen, Lorenzo; Krieger, Richard; Vieten, Cassandra; Lutgendorf, Susan
2015-11-01
Biofield therapies are noninvasive therapies in which the practitioner explicitly works with a client's biofield (interacting fields of energy and information that surround living systems) to stimulate healing responses in patients. While the practice of biofield therapies has existed in Eastern and Western cultures for thousands of years, empirical research on the effectiveness of biofield therapies is still relatively nascent. In this article, we provide a summary of the state of the evidence for biofield therapies for a number of different clinical conditions. We note specific methodological issues for research in biofield therapies that need to be addressed (including practitioner-based, outcomes-based, and research design considerations), as well as provide a list of suggested next steps for biofield researchers to consider.
Design optimization of gas generator hybrid propulsion boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight U.; Fink, Lawrence E.
1990-01-01
A methodology used in support of a contract study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specified optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
NASA Technical Reports Server (NTRS)
Cragg, Clinton H.; Bowman, Howard; Wilson, John E.
2011-01-01
The NASA Engineering and Safety Center (NESC) was requested to provide computational modeling to support the establishment of a safe separation distance surrounding the Kennedy Space Center (KSC) Vehicle Assembly Building (VAB). The two major objectives of the study were 1) establish a methodology based on thermal flux to determine safe separation distances from the Kennedy Space Center's (KSC's) Vehicle Assembly Building (VAB) with large numbers of solid propellant boosters containing hazard division 1.3 classification propellants, in case of inadvertent ignition; and 2) apply this methodology to the consideration of housing eight 5-segment solid propellant boosters in the VAB. The results of the study are contained in this report.
Scoping meta-review: introducing a new methodology.
Sarrami-Foroushani, Pooria; Travaglia, Joanne; Debono, Deborah; Clay-Williams, Robyn; Braithwaite, Jeffrey
2015-02-01
For researchers, policymakers, and practitioners facing a new field, undertaking a systematic review can typically present a challenge due to the enormous number of relevant papers. A scoping review is a method suggested for addressing this dilemma; however, scoping reviews present their own challenges. This paper introduces the "scoping meta-review" (SMR) for expanding current methodologies and is based on our experiences in mapping the field of consumer engagement in healthcare. During this process, we developed the novel SMR method. An SMR combines aspects of a scoping review and a meta-review to establish an evidence-based map of a field. Similar to a scoping review, an SMR offers a practical and flexible methodology. However, unlike in a traditional scoping review, only systematic reviews are included. Stages of the SMR include: undertaking a preliminary nonsystematic review; building a search strategy; interrogating academic literature databases; classifying and excluding studies based on titles and abstracts; saving the refined database of references; revising the search strategy; selecting and reviewing the full text papers; and thematically analyzing the selected texts and writing the report. The main benefit of an SMR is to map a new field based on high-level evidence provided by systematic reviews. © 2014 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Steinthorsson, E.; Modiano, David; Colella, Phillip
1994-01-01
A methodology for accurate and efficient simulation of unsteady, compressible flows is presented. The cornerstones of the methodology are a special discretization of the Navier-Stokes equations on structured body-fitted grid systems and an efficient solution-adaptive mesh refinement technique for structured grids. The discretization employs an explicit multidimensional upwind scheme for the inviscid fluxes and an implicit treatment of the viscous terms. The mesh refinement technique is based on the AMR algorithm of Berger and Colella. In this approach, cells on each level of refinement are organized into a small number of topologically rectangular blocks, each containing several thousand cells. The small number of blocks leads to small overhead in managing data, while their size and regular topology means that a high degree of optimization can be achieved on computers with vector processors.
Charpentier, R.R.; Klett, T.R.
2005-01-01
During the last 30 years, the methodology for assessment of undiscovered conventional oil and gas resources used by the Geological Survey has undergone considerable change. This evolution has been based on five major principles. First, the U.S. Geological Survey has responsibility for a wide range of U.S. and world assessments and requires a robust methodology suitable for immaturely explored as well as maturely explored areas. Second, the assessments should be based on as comprehensive a set of geological and exploration history data as possible. Third, the perils of methods that solely use statistical methods without geological analysis are recognized. Fourth, the methodology and course of the assessment should be documented as transparently as possible, within the limits imposed by the inevitable use of subjective judgement. Fifth, the multiple uses of the assessments require a continuing effort to provide the documentation in such ways as to increase utility to the many types of users. Undiscovered conventional oil and gas resources are those recoverable volumes in undiscovered, discrete, conventional structural or stratigraphic traps. The USGS 2000 methodology for these resources is based on a framework of assessing numbers and sizes of undiscovered oil and gas accumulations and the associated risks. The input is standardized on a form termed the Seventh Approximation Data Form for Conventional Assessment Units. Volumes of resource are then calculated using a Monte Carlo program named Emc2, but an alternative analytic (non-Monte Carlo) program named ASSESS also can be used. The resource assessment methodology continues to change. Accumulation-size distributions are being examined to determine how sensitive the results are to size-distribution assumptions. The resource assessment output is changing to provide better applicability for economic analysis. The separate methodology for assessing continuous (unconventional) resources also has been evolving. Further studies of the relationship between geologic models of conventional and continuous resources will likely impact the respective resource assessment methodologies. ?? 2005 International Association for Mathematical Geology.
The cost of vision loss in Canada. 1. Methodology.
Gordon, Keith D; Cruess, Alan F; Bellan, Lorne; Mitchell, Scott; Pezzullo, M Lynne
2011-08-01
This paper outlines the methodology used to estimate the cost of vision loss in Canada. The results of this study will be presented in a second paper. The cost of vision loss (VL) in Canada was estimated using a prevalence-based approach. This was done by estimating the number of people with VL in a base period (2007) and the costs associated with treating them. The cost estimates included direct health system expenditures on eye conditions that cause VL, as well as other indirect financial costs such as productivity losses. Estimates were also made of the value of the loss of healthy life, measured in Disability Adjusted Life Years or DALY's. To estimate the number of cases of VL in the population, epidemiological data on prevalence rates were applied to population data. The number of cases of VL was stratified by gender, age, ethnicity, severity and cause. The following sources were used for estimating prevalence: Population-based eye studies; Canadian Surveys; Canadian journal articles and research studies; and International Population Based Eye Studies. Direct health costs were obtained primarily from Health Canada and Canadian Institute for Health Information (CIHI) sources, while costs associated with productivity losses were based on employment information compiled by Statistics Canada and on economic theory of productivity loss. Costs related to vision rehabilitation (VR) were obtained from Canadian VR organizations. This study shows that it is possible to estimate the costs for VL for a country in the absence of ongoing local epidemiological studies. Copyright © 2011 Canadian Ophthalmological Society. Published by Elsevier Inc. All rights reserved.
Qualitative model-based diagnosis using possibility theory
NASA Technical Reports Server (NTRS)
Joslyn, Cliff
1994-01-01
The potential for the use of possibility in the qualitative model-based diagnosis of spacecraft systems is described. The first sections of the paper briefly introduce the Model-Based Diagnostic (MBD) approach to spacecraft fault diagnosis; Qualitative Modeling (QM) methodologies; and the concepts of possibilistic modeling in the context of Generalized Information Theory (GIT). Then the necessary conditions for the applicability of possibilistic methods to qualitative MBD, and a number of potential directions for such an application, are described.
Fragment-based approaches to the discovery of kinase inhibitors.
Mortenson, Paul N; Berdini, Valerio; O'Reilly, Marc
2014-01-01
Protein kinases are one of the most important families of drug targets, and aberrant kinase activity has been linked to a large number of disease areas. Although eminently targetable using small molecules, kinases present a number of challenges as drug targets, not least obtaining selectivity across such a large and relatively closely related target family. Fragment-based drug discovery involves screening simple, low-molecular weight compounds to generate initial hits against a target. These hits are then optimized to more potent compounds via medicinal chemistry, usually facilitated by structural biology. Here, we will present a number of recent examples of fragment-based approaches to the discovery of kinase inhibitors, detailing the construction of fragment-screening libraries, the identification and validation of fragment hits, and their optimization into potent and selective lead compounds. The advantages of fragment-based methodologies will be discussed, along with some of the challenges associated with using this route. Finally, we will present a number of key lessons derived both from our own experience running fragment screens against kinases and from a large number of published studies.
Investigation of Weibull statistics in fracture analysis of cast aluminum
NASA Technical Reports Server (NTRS)
Holland, Frederic A., Jr.; Zaretsky, Erwin V.
1989-01-01
The fracture strengths of two large batches of A357-T6 cast aluminum coupon specimens were compared by using two-parameter Weibull analysis. The minimum number of these specimens necessary to find the fracture strength of the material was determined. The applicability of three-parameter Weibull analysis was also investigated. A design methodology based on the combination of elementary stress analysis and Weibull statistical analysis is advanced and applied to the design of a spherical pressure vessel shell. The results from this design methodology are compared with results from the applicable ASME pressure vessel code.
Navarrete-Bolaños, J L; Téllez-Martínez, M G; Miranda-López, R; Jiménez-Islas, H
2017-07-03
For any fermentation process, the production cost depends on several factors, such as the genetics of the microorganism, the process condition, and the culture medium composition. In this work, a guideline for the design of cost-efficient culture media using a sequential approach based on response surface methodology is described. The procedure was applied to analyze and optimize a culture medium of registered trademark and a base culture medium obtained as a result of the screening analysis from different culture media used to grow the same strain according to the literature. During the experiments, the procedure quantitatively identified an appropriate array of micronutrients to obtain a significant yield and find a minimum number of culture medium ingredients without limiting the process efficiency. The resultant culture medium showed an efficiency that compares favorably with the registered trademark medium at a 95% lower cost as well as reduced the number of ingredients in the base culture medium by 60% without limiting the process efficiency. These results demonstrated that, aside from satisfying the qualitative requirements, an optimum quantity of each constituent is needed to obtain a cost-effective culture medium. Study process variables for optimized culture medium and scaling-up production for the optimal values are desirable.
Andreeva, Valentina A; Galan, Pilar; Julia, Chantal; Castetbon, Katia; Kesse-Guyot, Emmanuelle; Hercberg, Serge
2014-04-01
Whereas the feasibility and effectiveness of Internet-based epidemiologic research have been established, methodological support for the quality of such data is still accumulating. We aimed to identify sociodemographic differences among members of a French cohort according to willingness to provide part of one's 15-digit national identification number (personal Social Security number (PSSN)) and to assess response consistency based on information reported on the sociodemographic questionnaire and that reflected in the PSSN. We studied 100,118 persons enrolled in an Internet-based prospective cohort study, the NutriNet-Santé Study, between 2009 and 2013. Persons aged 18 years or more who resided in France and had Internet access were eligible for enrollment. The sociodemographic profiles of participants with discordant data were compared against those of participants with concordant data via 2-sided polytomous logistic regression. In total, 84,442 participants (84.3%) provided the first 7 digits of their PSSN, and among them 5,141 (6.1%) had discordant data. Our multivariate analysis revealed differences by sex, age, education, and employment as regards response consistency patterns. The results support the quality of sociodemographic data obtained online from a large and diverse volunteer sample. The quantitative description of participant profiles according to response consistency patterns could inform future methodological work in e-epidemiology.
Ethical and methodological issues in research with Sami experiencing disability.
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
Background A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. Objectives The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. Methods The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). Findings and discussion The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. Conclusion The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability.
Ethical and methodological issues in research with Sami experiencing disability
Melbøe, Line; Hansen, Ketil Lenert; Johnsen, Bjørn-Eirik; Fedreheim, Gunn Elin; Dinesen, Tone; Minde, Gunn-Tove; Rustad, Marit
2016-01-01
Background A study of disability among the indigenous Sami people in Norway presented a number of ethical and methodological challenges rarely addressed in the literature. Objectives The main study was designed to examine and understand the everyday life, transitions between life stages and democratic participation of Norwegian Sami people experiencing disability. Hence, the purpose of this article is to increase the understanding of possible ethical and methodological issues in research within this field. The article describes and discusses ethical and methodological issues that arose when conducting our study and identifies some strategies for addressing issues like these. Methods The ethical and methodological issues addressed in the article are based on a qualitative study among indigenous Norwegian Sami people experiencing disability. The data in this study were collected through 31 semi-structured in-depth interviews with altogether 24 Sami people experiencing disability and 13 next of kin of Sami people experiencing disability (8 mothers, 2 fathers, 2 sister and 1 guardian). Findings and discussion The researchers identified 4 main areas of ethical and methodological issues. We present these issues chronologically as they emerged in the research process: 1) concept of knowledge when designing the study, 2) gaining access, 3) data collection and 4) analysis and accountability. Conclusion The knowledge generated from this study has the potential to benefit future health research, specifically of Norwegian Sami people experiencing disability, as well as health research concerning indigenous people in general, providing scientific-based insight into important ethical and methodological issues in research with indigenous people experiencing disability. PMID:27396747
Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters
NASA Technical Reports Server (NTRS)
Vlcek, Brian L.; Hendricks, Robert C.; Zaretsky, Erwin V.
2013-01-01
Leonard Johnson published a methodology for establishing the confidence that two populations of data are different. Johnson's methodology is dependent on limited combinations of test parameters (Weibull slope, mean life ratio, and degrees of freedom) and a set of complex mathematical equations. In this report, a simplified algebraic equation for confidence numbers is derived based on the original work of Johnson. The confidence numbers calculated with this equation are compared to those obtained graphically by Johnson. Using the ratios of mean life, the resultant values of confidence numbers at the 99 percent level deviate less than 1 percent from those of Johnson. At a 90 percent confidence level, the calculated values differ between +2 and 4 percent. The simplified equation is used to rank the experimental lives of three aluminum alloys (AL 2024, AL 6061, and AL 7075), each tested at three stress levels in rotating beam fatigue, analyzed using the Johnson- Weibull method, and compared to the ASTM Standard (E739 91) method of comparison. The ASTM Standard did not statistically distinguish between AL 6061 and AL 7075. However, it is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers using the Johnson- Weibull analysis. AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median, or L(sub 50), lives
The Ideal Oriented Co-design Approach Revisited
NASA Astrophysics Data System (ADS)
Johnstone, Christina
There exist a large number of different methodologies for developing information systems on the market. This implies that there also are a large number of "best" ways of developing those information systems. Avison and Fitzgerald (2003) states that every methodology is built on a philosophy. With philosophy they refer to the underlying attitudes and viewpoints, and the different assumptions and emphases to be found within the specific methodology.
Benchmarking gate-based quantum computers
NASA Astrophysics Data System (ADS)
Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans
2017-11-01
With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.
NASA Astrophysics Data System (ADS)
Alemany, Kristina
Electric propulsion has recently become a viable technology for spacecraft, enabling shorter flight times, fewer required planetary gravity assists, larger payloads, and/or smaller launch vehicles. With the maturation of this technology, however, comes a new set of challenges in the area of trajectory design. Because low-thrust trajectory optimization has historically required long run-times and significant user-manipulation, mission design has relied on expert-based knowledge for selecting departure and arrival dates, times of flight, and/or target bodies and gravitational swing-bys. These choices are generally based on known configurations that have worked well in previous analyses or simply on trial and error. At the conceptual design level, however, the ability to explore the full extent of the design space is imperative to locating the best solutions in terms of mass and/or flight times. Beginning in 2005, the Global Trajectory Optimization Competition posed a series of difficult mission design problems, all requiring low-thrust propulsion and visiting one or more asteroids. These problems all had large ranges on the continuous variables---launch date, time of flight, and asteroid stay times (when applicable)---as well as being characterized by millions or even billions of possible asteroid sequences. Even with recent advances in low-thrust trajectory optimization, full enumeration of these problems was not possible within the stringent time limits of the competition. This investigation develops a systematic methodology for determining a broad suite of good solutions to the combinatorial, low-thrust, asteroid tour problem. The target application is for conceptual design, where broad exploration of the design space is critical, with the goal being to rapidly identify a reasonable number of promising solutions for future analysis. The proposed methodology has two steps. The first step applies a three-level heuristic sequence developed from the physics of the problem, which allows for efficient pruning of the design space. The second phase applies a global optimization scheme to locate a broad suite of good solutions to the reduced problem. The global optimization scheme developed combines a novel branch-and-bound algorithm with a genetic algorithm and an industry-standard low-thrust trajectory optimization program to solve for the following design variables: asteroid sequence, launch date, times of flight, and asteroid stay times. The methodology is developed based on a small sample problem, which is enumerated and solved so that all possible discretized solutions are known. The methodology is then validated by applying it to a larger intermediate sample problem, which also has a known solution. Next, the methodology is applied to several larger combinatorial asteroid rendezvous problems, using previously identified good solutions as validation benchmarks. These problems include the 2nd and 3rd Global Trajectory Optimization Competition problems. The methodology is shown to be capable of achieving a reduction in the number of asteroid sequences of 6-7 orders of magnitude, in terms of the number of sequences that require low-thrust optimization as compared to the number of sequences in the original problem. More than 70% of the previously known good solutions are identified, along with several new solutions that were not previously reported by any of the competitors. Overall, the methodology developed in this investigation provides an organized search technique for the low-thrust mission design of asteroid rendezvous problems.
Scoring Yes-No Vocabulary Tests: Reaction Time vs. Nonword Approaches
ERIC Educational Resources Information Center
Pellicer-Sanchez, Ana; Schmitt, Norbert
2012-01-01
Despite a number of research studies investigating the Yes-No vocabulary test format, one main question remains unanswered: What is the best scoring procedure to adjust for testee overestimation of vocabulary knowledge? Different scoring methodologies have been proposed based on the inclusion and selection of nonwords in the test. However, there…
The Field Production of Water for Injection
1985-12-01
L/day Bedridden Patient 0.75 L/day Average Diseased Patient 0.50 L/day e (There is no feasible methodology to forecast the number of procedures per... Bedridden Patient 0.75 All Diseased Patients 0.50 An estimate of the liters/day needed may be calculated based on a forecasted patient stream, including
Differential Response: Response to Hughes and Colleagues
ERIC Educational Resources Information Center
Samuels, Bryan; Brown, Brett Vaughn
2013-01-01
In their critique of differential response (DR), Hughes and colleagues raise a number of important issues that are central to broader efforts at the Administration on Children, Youth, and Families (ACYF) including the need for greater reliance on evidence-based practice in child welfare, more rigorous evaluation methodologies, and a robust set of…
Organizational Approach to the Ergonomic Examination of E-Learning Modules
ERIC Educational Resources Information Center
Lavrov, Evgeniy; Kupenko, Olena; Lavryk, Tetiana; Barchenko, Natalia
2013-01-01
With a significant increase in the number of e-learning resources the issue of quality is of current importance. An analysis of existing scientific and methodological literature shows the variety of approaches, methods and tools to evaluate e-learning materials. This paper proposes an approach based on the procedure for estimating parameters of…
Expansive Learning in a Library: Actions, Cycles and Deviations from Instructional Intentions
ERIC Educational Resources Information Center
Engestrom, Yrjo; Rantavuori, Juhana; Kerosuo, Hannele
2013-01-01
The theory of expansive learning has been applied in a large number of studies on workplace learning and organizational change. However, detailed comprehensive analyses of entire developmental interventions based on the theory of expansive learning do not exist. Such a study is needed to examine the empirical usability and methodological rigor…
The Need for Private Universities in Japan to Be Agents of Change
ERIC Educational Resources Information Center
Zhang, Rong; McCornac, Dennis C.
2013-01-01
Purpose: The purpose of this paper is to examine a number of current innovations made by private higher educational institutions in Japan to counter decreased enrollments and financial constraints. Design/methodology/approach: The design of this study is both descriptive and conceptual, based on the latest data available. Additional information…
45 CFR 284.11 - What definitions apply to this part?
Code of Federal Regulations, 2010 CFR
2010-10-01
... METHODOLOGY FOR DETERMINING WHETHER AN INCREASE IN A STATE OR TERRITORY'S CHILD POVERTY RATE IS THE RESULT OF... estimating the number and percentage of children in poverty in each State. These methods may include national estimates based on the Current Population Survey; the Small Area Income and Poverty Estimates; the annual...
Creating Sustainable Education Projects in Roatán, Honduras through Continuous Process Improvement
ERIC Educational Resources Information Center
Raven, Arjan; Randolph, Adriane B.; Heil, Shelli
2010-01-01
The investigators worked together with permanent residents of Roatán, Honduras on sustainable initiatives to help improve the island's troubled educational programs. Our initiatives focused on increasing the number of students eligible and likely to attend a university. Using a methodology based in continuous process improvement, we developed…
Community College Students, Costs and Finances: A Review of Research Literature.
ERIC Educational Resources Information Center
Hyde, William; Augenblick, John
Based on a review of the literature and ongoing research, this four-part monograph provides a composite profile of the enrollment and financial status of the nation's community colleges. After introductory material describing research methodology, Part I analyzes community college enrollment by sex, examines the increase in the number of adult…
Oral Reading Fluency Growth: A Sample of Methodology and Findings. Research Brief 6
ERIC Educational Resources Information Center
Tindal, Gerald; Nese, Joseph F. T.
2013-01-01
For the past 20 years, the growth of students' oral reading fluency has been investigated by a number of researchers using curriculum-based measurement. These researchers have used varied methods (student samples, measurement procedures, and analytical techniques) and yet have converged on a relatively consistent finding: General education…
[Paediatric neurology and habilitation in Norway].
Waaler, Per Erik; Sommerfelt, Kristian
2004-10-07
Based on results from a national survey we discuss the status and prospects of Norwegian child neurology and habilitation. A questionnaire on neurology and habilitation was sent to all 22 Norwegian departments of paediatrics. All departments responded. The organisation of services varied considerably. Only one department registered children admitted for neurological disorders specifically. Habilitation was mainly based on out-patient services. The number of out-patient neurology consultations in relation to regional population varied with a factor of 5.3 from the department with lowest to the one with highest number of cases. Corresponding factors were 5.9 for number of habilitation consultations per year, 3.6 for paediatricians in child neurology and habilitation, and 5.6 for allied health professionals working in habilitation units. In Norway there were 61 physicians working in child neurology and habilitation. Several departments were active in work on methodology. Research was mainly carried out in university departments. Child neurology and habilitation services are available in all Norwegian counties. There is need for more systematic registration of clinical activities, for research, including the effect of treatment and interventions, more work on methodology, more posts for graduate medical education in the field, better organisation of services for in-patients, and closer cooperation between paediatric, habilitation and community care services.
A knowledge base of the chemical compounds of intermediary metabolism.
Karp, P D
1992-08-01
This paper describes a publicly available knowledge base of the chemical compounds involved in intermediary metabolism. We consider the motivations for constructing a knowledge base of metabolic compounds, the methodology by which it was constructed, and the information that it currently contains. Currently the knowledge base describes 981 compounds, listing for each: synonyms for its name, a systematic name, CAS registry number, chemical formula, molecular weight, chemical structure and two-dimensional display coordinates for the structure. The Compound Knowledge Base (CompoundKB) illustrates several methodological principles that should guide the development of biological knowledge bases. I argue that biological datasets should be made available in multiple representations to increase their accessibility to end users, and I present multiple representations of the CompoundKB (knowledge base, relational data base and ASN. 1 representations). I also analyze the general characteristics of these representations to provide an understanding of their relative advantages and disadvantages. Another principle is that the error rate of biological data bases should be estimated and documented-this analysis is performed for the CompoundKB.
Latent Class Analysis of Incomplete Data via an Entropy-Based Criterion
Larose, Chantal; Harel, Ofer; Kordas, Katarzyna; Dey, Dipak K.
2016-01-01
Latent class analysis is used to group categorical data into classes via a probability model. Model selection criteria then judge how well the model fits the data. When addressing incomplete data, the current methodology restricts the imputation to a single, pre-specified number of classes. We seek to develop an entropy-based model selection criterion that does not restrict the imputation to one number of clusters. Simulations show the new criterion performing well against the current standards of AIC and BIC, while a family studies application demonstrates how the criterion provides more detailed and useful results than AIC and BIC. PMID:27695391
Development and exploration of a new methodology for the fitting and analysis of XAS data.
Delgado-Jaime, Mario Ulises; Kennepohl, Pierre
2010-01-01
A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010), J. Synchrotron Rad. 17, 132-137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl(4)(2-), a common reference compound used for calibration and covalency estimation in M-Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples.
Development and exploration of a new methodology for the fitting and analysis of XAS data
Delgado-Jaime, Mario Ulises; Kennepohl, Pierre
2010-01-01
A new data analysis methodology for X-ray absorption near-edge spectroscopy (XANES) is introduced and tested using several examples. The methodology has been implemented within the context of a new Matlab-based program discussed in a companion related article [Delgado-Jaime et al. (2010 ▶), J. Synchrotron Rad. 17, 132–137]. The approach makes use of a Monte Carlo search method to seek appropriate starting points for a fit model, allowing for the generation of a large number of independent fits with minimal user-induced bias. The applicability of this methodology is tested using various data sets on the Cl K-edge XAS data for tetragonal CuCl4 2−, a common reference compound used for calibration and covalency estimation in M—Cl bonds. A new background model function that effectively blends together background profiles with spectral features is an important component of the discussed methodology. The development of a robust evaluation function to fit multiple-edge data is discussed and the implications regarding standard approaches to data analysis are discussed and explored within these examples. PMID:20029120
Becker, Christoph; Lauterbach, Gabriele; Spengler, Sarah; Dettweiler, Ulrich; Mess, Filip
2017-01-01
Background: Participants in Outdoor Education Programmes (OEPs) presumably benefit from these programmes in terms of their social and personal development, academic achievement and physical activity (PA). The aim of this systematic review was to identify studies about regular compulsory school- and curriculum-based OEPs, to categorise and evaluate reported outcomes, to assess the methodological quality, and to discuss possible benefits for students. Methods: We searched online databases to identify English- and German-language peer-reviewed journal articles that reported any outcomes on a student level. Two independent reviewers screened studies identified for eligibility and assessed the methodological quality. Results: Thirteen studies were included for analysis. Most studies used a case-study design, the average number of participants was moderate (mean valued (M) = 62.17; standard deviation (SD) = 64.12), and the methodological quality was moderate on average for qualitative studies (M = 0.52; SD = 0.11), and low on average for quantitative studies (M = 0.18; SD = 0.42). Eight studies described outcomes in terms of social dimensions, seven studies in learning dimensions and four studies were subsumed under additional outcomes, i.e., PA and health. Eleven studies reported positive, one study positive as well as negative, and one study reported negative effects. PA and mental health as outcomes were underrepresented. Conclusion: Tendencies were detected that regular compulsory school- and curriculum-based OEPs can promote students in respect of social, academic, physical and psychological dimensions. Very little is known concerning students’ PA or mental health. We recommend conducting more quasi-experimental design and longitudinal studies with a greater number of participants, and a high methodological quality to further investigate these tendencies. PMID:28475167
Becker, Christoph; Lauterbach, Gabriele; Spengler, Sarah; Dettweiler, Ulrich; Mess, Filip
2017-05-05
Participants in Outdoor Education Programmes (OEPs) presumably benefit from these programmes in terms of their social and personal development, academic achievement and physical activity (PA). The aim of this systematic review was to identify studies about regular compulsory school- and curriculum-based OEPs, to categorise and evaluate reported outcomes, to assess the methodological quality, and to discuss possible benefits for students. We searched online databases to identify English- and German-language peer-reviewed journal articles that reported any outcomes on a student level. Two independent reviewers screened studies identified for eligibility and assessed the methodological quality. Thirteen studies were included for analysis. Most studies used a case-study design, the average number of participants was moderate (mean valued (M) = 62.17; standard deviation (SD) = 64.12), and the methodological quality was moderate on average for qualitative studies (M = 0.52; SD = 0.11), and low on average for quantitative studies (M = 0.18; SD = 0.42). Eight studies described outcomes in terms of social dimensions, seven studies in learning dimensions and four studies were subsumed under additional outcomes, i.e., PA and health. Eleven studies reported positive, one study positive as well as negative, and one study reported negative effects. PA and mental health as outcomes were underrepresented. Tendencies were detected that regular compulsory school- and curriculum-based OEPs can promote students in respect of social, academic, physical and psychological dimensions. Very little is known concerning students' PA or mental health. We recommend conducting more quasi-experimental design and longitudinal studies with a greater number of participants, and a high methodological quality to further investigate these tendencies.
Washington, Simon; Oh, Jutaek
2006-03-01
Transportation professionals are sometimes required to make difficult transportation safety investment decisions in the face of uncertainty. In particular, an engineer may be expected to choose among an array of technologies and/or countermeasures to remediate perceived safety problems when: (1) little information is known about the countermeasure effects on safety; (2) information is known but from different regions, states, or countries where a direct generalization may not be appropriate; (3) where the technologies and/or countermeasures are relatively untested, or (4) where costs prohibit the full and careful testing of each of the candidate countermeasures via before-after studies. The importance of an informed and well-considered decision based on the best possible engineering knowledge and information is imperative due to the potential impact on the numbers of human injuries and deaths that may result from these investments. This paper describes the formalization and application of a methodology to evaluate the safety benefit of countermeasures in the face of uncertainty. To illustrate the methodology, 18 countermeasures for improving safety of at grade railroad crossings (AGRXs) in the Republic of Korea are considered. Akin to "stated preference" methods in travel survey research, the methodology applies random selection and laws of large numbers to derive accident modification factor (AMF) densities from expert opinions. In a full Bayesian analysis framework, the collective opinions in the form of AMF densities (data likelihood) are combined with prior knowledge (AMF density priors) for the 18 countermeasures to obtain 'best' estimates of AMFs (AMF posterior credible intervals). The countermeasures are then compared and recommended based on the largest safety returns with minimum risk (uncertainty). To the author's knowledge the complete methodology is new and has not previously been applied or reported in the literature. The results demonstrate that the methodology is able to discern anticipated safety benefit differences across candidate countermeasures. For the 18 at grade railroad crossings considered in this analysis, it was found that the top three performing countermeasures for reducing crashes are in-vehicle warning systems, obstacle detection systems, and constant warning time systems.
Stability-based validation of dietary patterns obtained by cluster analysis.
Sauvageot, Nicolas; Schritz, Anna; Leite, Sonia; Alkerwi, Ala'a; Stranges, Saverio; Zannad, Faiez; Streel, Sylvie; Hoge, Axelle; Donneau, Anne-Françoise; Albert, Adelin; Guillaume, Michèle
2017-01-14
Cluster analysis is a data-driven method used to create clusters of individuals sharing similar dietary habits. However, this method requires specific choices from the user which have an influence on the results. Therefore, there is a need of an objective methodology helping researchers in their decisions during cluster analysis. The objective of this study was to use such a methodology based on stability of clustering solutions to select the most appropriate clustering method and number of clusters for describing dietary patterns in the NESCAV study (Nutrition, Environment and Cardiovascular Health), a large population-based cross-sectional study in the Greater Region (N = 2298). Clustering solutions were obtained with K-means, K-medians and Ward's method and a number of clusters varying from 2 to 6. Their stability was assessed with three indices: adjusted Rand index, Cramer's V and misclassification rate. The most stable solution was obtained with K-means method and a number of clusters equal to 3. The "Convenient" cluster characterized by the consumption of convenient foods was the most prevalent with 46% of the population having this dietary behaviour. In addition, a "Prudent" and a "Non-Prudent" patterns associated respectively with healthy and non-healthy dietary habits were adopted by 25% and 29% of the population. The "Convenient" and "Non-Prudent" clusters were associated with higher cardiovascular risk whereas the "Prudent" pattern was associated with a decreased cardiovascular risk. Associations with others factors showed that the choice of a specific dietary pattern is part of a wider lifestyle profile. This study is of interest for both researchers and public health professionals. From a methodological standpoint, we showed that using stability of clustering solutions could help researchers in their choices. From a public health perspective, this study showed the need of targeted health promotion campaigns describing the benefits of healthy dietary patterns.
Cost Effectiveness of HPV Vaccination: A Systematic Review of Modelling Approaches.
Pink, Joshua; Parker, Ben; Petrou, Stavros
2016-09-01
A large number of economic evaluations have been published that assess alternative possible human papillomavirus (HPV) vaccination strategies. Understanding differences in the modelling methodologies used in these studies is important to assess the accuracy, comparability and generalisability of their results. The aim of this review was to identify published economic models of HPV vaccination programmes and understand how characteristics of these studies vary by geographical area, date of publication and the policy question being addressed. We performed literature searches in MEDLINE, Embase, Econlit, The Health Economic Evaluations Database (HEED) and The National Health Service Economic Evaluation Database (NHS EED). From the 1189 unique studies retrieved, 65 studies were included for data extraction based on a priori eligibility criteria. Two authors independently reviewed these articles to determine eligibility for the final review. Data were extracted from the selected studies, focussing on six key structural or methodological themes covering different aspects of the model(s) used that may influence cost-effectiveness results. More recently published studies tend to model a larger number of HPV strains, and include a larger number of HPV-associated diseases. Studies published in Europe and North America also tend to include a larger number of diseases and are more likely to incorporate the impact of herd immunity and to use more realistic assumptions around vaccine efficacy and coverage. Studies based on previous models often do not include sufficiently robust justifications as to the applicability of the adapted model to the new context. The considerable between-study heterogeneity in economic evaluations of HPV vaccination programmes makes comparisons between studies difficult, as observed differences in cost effectiveness may be driven by differences in methodology as well as by variations in funding and delivery models and estimates of model parameters. Studies should consistently report not only all simplifying assumptions made but also the estimated impact of these assumptions on the cost-effectiveness results.
Filatov, Michael; Liu, Fang; Kim, Kwang S.; ...
2016-12-22
Here, the spin-restricted ensemble-referenced Kohn-Sham (REKS) method is based on an ensemble representation of the density and is capable of correctly describing the non-dynamic electron correlation stemming from (near-)degeneracy of several electronic configurations. The existing REKS methodology describes systems with two electrons in two fractionally occupied orbitals. In this work, the REKS methodology is extended to treat systems with four fractionally occupied orbitals accommodating four electrons and self-consistent implementation of the REKS(4,4) method with simultaneous optimization of the orbitals and their fractional occupation numbers is reported. The new method is applied to a number of molecular systems where simultaneous dissociationmore » of several chemical bonds takes place, as well as to the singlet ground states of organic tetraradicals 2,4-didehydrometaxylylene and 1,4,6,9-spiro[4.4]nonatetrayl.« less
Exploiting Multisite Gateway and pENFRUIT plasmid collection for fruit genetic engineering.
Estornell, Leandro H; Granell, Antonio; Orzaez, Diego
2012-01-01
MultiSite Gateway cloning techniques based on homologous recombination facilitate the combinatorial assembly of basic genetic pieces (i.e., promoters, CDS, and terminators) into gene expression or gene silencing cassettes. pENFRUIT is a collection of MultiSite Triple Gateway Entry vectors dedicated to genetic engineering in fruits. It comprises a number of fruit-operating promoters as well as C-terminal tags adapted to the Gateway standard. In this way, flanking regulatory/labeling sequences can be easily Gateway-assembled with a given gene of interest for its ectopic expression or silencing in fruits. The resulting gene constructs can be analyzed in stable transgenic plants or in transient expression assays, the latter allowing fast testing of the increasing number of combinations arising from MultiSite methodology. A detailed description of the use of MultiSite cloning methodology for the assembly of pENFRUIT elements is presented.
Fast underdetermined BSS architecture design methodology for real time applications.
Mopuri, Suresh; Reddy, P Sreenivasa; Acharyya, Amit; Naik, Ganesh R
2015-01-01
In this paper, we propose a high speed architecture design methodology for the Under-determined Blind Source Separation (UBSS) algorithm using our recently proposed high speed Discrete Hilbert Transform (DHT) targeting real time applications. In UBSS algorithm, unlike the typical BSS, the number of sensors are less than the number of the sources, which is of more interest in the real time applications. The DHT architecture has been implemented based on sub matrix multiplication method to compute M point DHT, which uses N point architecture recursively and where M is an integer multiples of N. The DHT architecture and state of the art architecture are coded in VHDL for 16 bit word length and ASIC implementation is carried out using UMC 90 - nm technology @V DD = 1V and @ 1MHZ clock frequency. The proposed architecture implementation and experimental comparison results show that the DHT design is two times faster than state of the art architecture.
Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.K.; Buelt, J.L.; Stottlemyre, J.A.
1991-02-01
Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets ofmore » compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
NASA Astrophysics Data System (ADS)
Alfano, M.; Bisagni, C.
2017-01-01
The objective of the running EU project DESICOS (New Robust DESign Guideline for Imperfection Sensitive COmposite Launcher Structures) is to formulate an improved shell design methodology in order to meet the demand of aerospace industry for lighter structures. Within the project, this article discusses the development of a probability-based methodology developed at Politecnico di Milano. It is based on the combination of the Stress-Strength Interference Method and the Latin Hypercube Method with the aim to predict the bucking response of three sandwich composite cylindrical shells, assuming a loading condition of pure compression. The three shells are made of the same material, but have different stacking sequence and geometric dimensions. One of them presents three circular cut-outs. Different types of input imperfections, treated as random variables, are taken into account independently and in combination: variability in longitudinal Young's modulus, ply misalignment, geometric imperfections, and boundary imperfections. The methodology enables a first assessment of the structural reliability of the shells through the calculation of a probabilistic buckling factor for a specified level of probability. The factor depends highly on the reliability level, on the number of adopted samples, and on the assumptions made in modeling the input imperfections. The main advantage of the developed procedure is the versatility, as it can be applied to the buckling analysis of laminated composite shells and sandwich composite shells including different types of imperfections.
Methodology for cost analysis of film-based and filmless portable chest systems
NASA Astrophysics Data System (ADS)
Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.
1996-05-01
Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayer, J.
Based on a compilation of three estimation approaches, the total nationwide population of wild pigs in the United States numbers approximately 6.3 million animals, with that total estimate ranging from 4.4 up to 11.3 million animals. The majority of these numbers (99 percent), which were encompassed by ten states (i.e., Alabama, Arkansas, California, Florida, Georgia, Louisiana, Mississippi, Oklahoma, South Carolina and Texas), were based on defined estimation methodologies (e.g., density estimates correlated to the total potential suitable wild pig habitat statewide, statewide harvest percentages, statewide agency surveys regarding wild pig distribution and numbers). In contrast to the pre-1990 estimates, nonemore » of these more recent efforts, collectively encompassing 99 percent of the total, were based solely on anecdotal information or speculation. To that end, one can defensibly state that the wild pigs found in the United States number in the millions of animals, with the nationwide population estimated to arguably vary from about four million up to about eleven million individuals.« less
Predictions of first passage times in sparse discrete fracture networks using graph-based reductions
NASA Astrophysics Data System (ADS)
Hyman, J.; Hagberg, A.; Srinivasan, G.; Mohd-Yusof, J.; Viswanathan, H. S.
2017-12-01
We present a graph-based methodology to reduce the computational cost of obtaining first passage times through sparse fracture networks. We derive graph representations of generic three-dimensional discrete fracture networks (DFNs) using the DFN topology and flow boundary conditions. Subgraphs corresponding to the union of the k shortest paths between the inflow and outflow boundaries are identified and transport on their equivalent subnetworks is compared to transport through the full network. The number of paths included in the subgraphs is based on the scaling behavior of the number of edges in the graph with the number of shortest paths. First passage times through the subnetworks are in good agreement with those obtained in the full network, both for individual realizations and in distribution. Accurate estimates of first passage times are obtained with an order of magnitude reduction of CPU time and mesh size using the proposed method.
Predictions of first passage times in sparse discrete fracture networks using graph-based reductions
NASA Astrophysics Data System (ADS)
Hyman, Jeffrey D.; Hagberg, Aric; Srinivasan, Gowri; Mohd-Yusof, Jamaludin; Viswanathan, Hari
2017-07-01
We present a graph-based methodology to reduce the computational cost of obtaining first passage times through sparse fracture networks. We derive graph representations of generic three-dimensional discrete fracture networks (DFNs) using the DFN topology and flow boundary conditions. Subgraphs corresponding to the union of the k shortest paths between the inflow and outflow boundaries are identified and transport on their equivalent subnetworks is compared to transport through the full network. The number of paths included in the subgraphs is based on the scaling behavior of the number of edges in the graph with the number of shortest paths. First passage times through the subnetworks are in good agreement with those obtained in the full network, both for individual realizations and in distribution. Accurate estimates of first passage times are obtained with an order of magnitude reduction of CPU time and mesh size using the proposed method.
Semantic integration of gene expression analysis tools and data sources using software connectors
2013-01-01
Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380
Semantic integration of gene expression analysis tools and data sources using software connectors.
Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G
2013-10-25
The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.
Reference Intervals of Common Clinical Chemistry Analytes for Adults in Hong Kong.
Lo, Y C; Armbruster, David A
2012-04-01
Defining reference intervals is a major challenge because of the difficulty in recruiting volunteers to participate and testing samples from a significant number of healthy reference individuals. Historical literature citation intervals are often suboptimal because they're be based on obsolete methods and/or only a small number of poorly defined reference samples. Blood donors in Hong Kong gave permission for additional blood to be collected for reference interval testing. The samples were tested for twenty-five routine analytes on the Abbott ARCHITECT clinical chemistry system. Results were analyzed using the Rhoads EP evaluator software program, which is based on the CLSI/IFCC C28-A guideline, and defines the reference interval as the 95% central range. Method specific reference intervals were established for twenty-five common clinical chemistry analytes for a Chinese ethnic population. The intervals were defined for each gender separately and for genders combined. Gender specific or combined gender intervals were adapted as appropriate for each analyte. A large number of healthy, apparently normal blood donors from a local ethnic population were tested to provide current reference intervals for a new clinical chemistry system. Intervals were determined following an accepted international guideline. Laboratories using the same or similar methodologies may adapt these intervals if deemed validated and deemed suitable for their patient population. Laboratories using different methodologies may be able to successfully adapt the intervals for their facilities using the reference interval transference technique based on a method comparison study.
NASA Astrophysics Data System (ADS)
Klaas, D. K. S. Y.; Imteaz, M. A.; Sudiayem, I.; Klaas, E. M. E.; Klaas, E. C. M.
2017-10-01
In groundwater modelling, robust parameterisation of sub-surface parameters is crucial towards obtaining an agreeable model performance. Pilot point is an alternative in parameterisation step to correctly configure the distribution of parameters into a model. However, the methodology given by the current studies are considered less practical to be applied on real catchment conditions. In this study, a practical approach of using geometric features of pilot point and distribution of hydraulic gradient over the catchment area is proposed to efficiently configure pilot point distribution in the calibration step of a groundwater model. A development of new pilot point distribution, Head Zonation-based (HZB) technique, which is based on the hydraulic gradient distribution of groundwater flow, is presented. Seven models of seven zone ratios (1, 5, 10, 15, 20, 25 and 30) using HZB technique were constructed on an eogenetic karst catchment in Rote Island, Indonesia and their performances were assessed. This study also concludes some insights into the trade-off between restricting and maximising the number of pilot points and offers a new methodology for selecting pilot point properties and distribution method in the development of a physically-based groundwater model.
Mateo, Estibaliz; Sevillano, Elena
2018-07-01
In the recent years, there has been a decrease in the number of medical professionals dedicated to a research career. There is evidence that students with a research experience during their training acquire knowledge and skills that increase the probability of getting involved in research more successfully. In the Degree of Medicine (University of the Basque Country) the annual core subject 'Research Project' introduces students to research. The aim of this work was to implement a project-based learning methodology, with the students working on microbiology, and to analyse its result along time. Given an initial scenario, the students had to come up with a research idea related to medical microbiology and to carry out a research project, including writing a funding proposal, developing the experimental assays and analyzing and presenting their results to a congress organized by the University. Summative assessment was performed by both students and teachers. A satisfaction survey was carried out to gather the students' opinion. The overall results regarding to the classroom dynamics, learning results and motivation after the implementation were favourable. Students referred a greater interest about research than they had before. They would choose the project based methodology versus the traditional one.
Casaseca-de-la-Higuera, Pablo; Simmross-Wattenberg, Federico; Martín-Fernández, Marcos; Alberola-López, Carlos
2009-07-01
Discontinuation of mechanical ventilation is a challenging task that involves a number of subtle clinical issues. The gradual removal of the respiratory support (referred to as weaning) should be performed as soon as autonomous respiration can be sustained. However, the prediction rate of successful extubation is still below 25% based on previous studies. Construction of an automatic system that provides information on extubation readiness is thus desirable. Recent works have demonstrated that the breathing pattern variability is a useful extubation readiness indicator, with improving performance when multiple respiratory signals are jointly processed. However, the existing methods for predictor extraction present several drawbacks when length-limited time series are to be processed in heterogeneous groups of patients. In this paper, we propose a model-based methodology for automatic readiness prediction. It is intended to deal with multichannel, nonstationary, short records of the breathing pattern. Results on experimental data yield an 87.27% of successful readiness prediction, which is in line with the best figures reported in the literature. A comparative analysis shows that our methodology overcomes the shortcomings of so far proposed methods when applied to length-limited records on heterogeneous groups of patients.
Nieri, Michele; Clauser, Carlo; Franceschi, Debora; Pagliaro, Umberto; Saletta, Daniele; Pini-Prato, Giovanpaolo
2007-08-01
The aim of the present study was to investigate the relationships among reported methodological, statistical, clinical and paratextual variables of randomized clinical trials (RCTs) in implant therapy, and their influence on subsequent research. The material consisted of the RCTs in implant therapy published through the end of the year 2000. Methodological, statistical, clinical and paratextual features of the articles were assessed and recorded. The perceived clinical relevance was subjectively evaluated by an experienced clinician on anonymous abstracts. The impact on research was measured by the number of citations found in the Science Citation Index. A new statistical technique (Structural learning of Bayesian Networks) was used to assess the relationships among the considered variables. Descriptive statistics revealed that the reported methodology and statistics of RCTs in implant therapy were defective. Follow-up of the studies was generally short. The perceived clinical relevance appeared to be associated with the objectives of the studies and with the number of published images in the original articles. The impact on research was related to the nationality of the involved institutions and to the number of published images. RCTs in implant therapy (until 2000) show important methodological and statistical flaws and may not be appropriate for guiding clinicians in their practice. The methodological and statistical quality of the studies did not appear to affect their impact on practice and research. Bayesian Networks suggest new and unexpected relationships among the methodological, statistical, clinical and paratextual features of RCTs.
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS
2017-10-01
DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS UNIVERSITY OF SOUTHERN CALIFORNIA OCTOBER 2017 FINAL...SUBTITLE DESIGN METHODOLOGIES AND TOOLS FOR SINGLE-FLUX QUANTUM LOGIC CIRCUITS 5a. CONTRACT NUMBER FA8750-15-C-0203 5b. GRANT NUMBER N/A 5c. PROGRAM...of this project was to investigate the state-of-the-art in design and optimization of single-flux quantum (SFQ) logic circuits, e.g., RSFQ and ERSFQ
Where does good quality qualitative health care research get published?
Richardson, Jane C; Liddle, Jennifer
2017-09-01
This short report aims to give some insight into current publication patterns for high-quality qualitative health research, using the Research Excellence Framework (REF) 2014 database. We explored patterns of publication by range and type of journal, by date and by methodological focus. We also looked at variations between the publications submitted to different Units of Assessment, focussing particularly on the one most closely aligned with our own research area of primary care. Our brief analysis demonstrates that general medical/health journals with high impact factors are the dominant routes of publication, but there is variation according to the methodological approach adopted by articles. The number of qualitative health articles submitted to REF 2014 overall was small, and even more so for articles based on mixed methods research, qualitative methodology or reviews/syntheses that included qualitative articles.
Styles of Adaptation: The Impact of Frequency and Valence of Adaptation on Preventing Substance Use
ERIC Educational Resources Information Center
Hansen, William B.; Pankratz, Melinda M.; Dusenbury, Linda; Giles, Steven M.; Bishop, Dana C.; Albritton, Jordan; Albritton, Lauren P.; Strack, Joann
2013-01-01
Purpose: To be effective, evidence-based programs should be delivered as prescribed. This suggests that adaptations that deviate from intervention goals may limit a program's effectiveness. This study aims to examine the impact that number and quality of adaptations have on substance use outcomes. Design/methodology/approach: The authors examined…
Exploring How the School Context Mediates Intern Learning in Underserved Rural Border Schools
ERIC Educational Resources Information Center
Ajayi, Lasisi
2013-01-01
This research used poststructural theories to examine a crucial issue of teacher-learning in rural border schools that are under pressure from high-stakes school accountability, fewer resources, and significant numbers of English language learners (ELLs). The methodology was based on a multiple case study of four intern teachers who participated…
A School Curriculum for Fetal Alcohol Spectrum Disorder: Advice from a Young Adult with FASD
ERIC Educational Resources Information Center
Brenna, Beverley; Burles, Meridith; Holtslander, Lorraine; Bocking, Sarah
2017-01-01
While a significant number of individuals in Canada and globally are affected by prenatal fetal alcohol exposure, scant research exists that focuses specifically on the subjective experiences of this population. Based on a single case study exploring through Photovoice methodology the life experiences of a young adult with Fetal Alcohol Spectrum…
Teaching and Learning English in a Multicultural Classroom: Strategies and Opportunities
ERIC Educational Resources Information Center
Xerri, Daniel
2016-01-01
Purpose: This paper aims to explore the beliefs and experiences of a group of teachers endeavouring to enhance their students' learning of English while adapting to a multicultural classroom reality. Design/methodology/approach: The paper is based on the results of a case study involving a number of semi-structured interviews. Findings: The paper…
ERIC Educational Resources Information Center
Bradford, William D.; Cahoon, Laty; Freel, Sara R.; Hoopes, Laura L. Mays; Eckdahl, Todd T.
2005-01-01
In order to engage their students in a core methodology of the new genomics era, an everincreasing number of faculty at primarily undergraduate institutions are gaining access to microarray technology. Their students are conducting successful microarray experiments designed to address a variety of interesting questions. A next step in these…
Autism genetics: Methodological issues and experimental design.
Sacco, Roberto; Lintas, Carla; Persico, Antonio M
2015-10-01
Autism is a complex neuropsychiatric disorder of developmental origin, where multiple genetic and environmental factors likely interact resulting in a clinical continuum between "affected" and "unaffected" individuals in the general population. During the last two decades, relevant progress has been made in identifying chromosomal regions and genes in linkage or association with autism, but no single gene has emerged as a major cause of disease in a large number of patients. The purpose of this paper is to discuss specific methodological issues and experimental strategies in autism genetic research, based on fourteen years of experience in patient recruitment and association studies of autism spectrum disorder in Italy.
Disturbance characteristics of half-selected cells in a cross-point resistive switching memory array
NASA Astrophysics Data System (ADS)
Chen, Zhe; Li, Haitong; Chen, Hong-Yu; Chen, Bing; Liu, Rui; Huang, Peng; Zhang, Feifei; Jiang, Zizhen; Ye, Hongfei; Gao, Bin; Liu, Lifeng; Liu, Xiaoyan; Kang, Jinfeng; Wong, H.-S. Philip; Yu, Shimeng
2016-05-01
Disturbance characteristics of cross-point resistive random access memory (RRAM) arrays are comprehensively studied in this paper. An analytical model is developed to quantify the number of pulses (#Pulse) the cell can bear before disturbance occurs under various sub-switching voltage stresses based on physical understanding. An evaluation methodology is proposed to assess the disturb behavior of half-selected (HS) cells in cross-point RRAM arrays by combining the analytical model and SPICE simulation. The characteristics of cross-point RRAM arrays such as energy consumption, reliable operating cycles and total error bits are evaluated by the methodology. A possible solution to mitigate disturbance is proposed.
Buzzelli, Michael
2007-03-01
The environmental justice literature faces a number of conceptual and methodological shortcomings. The purpose of this paper is to probe ways in which these shortcomings can be remedied via recent developments in related literatures: population health and air pollution epidemiology. More sophisticated treatment of social structure, particularly if based on Pierre Bourdieu's relational approach to forms of capital, can be combined with the methodological rigour and established biological pathways of air pollution epidemiology. The aim is to reformulate environmental justice research in order to make further meaningful contributions to the wider movement concerned with issues of social justice and equity in health research.
Allocation of nursing care hours in a combined ophthalmic nursing unit.
Navarro, V B; Stout, W A; Tolley, F M
1995-04-01
Traditional service configuration with separate nursing units for outpatient and inpatient care is becoming ineffective for new patient care delivery models. With the new configuration of a combined nursing unit, it was necessary to rethink traditional reporting methodologies and calculation of hours of care. This project management plan is an initial attempt to develop a standard costing/productivity model for a combined unit. The methodology developed from this plan measures nursing care hours for each patient population to determine the number of full time equivalents (FTEs) for a combined unit and allocates FTEs based on inpatient (IP), outpatient (OP), and emergency room (ER) volumes.
Ranasinghe, Nadeesha; Jones, Graham B
2013-03-15
Microwave, flow and combination methodologies have been applied to the synthesis of a number of substituted indoles. Based on the Hemetsberger-Knittel (HK) process, modifications allow formation of products rapidly and in high yield. Adapting the methodology allows formation of 2-unsubstituted indoles and derivatives, and a route to analogs of the antitumor agent PLX-4032 is demonstrated. The utility of the HK substrates is further demonstrated through bioconjugation and subsequent ring closure and via Huisgen type [3+2] cycloaddition chemistry, allowing formation of peptide adducts which can be subsequently labeled with fluorine tags. Copyright © 2013 Elsevier Ltd. All rights reserved.
A spatial ammonia emission inventory for pig farming
NASA Astrophysics Data System (ADS)
Rebolledo, Boris; Gil, Antonia; Pallarés, Javier
2013-01-01
Atmospheric emissions of ammonia (NH3) from the agricultural sector have become a significant environmental and public concern as they have impacts on human health and ecosystems. This work proposes an improved methodology in order to identify administrative regions with high NH3 emissions from pig farming and calculates an ammonia density map (kg NH3-N ha-1), based on the number of pigs and available agricultural land, terrain slopes, groundwater bodies, soil permeability, zones sensitive to nitrate pollution and surface water buffer zones. The methodology has been used to construct a general tool for locating ammonia emissions from pig farming when detailed information of livestock farms is not available.
Determination of soil degradation from flooding for estimating ecosystem services in Slovakia
NASA Astrophysics Data System (ADS)
Hlavcova, Kamila; Szolgay, Jan; Karabova, Beata; Kohnova, Silvia
2015-04-01
Floods as natural hazards are related to soil health, land-use and land management. They not only represent threats on their own, but can also be triggered, controlled and amplified by interactions with other soil threats and soil degradation processes. Among the many direct impacts of flooding on soil health, including soil texture, structure, changes in the soil's chemical properties, deterioration of soil aggregation and water holding capacity, etc., are soil erosion, mudflows, depositions of sediment and debris. Flooding is initiated by a combination of predispositive and triggering factors and apart from climate drivers it is related to the physiographic conditions of the land, state of the soil, land use and land management. Due to the diversity and complexity of their potential interactions, diverse methodologies and approaches are needed for describing a particular type of event in a specific environment, especially in ungauged sites. In engineering studies and also in many rainfall-runoff models, the SCS-CN method has remained widely applied for soil and land use-based estimations of direct runoff and flooding potential. The SCS-CN method is an empirical rainfall-runoff model developed by the USDA Natural Resources Conservation Service (formerly called the Soil Conservation Service or SCS). The runoff curve number (CN) is based on the hydrological soil characteristics, land use, land management and antecedent saturation conditions of soil. Since the method and curve numbers were derived on the basis of an empirical analysis of rainfall-runoff events from small catchments and hillslope plots monitored by the USDA, the use of the method for the conditions of Slovakia raises uncertainty and can cause inaccurate results in determining direct runoff. The objective of the study presented (also within the framework of the EU-FP7 RECARE Project) was to develop the SCS - CN methodology for the flood conditions in Slovakia (and especially for the RECARE pilot site of Myjava), with an emphasis on the determination of soil degradation from flooding for estimating ecosystem services. The parameters of the SCS-CN methodology were regionalised empirically based on actual rainfall and discharge measurements. Since there has been no appropriate methodology provided for the regionalisation of SCS-CN method parameters in Slovakia, such as runoff curve numbers and initial abstraction coefficients (λ), the work presented is important for the correct application of the SCS-CN method in our conditions.
Montero, Javier; Dib, Abraham; Guadilla, Yasmina; Flores, Javier; Santos, Juan Antonio; Aguilar, Rosa Anaya; Gómez-Polo, Cristina
2018-02-01
The aim of this study was to compare the perceived competence for treating prosthodontic patients of two samples of fourth-year dental students: those educated using traditional methodologies and those educated using problem-based learning (PBL). Two cohorts of fourth-year dental students at a dental school in Spain were surveyed: the traditional methods cohort (n=46) was comprised of all students in academic years 2012 and 2013, and the PBL cohort (n=57) was comprised of all students in academic years 2014 and 2015. Students in both cohorts reported the number of prosthodontic treatments they carried out per year and their perceived level of competence in performing such treatments. The results showed that the average number of treatments performed was similar for the two cohorts, except the number of metal-based removable partial dentures was significantly higher for students in the traditional (0.8±1.0) than the PBL (0.4±0.6) cohort. The level of perceived competence to treat complete denture patients for the combined cohorts was significantly higher (7.3±1.1) than that for partial acrylic dentures (6.7±1.5) and combined dentures (5.7±1.3). Students' clinical competence in prosthodontics mainly depended on number of treatments performed as the operator as well as the assistant. Students in the traditional methods cohort considered themselves to be significantly more competent at treating patients for removable partial and fixed prostheses (7.8±1.1 and 7.6±1.1, respectively) than did students in the PBL cohort (6.4±1.5 and 6.6±1.5, respectively). Overall, however, the study found that practical experiences were more important than the teaching method used to achieve students' perceived competence.
Adding spatial flexibility to source-receptor relationships for air quality modeling.
Pisoni, E; Clappier, A; Degraeuwe, B; Thunis, P
2017-04-01
To cope with computing power limitations, air quality models that are used in integrated assessment applications are generally approximated by simpler expressions referred to as "source-receptor relationships (SRR)". In addition to speed, it is desirable for the SRR also to be spatially flexible (application over a wide range of situations) and to require a "light setup" (based on a limited number of full Air Quality Models - AQM simulations). But "speed", "flexibility" and "light setup" do not naturally come together and a good compromise must be ensured that preserves "accuracy", i.e. a good comparability between SRR results and AQM. In this work we further develop a SRR methodology to better capture spatial flexibility. The updated methodology is based on a cell-to-cell relationship, in which a bell-shape function links emissions to concentrations. Maintaining a cell-to-cell relationship is shown to be the key element needed to ensure spatial flexibility, while at the same time the proposed approach to link emissions and concentrations guarantees a "light set-up" phase. Validation has been repeated on different areas and domain sizes (countries, regions, province throughout Europe) for precursors reduced independently or contemporarily. All runs showed a bias around 10% between the full AQM and the SRR. This methodology allows assessing the impact on air quality of emission scenarios applied over any given area in Europe (regions, set of regions, countries), provided that a limited number of AQM simulations are performed for training.
Interpreting the Australian Dietary Guideline to “Limit” into Practical and Personalised Advice
Fayet-Moore, Flavia; Pearson, Suzanne
2015-01-01
Food-based dietary guidelines shift the focus from single nutrients to whole diet. Guideline 3 of the Australian Dietary Guidelines (ADG) recommends “limiting” discretionary foods and beverages (DF)—Those high in saturated fat, added sugars, salt, and/or alcohol. In Australia, DF contribute 35% of total energy intake. Using the ADG supporting documents, the aim of this study was to develop a food‑based educational toolkit to help translate guideline 3 and interpret portion size. The methodology used to produce the toolkit is presented here. “Additional energy allowance” is specific to gender, age, height and physical activity level, and can be met from core foods, unsaturated fats/oils/spreads and/or DF. To develop the toolkit, additional energy allowance was converted to serves equaling 600 kJ. Common DF were selected and serves were determined based on nutrient profile. Portion sizes were used to calculate number of DF serves. A consumer brochure consisting of DF, portion sizes and equivalent number of DF serves was developed. A healthcare professional guide outlines the methodology used. The toolkit was designed to assist dietitians and consumers to translate guideline 3 of the ADF and develop a personalized approach to include DF as part of the diet. PMID:25803544
Golding, Sarah Elizabeth; Cropley, Mark
2017-09-01
The demand for organ donation is increasing worldwide. One possible way of increasing the pool of potential posthumous donors is to encourage more members of the general public to join an organ donor registry. A systematic review was conducted to investigate the effectiveness of psychological interventions designed to increase the number of individuals in the community who register as organ donors. PsycINFO and PubMed databases were searched. No date limits were set. Randomized and nonrandomized controlled trials exploring the effects of community-based interventions on organ donor registration rates were included. Methodological quality was assessed using the "Quality Assessment Tool for Quantitative Studies." Twenty-four studies met the inclusion criteria; 19 studies found a positive intervention effect on registration. Only 8 studies were assessed as having reasonable methodological robustness. A narrative synthesis was conducted. Factors influencing registration rates include providing an immediate registration opportunity and using brief interventions to challenge misconceptions and concerns about organ donation. Community-based interventions can be effective at increasing organ donor registrations among the general public. Factors that may increase effectiveness include brief interventions to address concerns and providing an immediate registration opportunity. Particular consideration should be paid to the fidelity of intervention delivery. Protocol registration number: CRD42014012975.
Recommendations for benefit-risk assessment methodologies and visual representations.
Hughes, Diana; Waddingham, Ed; Mt-Isa, Shahrul; Goginsky, Alesia; Chan, Edmond; Downey, Gerald F; Hallgreen, Christine E; Hockley, Kimberley S; Juhaeri, Juhaeri; Lieftucht, Alfons; Metcalf, Marilyn A; Noel, Rebecca A; Phillips, Lawrence D; Ashby, Deborah; Micaleff, Alain
2016-03-01
The purpose of this study is to draw on the practical experience from the PROTECT BR case studies and make recommendations regarding the application of a number of methodologies and visual representations for benefit-risk assessment. Eight case studies based on the benefit-risk balance of real medicines were used to test various methodologies that had been identified from the literature as having potential applications in benefit-risk assessment. Recommendations were drawn up based on the results of the case studies. A general pathway through the case studies was evident, with various classes of methodologies having roles to play at different stages. Descriptive and quantitative frameworks were widely used throughout to structure problems, with other methods such as metrics, estimation techniques and elicitation techniques providing ways to incorporate technical or numerical data from various sources. Similarly, tree diagrams and effects tables were universally adopted, with other visualisations available to suit specific methodologies or tasks as required. Every assessment was found to follow five broad stages: (i) Planning, (ii) Evidence gathering and data preparation, (iii) Analysis, (iv) Exploration and (v) Conclusion and dissemination. Adopting formal, structured approaches to benefit-risk assessment was feasible in real-world problems and facilitated clear, transparent decision-making. Prior to this work, no extensive practical application and appraisal of methodologies had been conducted using real-world case examples, leaving users with limited knowledge of their usefulness in the real world. The practical guidance provided here takes us one step closer to a harmonised approach to benefit-risk assessment from multiple perspectives. Copyright © 2016 John Wiley & Sons, Ltd.
Catchment area-based evaluation of the AMC-dependent SCS-CN-based rainfall-runoff models
NASA Astrophysics Data System (ADS)
Mishra, S. K.; Jain, M. K.; Pandey, R. P.; Singh, V. P.
2005-09-01
Using a large set of rainfall-runoff data from 234 watersheds in the USA, a catchment area-based evaluation of the modified version of the Mishra and Singh (2002a) model was performed. The model is based on the Soil Conservation Service Curve Number (SCS-CN) methodology and incorporates the antecedent moisture in computation of direct surface runoff. Comparison with the existing SCS-CN method showed that the modified version performed better than did the existing one on the data of all seven area-based groups of watersheds ranging from 0.01 to 310.3 km2.
Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving
Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice
2016-01-01
The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
Decision support methodology to establish priorities on the inspection of structures
NASA Astrophysics Data System (ADS)
Cortes, V. Juliette; Sterlacchini, Simone; Bogaard, Thom; Frigerio, Simone; Schenato, Luca; Pasuto, Alessandro
2014-05-01
For hydro-meteorological hazards in mountain areas, the regular inspection of check dams and bridges is important due to the effect of their functional status on water-sediment processes. Moreover, the inspection of these structures is time consuming for organizations due to their extensive number in many regions. However, trained citizen-volunteers can support civil protection and technical services in the frequency, timeliness and coverage of monitoring the functional status of hydraulic structures. Technicians should evaluate and validate these reports to get an index for the status of the structure. Thus, preventive actions could initiate such as the cleaning of obstructions or to pre-screen potential problems for a second level inspection. This study proposes a decision support methodology that technicians can use to assess an index for three parameters representing the functional status of the structure: a) condition of the structure at the opening of the stream flow, b) level of obstruction at the structure and c) the level of erosion in the stream bank. The calculation of the index for each parameter is based upon fuzzy logic theory to handle ranges in precision of the reports and to convert the linguistic rating scales into numbers representing the structure's status. A weighting method and multi-criteria method (Analytic Hierarchy Process- AHP and TOPSIS), can be used by technicians to combine the different ratings according to the component elements of the structure and the completeness of the reports. Finally, technicians can set decision rules based on the worst rating and a threshold for the functional indexes. The methodology was implemented as a prototype web-based tool to be tested with technicians of the Civil Protection in the Fella basin, Northern Italy. Results at this stage comprise the design and implementation of the web-based tool with GIS interaction to evaluate available reports and to set priorities on the inspection of structures. Keywords Decision-making, Multi-criteria methods, Torrent control structures, Web-based tools.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damato, A; Devlin, P; Bhagwat, M
Purpose: To investigate the sensitivity and specificity of a novel verification methodology for image-guided skin HDR brachytherapy plans using a TRAK-based reasonableness test, compared to a typical manual verification methodology. Methods: Two methodologies were used to flag treatment plans necessitating additional review due to a potential discrepancy of 3 mm between planned dose and clinical target in the skin. Manual verification was used to calculate the discrepancy between the average dose to points positioned at time of planning representative of the prescribed depth and the expected prescription dose. Automatic verification was used to calculate the discrepancy between TRAK of themore » clinical plan and its expected value, which was calculated using standard plans with varying curvatures, ranging from flat to cylindrically circumferential. A plan was flagged if a discrepancy >10% was observed. Sensitivity and specificity were calculated using as a criteria for true positive that >10% of plan dwells had a distance to prescription dose >1 mm different than prescription depth (3 mm + size of applicator). All HDR image-based skin brachytherapy plans treated at our institution in 2013 were analyzed. Results: 108 surface applicator plans to treat skin of the face, scalp, limbs, feet, hands or abdomen were analyzed. Median number of catheters was 19 (range, 4 to 71) and median number of dwells was 257 (range, 20 to 1100). Sensitivity/specificity were 57%/78% for manual and 70%/89% for automatic verification. Conclusion: A check based on expected TRAK value is feasible for irregularly shaped, image-guided skin HDR brachytherapy. This test yielded higher sensitivity and specificity than a test based on the identification of representative points, and can be implemented with a dedicated calculation code or with pre-calculated lookup tables of ideally shaped, uniform surface applicators.« less
Three Dimensional Sector Design with Optimal Number of Sectors
NASA Technical Reports Server (NTRS)
Xue, Min
2010-01-01
In the national airspace system, sectors get overloaded due to high traffic demand and inefficient airspace designs. Overloads can be eliminated in some cases by redesigning sector boundaries. This paper extends the Voronoi-based sector design method by automatically selecting the number of sectors, allowing three-dimensional partitions, and enforcing traffic pattern conformance. The method was used to design sectors at Fort-Worth and Indianapolis centers for current traffic scenarios. Results show that new designs can eliminate overloaded sectors, although not in all cases, reduce the number of necessary sectors, and conform to major traffic patterns. Overall, the new methodology produces enhanced and efficient sector designs.
Equivalent orthotropic elastic moduli identification method for laminated electrical steel sheets
NASA Astrophysics Data System (ADS)
Saito, Akira; Nishikawa, Yasunari; Yamasaki, Shintaro; Fujita, Kikuo; Kawamoto, Atsushi; Kuroishi, Masakatsu; Nakai, Hideo
2016-05-01
In this paper, a combined numerical-experimental methodology for the identification of elastic moduli of orthotropic media is presented. Special attention is given to the laminated electrical steel sheets, which are modeled as orthotropic media with nine independent engineering elastic moduli. The elastic moduli are determined specifically for use with finite element vibration analyses. We propose a three-step methodology based on a conventional nonlinear least squares fit between measured and computed natural frequencies. The methodology consists of: (1) successive augmentations of the objective function by increasing the number of modes, (2) initial condition updates, and (3) appropriate selection of the natural frequencies based on their sensitivities on the elastic moduli. Using the results of numerical experiments, it is shown that the proposed method achieves more accurate converged solution than a conventional approach. Finally, the proposed method is applied to measured natural frequencies and mode shapes of the laminated electrical steel sheets. It is shown that the method can successfully identify the orthotropic elastic moduli that can reproduce the measured natural frequencies and frequency response functions by using finite element analyses with a reasonable accuracy.
A novel tree-based algorithm to discover seismic patterns in earthquake catalogs
NASA Astrophysics Data System (ADS)
Florido, E.; Asencio-Cortés, G.; Aznarte, J. L.; Rubio-Escudero, C.; Martínez-Álvarez, F.
2018-06-01
A novel methodology is introduced in this research study to detect seismic precursors. Based on an existing approach, the new methodology searches for patterns in the historical data. Such patterns may contain statistical or soil dynamics information. It improves the original version in several aspects. First, new seismicity indicators have been used to characterize earthquakes. Second, a machine learning clustering algorithm has been applied in a very flexible way, thus allowing the discovery of new data groupings. Third, a novel search strategy is proposed in order to obtain non-overlapped patterns. And, fourth, arbitrary lengths of patterns are searched for, thus discovering long and short-term behaviors that may influence in the occurrence of medium-large earthquakes. The methodology has been applied to seven different datasets, from three different regions, namely the Iberian Peninsula, Chile and Japan. Reported results show a remarkable improvement with respect to the former version, in terms of all evaluated quality measures. In particular, the number of false positives has decreased and the positive predictive values increased, both of them in a very remarkable manner.
Kinjo, Masataka
2018-01-01
Neurodegenerative diseases, including amyotrophic lateral sclerosis (ALS), Alzheimer’s disease, Parkinson’s disease, and Huntington’s disease, are devastating proteinopathies with misfolded protein aggregates accumulating in neuronal cells. Inclusion bodies of protein aggregates are frequently observed in the neuronal cells of patients. Investigation of the underlying causes of neurodegeneration requires the establishment and selection of appropriate methodologies for detailed investigation of the state and conformation of protein aggregates. In the current review, we present an overview of the principles and application of several methodologies used for the elucidation of protein aggregation, specifically ones based on determination of fluctuations of fluorescence. The discussed methods include fluorescence correlation spectroscopy (FCS), imaging FCS, image correlation spectroscopy (ICS), photobleaching ICS (pbICS), number and brightness (N&B) analysis, super-resolution optical fluctuation imaging (SOFI), and transient state (TRAST) monitoring spectroscopy. Some of these methodologies are classical protein aggregation analyses, while others are not yet widely used. Collectively, the methods presented here should help the future development of research not only into protein aggregation but also neurodegenerative diseases. PMID:29570669
LC-MS based analysis of endogenous steroid hormones in human hair.
Gao, Wei; Kirschbaum, Clemens; Grass, Juliane; Stalder, Tobias
2016-09-01
The quantification of endogenous steroid hormone concentrations in hair is increasingly used as a method for obtaining retrospective information on long-term integrated hormone exposure. Several different analytical procedures have been employed for hair steroid analysis, with liquid chromatography-mass spectrometry (LC-MS) being recognized as a particularly powerful analytical tool. Several methodological aspects affect the performance of LC-MS systems for hair steroid analysis, including sample preparation and pretreatment, steroid extraction, post-incubation purification, LC methodology, ionization techniques and MS specifications. Here, we critically review the differential value of such protocol variants for hair steroid hormones analysis, focusing on both analytical quality and practical feasibility issues. Our results show that, when methodological challenges are adequately addressed, LC-MS protocols can not only yield excellent sensitivity and specificity but are also characterized by relatively simple sample processing and short run times. This makes LC-MS based hair steroid protocols particularly suitable as a high-quality option for routine application in research contexts requiring the processing of larger numbers of samples. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yee, Eugene
2007-04-01
Although a great deal of research effort has been focused on the forward prediction of the dispersion of contaminants (e.g., chemical and biological warfare agents) released into the turbulent atmosphere, much less work has been directed toward the inverse prediction of agent source location and strength from the measured concentration, even though the importance of this problem for a number of practical applications is obvious. In general, the inverse problem of source reconstruction is ill-posed and unsolvable without additional information. It is demonstrated that a Bayesian probabilistic inferential framework provides a natural and logically consistent method for source reconstruction from a limited number of noisy concentration data. In particular, the Bayesian approach permits one to incorporate prior knowledge about the source as well as additional information regarding both model and data errors. The latter enables a rigorous determination of the uncertainty in the inference of the source parameters (e.g., spatial location, emission rate, release time, etc.), hence extending the potential of the methodology as a tool for quantitative source reconstruction. A model (or, source-receptor relationship) that relates the source distribution to the concentration data measured by a number of sensors is formulated, and Bayesian probability theory is used to derive the posterior probability density function of the source parameters. A computationally efficient methodology for determination of the likelihood function for the problem, based on an adjoint representation of the source-receptor relationship, is described. Furthermore, we describe the application of efficient stochastic algorithms based on Markov chain Monte Carlo (MCMC) for sampling from the posterior distribution of the source parameters, the latter of which is required to undertake the Bayesian computation. The Bayesian inferential methodology for source reconstruction is validated against real dispersion data for two cases involving contaminant dispersion in highly disturbed flows over urban and complex environments where the idealizations of horizontal homogeneity and/or temporal stationarity in the flow cannot be applied to simplify the problem. Furthermore, the methodology is applied to the case of reconstruction of multiple sources.
Research Methodology in Second Language Studies: Trends, Concerns, and New Directions
ERIC Educational Resources Information Center
King, Kendall A.; Mackey, Alison
2016-01-01
The field of second language studies is using increasingly sophisticated methodological approaches to address a growing number of urgent, real-world problems. These methodological developments bring both new challenges and opportunities. This article briefly reviews recent ontological and methodological debates in the field, then builds on these…
76 FR 23825 - Study Methodologies for Diagnostics in the Postmarket Setting; Public Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-28
... community on issues related to the studies and methodological approaches examining diagnostics in the... discuss a large number of methodological concerns at the workshop, including, but not limited to the...
Transportation networks : data, analysis, methodology development and visualization.
DOT National Transportation Integrated Search
2007-12-29
This project provides data compilation, analysis methodology and visualization methodology for the current network : data assets of the Alabama Department of Transportation (ALDOT). This study finds that ALDOT is faced with a : considerable number of...
2013-04-01
and Integrated Risk Management Methodologies 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...supply chains, risk management with real options, and sustainability . [dnford@nps.edu] Thomas J. Housel—Housel specializes in valuing intellectual...maintenance services for the RDN. Damen Schelde has used an ILS since 2002 to manage the shipbuilding process from project initiation through the
Jenkins, M B; Endale, D M; Fisher, D S; Gay, P A
2009-02-01
To better understand the transport and enumeration of dilute densities of Escherichia coli O157:H7 in agricultural watersheds, we developed a culture-based, five tube-multiple dilution most probable number (MPN) method. The MPN method combined a filtration technique for large volumes of surface water with standard selective media, biochemical and immunological tests, and a TaqMan confirmation step. This method determined E. coli O157:H7 concentrations as low as 0.1 MPN per litre, with a 95% confidence level of 0.01-0.7 MPN per litre. Escherichia coli O157:H7 densities ranged from not detectable to 9 MPN per litre for pond inflow, from not detectable to 0.9 MPN per litre for pond outflow and from not detectable to 8.3 MPN per litre for within pond. The MPN methodology was extended to mass flux determinations. Fluxes of E. coli O157:H7 ranged from <27 to >10(4) MPN per hour. This culture-based method can detect small numbers of viable/culturable E. coli O157:H7 in surface waters of watersheds containing animal agriculture and wildlife. This MPN method will improve our understanding of the transport and fate of E. coli O157:H7 in agricultural watersheds, and can be the basis of collections of environmental E. coli O157:H7.
Depth Acuity Methodology for Electronic 3D Displays: eJames (eJ)
2016-07-01
AFRL-RH-WP-TR-2016-0060 Depth Acuity Methodology for Electronic 3D Displays: eJames (eJ) Eric L. Heft, John McIntire...AND SUBTITLE Depth Acuity Methodology for Electronic 3D Displays: eJames (eJ) 5a. CONTRACT NUMBER FA8650-08-D-6801-0050 5b. GRANT NUMBER...of 3D electronic displays: one active-eyewear Stereo 3D (S3D) and two non-eyewear full parallax Field-of-Light Display (FoLD) systems. The two FoLD
Evaluation of the HARDMAN comparability methodology for manpower, personnel and training
NASA Technical Reports Server (NTRS)
Zimmerman, W.; Butler, R.; Gray, V.; Rosenberg, L.
1984-01-01
The methodology evaluation and recommendation are part of an effort to improve Hardware versus Manpower (HARDMAN) methodology for projecting manpower, personnel, and training (MPT) to support new acquisition. Several different validity tests are employed to evaluate the methodology. The methodology conforms fairly well with both the MPT user needs and other accepted manpower modeling techniques. Audits of three completed HARDMAN applications reveal only a small number of potential problem areas compared to the total number of issues investigated. The reliability study results conform well with the problem areas uncovered through the audits. The results of the accuracy studies suggest that the manpower life-cycle cost component is only marginally sensitive to changes in other related cost variables. Even with some minor problems, the methodology seem sound and has good near term utility to the Army. Recommendations are provided to firm up the problem areas revealed through the evaluation.
Inconsistency prevents the valuable synergism of explanatory and pragmatic trails.
Correia, Luis C L; Correia, Vitor C A; Souza, Thiago M B; Cerqueira, Antonio Maurício S; Alexandre, Felipe K B; Garcia, Guilherme; Ferreira, Felipe R M; Lopes, Fernanda O A
2018-05-01
To assess review articles on pragmatic trials in order to describe how authors define the aim of this type of study, how comprehensive methodological topics are covered, and which topics are most valued by authors. Review articles were selected from Medline Database, based on the expression "pragmatic trial" in the titles. Five trained medical students evaluated the articles, based on a list of 15 self-explanatory methodological topics. Each article was evaluated regarding topics covered. Baseline statements on the aim of pragmatic trials were derived. Among 22 articles identified, there was general agreement that the aim of a pragmatic trial is to evaluate if the intervention works under real-world conditions. The mean number of methodological topics addressed by each article was 7.6 ± 3.1. Only one article covered all 15 topics, three articles (14%) responded to at least 75% of topics and 13 articles (59%) mentioned at least 50% of the topics. The relative frequency each of the 15 topics was cited by articles had a mean of 50% ± 25%. No topic was addressed by all articles, only three (20%) were addressed by more than 75% of articles. There is agreement on the different aims of explanatory and pragmatic trials. But there is a large variation on methodological topics used to define a pragmatic trial, which led to inconsistency in defining the typical methodology of a pragmatic trial. © 2018 Chinese Cochrane Center, West China Hospital of Sichuan University and John Wiley & Sons Australia, Ltd.
Assessment of the Knowledge of the Decimal Number System Exhibited by Students with Down Syndrome
ERIC Educational Resources Information Center
Noda, Aurelia; Bruno, Alicia
2017-01-01
This paper presents an assessment of the understanding of the decimal numeral system in students with Down Syndrome (DS). We followed a methodology based on a descriptive case study involving six students with DS. We used a framework of four constructs (counting, grouping, partitioning and numerical relationships) and five levels of thinking for…
ERIC Educational Resources Information Center
Schriewer, Jurgen, Ed.
2012-01-01
New theories and theory-based methodological approaches have found their way into Comparative Education--just as into Comparative Social Science more generally--in increasing number in the recent past. The essays of this volume express and critically discuss quite a range of these positions such as, inter alia, the theory of self-organizing social…
Code of Federal Regulations, 2012 CFR
2012-07-01
... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...
Code of Federal Regulations, 2011 CFR
2011-07-01
... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...
Code of Federal Regulations, 2010 CFR
2010-07-01
... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...
ERIC Educational Resources Information Center
Bjorkquist, David C.
This document reports on a study of the training needs that result from actual or impending corporate takeovers, based on needs assessments at three corporations conducted by students as part of a university class over a period of 10 weeks. The first section describes the study's background and methodology. The qualitative research methodology…
NASA Astrophysics Data System (ADS)
Gelfan, Alexander; Moreydo, Vsevolod; Motovilov, Yury; Solomatine, Dimitri P.
2018-04-01
A long-term forecasting ensemble methodology, applied to water inflows into the Cheboksary Reservoir (Russia), is presented. The methodology is based on a version of the semi-distributed hydrological model ECOMAG (ECOlogical Model for Applied Geophysics) that allows for the calculation of an ensemble of inflow hydrographs using two different sets of weather ensembles for the lead time period: observed weather data, constructed on the basis of the Ensemble Streamflow Prediction methodology (ESP-based forecast), and synthetic weather data, simulated by a multi-site weather generator (WG-based forecast). We have studied the following: (1) whether there is any advantage of the developed ensemble forecasts in comparison with the currently issued operational forecasts of water inflow into the Cheboksary Reservoir, and (2) whether there is any noticeable improvement in probabilistic forecasts when using the WG-simulated ensemble compared to the ESP-based ensemble. We have found that for a 35-year period beginning from the reservoir filling in 1982, both continuous and binary model-based ensemble forecasts (issued in the deterministic form) outperform the operational forecasts of the April-June inflow volume actually used and, additionally, provide acceptable forecasts of additional water regime characteristics besides the inflow volume. We have also demonstrated that the model performance measures (in the verification period) obtained from the WG-based probabilistic forecasts, which are based on a large number of possible weather scenarios, appeared to be more statistically reliable than the corresponding measures calculated from the ESP-based forecasts based on the observed weather scenarios.
A novel methodology for interpreting air quality measurements from urban streets using CFD modelling
NASA Astrophysics Data System (ADS)
Solazzo, Efisio; Vardoulakis, Sotiris; Cai, Xiaoming
2011-09-01
In this study, a novel computational fluid dynamics (CFD) based methodology has been developed to interpret long-term averaged measurements of pollutant concentrations collected at roadside locations. The methodology is applied to the analysis of pollutant dispersion in Stratford Road (SR), a busy street canyon in Birmingham (UK), where a one-year sampling campaign was carried out between August 2005 and July 2006. Firstly, a number of dispersion scenarios are defined by combining sets of synoptic wind velocity and direction. Assuming neutral atmospheric stability, CFD simulations are conducted for all the scenarios, by applying the standard k-ɛ turbulence model, with the aim of creating a database of normalised pollutant concentrations at specific locations within the street. Modelled concentration for all wind scenarios were compared with hourly observed NO x data. In order to compare with long-term averaged measurements, a weighted average of the CFD-calculated concentration fields was derived, with the weighting coefficients being proportional to the frequency of each scenario observed during the examined period (either monthly or annually). In summary the methodology consists of (i) identifying the main dispersion scenarios for the street based on wind speed and directions data, (ii) creating a database of CFD-calculated concentration fields for the identified dispersion scenarios, and (iii) combining the CFD results based on the frequency of occurrence of each dispersion scenario during the examined period. The methodology has been applied to calculate monthly and annually averaged benzene concentration at several locations within the street canyon so that a direct comparison with observations could be made. The results of this study indicate that, within the simplifying assumption of non-buoyant flow, CFD modelling can aid understanding of long-term air quality measurements, and help assessing the representativeness of monitoring locations for population exposure studies.
An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence
NASA Technical Reports Server (NTRS)
Lindley, Craig A.
1993-01-01
This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.
NASA Astrophysics Data System (ADS)
Viironen, K.; Marín-Franch, A.; López-Sanjuan, C.; Varela, J.; Chaves-Montero, J.; Cristóbal-Hornillos, D.; Molino, A.; Fernández-Soto, A.; Vilella-Rojo, G.; Ascaso, B.; Cenarro, A. J.; Cerviño, M.; Cepa, J.; Ederoclite, A.; Márquez, I.; Masegosa, J.; Moles, M.; Oteo, I.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Castander, J. F.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.
2015-04-01
Context. Most observational results on the high redshift restframe UV-bright galaxies are based on samples pinpointed using the so-called dropout technique or Ly-α selection. However, the availability of multifilter data now allows the dropout selections to be replaced by direct methods based on photometric redshifts. In this paper we present the methodology to select and study the population of high redshift galaxies in the ALHAMBRA survey data. Aims: Our aim is to develop a less biased methodology than the traditional dropout technique to study the high redshift galaxies in ALHAMBRA and other multifilter data. Thanks to the wide area ALHAMBRA covers, we especially aim at contributing to the study of the brightest, least frequent, high redshift galaxies. Methods: The methodology is based on redshift probability distribution functions (zPDFs). It is shown how a clean galaxy sample can be obtained by selecting the galaxies with high integrated probability of being within a given redshift interval. However, reaching both a complete and clean sample with this method is challenging. Hence, a method to derive statistical properties by summing the zPDFs of all the galaxies in the redshift bin of interest is introduced. Results: Using this methodology we derive the galaxy rest frame UV number counts in five redshift bins centred at z = 2.5,3.0,3.5,4.0, and 4.5, being complete up to the limiting magnitude at mUV(AB) = 24, where mUV refers to the first ALHAMBRA filter redwards of the Ly-α line. With the wide field ALHAMBRA data we especially contribute to the study of the brightest ends of these counts, accurately sampling the surface densities down to mUV(AB) = 21-22. Conclusions: We show that using the zPDFs it is easy to select a very clean sample of high redshift galaxies. We also show that it is better to do statistical analysis of the properties of galaxies using a probabilistic approach, which takes into account both the incompleteness and contamination issues in a natural way. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).
NASA Technical Reports Server (NTRS)
Onwubiko, Chin-Yere; Onyebueke, Landon
1996-01-01
The structural design, or the design of machine elements, has been traditionally based on deterministic design methodology. The deterministic method considers all design parameters to be known with certainty. This methodology is, therefore, inadequate to design complex structures that are subjected to a variety of complex, severe loading conditions. A nonlinear behavior that is dependent on stress, stress rate, temperature, number of load cycles, and time is observed on all components subjected to complex conditions. These complex conditions introduce uncertainties; hence, the actual factor of safety margin remains unknown. In the deterministic methodology, the contingency of failure is discounted; hence, there is a use of a high factor of safety. It may be most useful in situations where the design structures are simple. The probabilistic method is concerned with the probability of non-failure performance of structures or machine elements. It is much more useful in situations where the design is characterized by complex geometry, possibility of catastrophic failure, sensitive loads and material properties. Also included: Comparative Study of the use of AGMA Geometry Factors and Probabilistic Design Methodology in the Design of Compact Spur Gear Set.
Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors
NASA Technical Reports Server (NTRS)
Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele
2010-01-01
This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-18
... parties to comment on these methodological issues described above. Request for Comment on Interim Industry... comments. \\15\\ Indicator: GNI per capita, Atlas Method (current US$) is obtained from http://data.worldbank... methodology, the Department has encountered a number of methodological and practical challenges that must be...
Second Language Listening Strategy Research: Methodological Challenges and Perspectives
ERIC Educational Resources Information Center
Santos, Denise; Graham, Suzanne; Vanderplank, Robert
2008-01-01
This paper explores methodological issues related to research into second language listening strategies. We argue that a number of central questions regarding research methodology in this line of enquiry are underexamined, and we engage in the discussion of three key methodological questions: (1) To what extent is a verbal report a valid and…
Evaluation Methodology. The Evaluation Exchange. Volume 11, Number 2, Summer 2005
ERIC Educational Resources Information Center
Coffman, Julia, Ed.
2005-01-01
This is the third issue of "The Evaluation Exchange" devoted entirely to the theme of methodology, though every issue tries to identify new methodological choices, the instructive ways in which people have applied or combined different methods, and emerging methodological trends. For example, lately "theories of change" have gained almost…
Horvath, Karl; Semlitsch, Thomas; Jeitler, Klaus; Abuzahra, Muna E; Posch, Nicole; Domke, Andreas; Siebenhofer, Andrea
2016-01-01
Objectives Identification of sufficiently trustworthy top 5 list recommendations from the US Choosing Wisely campaign. Setting Not applicable. Participants All top 5 list recommendations available from the American Board of Internal Medicine Foundation website. Main outcome measures/interventions Compilation of US top 5 lists and search for current German highly trustworthy (S3) guidelines. Extraction of guideline recommendations, including grade of recommendation (GoR), for suggestions comparable to top 5 list recommendations. For recommendations without guideline equivalents, the methodological quality of the top 5 list development process was assessed using criteria similar to that used to judge guidelines, and relevant meta-literature was identified in cited references. Judgement of sufficient trustworthiness of top 5 list recommendations was based either on an ‘A’ GoR of guideline equivalents or on high methodological quality and citation of relevant meta-literature. Results 412 top 5 list recommendations were identified. For 75 (18%), equivalents were found in current German S3 guidelines. 44 of these recommendations were associated with an ‘A’ GoR, or a strong recommendation based on strong evidence, and 26 had a ‘B’ or a ‘C’ GoR. No GoR was provided for 5 recommendations. 337 recommendations had no equivalent in the German S3 guidelines. The methodological quality of the development process was high and relevant meta-literature was cited for 87 top 5 list recommendations. For a further 36, either the methodological quality was high without any meta-literature citations or meta-literature citations existed but the methodological quality was lacking. For the remaining 214 recommendations, either the methodological quality was lacking and no literature was cited or the methodological quality was generally unsatisfactory. Conclusions 131 of current US top 5 list recommendations were found to be sufficiently trustworthy. For a substantial number of current US top 5 list recommendations, their trustworthiness remains unclear. Methodological requirements for developing top 5 lists are recommended. PMID:27855098
NASA Astrophysics Data System (ADS)
Morin, C.; Quattrochi, D. A.; Zavodsky, B.; Case, J.
2015-12-01
Dengue fever (DF) is an important mosquito transmitted disease that is strongly influenced by meteorological and environmental conditions. Recent research has focused on forecasting DF case numbers based on meteorological data. However, these forecasting tools have generally relied on empirical models that require long DF time series to train. Additionally, their accuracy has been tested retrospectively, using past meteorological data. Consequently, the operational utility of the forecasts are still in question because the error associated with weather and climate forecasts are not reflected in the results. Using up-to-date weekly dengue case numbers for model parameterization and weather forecast data as meteorological input, we produced weekly forecasts of DF cases in San Juan, Puerto Rico. Each week, the past weeks' case counts were used to re-parameterize a process-based DF model driven with updated weather forecast data to generate forecasts of DF case numbers. Real-time weather forecast data was produced using the Weather Research and Forecasting (WRF) numerical weather prediction (NWP) system enhanced using additional high-resolution NASA satellite data. This methodology was conducted in a weekly iterative process with each DF forecast being evaluated using county-level DF cases reported by the Puerto Rico Department of Health. The one week DF forecasts were accurate especially considering the two sources of model error. First, weather forecasts were sometimes inaccurate and generally produced lower than observed temperatures. Second, the DF model was often overly influenced by the previous weeks DF case numbers, though this phenomenon could be lessened by increasing the number of simulations included in the forecast. Although these results are promising, we would like to develop a methodology to produce longer range forecasts so that public health workers can better prepare for dengue epidemics.
Multi-viewpoint clustering analysis
NASA Technical Reports Server (NTRS)
Mehrotra, Mala; Wild, Chris
1993-01-01
In this paper, we address the feasibility of partitioning rule-based systems into a number of meaningful units to enhance the comprehensibility, maintainability and reliability of expert systems software. Preliminary results have shown that no single structuring principle or abstraction hierarchy is sufficient to understand complex knowledge bases. We therefore propose the Multi View Point - Clustering Analysis (MVP-CA) methodology to provide multiple views of the same expert system. We present the results of using this approach to partition a deployed knowledge-based system that navigates the Space Shuttle's entry. We also discuss the impact of this approach on verification and validation of knowledge-based systems.
Ground System Architectures Workshop GMSEC SERVICES SUITE (GSS): an Agile Development Story
NASA Technical Reports Server (NTRS)
Ly, Vuong
2017-01-01
The GMSEC (Goddard Mission Services Evolution Center) Services Suite (GSS) is a collection of tools and software services along with a robust customizable web-based portal that enables the user to capture, monitor, report, and analyze system-wide GMSEC data. Given our plug-and-play architecture and the needs for rapid system development, we opted to follow the Scrum Agile Methodology for software development. Being one of the first few projects to implement the Agile methodology at NASA GSFC, in this presentation we will present our approaches, tools, successes, and challenges in implementing this methodology. The GMSEC architecture provides a scalable, extensible ground and flight system for existing and future missions. GMSEC comes with a robust Application Programming Interface (GMSEC API) and a core set of Java-based GMSEC components that facilitate the development of a GMSEC-based ground system. Over the past few years, we have seen an upbeat in the number of customers who are moving from a native desktop application environment to a web based environment particularly for data monitoring and analysis. We also see a need to provide separation of the business logic from the GUI display for our Java-based components and also to consolidate all the GUI displays into one interface. This combination of separation and consolidation brings immediate value to a GMSEC-based ground system through increased ease of data access via a uniform interface, built-in security measures, centralized configuration management, and ease of feature extensibility.
Schnohr, Christina W; Molcho, Michal; Rasmussen, Mette; Samdal, Oddrun; de Looze, Margreet; Levin, Kate; Roberts, Chris J; Ehlinger, Virginie; Krølner, Rikke; Dalmasso, Paola; Torsheim, Torbjørn
2015-04-01
This article presents the scope and development of the Health Behaviour in School-aged Children (HBSC) study, reviews trend papers published on international HBSC data up to 2012 and discusses the efforts made to produce reliable trend analyses. The major goal of this article is to present the statistical procedures and analytical strategies for upholding high data quality, as well as reflections from the authors of this article on how to produce reliable trends based on an international study of the magnitude of the HBSC study. HBSC is an international cross-sectional study collecting data from adolescents aged 11-15 years, on a broad variety of health determinants and health behaviours. A number of methodological challenges have stemmed from the growth of the HBSC-study, in particular given that the study has a focus on monitoring trends. Some of those challenges are considered. When analysing trends, researchers must be able to assess whether a change in prevalence is an expression of an actual change in the observed outcome, whether it is a result of methodological artefacts, or whether it is due to changes in the conceptualization of the outcome by the respondents. The article present recommendations to take a number of the considerations into account. The considerations imply methodological challenges, which are core issues in undertaking trend analyses. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
Fungal Fragments in Moldy Houses: A Field Study in Homes in New Orleans and Southern Ohio
Reponen, Tiina; Seo, Sung-Chul; Grimsley, Faye; Lee, Taekhee; Crawford, Carlos; Grinshpun, Sergey A.
2007-01-01
Smaller-sized fungal fragments (<1 μm) may contribute to mold-related health effects. Previous laboratory-based studies have shown that the number concentration of fungal fragments can be up to 500 times higher than that of fungal spores, but this has not yet been confirmed in a field study due to lack of suitable methodology. We have recently developed a field-compatible method for the sampling and analysis of airborne fungal fragments. The new methodology was utilized for characterizing fungal fragment exposures in mold-contaminated homes selected in New Orleans, Louisiana and Southern Ohio. Airborne fungal particles were separated into three distinct size fractions: (i) >2.25 μm (spores); (ii) 1.05–2.25 μm (mixture); and (iii) < 1.0 μm (submicrometer-sized fragments). Samples were collected in five homes in summer and winter and analyzed for (1→3)-β-D-glucan. The total (1→3)-β-D-glucan varied from 0.2 to 16.0 ng m−3. The ratio of (1→3)-β-D-glucan mass in fragment size fraction to that in spore size fraction (F/S) varied from 0.011 to 2.163. The mass ratio was higher in winter (average = 1.017) than in summer (0.227) coinciding with a lower relative humidity in the winter. Assuming a mass-based F/S-ratio=1 and the spore size = 3 μm, the corresponding number-based F/S-ratio (fragment number/spore number) would be 103 and 106, for the fragment sizes of 0.3 and 0.03 μm, respectively. These results indicate that the actual (field) contribution of fungal fragments to the overall exposure may be very high, even much greater than that estimated in our earlier laboratory-based studies. PMID:19050738
NASA Astrophysics Data System (ADS)
Uvarova, Svetlana; Vlasenko, Vyacheslav; Bukreev, Anatoly; Myshovskaya, Ludmila; Kuzina, Olga
2018-03-01
This article is based on the analysis of modern condition and dynamics of innovational development of high-rise buildings construction. A number of cardinal organizational and economic changes in management at the macro and meso-levels is taken into the account. Principal scheme of development of the methodology of formation of perspective innovation politics in high-ruse buildings construction based on inculcation of modern methods of strategic control of innovational activity is suggested in this article.
NASA Astrophysics Data System (ADS)
Huda, J.; Kauneckis, D. L.
2013-12-01
Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.
Estimation of under-reporting in epidemics using approximations.
Gamado, Kokouvi; Streftaris, George; Zachary, Stan
2017-06-01
Under-reporting in epidemics, when it is ignored, leads to under-estimation of the infection rate and therefore of the reproduction number. In the case of stochastic models with temporal data, a usual approach for dealing with such issues is to apply data augmentation techniques through Bayesian methodology. Departing from earlier literature approaches implemented using reversible jump Markov chain Monte Carlo (RJMCMC) techniques, we make use of approximations to obtain faster estimation with simple MCMC. Comparisons among the methods developed here, and with the RJMCMC approach, are carried out and highlight that approximation-based methodology offers useful alternative inference tools for large epidemics, with a good trade-off between time cost and accuracy.
Computer Mathematics Games and Conditions for Enhancing Young Children's Learning of Number Sense
ERIC Educational Resources Information Center
Kermani, Hengameh
2017-01-01
Purpose: The present study was designed to examine whether mathematics computer games improved young children's learning of number sense under three different conditions: when used individually, with a peer, and with teacher facilitation. Methodology: This study utilized a mixed methodology, collecting both quantitative and qualitative data. A…
NASA Astrophysics Data System (ADS)
Huespe, A. E.; Oliver, J.; Mora, D. F.
2013-12-01
A finite element methodology for simulating the failure of high performance fiber reinforced concrete composites (HPFRC), with arbitrarily oriented short fibers, is presented. The composite material model is based on a micromorphic approach. Using the framework provided by this theory, the body configuration space is described through two kinematical descriptors. At the structural level, the displacement field represents the standard kinematical descriptor. Additionally, a morphological kinematical descriptor, the micromorphic field, is introduced. It describes the fiber-matrix relative displacement, or slipping mechanism of the bond, observed at the mesoscale level. In the first part of this paper, we summarize the model formulation of the micromorphic approach presented in a previous work by the authors. In the second part, and as the main contribution of the paper, we address specific issues related to the numerical aspects involved in the computational implementation of the model. The developed numerical procedure is based on a mixed finite element technique. The number of dofs per node changes according with the number of fiber bundles simulated in the composite. Then, a specific solution scheme is proposed to solve the variable number of unknowns in the discrete model. The HPFRC composite model takes into account the important effects produced by concrete fracture. A procedure for simulating quasi-brittle fracture is introduced into the model and is described in the paper. The present numerical methodology is assessed by simulating a selected set of experimental tests which proves its viability and accuracy to capture a number of mechanical phenomenon interacting at the macro- and mesoscale and leading to failure of HPFRC composites.
De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S
2013-11-01
In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration. © 2013 Elsevier B.V. All rights reserved.
[New calculation algorithms in brachytherapy for iridium 192 treatments].
Robert, C; Dumas, I; Martinetti, F; Chargari, C; Haie-Meder, C; Lefkopoulos, D
2018-05-18
Since 1995, the brachytherapy dosimetry protocols follow the methodology recommended by the Task Group 43. This methodology, which has the advantage of being fast, is based on several approximations that are not always valid in clinical conditions. Model-based dose calculation algorithms have recently emerged in treatment planning stations and are considered as a major evolution by allowing for consideration of the patient's finite dimensions, tissue heterogeneities and the presence of high atomic number materials in applicators. In 2012, a report from the American Association of Physicists in Medicine Radiation Therapy Task Group 186 reviews these models and makes recommendations for their clinical implementation. This review focuses on the use of model-based dose calculation algorithms in the context of iridium 192 treatments. After a description of these algorithms and their clinical implementation, a summary of the main questions raised by these new methods is performed. Considerations regarding the choice of the medium used for the dose specification and the recommended methodology for assigning materials characteristics are especially described. In the last part, recent concrete examples from the literature illustrate the capabilities of these new algorithms on clinical cases. Copyright © 2018 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.
Rezapour, Aziz; Jafari, Abdosaleh; Mirmasoudi, Kosha; Talebianpour, Hamid
2017-09-01
Health economic evaluation research plays an important role in selecting cost-effective interventions. The purpose of this study was to assess the quality of published articles in Iranian journals related to economic evaluation in health care programs based on Drummond's checklist in terms of numbers, features, and quality. In the present review study, published articles (Persian and English) in Iranian journals related to economic evaluation in health care programs were searched using electronic databases. In addition, the methodological quality of articles' structure was analyzed by Drummond's standard checklist. Based on the inclusion criteria, the search of databases resulted in 27 articles that fully covered economic evaluation in health care programs. A review of articles in accordance with Drummond's criteria showed that the majority of studies had flaws. The most common methodological weakness in the articles was in terms of cost calculation and valuation. Considering such methodological faults in these studies, it is anticipated that these studies would not provide an appropriate feedback to policy makers to allocate health care resources correctly and select suitable cost-effective interventions. Therefore, researchers are required to comply with the standard guidelines in order to better execute and report on economic evaluation studies.
Rezapour, Aziz; Jafari, Abdosaleh; Mirmasoudi, Kosha; Talebianpour, Hamid
2017-01-01
Health economic evaluation research plays an important role in selecting cost-effective interventions. The purpose of this study was to assess the quality of published articles in Iranian journals related to economic evaluation in health care programs based on Drummond’s checklist in terms of numbers, features, and quality. In the present review study, published articles (Persian and English) in Iranian journals related to economic evaluation in health care programs were searched using electronic databases. In addition, the methodological quality of articles’ structure was analyzed by Drummond’s standard checklist. Based on the inclusion criteria, the search of databases resulted in 27 articles that fully covered economic evaluation in health care programs. A review of articles in accordance with Drummond’s criteria showed that the majority of studies had flaws. The most common methodological weakness in the articles was in terms of cost calculation and valuation. Considering such methodological faults in these studies, it is anticipated that these studies would not provide an appropriate feedback to policy makers to allocate health care resources correctly and select suitable cost-effective interventions. Therefore, researchers are required to comply with the standard guidelines in order to better execute and report on economic evaluation studies. PMID:29234174
Teaching Methodologies in Spatial Planning for Integration of International Students
NASA Astrophysics Data System (ADS)
Virtudes, Ana; Cavaleiro, Victor
2016-10-01
Nowadays, the spread of international exchanges is growing among university students, across European countries. In general, during their academic degrees, the high education students are looking for international experiences abroad. This goal has its justification not only in the reason of pursuing their studies, but also in the desire of knowing another city, a different culture, a diverse way of teaching, and at the same time having the opportunity of improving their skills speaking another language. Therefore, the scholars at the high level of educational systems have to rethink their traditional approaches in terms of teaching methodologies in order to be able to integrate these students, that every academic year are coming from abroad. Portugal is not an exception on this matter, neither the scientific domain of spatial planning. Actually, during the last years, the number of foreign students choosing to study in this country is rapidly increasing. Even though some years ago, most of the international students were originated from Portuguese speaking countries, comprising its former colonies such as Brazil, Angola, Cape Verde or Mozambique, recently the number of students from other countries is increasing, including from Syria. Characterized by a mild climate, a beautiful seashore and cities packed with historical and cultural interests, this country is a very attractive destination for international students. In this sense, this study explores the beliefs about teaching methodologies that scholars in spatial planning domain can use to guide their practice within Architecture degree, in order to promote de integration of international students. These methodologies are based on the notion that effective teaching is student-centred rather than teacher-centred, in order to achieve a knowledge-centred learning environment framework in terms of spatial planning skills. Thus, this article arises out of a spatial planning unit experience in the Master Degree in Architecture (MIA) course, at the University of Beira Interior (UBI) in Portugal, to understand more about teaching methodologies, in order to promote the integration of international students. The study explores the teamwork tasks and the hetero-evaluation as new approaches in the teaching methodologies focused on the student-centred teaching. This research main conclusion is the need of promoting a shift from lecture-based and teacher-centred practices to student-centred approach.
Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin
ERIC Educational Resources Information Center
Hiha, Anne Aroha
2016-01-01
Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…
Martinez-Espronceda, Miguel; Martinez, Ignacio; Serrano, Luis; Led, Santiago; Trigo, Jesús Daniel; Marzo, Asier; Escayola, Javier; Garcia, José
2011-05-01
Traditionally, e-Health solutions were located at the point of care (PoC), while the new ubiquitous user-centered paradigm draws on standard-based personal health devices (PHDs). Such devices place strict constraints on computation and battery efficiency that encouraged the International Organization for Standardization/IEEE11073 (X73) standard for medical devices to evolve from X73PoC to X73PHD. In this context, low-voltage low-power (LV-LP) technologies meet the restrictions of X73PHD-compliant devices. Since X73PHD does not approach the software architecture, the accomplishment of an efficient design falls directly on the software developer. Therefore, computational and battery performance of such LV-LP-constrained devices can even be outperformed through an efficient X73PHD implementation design. In this context, this paper proposes a new methodology to implement X73PHD into microcontroller-based platforms with LV-LP constraints. Such implementation methodology has been developed through a patterns-based approach and applied to a number of X73PHD-compliant agents (including weighing scale, blood pressure monitor, and thermometer specializations) and microprocessor architectures (8, 16, and 32 bits) as a proof of concept. As a reference, the results obtained in the weighing scale guarantee all features of X73PHD running over a microcontroller architecture based on ARM7TDMI requiring only 168 B of RAM and 2546 B of flash memory.
Development of a Probabilistic Assessment Methodology for Evaluation of Carbon Dioxide Storage
Burruss, Robert A.; Brennan, Sean T.; Freeman, P.A.; Merrill, Matthew D.; Ruppert, Leslie F.; Becker, Mark F.; Herkelrath, William N.; Kharaka, Yousif K.; Neuzil, Christopher E.; Swanson, Sharon M.; Cook, Troy A.; Klett, Timothy R.; Nelson, Philip H.; Schenk, Christopher J.
2009-01-01
This report describes a probabilistic assessment methodology developed by the U.S. Geological Survey (USGS) for evaluation of the resource potential for storage of carbon dioxide (CO2) in the subsurface of the United States as authorized by the Energy Independence and Security Act (Public Law 110-140, 2007). The methodology is based on USGS assessment methodologies for oil and gas resources created and refined over the last 30 years. The resource that is evaluated is the volume of pore space in the subsurface in the depth range of 3,000 to 13,000 feet that can be described within a geologically defined storage assessment unit consisting of a storage formation and an enclosing seal formation. Storage assessment units are divided into physical traps (PTs), which in most cases are oil and gas reservoirs, and the surrounding saline formation (SF), which encompasses the remainder of the storage formation. The storage resource is determined separately for these two types of storage. Monte Carlo simulation methods are used to calculate a distribution of the potential storage size for individual PTs and the SF. To estimate the aggregate storage resource of all PTs, a second Monte Carlo simulation step is used to sample the size and number of PTs. The probability of successful storage for individual PTs or the entire SF, defined in this methodology by the likelihood that the amount of CO2 stored will be greater than a prescribed minimum, is based on an estimate of the probability of containment using present-day geologic knowledge. The report concludes with a brief discussion of needed research data that could be used to refine assessment methodologies for CO2 sequestration.
Development of fuzzy air quality index using soft computing approach.
Mandal, T; Gorai, A K; Pathak, G
2012-10-01
Proper assessment of air quality status in an atmosphere based on limited observations is an essential task for meeting the goals of environmental management. A number of classification methods are available for estimating the changing status of air quality. However, a discrepancy frequently arises from the quality criteria of air employed and vagueness or fuzziness embedded in the decision making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies like air quality index when describing integrated air quality conditions with respect to various pollutants parameters and time of exposure. In recent years, the fuzzy logic-based methods have demonstrated to be appropriated to address uncertainty and subjectivity in environmental issues. In the present study, a methodology based on fuzzy inference systems (FIS) to assess air quality is proposed. This paper presents a comparative study to assess status of air quality using fuzzy logic technique and that of conventional technique. The findings clearly indicate that the FIS may successfully harmonize inherent discrepancies and interpret complex conditions.
Rottmann, M; Mielck, A
2014-02-01
'Walkability' is mainly assessed by the NEWS questionnaire (Neighbourhood Environment Walkability Scale); in Germany this questionnaire is widely unknown. We now try to fill this gap by providing a systematic overview of empirical studies based on the NEWS. A systematic review was conducted concerning original papers including empirical analyses based on the NEWS. The results are summarised and presented in tables. Altogether 31 publications could be identified. Most of them focus on associations with the variable 'physical activity', and they often report significant associations with at least some of the scales included in the NEWS. Due to methodological differences between the studies it is difficult to compare the results. The concept of 'walkability' should also be established in the German public health discussion. A number of methodological challenges remain to be solved, such as the identification of those scales and items in the NEWS that show the strongest associations with individual health behaviours. © Georg Thieme Verlag KG Stuttgart · New York.
Vector data structure conversion at the EROS Data Center
van Roessel, Jan W.; Doescher, S.W.
1986-01-01
With the increasing prevalence of GIS systems and the processing of spatial data, conversion of data from one system to another has become a more serious problem. This report describes the approach taken to arrive at a solution at the EROS Data Center. The report consists of a main section and a number of appendices. The methodology is described in the main section, while the appendices have system specific descriptions. The overall approach is based on a central conversion hub consisting of a relational database manager and associated tools, with a standard data structure for the transfer of spatial data. This approach is the best compromise between the two goals of reducing the overall interfacing effort and producing efficient system interfaces, while the tools can be used to arrive at a progression of interface sophistication ranging from toolbench to smooth flow. The appendices provide detailed information on a number of spatial data handling systems and data structures and existing interfaces as well as interfaces developed with the described methodology.
Methodological Quality Assessment of Meta-Analyses of Hyperthyroidism Treatment.
Qin, Yahong; Yao, Liang; Shao, Feifei; Yang, Kehu; Tian, Limin
2018-01-01
Hyperthyroidism is a common condition that is associated with increased morbidity and mortality. A number of meta-analyses (MAs) have assessed the therapeutic measures for hyperthyroidism, including antithyroid drugs, surgery, and radioiodine, however, the methodological quality has not been evaluated. This study evaluated the methodological quality and summarized the evidence obtained from MAs of hyperthyroidism treatments for radioiodine, antithyroid drugs, and surgery. We searched the PubMed, EMBASE, Cochrane Library, Web of Science, and Chinese Biomedical Literature Database databases. Two investigators independently assessed the meta-analyses titles and abstracts for inclusion. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. A total of 26 MAs fulfilled the inclusion criteria. Based on the AMSTAR scores, the average methodological quality was 8.31, with large variability ranging from 4 to 11. The methodological quality of English meta-analyses was better than that of Chinese meta-analyses. Cochrane reviews had better methodological quality than non-Cochrane reviews due to better study selection and data extraction, the inclusion of unpublished studies, and better reporting of study characteristics. The authors did not report conflicts of interest in 53.8% meta-analyses, and 19.2% did not report the harmful effects of treatment. Publication bias was not assessed in 38.5% of meta-analyses, and 19.2% did not report the follow-up time. Large-scale assessment of methodological quality of meta-analyses of hyperthyroidism treatment highlighted methodological strengths and weaknesses. Consideration of scientific quality when formulating conclusions should be made explicit. Future meta-analyses should improve on reporting conflict of interest. © Georg Thieme Verlag KG Stuttgart · New York.
Bitter, Neis A; Roeg, Diana P K; van Nieuwenhuizen, Chijs; van Weeghel, Jaap
2015-07-22
There is an increasing amount of evidence for the effectiveness of rehabilitation interventions for people with severe mental illness (SMI). In the Netherlands, a rehabilitation methodology that is well known and often applied is the Comprehensive Approach to Rehabilitation (CARe) methodology. The overall goal of the CARe methodology is to improve the client's quality of life by supporting the client in realizing his/her goals and wishes, handling his/her vulnerability and improving the quality of his/her social environment. The methodology is strongly influenced by the concept of 'personal recovery' and the 'strengths case management model'. No controlled effect studies have been conducted hitherto regarding the CARe methodology. This study is a two-armed cluster randomized controlled trial (RCT) that will be executed in teams from three organizations for sheltered and supported housing, which provide services to people with long-term severe mental illness. Teams in the intervention group will receive the multiple-day CARe methodology training from a specialized institute and start working according the CARe Methodology guideline. Teams in the control group will continue working in their usual way. Standardized questionnaires will be completed at baseline (T0), and 10 (T1) and 20 months (T2) post baseline. Primary outcomes are recovery, social functioning and quality of life. The model fidelity of the CARe methodology will be assessed at T1 and T2. This study is the first controlled effect study on the CARe methodology and one of the few RCTs on a broad rehabilitation method or strength-based approach. This study is relevant because mental health care organizations have become increasingly interested in recovery and rehabilitation-oriented care. The trial registration number is ISRCTN77355880 .
Cho, Kyung Hwa; Lee, Seungwon; Ham, Young Sik; Hwang, Jin Hwan; Cha, Sung Min; Park, Yongeun; Kim, Joon Ha
2009-01-01
The present study proposes a methodology for determining the effective dispersion coefficient based on the field measurements performed in Gwangju (GJ) Creek in South Korea which is environmentally degraded by the artificial interferences such as weirs and culverts. Many previous works determining the dispersion coefficient were limited in application due to the complexity and artificial interferences in natural stream. Therefore, the sequential combination of N-Tank-In-Series (NTIS) model and Advection-Dispersion-Reaction (ADR) model was proposed for evaluating dispersion process in complex stream channel in this study. The series of water quality data were intensively monitored in the field to determine the effective dispersion coefficient of E. coli in rainy day. As a result, the suggested methodology reasonably estimates the dispersion coefficient for GJ Creek with 1.25 m(2)/s. Also, the sequential combined method provided Number of tank-Velocity-Dispersion coefficient (NVD) curves for convenient evaluation of dispersion coefficient of other rivers or streams. Comparing the previous studies, the present methodology is quite general and simple for determining the effective dispersion coefficients which are applicable for other rivers and streams.
Delamination onset in polymeric composite laminates under thermal and mechanical loads
NASA Technical Reports Server (NTRS)
Martin, Roderick H.
1991-01-01
A fracture mechanics damage methodology to predict edge delamination is described. The methodology accounts for residual thermal stresses, cyclic thermal stresses, and cyclic mechanical stresses. The modeling is based on the classical lamination theory and a sublaminate theory. The prediction methodology determines the strain energy release rate, G, at the edge of a laminate and compares it with the fatigue and fracture toughness of the composite. To verify the methodology, isothermal static tests at 23, 125, and 175 C and tension-tension fatigue tests at 23 and 175 C were conducted on laminates. The material system used was a carbon/bismaleimide, IM7/5260. Two quasi-isotropic layups were used. Also, 24 ply unidirectional double cantilever beam specimens were tested to determine the fatigue and fracture toughness of the composite at different temperatures. Raising the temperature had the effect of increasing the value of G at the edge for these layups and also to lower the fatigue and fracture toughness of the composite. The static stress to edge delamination was not affected by temperature but the number of cycles to edge delamination decreased.
NASA Technical Reports Server (NTRS)
Kimmel, William M. (Technical Monitor); Bradley, Kevin R.
2004-01-01
This paper describes the development of a methodology for sizing Blended-Wing-Body (BWB) transports and how the capabilities of the Flight Optimization System (FLOPS) have been expanded using that methodology. In this approach, BWB transports are sized based on the number of passengers in each class that must fit inside the centerbody or pressurized vessel. Weight estimation equations for this centerbody structure were developed using Finite Element Analysis (FEA). This paper shows how the sizing methodology has been incorporated into FLOPS to enable the design and analysis of BWB transports. Previous versions of FLOPS did not have the ability to accurately represent or analyze BWB configurations in any reliable, logical way. The expanded capabilities allow the design and analysis of a 200 to 450-passenger BWB transport or the analysis of a BWB transport for which the geometry is already known. The modifications to FLOPS resulted in differences of less than 4 percent for the ramp weight of a BWB transport in this range when compared to previous studies performed by NASA and Boeing.
Security Events and Vulnerability Data for Cybersecurity Risk Estimation.
Allodi, Luca; Massacci, Fabio
2017-08-01
Current industry standards for estimating cybersecurity risk are based on qualitative risk matrices as opposed to quantitative risk estimates. In contrast, risk assessment in most other industry sectors aims at deriving quantitative risk estimations (e.g., Basel II in Finance). This article presents a model and methodology to leverage on the large amount of data available from the IT infrastructure of an organization's security operation center to quantitatively estimate the probability of attack. Our methodology specifically addresses untargeted attacks delivered by automatic tools that make up the vast majority of attacks in the wild against users and organizations. We consider two-stage attacks whereby the attacker first breaches an Internet-facing system, and then escalates the attack to internal systems by exploiting local vulnerabilities in the target. Our methodology factors in the power of the attacker as the number of "weaponized" vulnerabilities he/she can exploit, and can be adjusted to match the risk appetite of the organization. We illustrate our methodology by using data from a large financial institution, and discuss the significant mismatch between traditional qualitative risk assessments and our quantitative approach. © 2017 Society for Risk Analysis.
A biased review of biases in Twitter studies on political collective action
NASA Astrophysics Data System (ADS)
Cihon, Peter; Yasseri, Taha
2016-08-01
In recent years researchers have gravitated to Twitter and other social media platforms as fertile ground for empirical analysis of social phenomena. Social media provides researchers access to trace data of interactions and discourse that once went unrecorded in the offline world. Researchers have sought to use these data to explain social phenomena both particular to social media and applicable to the broader social world. This paper offers a minireview of Twitter-based research on political crowd behaviour. This literature offers insight into particular social phenomena on Twitter, but often fails to use standardized methods that permit interpretation beyond individual studies. Moreover, the literature fails to ground methodologies and results in social or political theory, divorcing empirical research from the theory needed to interpret it. Rather, investigations focus primarily on methodological innovations for social media analyses, but these too often fail to sufficiently demonstrate the validity of such methodologies. This minireview considers a small number of selected papers; we analyse their (often lack of) theoretical approaches, review their methodological innovations, and offer suggestions as to the relevance of their results for political scientists and sociologists.
Cuesta, D; Varela, M; Miró, P; Galdós, P; Abásolo, D; Hornero, R; Aboy, M
2007-07-01
Body temperature is a classical diagnostic tool for a number of diseases. However, it is usually employed as a plain binary classification function (febrile or not febrile), and therefore its diagnostic power has not been fully developed. In this paper, we describe how body temperature regularity can be used for diagnosis. Our proposed methodology is based on obtaining accurate long-term temperature recordings at high sampling frequencies and analyzing the temperature signal using a regularity metric (approximate entropy). In this study, we assessed our methodology using temperature registers acquired from patients with multiple organ failure admitted to an intensive care unit. Our results indicate there is a correlation between the patient's condition and the regularity of the body temperature. This finding enabled us to design a classifier for two outcomes (survival or death) and test it on a dataset including 36 subjects. The classifier achieved an accuracy of 72%.
Gunawardena, Harsha P; O'Brien, Jonathon; Wrobel, John A; Xie, Ling; Davies, Sherri R; Li, Shunqiang; Ellis, Matthew J; Qaqish, Bahjat F; Chen, Xian
2016-02-01
Single quantitative platforms such as label-based or label-free quantitation (LFQ) present compromises in accuracy, precision, protein sequence coverage, and speed of quantifiable proteomic measurements. To maximize the quantitative precision and the number of quantifiable proteins or the quantifiable coverage of tissue proteomes, we have developed a unified approach, termed QuantFusion, that combines the quantitative ratios of all peptides measured by both LFQ and label-based methodologies. Here, we demonstrate the use of QuantFusion in determining the proteins differentially expressed in a pair of patient-derived tumor xenografts (PDXs) representing two major breast cancer (BC) subtypes, basal and luminal. Label-based in-spectra quantitative peptides derived from amino acid-coded tagging (AACT, also known as SILAC) of a non-malignant mammary cell line were uniformly added to each xenograft with a constant predefined ratio, from which Ratio-of-Ratio estimates were obtained for the label-free peptides paired with AACT peptides in each PDX tumor. A mixed model statistical analysis was used to determine global differential protein expression by combining complementary quantifiable peptide ratios measured by LFQ and Ratio-of-Ratios, respectively. With minimum number of replicates required for obtaining the statistically significant ratios, QuantFusion uses the distinct mechanisms to "rescue" the missing data inherent to both LFQ and label-based quantitation. Combined quantifiable peptide data from both quantitative schemes increased the overall number of peptide level measurements and protein level estimates. In our analysis of the PDX tumor proteomes, QuantFusion increased the number of distinct peptide ratios by 65%, representing differentially expressed proteins between the BC subtypes. This quantifiable coverage improvement, in turn, not only increased the number of measurable protein fold-changes by 8% but also increased the average precision of quantitative estimates by 181% so that some BC subtypically expressed proteins were rescued by QuantFusion. Thus, incorporating data from multiple quantitative approaches while accounting for measurement variability at both the peptide and global protein levels make QuantFusion unique for obtaining increased coverage and quantitative precision for tissue proteomes. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-09
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research To Support the National... Redesign Research (NCVS-RR) program: Methodological Research to Support the National Crime Victimization...
Predictions of first passage times in sparse discrete fracture networks using graph-based reductions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hyman, Jeffrey De'Haven; Hagberg, Aric Arild; Mohd-Yusof, Jamaludin
Here, we present a graph-based methodology to reduce the computational cost of obtaining first passage times through sparse fracture networks. We also derive graph representations of generic three-dimensional discrete fracture networks (DFNs) using the DFN topology and flow boundary conditions. Subgraphs corresponding to the union of the k shortest paths between the inflow and outflow boundaries are identified and transport on their equivalent subnetworks is compared to transport through the full network. The number of paths included in the subgraphs is based on the scaling behavior of the number of edges in the graph with the number of shortest paths.more » First passage times through the subnetworks are in good agreement with those obtained in the full network, both for individual realizations and in distribution. We obtain accurate estimates of first passage times with an order of magnitude reduction of CPU time and mesh size using the proposed method.« less
Predictions of first passage times in sparse discrete fracture networks using graph-based reductions
Hyman, Jeffrey De'Haven; Hagberg, Aric Arild; Mohd-Yusof, Jamaludin; ...
2017-07-10
Here, we present a graph-based methodology to reduce the computational cost of obtaining first passage times through sparse fracture networks. We also derive graph representations of generic three-dimensional discrete fracture networks (DFNs) using the DFN topology and flow boundary conditions. Subgraphs corresponding to the union of the k shortest paths between the inflow and outflow boundaries are identified and transport on their equivalent subnetworks is compared to transport through the full network. The number of paths included in the subgraphs is based on the scaling behavior of the number of edges in the graph with the number of shortest paths.more » First passage times through the subnetworks are in good agreement with those obtained in the full network, both for individual realizations and in distribution. We obtain accurate estimates of first passage times with an order of magnitude reduction of CPU time and mesh size using the proposed method.« less
NASA Astrophysics Data System (ADS)
Bag, S.; de, A.
2010-09-01
The transport phenomena based heat transfer and fluid flow calculations in weld pool require a number of input parameters. Arc efficiency, effective thermal conductivity, and viscosity in weld pool are some of these parameters, values of which are rarely known and difficult to assign a priori based on the scientific principles alone. The present work reports a bi-directional three-dimensional (3-D) heat transfer and fluid flow model, which is integrated with a real number based genetic algorithm. The bi-directional feature of the integrated model allows the identification of the values of a required set of uncertain model input parameters and, next, the design of process parameters to achieve a target weld pool dimension. The computed values are validated with measured results in linear gas-tungsten-arc (GTA) weld samples. Furthermore, a novel methodology to estimate the overall reliability of the computed solutions is also presented.
Predictions of CD4 lymphocytes’ count in HIV patients from complete blood count
2013-01-01
Background HIV diagnosis, prognostic and treatment requires T CD4 lymphocytes’ number from flow cytometry, an expensive technique often not available to people in developing countries. The aim of this work is to apply a previous developed methodology that predicts T CD4 lymphocytes’ value based on total white blood cell (WBC) count and lymphocytes count applying sets theory, from information taken from the Complete Blood Count (CBC). Methods Sets theory was used to classify into groups named A, B, C and D the number of leucocytes/mm3, lymphocytes/mm3, and CD4/μL3 subpopulation per flow cytometry of 800 HIV diagnosed patients. Union between sets A and C, and B and D were assessed, and intersection between both unions was described in order to establish the belonging percentage to these sets. Results were classified into eight ranges taken by 1000 leucocytes/mm3, calculating the belonging percentage of each range with respect to the whole sample. Results Intersection (A ∪ C) ∩ (B ∪ D) showed an effectiveness in the prediction of 81.44% for the range between 4000 and 4999 leukocytes, 91.89% for the range between 3000 and 3999, and 100% for the range below 3000. Conclusions Usefulness and clinical applicability of a methodology based on sets theory were confirmed to predict the T CD4 lymphocytes’ value, beginning with WBC and lymphocytes’ count from CBC. This methodology is new, objective, and has lower costs than the flow cytometry which is currently considered as Gold Standard. PMID:24034560
Estimating Bias Error Distributions
NASA Technical Reports Server (NTRS)
Liu, Tian-Shu; Finley, Tom D.
2001-01-01
This paper formulates the general methodology for estimating the bias error distribution of a device in a measuring domain from less accurate measurements when a minimal number of standard values (typically two values) are available. A new perspective is that the bias error distribution can be found as a solution of an intrinsic functional equation in a domain. Based on this theory, the scaling- and translation-based methods for determining the bias error distribution arc developed. These methods are virtually applicable to any device as long as the bias error distribution of the device can be sufficiently described by a power series (a polynomial) or a Fourier series in a domain. These methods have been validated through computational simulations and laboratory calibration experiments for a number of different devices.
ERIC Educational Resources Information Center
Dawe, Gerald F. M.; Vetter, Arnie; Martin, Stephen
2004-01-01
A sustainability audit of Holme Lacy College is described. The approach adopted a "triple bottom line" assessment, comprising a number of key steps: a scoping review utilising a revised Royal Institution of Chartered Surveyors project appraisal tool; an environmental impact assessment based on ecological footprinting and a social and…
Erin L. Landguth; Bradley C. Fedy; Sara J. Oyler-McCance; Andrew L. Garey; Sarah L. Emel; Matthew Mumma; Helene H. Wagner; Marie-Josee Fortin; Samuel A. Cushman
2012-01-01
The influence of study design on the ability to detect the effects of landscape pattern on gene flow is one of the most pressing methodological gaps in landscape genetic research. To investigate the effect of study design on landscape genetics inference, we used a spatially-explicit, individual-based program to simulate gene flow in a spatially continuous population...
ERIC Educational Resources Information Center
Allan, Alexandra; Tinkler, Penny
2015-01-01
A small number of attempts have been made to take stock of the field of gender and education, though very few have taken methodology as their explicit focus. We seek to stimulate such discussion in this article by taking stock of the use of visual methods in gender and education research (particularly participatory and image-based methods). We…
Abell, Bridget; Glasziou, Paul; Hoffmann, Tammy
2017-06-13
Clinicians are encouraged to use guidelines to assist in providing evidence-based secondary prevention to patients with coronary heart disease. However, the expanding number of publications providing guidance about exercise training may confuse cardiac rehabilitation clinicians. We therefore sought to explore the number, scope, publication characteristics, methodological quality, and clinical usefulness of published exercise-based cardiac rehabilitation guidance. We included publications recommending physical activity, exercise or cardiac rehabilitation for patients with coronary heart disease. These included systematically developed clinical practice guidelines, as well as other publications intended to support clinician decision making, such as position papers or consensus statements. Publications were obtained via electronic searches of preventive cardiology societies, guideline databases and PubMed, to November 2016. Publication characteristics were extracted, and two independent assessors evaluated quality using the 23-item Appraisal of Guidelines Research and Evaluation II (AGREE) tool. Fifty-four international publications from 1994 to 2016 were identified. Most were found on preventive cardiology association websites (n = 35; 65%) and were freely accessible (n = 50; 93%). Thirty (56%) publications contained only broad recommendations for physical activity and cardiac rehabilitation referral, while 24 (44%) contained the necessary detailed exercise training recommendations. Many were labelled as "guidelines", however publications with other titles (e.g. scientific statements) were common (n = 24; 44%). This latter group of publications contained a significantly greater proportion of detailed exercise training recommendations than clinical guidelines (p = 0.017). Wide variation in quality also existed, with 'applicability' the worst scoring AGREE II domain for clinical guidelines (mean score 53%) and 'rigour of development' rated lowest for other guidance types (mean score 33%). While a large number of guidance documents provide recommendations for exercise-based cardiac rehabilitation, most have limitations in either methodological quality or clinical usefulness. The lack of rigorously developed guidelines which also contain necessary detail about exercise training remains a substantial problem for clinicians.
NASA Astrophysics Data System (ADS)
Solvik, K.; Macedo, M.; Graesser, J.; Lathuilliere, M. J.
2017-12-01
Large-scale agriculture and cattle ranching in Brazil has driving the creation of tens of thousands of small stream impoundments to provide water for crops and livestock. These impoundments are a source of methane emissions and have significant impacts on stream temperature, connectivity, and water use over a large region. Due to their large numbers and small size, they are difficult to map using conventional methods. Here, we present a two-stage object-based supervised classification methodology for identifying man-made impoundments in Brazil. First, in Google Earth Engine pixels are classified as water or non-water using satellite data and HydroSHEDS products as predictors. Second, using Python's scikit-learn and scikit-image modules the water objects are classified as man-made or natural based on a variety of shape and spectral properties. Both classifications are performed by a random forest classifier. Training data is acquired by visually identifying impoundments and natural water bodies using high resolution satellite imagery from Google Earth.This methodology was applied to the state of Mato Grosso using a cloud-free mosaic of Sentinel 1 (10m resolution) radar and Sentinel 2 (10-20m) multispectral data acquired during the 2016 dry season. Independent test accuracy was estimated at 95% for the first stage and 93% for the second. We identified 54,294 man-made impoundments in Mato Grosso in 2016. The methodology is generalizable to other high resolution satellite data and has been tested on Landsat 5 and 8 imagery. Applying the same approach to Landsat 8 images (30 m), we identified 35,707 impoundments in the 2015 dry season. The difference in number is likely because the coarser-scale imagery fails to detect small (< 900 m2) objects. On-going work will apply this approach to satellite time series for the entire Amazon-Cerrado frontier, allowing us to track changes in the number, size, and distribution of man-made impoundments. Automated impoundment mapping over large areas may help with management of streams in agricultural landscapes in Brazil and other tropical regions.
Warren, Megan R; Sangiamo, Daniel T; Neunuebel, Joshua P
2018-03-01
An integral component in the assessment of vocal behavior in groups of freely interacting animals is the ability to determine which animal is producing each vocal signal. This process is facilitated by using microphone arrays with multiple channels. Here, we made important refinements to a state-of-the-art microphone array based system used to localize vocal signals produced by freely interacting laboratory mice. Key changes to the system included increasing the number of microphones as well as refining the methodology for localizing and assigning vocal signals to individual mice. We systematically demonstrate that the improvements in the methodology for localizing mouse vocal signals led to an increase in the number of signals detected as well as the number of signals accurately assigned to an animal. These changes facilitated the acquisition of larger and more comprehensive data sets that better represent the vocal activity within an experiment. Furthermore, this system will allow more thorough analyses of the role that vocal signals play in social communication. We expect that such advances will broaden our understanding of social communication deficits in mouse models of neurological disorders. Copyright © 2018 Elsevier B.V. All rights reserved.
Gutierrez-Navarro, Omar; Campos-Delgado, Daniel U; Arce-Santana, Edgar R; Maitland, Kristen C; Cheng, Shuna; Jabbour, Joey; Malik, Bilal; Cuenca, Rodrigo; Jo, Javier A
2014-05-19
Multispectral fluorescence lifetime imaging (m-FLIM) can potentially allow identifying the endogenous fluorophores present in biological tissue. Quantitative description of such data requires estimating the number of components in the sample, their characteristic fluorescent decays, and their relative contributions or abundances. Unfortunately, this inverse problem usually requires prior knowledge about the data, which is seldom available in biomedical applications. This work presents a new methodology to estimate the number of potential endogenous fluorophores present in biological tissue samples from time-domain m-FLIM data. Furthermore, a completely blind linear unmixing algorithm is proposed. The method was validated using both synthetic and experimental m-FLIM data. The experimental m-FLIM data include in-vivo measurements from healthy and cancerous hamster cheek-pouch epithelial tissue, and ex-vivo measurements from human coronary atherosclerotic plaques. The analysis of m-FLIM data from in-vivo hamster oral mucosa identified healthy from precancerous lesions, based on the relative concentration of their characteristic fluorophores. The algorithm also provided a better description of atherosclerotic plaques in term of their endogenous fluorophores. These results demonstrate the potential of this methodology to provide quantitative description of tissue biochemical composition.
Mudalige, Thilak K; Qu, Haiou; Linder, Sean W
2015-11-13
Engineered nanoparticles are available in large numbers of commercial products claiming various health benefits. Nanoparticle absorption, distribution, metabolism, excretion, and toxicity in a biological system are dependent on particle size, thus the determination of size and size distribution is essential for full characterization. Number based average size and size distribution is a major parameter for full characterization of the nanoparticle. In the case of polydispersed samples, large numbers of particles are needed to obtain accurate size distribution data. Herein, we report a rapid methodology, demonstrating improved nanoparticle recovery and excellent size resolution, for the characterization of gold nanoparticles in dietary supplements using asymmetric flow field flow fractionation coupled with visible absorption spectrometry and inductively coupled plasma mass spectrometry. A linear relationship between gold nanoparticle size and retention times was observed, and used for characterization of unknown samples. The particle size results from unknown samples were compared to results from traditional size analysis by transmission electron microscopy, and found to have less than a 5% deviation in size for unknown product over the size range from 7 to 30 nm. Published by Elsevier B.V.
Lean management systems: creating a culture of continuous quality improvement.
Clark, David M; Silvester, Kate; Knowles, Simon
2013-08-01
This is the first in a series of articles describing the application of Lean management systems to Laboratory Medicine. Lean is the term used to describe a principle-based continuous quality improvement (CQI) management system based on the Toyota production system (TPS) that has been evolving for over 70 years. Its origins go back much further and are heavily influenced by the work of W Edwards Deming and the scientific method that forms the basis of most quality management systems. Lean has two fundamental elements--a systematic approach to process improvement by removing waste in order to maximise value for the end-user of the service and a commitment to respect, challenge and develop the people who work within the service to create a culture of continuous improvement. Lean principles have been applied to a growing number of Healthcare systems throughout the world to improve the quality and cost-effectiveness of services for patients and a number of laboratories from all the pathology disciplines have used Lean to shorten turnaround times, improve quality (reduce errors) and improve productivity. Increasingly, models used to plan and implement large scale change in healthcare systems, including the National Health Service (NHS) change model, have evidence-based improvement methodologies (such as Lean CQI) as a core component. Consequently, a working knowledge of improvement methodology will be a core skill for Pathologists involved in leadership and management.
Review of health information technology usability study methodologies
Bakken, Suzanne
2011-01-01
Usability factors are a major obstacle to health information technology (IT) adoption. The purpose of this paper is to review and categorize health IT usability study methods and to provide practical guidance on health IT usability evaluation. 2025 references were initially retrieved from the Medline database from 2003 to 2009 that evaluated health IT used by clinicians. Titles and abstracts were first reviewed for inclusion. Full-text articles were then examined to identify final eligibility studies. 629 studies were categorized into the five stages of an integrated usability specification and evaluation framework that was based on a usability model and the system development life cycle (SDLC)-associated stages of evaluation. Theoretical and methodological aspects of 319 studies were extracted in greater detail and studies that focused on system validation (SDLC stage 2) were not assessed further. The number of studies by stage was: stage 1, task-based or user–task interaction, n=42; stage 2, system–task interaction, n=310; stage 3, user–task–system interaction, n=69; stage 4, user–task–system–environment interaction, n=54; and stage 5, user–task–system–environment interaction in routine use, n=199. The studies applied a variety of quantitative and qualitative approaches. Methodological issues included lack of theoretical framework/model, lack of details regarding qualitative study approaches, single evaluation focus, environmental factors not evaluated in the early stages, and guideline adherence as the primary outcome for decision support system evaluations. Based on the findings, a three-level stratified view of health IT usability evaluation is proposed and methodological guidance is offered based upon the type of interaction that is of primary interest in the evaluation. PMID:21828224
Investigation of Low-Reynolds-Number Rocket Nozzle Design Using PNS-Based Optimization Procedure
NASA Technical Reports Server (NTRS)
Hussaini, M. Moin; Korte, John J.
1996-01-01
An optimization approach to rocket nozzle design, based on computational fluid dynamics (CFD) methodology, is investigated for low-Reynolds-number cases. This study is undertaken to determine the benefits of this approach over those of classical design processes such as Rao's method. A CFD-based optimization procedure, using the parabolized Navier-Stokes (PNS) equations, is used to design conical and contoured axisymmetric nozzles. The advantage of this procedure is that it accounts for viscosity during the design process; other processes make an approximated boundary-layer correction after an inviscid design is created. Results showed significant improvement in the nozzle thrust coefficient over that of the baseline case; however, the unusual nozzle design necessitates further investigation of the accuracy of the PNS equations for modeling expanding flows with thick laminar boundary layers.
Multi-membership gene regulation in pathway based microarray analysis
2011-01-01
Background Gene expression analysis has been intensively researched for more than a decade. Recently, there has been elevated interest in the integration of microarray data analysis with other types of biological knowledge in a holistic analytical approach. We propose a methodology that can be facilitated for pathway based microarray data analysis, based on the observation that a substantial proportion of genes present in biochemical pathway databases are members of a number of distinct pathways. Our methodology aims towards establishing the state of individual pathways, by identifying those truly affected by the experimental conditions based on the behaviour of such genes. For that purpose it considers all the pathways in which a gene participates and the general census of gene expression per pathway. Results We utilise hill climbing, simulated annealing and a genetic algorithm to analyse the consistency of the produced results, through the application of fuzzy adjusted rand indexes and hamming distance. All algorithms produce highly consistent genes to pathways allocations, revealing the contribution of genes to pathway functionality, in agreement with current pathway state visualisation techniques, with the simulated annealing search proving slightly superior in terms of efficiency. Conclusions We show that the expression values of genes, which are members of a number of biochemical pathways or modules, are the net effect of the contribution of each gene to these biochemical processes. We show that by manipulating the pathway and module contribution of such genes to follow underlying trends we can interpret microarray results centred on the behaviour of these genes. PMID:21939531
Multi-membership gene regulation in pathway based microarray analysis.
Pavlidis, Stelios P; Payne, Annette M; Swift, Stephen M
2011-09-22
Gene expression analysis has been intensively researched for more than a decade. Recently, there has been elevated interest in the integration of microarray data analysis with other types of biological knowledge in a holistic analytical approach. We propose a methodology that can be facilitated for pathway based microarray data analysis, based on the observation that a substantial proportion of genes present in biochemical pathway databases are members of a number of distinct pathways. Our methodology aims towards establishing the state of individual pathways, by identifying those truly affected by the experimental conditions based on the behaviour of such genes. For that purpose it considers all the pathways in which a gene participates and the general census of gene expression per pathway. We utilise hill climbing, simulated annealing and a genetic algorithm to analyse the consistency of the produced results, through the application of fuzzy adjusted rand indexes and hamming distance. All algorithms produce highly consistent genes to pathways allocations, revealing the contribution of genes to pathway functionality, in agreement with current pathway state visualisation techniques, with the simulated annealing search proving slightly superior in terms of efficiency. We show that the expression values of genes, which are members of a number of biochemical pathways or modules, are the net effect of the contribution of each gene to these biochemical processes. We show that by manipulating the pathway and module contribution of such genes to follow underlying trends we can interpret microarray results centred on the behaviour of these genes.
Reiman, Anne; Pandey, Sarojini; Lloyd, Kate L; Dyer, Nigel; Khan, Mike; Crockard, Martin; Latten, Mark J; Watson, Tracey L; Cree, Ian A; Grammatopoulos, Dimitris K
2016-11-01
Background Detection of disease-associated mutations in patients with familial hypercholesterolaemia is crucial for early interventions to reduce risk of cardiovascular disease. Screening for these mutations represents a methodological challenge since more than 1200 different causal mutations in the low-density lipoprotein receptor has been identified. A number of methodological approaches have been developed for screening by clinical diagnostic laboratories. Methods Using primers targeting, the low-density lipoprotein receptor, apolipoprotein B, and proprotein convertase subtilisin/kexin type 9, we developed a novel Ion Torrent-based targeted re-sequencing method. We validated this in a West Midlands-UK small cohort of 58 patients screened in parallel with other mutation-targeting methods, such as multiplex polymerase chain reaction (Elucigene FH20), oligonucleotide arrays (Randox familial hypercholesterolaemia array) or the Illumina next-generation sequencing platform. Results In this small cohort, the next-generation sequencing method achieved excellent analytical performance characteristics and showed 100% and 89% concordance with the Randox array and the Elucigene FH20 assay. Investigation of the discrepant results identified two cases of mutation misclassification of the Elucigene FH20 multiplex polymerase chain reaction assay. A number of novel mutations not previously reported were also identified by the next-generation sequencing method. Conclusions Ion Torrent-based next-generation sequencing can deliver a suitable alternative for the molecular investigation of familial hypercholesterolaemia patients, especially when comprehensive mutation screening for rare or unknown mutations is required.
Donnelly, Aoife; Naughton, Owen; Misstear, Bruce; Broderick, Brian
2016-10-14
This article describes a new methodology for increasing the spatial representativeness of individual monitoring sites. Air pollution levels at a given point are influenced by emission sources in the immediate vicinity. Since emission sources are rarely uniformly distributed around a site, concentration levels will inevitably be most affected by the sources in the prevailing upwind direction. The methodology provides a means of capturing this effect and providing additional information regarding source/pollution relationships. The methodology allows for the division of the air quality data from a given monitoring site into a number of sectors or wedges based on wind direction and estimation of annual mean values for each sector, thus optimising the information that can be obtained from a single monitoring station. The method corrects for short-term data, diurnal and seasonal variations in concentrations (which can produce uneven weighting of data within each sector) and uneven frequency of wind directions. Significant improvements in correlations between the air quality data and the spatial air quality indicators were obtained after application of the correction factors. This suggests the application of these techniques would be of significant benefit in land-use regression modelling studies. Furthermore, the method was found to be very useful for estimating long-term mean values and wind direction sector values using only short-term monitoring data. The methods presented in this article can result in cost savings through minimising the number of monitoring sites required for air quality studies while also capturing a greater degree of variability in spatial characteristics. In this way, more reliable, but also more expensive monitoring techniques can be used in preference to a higher number of low-cost but less reliable techniques. The methods described in this article have applications in local air quality management, source receptor analysis, land-use regression mapping and modelling and population exposure studies.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-16
... DEPARTMENT OF JUSTICE Office of Justice Programs [OMB Number 1121-NEW] Agency Information Collection Activities; Proposed Collection; Comment Request: Methodological Research to Support the National...: Methodological Research to Support the National Crime Victimization Survey: Self-Report Data on Rape and Sexual...
Indications and Warning Methodology for Strategic Intelligence
2017-12-01
WARNING METHODOLOGY FOR STRATEGIC INTELLIGENCE by Susann Kimmelman December 2017 Thesis Co-Advisors: Robert Simeral James Wirtz THIS...Master’s thesis 4. TITLE AND SUBTITLE INDICATIONS AND WARNING METHODOLOGY FOR STRATEGIC INTELLIGENCE 5. FUNDING NUMBERS 6. AUTHOR(S) Susann...enterprise. The research found that, for homeland security, implementing a human-centric indications and warning methodology that focuses on the actor as
The Development of a Methodology for Estimating the Cost of Air Force On-the-Job Training.
ERIC Educational Resources Information Center
Samers, Bernard N.; And Others
The Air Force uses a standardized costing methodology for resident technical training schools (TTS); no comparable methodology exists for computing the cost of on-the-job training (OJT). This study evaluates three alternative survey methodologies and a number of cost models for estimating the cost of OJT for airmen training in the Administrative…
Hajibandeh, Shahab; Hajibandeh, Shahin; Antoniou, George A; Green, Patrick A; Maden, Michelle; Torella, Francesco
2017-04-01
Purpose We aimed to investigate association between bibliometric parameters, reporting and methodological quality of vascular and endovascular surgery randomised controlled trials. Methods The most recent 75 and oldest 75 randomised controlled trials published in leading journals over a 10-year period were identified. The reporting quality was analysed using the CONSORT statement, and methodological quality with the Intercollegiate Guidelines Network checklist. We used exploratory univariate and multivariable linear regression analysis to investigate associations. Findings Bibliometric parameters such as type of journal, study design reported in title, number of pages; external funding, industry sponsoring and number of citations are associated with reporting quality. Moreover, parameters such as type of journal, subject area and study design reported in title are associated with methodological quality. Conclusions The bibliometric parameters of randomised controlled trials may be independent predictors for their reporting and methodological quality. Moreover, the reporting quality of randomised controlled trials is associated with their methodological quality and vice versa.
Sensor placement in nuclear reactors based on the generalized empirical interpolation method
NASA Astrophysics Data System (ADS)
Argaud, J.-P.; Bouriquet, B.; de Caso, F.; Gong, H.; Maday, Y.; Mula, O.
2018-06-01
In this paper, we apply the so-called generalized empirical interpolation method (GEIM) to address the problem of sensor placement in nuclear reactors. This task is challenging due to the accumulation of a number of difficulties like the complexity of the underlying physics and the constraints in the admissible sensor locations and their number. As a result, the placement, still today, strongly relies on the know-how and experience of engineers from different areas of expertise. The present methodology contributes to making this process become more systematic and, in turn, simplify and accelerate the procedure.
Improving FMEA risk assessment through reprioritization of failures
NASA Astrophysics Data System (ADS)
Ungureanu, A. L.; Stan, G.
2016-08-01
Most of the current methods used to assess the failure and to identify the industrial equipment defects are based on the determination of Risk Priority Number (RPN). Although conventional RPN calculation is easy to understand and use, the methodology presents some limitations, such as the large number of duplicates and the difficulty of assessing the RPN indices. In order to eliminate the afore-mentioned shortcomings, this paper puts forward an easy and efficient computing method, called Failure Developing Mode and Criticality Analysis (FDMCA), which takes into account the failures and the defect evolution in time, from failure appearance to a breakdown.
Logic-based models in systems biology: a predictive and parameter-free network analysis method†
Wynn, Michelle L.; Consul, Nikita; Merajver, Sofia D.
2012-01-01
Highly complex molecular networks, which play fundamental roles in almost all cellular processes, are known to be dysregulated in a number of diseases, most notably in cancer. As a consequence, there is a critical need to develop practical methodologies for constructing and analysing molecular networks at a systems level. Mathematical models built with continuous differential equations are an ideal methodology because they can provide a detailed picture of a network’s dynamics. To be predictive, however, differential equation models require that numerous parameters be known a priori and this information is almost never available. An alternative dynamical approach is the use of discrete logic-based models that can provide a good approximation of the qualitative behaviour of a biochemical system without the burden of a large parameter space. Despite their advantages, there remains significant resistance to the use of logic-based models in biology. Here, we address some common concerns and provide a brief tutorial on the use of logic-based models, which we motivate with biological examples. PMID:23072820
NASA Astrophysics Data System (ADS)
Bisadi, Zahra; Acerbi, Fabio; Fontana, Giorgio; Zorzi, Nicola; Piemonte, Claudio; Pucker, Georg; Pavesi, Lorenzo
2018-02-01
A small-sized photonic quantum random number generator, easy to be implemented in small electronic devices for secure data encryption and other applications, is highly demanding nowadays. Here, we propose a compact configuration with Silicon nanocrystals large area light emitting device (LED) coupled to a Silicon photomultiplier to generate random numbers. The random number generation methodology is based on the photon arrival time and is robust against the non-idealities of the detector and the source of quantum entropy. The raw data show high quality of randomness and pass all the statistical tests in national institute of standards and technology tests (NIST) suite without a post-processing algorithm. The highest bit rate is 0.5 Mbps with the efficiency of 4 bits per detected photon.
Applications of mixed-methods methodology in clinical pharmacy research.
Hadi, Muhammad Abdul; Closs, S José
2016-06-01
Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.
Tsai, Chu-Lin; Camargo, Carlos A
2009-09-01
Acute exacerbations of chronic disease are ubiquitous in clinical medicine, and thus far, there has been a paucity of integrated methodological discussion on this phenomenon. We use acute exacerbations of chronic obstructive pulmonary disease as an example to emphasize key epidemiological and statistical issues for this understudied field in clinical epidemiology. Directed acyclic graphs are a useful epidemiological tool to explain the differential effects of risk factor on health outcomes in studies of acute and chronic phases of disease. To study the pathogenesis of acute exacerbations of chronic disease, case-crossover design and time-series analysis are well-suited study designs to differentiate acute and chronic effect. Modeling changes over time and setting appropriate thresholds are important steps to separate acute from chronic phases of disease in serial measurements. In statistical analysis, acute exacerbations are recurrent events, and some individuals are more prone to recurrences than others. Therefore, appropriate statistical modeling should take into account intraindividual dependence. Finally, we recommend the use of "event-based" number needed to treat (NNT) to prevent a single exacerbation instead of traditional patient-based NNT. Addressing these methodological challenges will advance research quality in acute on chronic disease epidemiology.
Clinical results from a noninvasive blood glucose monitor
NASA Astrophysics Data System (ADS)
Blank, Thomas B.; Ruchti, Timothy L.; Lorenz, Alex D.; Monfre, Stephen L.; Makarewicz, M. R.; Mattu, Mutua; Hazen, Kevin
2002-05-01
Non-invasive blood glucose monitoring has long been proposed as a means for advancing the management of diabetes through increased measurement and control. The use of a near-infrared, NIR, spectroscopy based methodology for noninvasive monitoring has been pursued by a number of groups. The accuracy of the NIR measurement technology is limited by challenges related to the instrumentation, the heterogeneity and time-variant nature of skin tissue, and the complexity of the calibration methodology. In this work, we discuss results from a clinical study that targeted the evaluation of individual calibrations for each subject based on a series of controlled calibration visits. While the customization of the calibrations to individuals was intended to reduce model complexity, the extensive requirements for each individual set of calibration data were difficult to achieve and required several days of measurement. Through the careful selection of a small subset of data from all samples collected on the 138 study participants in a previous study, we have developed a methodology for applying a single standard calibration to multiple persons. The standard calibrations have been applied to a plurality of individuals and shown to be persistent over periods greater than 24 weeks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buelt, J.L.; Stottlemyre, J.A.; White, M.K.
1991-09-01
Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigations/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating establishment technology processmore » options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies requires by the Resource Conservation and Recovery Act (RCRA). This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses information on technologies in a graphical and tabular manner, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buelt, J.L.; Stottlemyre, J.A.; White, M.K.
1991-02-01
Because of the great complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, the DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process required for DOE operable units. DOE operable units are generally more complex in nature because of the existence of multiple waste sites within many of the operable units and the presence of mixed radioactive and hazardous chemical wastes. Consequently, Pacific Northwest Laboratory (PNL) is developing the Remedial Action Assessment System (RAAS), which is aimed at screening, linking, and evaluating established technology processmore » options in support of conducting feasibility studies under the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA). It is also intended to do the same in support of corrective measures studies required by the Resource Conservation and Recovery Act (RCRA). This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses information on technologies in a graphical and tabular manner, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
NASA Astrophysics Data System (ADS)
Abbas, Mohammad
Recently developed methodology that provides the direct assessment of traditional thrust-based performance of aerospace vehicles in terms of entropy generation (i.e., exergy destruction) is modified for stand-alone jet engines. This methodology is applied to a specific single-spool turbojet engine configuration. A generic compressor performance map along with modeled engine component performance characterizations are utilized in order to provide comprehensive traditional engine performance results (engine thrust, mass capture, and RPM), for on and off-design engine operation. Details of exergy losses in engine components, across the entire engine, and in the engine wake are provided and the engine performance losses associated with their losses are discussed. Results are provided across the engine operating envelope as defined by operational ranges of flight Mach number, altitude, and fuel throttle setting. The exergy destruction that occurs in the engine wake is shown to be dominant with respect to other losses, including all exergy losses that occur inside the engine. Specifically, the ratio of the exergy destruction rate in the wake to the exergy destruction rate inside the engine itself ranges from 1 to 2.5 across the operational envelope of the modeled engine.
Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang
2017-01-01
The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System's underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored.
Stieglitz, Rolf-Dieter; Haug, Achim; Fähndrich, Erdmann; Rösler, Michael; Trabert, Wolfgang
2017-01-01
The documentation of psychopathology is core to the clinical practice of the psychiatrist and clinical psychologist. However, both in initial as well as further training and specialization in their fields, this particular aspect of their work receives scanty attention only. Yet, for the past 50 years, the Association for Methodology and Documentation in Psychiatry (AMDP) System has been in existence and available as a tool to serve precisely the purpose of offering a systematic introduction to the terminology and documentation of psychopathology. The motivation for its development was based on the need for an assessment procedure for the reliable documentation of the effectiveness of newly developed psychopharmacological substances. Subsequently, the AMDP-System began to be applied in the context of investigations into a number of methodological issues in psychiatry (e.g., the frequency and specificity of particular symptoms, the comparison of rating scales). The System then became increasingly important also in clinical practice and, today, represents the most used instrument for the documentation of psychopathology in the German-speaking countries of Europe. This paper intends to offer an overview of the AMDP-System, its origins, design, and functionality. After an initial account of the history and development of the AMDP-System, the discussion will in turn focus on the System’s underlying methodological principles, the transfer of clinical skills and competencies in its practical application, and its use in research and clinical practice. Finally, potential future areas of development in relation to the AMDP-System are explored. PMID:28439242
Improved orthologous databases to ease protozoan targets inference.
Kotowski, Nelson; Jardim, Rodrigo; Dávila, Alberto M R
2015-09-29
Homology inference helps on identifying similarities, as well as differences among organisms, which provides a better insight on how closely related one might be to another. In addition, comparative genomics pipelines are widely adopted tools designed using different bioinformatics applications and algorithms. In this article, we propose a methodology to build improved orthologous databases with the potential to aid on protozoan target identification, one of the many tasks which benefit from comparative genomics tools. Our analyses are based on OrthoSearch, a comparative genomics pipeline originally designed to infer orthologs through protein-profile comparison, supported by an HMM, reciprocal best hits based approach. Our methodology allows OrthoSearch to confront two orthologous databases and to generate an improved new one. Such can be later used to infer potential protozoan targets through a similarity analysis against the human genome. The protein sequences of Cryptosporidium hominis, Entamoeba histolytica and Leishmania infantum genomes were comparatively analyzed against three orthologous databases: (i) EggNOG KOG, (ii) ProtozoaDB and (iii) Kegg Orthology (KO). That allowed us to create two new orthologous databases, "KO + EggNOG KOG" and "KO + EggNOG KOG + ProtozoaDB", with 16,938 and 27,701 orthologous groups, respectively. Such new orthologous databases were used for a regular OrthoSearch run. By confronting "KO + EggNOG KOG" and "KO + EggNOG KOG + ProtozoaDB" databases and protozoan species we were able to detect the following total of orthologous groups and coverage (relation between the inferred orthologous groups and the species total number of proteins): Cryptosporidium hominis: 1,821 (11 %) and 3,254 (12 %); Entamoeba histolytica: 2,245 (13 %) and 5,305 (19 %); Leishmania infantum: 2,702 (16 %) and 4,760 (17 %). Using our HMM-based methodology and the largest created orthologous database, it was possible to infer 13 orthologous groups which represent potential protozoan targets; these were found because of our distant homology approach. We also provide the number of species-specific, pair-to-pair and core groups from such analyses, depicted in Venn diagrams. The orthologous databases generated by our HMM-based methodology provide a broader dataset, with larger amounts of orthologous groups when compared to the original databases used as input. Those may be used for several homology inference analyses, annotation tasks and protozoan targets identification.
Chauvin, Anthony; Truchot, Jennifer; Bafeta, Aida; Pateron, Dominique; Plaisance, Patrick; Yordanov, Youri
2018-04-01
The number of trials assessing Simulation-Based Medical Education (SBME) interventions has rapidly expanded. Many studies show that potential flaws in design, conduct and reporting of randomized controlled trials (RCTs) can bias their results. We conducted a methodological review of RCTs assessing a SBME in Emergency Medicine (EM) and examined their methodological characteristics. We searched MEDLINE via PubMed for RCT that assessed a simulation intervention in EM, published in 6 general and internal medicine and in the top 10 EM journals. The Cochrane Collaboration risk of Bias tool was used to assess risk of bias, intervention reporting was evaluated based on the "template for intervention description and replication" checklist, and methodological quality was evaluated by the Medical Education Research Study Quality Instrument. Reports selection and data extraction was done by 2 independents researchers. From 1394 RCTs screened, 68 trials assessed a SBME intervention. They represent one quarter of our sample. Cardiopulmonary resuscitation (CPR) is the most frequent topic (81%). Random sequence generation and allocation concealment were performed correctly in 66 and 49% of trials. Blinding of participants and assessors was performed correctly in 19 and 68%. Risk of attrition bias was low in three-quarters of the studies (n = 51). Risk of selective reporting bias was unclear in nearly all studies. The mean MERQSI score was of 13.4/18.4% of the reports provided a description allowing the intervention replication. Trials assessing simulation represent one quarter of RCTs in EM. Their quality remains unclear, and reproducing the interventions appears challenging due to reporting issues.
De Buck, Emmy; Pauwels, Nele S; Dieltjens, Tessa; Vandekerckhove, Philippe
2014-03-01
As part of its strategy Belgian Red Cross-Flanders underpins all its activities with evidence-based guidelines and systematic reviews. The aim of this publication is to describe in detail the methodology used to achieve this goal within an action-oriented organisation, in a timely and cost-effective way. To demonstrate transparency in our methods, we wrote a methodological charter describing the way in which we develop evidence-based materials to support our activities. Criteria were drawn up for deciding on project priority and the choice of different types of projects (scoping reviews, systematic reviews and evidence-based guidelines). While searching for rigorous and realistically attainable methodological standards, we encountered a wide variety in terminology and methodology used in the field of evidence-based practice. Terminologies currently being used by different organisations and institutions include systematic reviews, systematic literature searches, evidence-based guidelines, rapid reviews, pragmatic systematic reviews, and rapid response service. It is not always clear what the definition and methodology is behind these terms and whether they are used consistently. We therefore describe the terminology and methodology used by Belgian Red Cross-Flanders; criteria for making methodological choices and details on the methodology we use are given. In our search for an appropriate methodology, taking into account time and resource constraints, we encountered an enormous variety of methodological approaches and terminology used for evidence-based materials. In light of this, we recommend that authors of evidence-based guidelines and reviews are transparent and clear about the methodology used. To be transparent about our approach, we developed a methodological charter. This charter may inspire other organisations that want to use evidence-based methodology to support their activities.
Katz, Mark A.; Lindblade, Kim A.; Njuguna, Henry; Arvelo, Wences; Khagayi, Sammy; Emukule, Gideon; Linares-Perez, Nivaldo; McCracken, John; Nokes, D. James; Ngama, Mwanajuma; Kazungu, Sidi; Mott, Joshua A.; Olsen, Sonja J.; Widdowson, Marc-Alain; Feikin, Daniel R.
2013-01-01
Background Knowing the national disease burden of severe influenza in low-income countries can inform policy decisions around influenza treatment and prevention. We present a novel methodology using locally generated data for estimating this burden. Methods and Findings This method begins with calculating the hospitalized severe acute respiratory illness (SARI) incidence for children <5 years old and persons ≥5 years old from population-based surveillance in one province. This base rate of SARI is then adjusted for each province based on the prevalence of risk factors and healthcare-seeking behavior. The percentage of SARI with influenza virus detected is determined from provincial-level sentinel surveillance and applied to the adjusted provincial rates of hospitalized SARI. Healthcare-seeking data from healthcare utilization surveys is used to estimate non-hospitalized influenza-associated SARI. Rates of hospitalized and non-hospitalized influenza-associated SARI are applied to census data to calculate the national number of cases. The method was field-tested in Kenya, and validated in Guatemala, using data from August 2009–July 2011. In Kenya (2009 population 38.6 million persons), the annual number of hospitalized influenza-associated SARI cases ranged from 17,129–27,659 for children <5 years old (2.9–4.7 per 1,000 persons) and 6,882–7,836 for persons ≥5 years old (0.21–0.24 per 1,000 persons), depending on year and base rate used. In Guatemala (2011 population 14.7 million persons), the annual number of hospitalized cases of influenza-associated pneumonia ranged from 1,065–2,259 (0.5–1.0 per 1,000 persons) among children <5 years old and 779–2,252 cases (0.1–0.2 per 1,000 persons) for persons ≥5 years old, depending on year and base rate used. In both countries, the number of non-hospitalized influenza-associated cases was several-fold higher than the hospitalized cases. Conclusions Influenza virus was associated with a substantial amount of severe disease in Kenya and Guatemala. This method can be performed in most low and lower-middle income countries. PMID:23573177
Locations of Sampling Stations for Water Quality Monitoring in Water Distribution Networks.
Rathi, Shweta; Gupta, Rajesh
2014-04-01
Water quality is required to be monitored in the water distribution networks (WDNs) at salient locations to assure the safe quality of water supplied to the consumers. Such monitoring stations (MSs) provide warning against any accidental contaminations. Various objectives like demand coverage, time for detection, volume of water contaminated before detection, extent of contamination, expected population affected prior to detection, detection likelihood and others, have been independently or jointly considered in determining optimal number and location of MSs in WDNs. "Demand coverage" defined as the percentage of network demand monitored by a particular monitoring station is a simple measure to locate MSs. Several methods based on formulation of coverage matrix using pre-specified coverage criteria and optimization have been suggested. Coverage criteria is defined as some minimum percentage of total flow received at the monitoring stations that passed through any upstream node included then as covered node of the monitoring station. Number of monitoring stations increases with the increase in the value of coverage criteria. Thus, the design of monitoring station becomes subjective. A simple methodology is proposed herein which priority wise iteratively selects MSs to achieve targeted demand coverage. The proposed methodology provided the same number and location of MSs for illustrative network as an optimization method did. Further, the proposed method is simple and avoids subjectivity that could arise from the consideration of coverage criteria. The application of methodology is also shown on a WDN of Dharampeth zone (Nagpur city WDN in Maharashtra, India) having 285 nodes and 367 pipes.
NASA Technical Reports Server (NTRS)
Bundick, W. Thomas
1990-01-01
A methodology for designing a failure detection and identification (FDI) system to detect and isolate control element failures in aircraft control systems is reviewed. An FDI system design for a modified B-737 aircraft resulting from this methodology is also reviewed, and the results of evaluating this system via simulation are presented. The FDI system performed well in a no-turbulence environment, but it experienced an unacceptable number of false alarms in atmospheric turbulence. An adaptive FDI system, which adjusts thresholds and other system parameters based on the estimated turbulence level, was developed and evaluated. The adaptive system performed well over all turbulence levels simulated, reliably detecting all but the smallest magnitude partially-missing-surface failures.
Effectiveness of voice therapy in functional dysphonia: where are we now?
Bos-Clark, Marianne; Carding, Paul
2011-06-01
To review the recent literature since the 2009 Cochrane review regarding the effectiveness of voice therapy for patients with functional dysphonia. A range of articles report on the effects of voice therapy treatment for functional dysphonia, with a wide range of interventions described. Only one study is a randomized controlled trial. A number of excellent review articles have extended the knowledge base. In primary research, methodological issues persist: studies are small, and not adequately controlled. Studies show improved standards of outcome measurement and of description of the content of voice therapy. There is a continued need for larger, methodologically sound clinical effectiveness studies. Future studies need to be replicable and generalizable in order to inform and elucidate clinical practice.
48 CFR 1552.215-72 - Instructions for the Preparation of Proposals.
Code of Federal Regulations, 2011 CFR
2011-10-01
... used. If escalation is included, state the degree (percent) and methodology. The methodology shall.... If so, state the number required, the professional or technical level and the methodology used to... for which the salary is applicable; (C) List of other research Projects or proposals for which...
Active-Reserve Force Cost Model
2015-01-01
structure to be maintained for a given level of expenditure. We have developed this methodology and set of associated computer-based tools to...rotational, and deployed units or systems • Attain acceptable steady state operational or presence levels , as measured by the number of units a...at the community level . By community, we mean the set of units of a given type: mission, platform, or capability. We do this because AC-RC force-mix
2009-12-01
standards for assessing the value of intangible assets or intellectual capital. Historically, a number of frameworks have evolved, each with a ...different focus and a different assessment methodology. In order to assess that knowledge management initiatives contributed to the fight against...terrorism in Canada, a results-based framework was selected, customized and applied to CRTI ( a networked science and technology program to counter
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
Structural design methodologies for ceramic-based material systems
NASA Technical Reports Server (NTRS)
Duffy, Stephen F.; Chulya, Abhisak; Gyekenyesi, John P.
1991-01-01
One of the primary pacing items for realizing the full potential of ceramic-based structural components is the development of new design methods and protocols. The focus here is on low temperature, fast-fracture analysis of monolithic, whisker-toughened, laminated, and woven ceramic composites. A number of design models and criteria are highlighted. Public domain computer algorithms, which aid engineers in predicting the fast-fracture reliability of structural components, are mentioned. Emphasis is not placed on evaluating the models, but instead is focused on the issues relevant to the current state of the art.
Heat transfer in aeropropulsion systems
NASA Astrophysics Data System (ADS)
Simoneau, R. J.
1985-07-01
Aeropropulsion heat transfer is reviewed. A research methodology based on a growing synergism between computations and experiments is examined. The aeropropulsion heat transfer arena is identified as high Reynolds number forced convection in a highly disturbed environment subject to strong gradients, body forces, abrupt geometry changes and high three dimensionality - all in an unsteady flow field. Numerous examples based on heat transfer to the aircraft gas turbine blade are presented to illustrate the types of heat transfer problems which are generic to aeropropulsion systems. The research focus of the near future in aeropropulsion heat transfer is projected.
NASA Technical Reports Server (NTRS)
Turc, Catalin; Anand, Akash; Bruno, Oscar; Chaubell, Julian
2011-01-01
We present a computational methodology (a novel Nystrom approach based on use of a non-overlapping patch technique and Chebyshev discretizations) for efficient solution of problems of acoustic and electromagnetic scattering by open surfaces. Our integral equation formulations (1) Incorporate, as ansatz, the singular nature of open-surface integral-equation solutions, and (2) For the Electric Field Integral Equation (EFIE), use analytical regularizes that effectively reduce the number of iterations required by iterative linear-algebra solution based on Krylov-subspace iterative solvers.
Heat transfer in aeropropulsion systems
NASA Technical Reports Server (NTRS)
Simoneau, R. J.
1985-01-01
Aeropropulsion heat transfer is reviewed. A research methodology based on a growing synergism between computations and experiments is examined. The aeropropulsion heat transfer arena is identified as high Reynolds number forced convection in a highly disturbed environment subject to strong gradients, body forces, abrupt geometry changes and high three dimensionality - all in an unsteady flow field. Numerous examples based on heat transfer to the aircraft gas turbine blade are presented to illustrate the types of heat transfer problems which are generic to aeropropulsion systems. The research focus of the near future in aeropropulsion heat transfer is projected.
Echtermeyer, Alexander; Amar, Yehia; Zakrzewski, Jacek; Lapkin, Alexei
2017-01-01
A recently described C(sp 3 )-H activation reaction to synthesise aziridines was used as a model reaction to demonstrate the methodology of developing a process model using model-based design of experiments (MBDoE) and self-optimisation approaches in flow. The two approaches are compared in terms of experimental efficiency. The self-optimisation approach required the least number of experiments to reach the specified objectives of cost and product yield, whereas the MBDoE approach enabled a rapid generation of a process model.
Jack, Robert A; Sochacki, Kyle R; Morehouse, Hannah A; McCulloch, Patrick C; Lintner, David M; Harris, Joshua D
2018-04-01
Several studies have analyzed the most cited articles in shoulder, elbow, pediatrics, and foot and ankle surgery. However, no study has analyzed the quality of the most cited articles in elbow medial ulnar collateral ligament (UCL) surgery. To (1) identify the top 50 most cited articles related to UCL surgery, (2) determine whether there was a correlation between the top cited articles and level of evidence, and (3) determine whether there was a correlation between study methodological quality and the top cited articles. Systematic review. Web of Science and Scopus online databases were searched to identify the top 50 cited articles in UCL surgery. Level of evidence, number of times cited, year of publication, name of journal, country of origin, and study type were recorded for each study. Study methodological quality was analyzed for each article with the Modified Coleman Methodology Score (MCMS) and the Methodological Index for Non-randomized Studies (MINORS). Correlation coefficients were calculated. The 50 most cited articles were published between 1981 and 2015. The number of citations per article ranged from 20 to 301 (mean ± SD, 71 ± 62 citations). Most articles (92%) were from the United States and were level 3 (16%), level 4 (58%), or unclassified (16%) evidence. There were no articles of level 1 evidence quality. The mean MCMS and MINORS scores were 28.1 ± 13.4 (range, 3-52) and 9.2 ± 3.6 (range, 2-19), respectively. There was no significant correlation between the mean number of citations and level of evidence or quality ( r s = -0.01, P = .917), MCMS ( r s = 0.09, P = .571), or MINORS ( r s = -0.26, P = .089). The top 50 cited articles in UCL surgery constitute a low level of evidence and low methodological quality, including no level 1 articles. There was no significant correlation between the mean number of citations and level of evidence or study methodological quality. However, weak correlations were observed for later publication date and improved level of evidence and methodological quality.
Jack, Robert A.; Sochacki, Kyle R.; Morehouse, Hannah A.; McCulloch, Patrick C.; Lintner, David M.; Harris, Joshua D.
2018-01-01
Background: Several studies have analyzed the most cited articles in shoulder, elbow, pediatrics, and foot and ankle surgery. However, no study has analyzed the quality of the most cited articles in elbow medial ulnar collateral ligament (UCL) surgery. Purpose: To (1) identify the top 50 most cited articles related to UCL surgery, (2) determine whether there was a correlation between the top cited articles and level of evidence, and (3) determine whether there was a correlation between study methodological quality and the top cited articles. Study Design: Systematic review. Methods: Web of Science and Scopus online databases were searched to identify the top 50 cited articles in UCL surgery. Level of evidence, number of times cited, year of publication, name of journal, country of origin, and study type were recorded for each study. Study methodological quality was analyzed for each article with the Modified Coleman Methodology Score (MCMS) and the Methodological Index for Non-randomized Studies (MINORS). Correlation coefficients were calculated. Results: The 50 most cited articles were published between 1981 and 2015. The number of citations per article ranged from 20 to 301 (mean ± SD, 71 ± 62 citations). Most articles (92%) were from the United States and were level 3 (16%), level 4 (58%), or unclassified (16%) evidence. There were no articles of level 1 evidence quality. The mean MCMS and MINORS scores were 28.1 ± 13.4 (range, 3-52) and 9.2 ± 3.6 (range, 2-19), respectively. There was no significant correlation between the mean number of citations and level of evidence or quality (rs = –0.01, P = .917), MCMS (rs = 0.09, P = .571), or MINORS (rs = –0.26, P = .089). Conclusion: The top 50 cited articles in UCL surgery constitute a low level of evidence and low methodological quality, including no level 1 articles. There was no significant correlation between the mean number of citations and level of evidence or study methodological quality. However, weak correlations were observed for later publication date and improved level of evidence and methodological quality. PMID:29780841
Performance of a Line Loss Correction Method for Gas Turbine Emission Measurements
NASA Astrophysics Data System (ADS)
Hagen, D. E.; Whitefield, P. D.; Lobo, P.
2015-12-01
International concern for the environmental impact of jet engine exhaust emissions in the atmosphere has led to increased attention on gas turbine engine emission testing. The Society of Automotive Engineers Aircraft Exhaust Emissions Measurement Committee (E-31) has published an Aerospace Information Report (AIR) 6241 detailing the sampling system for the measurement of non-volatile particulate matter from aircraft engines, and is developing an Aerospace Recommended Practice (ARP) for methodology and system specification. The Missouri University of Science and Technology (MST) Center for Excellence for Aerospace Particulate Emissions Reduction Research has led numerous jet engine exhaust sampling campaigns to characterize emissions at different locations in the expanding exhaust plume. Particle loss, due to various mechanisms, occurs in the sampling train that transports the exhaust sample from the engine exit plane to the measurement instruments. To account for the losses, both the size dependent penetration functions and the size distribution of the emitted particles need to be known. However in the proposed ARP, particle number and mass are measured, but size is not. Here we present a methodology to generate number and mass correction factors for line loss, without using direct size measurement. A lognormal size distribution is used to represent the exhaust aerosol at the engine exit plane and is defined by the measured number and mass at the downstream end of the sample train. The performance of this line loss correction is compared to corrections based on direct size measurements using data taken by MST during numerous engine test campaigns. The experimental uncertainty in these correction factors is estimated. Average differences between the line loss correction method and size based corrections are found to be on the order of 10% for number and 2.5% for mass.
Gradient-Based Aerodynamic Shape Optimization Using ADI Method for Large-Scale Problems
NASA Technical Reports Server (NTRS)
Pandya, Mohagna J.; Baysal, Oktay
1997-01-01
A gradient-based shape optimization methodology, that is intended for practical three-dimensional aerodynamic applications, has been developed. It is based on the quasi-analytical sensitivities. The flow analysis is rendered by a fully implicit, finite volume formulation of the Euler equations.The aerodynamic sensitivity equation is solved using the alternating-direction-implicit (ADI) algorithm for memory efficiency. A flexible wing geometry model, that is based on surface parameterization and platform schedules, is utilized. The present methodology and its components have been tested via several comparisons. Initially, the flow analysis for for a wing is compared with those obtained using an unfactored, preconditioned conjugate gradient approach (PCG), and an extensively validated CFD code. Then, the sensitivities computed with the present method have been compared with those obtained using the finite-difference and the PCG approaches. Effects of grid refinement and convergence tolerance on the analysis and shape optimization have been explored. Finally the new procedure has been demonstrated in the design of a cranked arrow wing at Mach 2.4. Despite the expected increase in the computational time, the results indicate that shape optimization, which require large numbers of grid points can be resolved with a gradient-based approach.
75 FR 8999 - Agency Information Collection Activities: Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-26
.... Methodological findings may be presented externally in technical papers at conferences, published in the... represent several methodological improvement projects. \\2\\ This number refers to the science, engineering...
Algorithm for cellular reprogramming.
Ronquist, Scott; Patterson, Geoff; Muir, Lindsey A; Lindsly, Stephen; Chen, Haiming; Brown, Markus; Wicha, Max S; Bloch, Anthony; Brockett, Roger; Rajapakse, Indika
2017-11-07
The day we understand the time evolution of subcellular events at a level of detail comparable to physical systems governed by Newton's laws of motion seems far away. Even so, quantitative approaches to cellular dynamics add to our understanding of cell biology. With data-guided frameworks we can develop better predictions about, and methods for, control over specific biological processes and system-wide cell behavior. Here we describe an approach for optimizing the use of transcription factors (TFs) in cellular reprogramming, based on a device commonly used in optimal control. We construct an approximate model for the natural evolution of a cell-cycle-synchronized population of human fibroblasts, based on data obtained by sampling the expression of 22,083 genes at several time points during the cell cycle. To arrive at a model of moderate complexity, we cluster gene expression based on division of the genome into topologically associating domains (TADs) and then model the dynamics of TAD expression levels. Based on this dynamical model and additional data, such as known TF binding sites and activity, we develop a methodology for identifying the top TF candidates for a specific cellular reprogramming task. Our data-guided methodology identifies a number of TFs previously validated for reprogramming and/or natural differentiation and predicts some potentially useful combinations of TFs. Our findings highlight the immense potential of dynamical models, mathematics, and data-guided methodologies for improving strategies for control over biological processes. Copyright © 2017 the Author(s). Published by PNAS.
Finite Element Method-Based Kinematics and Closed-Loop Control of Soft, Continuum Manipulators.
Bieze, Thor Morales; Largilliere, Frederick; Kruszewski, Alexandre; Zhang, Zhongkai; Merzouki, Rochdi; Duriez, Christian
2018-06-01
This article presents a modeling methodology and experimental validation for soft manipulators to obtain forward kinematic model (FKM) and inverse kinematic model (IKM) under quasi-static conditions (in the literature, these manipulators are usually classified as continuum robots. However, their main characteristic of interest in this article is that they create motion by deformation, as opposed to the classical use of articulations). It offers a way to obtain the kinematic characteristics of this type of soft robots that is suitable for offline path planning and position control. The modeling methodology presented relies on continuum mechanics, which does not provide analytic solutions in the general case. Our approach proposes a real-time numerical integration strategy based on finite element method with a numerical optimization based on Lagrange multipliers to obtain FKM and IKM. To reduce the dimension of the problem, at each step, a projection of the model to the constraint space (gathering actuators, sensors, and end-effector) is performed to obtain the smallest number possible of mathematical equations to be solved. This methodology is applied to obtain the kinematics of two different manipulators with complex structural geometry. An experimental comparison is also performed in one of the robots, between two other geometric approaches and the approach that is showcased in this article. A closed-loop controller based on a state estimator is proposed. The controller is experimentally validated and its robustness is evaluated using Lypunov stability method.
Integrated assessment of urban drainage system under the framework of uncertainty analysis.
Dong, X; Chen, J; Zeng, S; Zhao, D
2008-01-01
Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.
Application of a Probalistic Sizing Methodology for Ceramic Structures
NASA Astrophysics Data System (ADS)
Rancurel, Michael; Behar-Lafenetre, Stephanie; Cornillon, Laurence; Leroy, Francois-Henri; Coe, Graham; Laine, Benoit
2012-07-01
Ceramics are increasingly used in the space industry to take advantage of their stability and high specific stiffness properties. Their brittle behaviour often leads to size them by increasing the safety factors that are applied on the maximum stresses. It induces to oversize the structures. This is inconsistent with the major driver in space architecture, the mass criteria. This paper presents a methodology to size ceramic structures based on their failure probability. Thanks to failure tests on samples, the Weibull law which characterizes the strength distribution of the material is obtained. A-value (Q0.0195%) and B-value (Q0.195%) are then assessed to take into account the limited number of samples. A knocked-down Weibull law that interpolates the A- & B- values is also obtained. Thanks to these two laws, a most-likely and a knocked- down prediction of failure probability are computed for complex ceramic structures. The application of this methodology and its validation by test is reported in the paper.
Blencowe, Natalie S; Cook, Jonathan A; Pinkney, Thomas; Rogers, Chris; Reeves, Barnaby C; Blazeby, Jane M
2017-04-01
Randomized controlled trials in surgery are notoriously difficult to design and conduct due to numerous methodological and cultural challenges. Over the last 5 years, several UK-based surgical trial-related initiatives have been funded to address these issues. These include the development of Surgical Trials Centers and Surgical Specialty Leads (individual surgeons responsible for championing randomized controlled trials in their specialist fields), both funded by the Royal College of Surgeons of England; networks of research-active surgeons in training; and investment in methodological research relating to surgical randomized controlled trials (to address issues such as recruitment, blinding, and the selection and standardization of interventions). This article discusses these initiatives more in detail and provides exemplar cases to illustrate how the methodological challenges have been tackled. The initiatives have surpassed expectations, resulting in a renaissance in surgical research throughout the United Kingdom, such that the number of patients entering surgical randomized controlled trials has doubled.
Overcoming an obstacle in expanding a UMLS semantic type extent.
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2012-02-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. Copyright © 2011 Elsevier Inc. All rights reserved.
Overcoming an Obstacle in Expanding a UMLS Semantic Type Extent
Chen, Yan; Gu, Huanying; Perl, Yehoshua; Geller, James
2011-01-01
This paper strives to overcome a major problem encountered by a previous expansion methodology for discovering concepts highly likely to be missing a specific semantic type assignment in the UMLS. This methodology is the basis for an algorithm that presents the discovered concepts to a human auditor for review and possible correction. We analyzed the problem of the previous expansion methodology and discovered that it was due to an obstacle constituted by one or more concepts assigned the UMLS Semantic Network semantic type Classification. A new methodology was designed that bypasses such an obstacle without a combinatorial explosion in the number of concepts presented to the human auditor for review. The new expansion methodology with obstacle avoidance was tested with the semantic type Experimental Model of Disease and found over 500 concepts missed by the previous methodology that are in need of this semantic type assignment. Furthermore, other semantic types suffering from the same major problem were discovered, indicating that the methodology is of more general applicability. The algorithmic discovery of concepts that are likely missing a semantic type assignment is possible even in the face of obstacles, without an explosion in the number of processed concepts. PMID:21925287
Reorganising the pandemic triage processes to ethically maximise individuals' best interests.
Tillyard, Andrew
2010-11-01
To provide a revised definition, process and purpose of triage to maximise the number of patients receiving intensive care during a crisis. Based on the ethical principle of virtue ethics and the underlying goal of providing individual patients with treatment according to their best interests, the methodology of triage is reassessed and revised. The decision making processes regarding treatment decisions during a pandemic are redefined and new methods of intensive care provision recommended as well as recommending the use of a 'ranking' system for patients excluded from intensive care, defining the role of non-intensive care specialists, and applying two types of triage as 'organisational triage' and 'treatment triage' based on the demand for intensive care. Using a different underlying ethical basis upon which to plan for a pandemic crisis could maximise the number of patients receiving intensive care based on individual patients' best interests.
Banks, Caitlin L.; Pai, Mihir M.; McGuirk, Theresa E.; Fregly, Benjamin J.; Patten, Carolynn
2017-01-01
Muscle synergy analysis (MSA) is a mathematical technique that reduces the dimensionality of electromyographic (EMG) data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES) or non-responders (nRES) on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs) and time-varying neural commands (NCs) were measured as a function of: (1) number of synergies computed; (2) EMG normalization method before MSA; (3) whether SVs were held constant across trials or allowed to vary during MSA; and (4) synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF). Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG normalization method. No outcomes were greatly impacted by output normalization method. MSA variability for some, but not all, methods successfully differentiated intrinsic physiological differences inaccessible to traditional clinical or biomechanical assessments. Our results were sensitive to methodological choices, highlighting the need for disclosure of all aspects of MSA methodology in future studies. PMID:28912707
Banks, Caitlin L; Pai, Mihir M; McGuirk, Theresa E; Fregly, Benjamin J; Patten, Carolynn
2017-01-01
Muscle synergy analysis (MSA) is a mathematical technique that reduces the dimensionality of electromyographic (EMG) data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES) or non-responders (nRES) on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs) and time-varying neural commands (NCs) were measured as a function of: (1) number of synergies computed; (2) EMG normalization method before MSA; (3) whether SVs were held constant across trials or allowed to vary during MSA; and (4) synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF). Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG normalization method. No outcomes were greatly impacted by output normalization method. MSA variability for some, but not all, methods successfully differentiated intrinsic physiological differences inaccessible to traditional clinical or biomechanical assessments. Our results were sensitive to methodological choices, highlighting the need for disclosure of all aspects of MSA methodology in future studies.
Cui, De-Mi; Yan, Weizhong; Wang, Xiao-Quan; Lu, Lie-Min
2017-10-25
Low strain pile integrity testing (LSPIT), due to its simplicity and low cost, is one of the most popular NDE methods used in pile foundation construction. While performing LSPIT in the field is generally quite simple and quick, determining the integrity of the test piles by analyzing and interpreting the test signals (reflectograms) is still a manual process performed by experienced experts only. For foundation construction sites where the number of piles to be tested is large, it may take days before the expert can complete interpreting all of the piles and delivering the integrity assessment report. Techniques that can automate test signal interpretation, thus shortening the LSPIT's turnaround time, are of great business value and are in great need. Motivated by this need, in this paper, we develop a computer-aided reflectogram interpretation (CARI) methodology that can interpret a large number of LSPIT signals quickly and consistently. The methodology, built on advanced signal processing and machine learning technologies, can be used to assist the experts in performing both qualitative and quantitative interpretation of LSPIT signals. Specifically, the methodology can ease experts' interpretation burden by screening all test piles quickly and identifying a small number of suspected piles for experts to perform manual, in-depth interpretation. We demonstrate the methodology's effectiveness using the LSPIT signals collected from a number of real-world pile construction sites. The proposed methodology can potentially enhance LSPIT and make it even more efficient and effective in quality control of deep foundation construction.
Efficient Gradient-Based Shape Optimization Methodology Using Inviscid/Viscous CFD
NASA Technical Reports Server (NTRS)
Baysal, Oktay
1997-01-01
The formerly developed preconditioned-biconjugate-gradient (PBCG) solvers for the analysis and the sensitivity equations had resulted in very large error reductions per iteration; quadratic convergence was achieved whenever the solution entered the domain of attraction to the root. Its memory requirement was also lower as compared to a direct inversion solver. However, this memory requirement was high enough to preclude the realistic, high grid-density design of a practical 3D geometry. This limitation served as the impetus to the first-year activity (March 9, 1995 to March 8, 1996). Therefore, the major activity for this period was the development of the low-memory methodology for the discrete-sensitivity-based shape optimization. This was accomplished by solving all the resulting sets of equations using an alternating-direction-implicit (ADI) approach. The results indicated that shape optimization problems which required large numbers of grid points could be resolved with a gradient-based approach. Therefore, to better utilize the computational resources, it was recommended that a number of coarse grid cases, using the PBCG method, should initially be conducted to better define the optimization problem and the design space, and obtain an improved initial shape. Subsequently, a fine grid shape optimization, which necessitates using the ADI method, should be conducted to accurately obtain the final optimized shape. The other activity during this period was the interaction with the members of the Aerodynamic and Aeroacoustic Methods Branch of Langley Research Center during one stage of their investigation to develop an adjoint-variable sensitivity method using the viscous flow equations. This method had algorithmic similarities to the variational sensitivity methods and the control-theory approach. However, unlike the prior studies, it was considered for the three-dimensional, viscous flow equations. The major accomplishment in the second period of this project (March 9, 1996 to March 8, 1997) was the extension of the shape optimization methodology for the Thin-Layer Navier-Stokes equations. Both the Euler-based and the TLNS-based analyses compared with the analyses obtained using the CFL3D code. The sensitivities, again from both levels of the flow equations, also compared very well with the finite-differenced sensitivities. A fairly large set of shape optimization cases were conducted to study a number of issues previously not well understood. The testbed for these cases was the shaping of an arrow wing in Mach 2.4 flow. All the final shapes, obtained either from a coarse-grid-based or a fine-grid-based optimization, using either a Euler-based or a TLNS-based analysis, were all re-analyzed using a fine-grid, TLNS solution for their function evaluations. This allowed for a more fair comparison of their relative merits. From the aerodynamic performance standpoint, the fine-grid TLNS-based optimization produced the best shape, and the fine-grid Euler-based optimization produced the lowest cruise efficiency.
Reliability based design optimization: Formulations and methodologies
NASA Astrophysics Data System (ADS)
Agarwal, Harish
Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.
NASA Astrophysics Data System (ADS)
Jeong, Junho; Kim, Seungkeun; Suk, Jinyoung
2017-12-01
In order to overcome the limited range of GPS-based techniques, vision-based relative navigation methods have recently emerged as alternative approaches for a high Earth orbit (HEO) or deep space missions. Therefore, various vision-based relative navigation systems use for proximity operations between two spacecraft. For the implementation of these systems, a sensor placement problem can occur on the exterior of spacecraft due to its limited space. To deal with the sensor placement, this paper proposes a novel methodology for a vision-based relative navigation based on multiple position sensitive diode (PSD) sensors and multiple infrared beacon modules. For the proposed method, an iterated parametric study is used based on the farthest point optimization (FPO) and a constrained extended Kalman filter (CEKF). Each algorithm is applied to set the location of the sensors and to estimate relative positions and attitudes according to each combination by the PSDs and beacons. After that, scores for the sensor placement are calculated with respect to parameters: the number of the PSDs, number of the beacons, and accuracy of relative estimates. Then, the best scoring candidate is determined for the sensor placement. Moreover, the results of the iterated estimation show that the accuracy improves dramatically, as the number of the PSDs increases from one to three.
Decision support for redesigning wastewater treatment technologies.
McConville, Jennifer R; Künzle, Rahel; Messmer, Ulrike; Udert, Kai M; Larsen, Tove A
2014-10-21
This paper offers a methodology for structuring the design space for innovative process engineering technology development. The methodology is exemplified in the evaluation of a wide variety of treatment technologies for source-separated domestic wastewater within the scope of the Reinvent the Toilet Challenge. It offers a methodology for narrowing down the decision-making field based on a strict interpretation of treatment objectives for undiluted urine and dry feces and macroenvironmental factors (STEEPLED analysis) which influence decision criteria. Such an evaluation identifies promising paths for technology development such as focusing on space-saving processes or the need for more innovation in low-cost, energy-efficient urine treatment methods. Critical macroenvironmental factors, such as housing density, transportation infrastructure, and climate conditions were found to affect technology decisions regarding reactor volume, weight of outputs, energy consumption, atmospheric emissions, investment cost, and net revenue. The analysis also identified a number of qualitative factors that should be carefully weighed when pursuing technology development; such as availability of O&M resources, health and safety goals, and other ethical issues. Use of this methodology allows for coevolution of innovative technology within context constraints; however, for full-scale technology choices in the field, only very mature technologies can be evaluated.
On the effect of model parameters on forecast objects
NASA Astrophysics Data System (ADS)
Marzban, Caren; Jones, Corinne; Li, Ning; Sandgathe, Scott
2018-04-01
Many physics-based numerical models produce a gridded, spatial field of forecasts, e.g., a temperature map
. The field for some quantities generally consists of spatially coherent and disconnected objects
. Such objects arise in many problems, including precipitation forecasts in atmospheric models, eddy currents in ocean models, and models of forest fires. Certain features of these objects (e.g., location, size, intensity, and shape) are generally of interest. Here, a methodology is developed for assessing the impact of model parameters on the features of forecast objects. The main ingredients of the methodology include the use of (1) Latin hypercube sampling for varying the values of the model parameters, (2) statistical clustering algorithms for identifying objects, (3) multivariate multiple regression for assessing the impact of multiple model parameters on the distribution (across the forecast domain) of object features, and (4) methods for reducing the number of hypothesis tests and controlling the resulting errors. The final output
of the methodology is a series of box plots and confidence intervals that visually display the sensitivities. The methodology is demonstrated on precipitation forecasts from a mesoscale numerical weather prediction model.
Air and radon pathways screenings methodologies for the next revision of the E-area PA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dyer, J. A.
The strategic plan for the next E-Area Low-Level Waste Facility Performance Assessment includes recommended changes to the screening criteria used to reduce the number of radioisotopes that are to be considered in the air and radon pathways incorporated into the GoldSim® atmospheric release model (ARM). For the air pathway, a revised screening methodology was developed based on refinement of previous E-Area PA screening approaches and consideration of the strategic plan recommendations. The revised methodology has three sequential screening steps for each radioisotope: (1) volatility test using the Periodic Table of the Elements, (2) stability test based on half-life, and (3)more » stability test based on volatility as measured by the Henry’s Law constant for the assumed dominant gaseous species or vapor pressure in the case of tritiated water. Of the 1252 radioisotopes listed in the International Commission on Radiological Protection Publication 107, only the 10 that satisfied all three steps of the revised screening methodology will be included in the ARM. They are: Ar-37, Ar-39, Ar-42, C-14, H-3, Hg-194, Hg-203, Kr-81, Kr-85, and Xe-127. For the radon pathway, a revised screening methodology was developed that also has three sequential steps: (1) identify all decay chains that terminate at Rn-222, (2) screen out parents that decay through U-238 because of its 4.5-billion-year primordial half-life, and (3) eliminate remaining parents whose half-life is shorter than one day. Of the 86 possible decay chains leading to Rn-222, six decay chains consist of 15 unique radioisotopes that will be incorporated into the ARM. The 15 radioisotopes are: U-238, Th-234, Pa-234m, Pu-238, U-234, Th-230, Ra-226, Cf-246, Cm-242, Am-242m, Am-242, Np-238, Np-234, Pa-230, and Rn-222.« less
Möhler, Christian; Wohlfahrt, Patrick; Richter, Christian; Greilich, Steffen
2017-06-01
Electron density is the most important tissue property influencing photon and ion dose distributions in radiotherapy patients. Dual-energy computed tomography (DECT) enables the determination of electron density by combining the information on photon attenuation obtained at two different effective x-ray energy spectra. Most algorithms suggested so far use the CT numbers provided after image reconstruction as input parameters, i.e., are imaged-based. To explore the accuracy that can be achieved with these approaches, we quantify the intrinsic methodological and calibration uncertainty of the seemingly simplest approach. In the studied approach, electron density is calculated with a one-parametric linear superposition ('alpha blending') of the two DECT images, which is shown to be equivalent to an affine relation between the photon attenuation cross sections of the two x-ray energy spectra. We propose to use the latter relation for empirical calibration of the spectrum-dependent blending parameter. For a conclusive assessment of the electron density uncertainty, we chose to isolate the purely methodological uncertainty component from CT-related effects such as noise and beam hardening. Analyzing calculated spectrally weighted attenuation coefficients, we find universal applicability of the investigated approach to arbitrary mixtures of human tissue with an upper limit of the methodological uncertainty component of 0.2%, excluding high-Z elements such as iodine. The proposed calibration procedure is bias-free and straightforward to perform using standard equipment. Testing the calibration on five published data sets, we obtain very small differences in the calibration result in spite of different experimental setups and CT protocols used. Employing a general calibration per scanner type and voltage combination is thus conceivable. Given the high suitability for clinical application of the alpha-blending approach in combination with a very small methodological uncertainty, we conclude that further refinement of image-based DECT-algorithms for electron density assessment is not advisable. © 2017 American Association of Physicists in Medicine.
Traumatic brain injury: methodological approaches to estimate health and economic outcomes.
Lu, Juan; Roe, Cecilie; Aas, Eline; Lapane, Kate L; Niemeier, Janet; Arango-Lasprilla, Juan Carlos; Andelic, Nada
2013-12-01
The effort to standardize the methodology and adherence to recommended principles for all economic evaluations has been emphasized in medical literature. The objective of this review is to examine whether economic evaluations in traumatic brain injury (TBI) research have been compliant with existing guidelines. Medline search was performed between January 1, 1995 and August 11, 2012. All original TBI-related full economic evaluations were included in the study. Two authors independently rated each study's methodology and data presentation to determine compliance to the 10 methodological principles recommended by Blackmore et al. Descriptive analysis was used to summarize the data. Inter-rater reliability was assessed with Kappa statistics. A total of 28 studies met the inclusion criteria. Eighteen of these studies described cost-effectiveness, seven cost-benefit, and three cost-utility analyses. The results showed a rapid growth in the number of published articles on the economic impact of TBI since 2000 and an improvement in their methodological quality. However, overall compliance with recommended methodological principles of TBI-related economic evaluation has been deficient. On average, about six of the 10 criteria were followed in these publications, and only two articles met all 10 criteria. These findings call for an increased awareness of the methodological standards that should be followed by investigators both in performance of economic evaluation and in reviews of evaluation reports prior to publication. The results also suggest that all economic evaluations should be made by following the guidelines within a conceptual framework, in order to facilitate evidence-based practices in the field of TBI.
NASA Astrophysics Data System (ADS)
Siade, Adam J.; Hall, Joel; Karelse, Robert N.
2017-11-01
Regional groundwater flow models play an important role in decision making regarding water resources; however, the uncertainty embedded in model parameters and model assumptions can significantly hinder the reliability of model predictions. One way to reduce this uncertainty is to collect new observation data from the field. However, determining where and when to obtain such data is not straightforward. There exist a number of data-worth and experimental design strategies developed for this purpose. However, these studies often ignore issues related to real-world groundwater models such as computational expense, existing observation data, high-parameter dimension, etc. In this study, we propose a methodology, based on existing methods and software, to efficiently conduct such analyses for large-scale, complex regional groundwater flow systems for which there is a wealth of available observation data. The method utilizes the well-established d-optimality criterion, and the minimax criterion for robust sampling strategies. The so-called Null-Space Monte Carlo method is used to reduce the computational burden associated with uncertainty quantification. And, a heuristic methodology, based on the concept of the greedy algorithm, is proposed for developing robust designs with subsets of the posterior parameter samples. The proposed methodology is tested on a synthetic regional groundwater model, and subsequently applied to an existing, complex, regional groundwater system in the Perth region of Western Australia. The results indicate that robust designs can be obtained efficiently, within reasonable computational resources, for making regional decisions regarding groundwater level sampling.
Martens, Leon; Goode, Grahame; Wold, Johan F H; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas
2014-01-01
To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield.
Martens, Leon; Goode, Grahame; Wold, Johan F. H.; Beck, Lionel; Martin, Georgina; Perings, Christian; Stolt, Pelle; Baggerman, Lucas
2014-01-01
Aims To conduct a pilot study on the potential to optimise care pathways in syncope/Transient Loss of Consciousness management by using Lean Six Sigma methodology while maintaining compliance with ESC and/or NICE guidelines. Methods Five hospitals in four European countries took part. The Lean Six Sigma methodology consisted of 3 phases: 1) Assessment phase, in which baseline performance was mapped in each centre, processes were evaluated and a new operational model was developed with an improvement plan that included best practices and change management; 2) Improvement phase, in which optimisation pathways and standardised best practice tools and forms were developed and implemented. Staff were trained on new processes and change-management support provided; 3) Sustaining phase, which included support, refinement of tools and metrics. The impact of the implementation of new pathways was evaluated on number of tests performed, diagnostic yield, time to diagnosis and compliance with guidelines. One hospital with focus on geriatric populations was analysed separately from the other four. Results With the new pathways, there was a 59% reduction in the average time to diagnosis (p = 0.048) and a 75% increase in diagnostic yield (p = 0.007). There was a marked reduction in repetitions of diagnostic tests and improved prioritisation of indicated tests. Conclusions Applying a structured Lean Six Sigma based methodology to pathways for syncope management has the potential to improve time to diagnosis and diagnostic yield. PMID:24927475
An adaptive response surface method for crashworthiness optimization
NASA Astrophysics Data System (ADS)
Shi, Lei; Yang, Ren-Jye; Zhu, Ping
2013-11-01
Response surface-based design optimization has been commonly used for optimizing large-scale design problems in the automotive industry. However, most response surface models are built by a limited number of design points without considering data uncertainty. In addition, the selection of a response surface in the literature is often arbitrary. This article uses a Bayesian metric to systematically select the best available response surface among several candidates in a library while considering data uncertainty. An adaptive, efficient response surface strategy, which minimizes the number of computationally intensive simulations, was developed for design optimization of large-scale complex problems. This methodology was demonstrated by a crashworthiness optimization example.
NASA Astrophysics Data System (ADS)
Vilhelmsen, Troels N.; Ferré, Ty P. A.
2016-04-01
Hydrological models are often developed to forecasting future behavior in response due to natural or human induced changes in stresses affecting hydrologic systems. Commonly, these models are conceptualized and calibrated based on existing data/information about the hydrological conditions. However, most hydrologic systems lack sufficient data to constrain models with adequate certainty to support robust decision making. Therefore, a key element of a hydrologic study is the selection of additional data to improve model performance. Given the nature of hydrologic investigations, it is not practical to select data sequentially, i.e. to choose the next observation, collect it, refine the model, and then repeat the process. Rather, for timing and financial reasons, measurement campaigns include multiple wells or sampling points. There is a growing body of literature aimed at defining the expected data worth based on existing models. However, these are almost all limited to identifying single additional observations. In this study, we present a methodology for simultaneously selecting multiple potential new observations based on their expected ability to reduce the uncertainty of the forecasts of interest. This methodology is based on linear estimates of the predictive uncertainty, and it can be used to determine the optimal combinations of measurements (location and number) established to reduce the uncertainty of multiple predictions. The outcome of the analysis is an estimate of the optimal sampling locations; the optimal number of samples; as well as a probability map showing the locations within the investigated area that are most likely to provide useful information about the forecasting of interest.
Improving data retention in EEG research with children using child-centered eye tracking
Maguire, Mandy J.; Magnon, Grant; Fitzhugh, Anna E.
2014-01-01
Background Event Related Potentials (ERPs) elicited by visual stimuli have increased our understanding of developmental disorders and adult cognitive abilities for decades; however, these studies are very difficult with populations who cannot sustain visual attention such as infants and young children. Current methods for studying such populations include requiring a button response, which may be impossible for some participants, and experimenter monitoring, which is subject to error, highly variable, and spatially imprecise. New Method We developed a child-centered methodology to integrate EEG data acquisition and eye-tracking technologies that uses “attention-getters” in which stimulus display is contingent upon the child’s gaze. The goal was to increase the number of trials retained. Additionally, we used the eye-tracker to categorize and analyze the EEG data based on gaze to specific areas of the visual display, compared to analyzing based on stimulus presentation. Results Compared with Existing Methods The number of trials retained was substantially improved using the child-centered methodology compared to a button-press response in 7–8 year olds. In contrast, analyzing the EEG based on eye gaze to specific points within the visual display as opposed to stimulus presentation provided too few trials for reliable interpretation. Conclusions By using the linked EEG-eye-tracker we significantly increased data retention. With this method, studies can be completed with fewer participants and a wider range of populations. However, caution should be used when epoching based on participants’ eye gaze because, in this case, this technique provided substantially fewer trials. PMID:25251555
Machado, Alexandre F; Baker, Julien S; Figueira Junior, Aylton J; Bocalini, Danilo S
2017-05-04
HIIT whole body (HWB)-based exercise is a new calisthenics exercise programme approach that can be considered an effective and safe method to improve physical fitness and body composition. HWB is a method that can be applied to different populations and ages. The purpose of this study was to describe possible methodologies for performing physical training based on whole-body exercise in healthy subjects. The HWB sessions consist of a repeated stimulus based on high-intensity exercise that also include monitoring time to effort, time to recuperation and session time. The exercise intensity is related to the maximal number of movements possible in a given time; therefore, the exercise sessions can be characterized as maximal. The intensity can be recorded using ratings of perceived exertion. Weekly training frequency and exercise selection should be structured according to individual subject functional fitness. Using this simple method, there is potential for greater adherence to physical activity which can promote health benefits to all members of society. © 2017 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Wang, Dai; Gao, Junyu; Li, Pan; Wang, Bin; Zhang, Cong; Saxena, Samveg
2017-08-01
Modeling PEV travel and charging behavior is the key to estimate the charging demand and further explore the potential of providing grid services. This paper presents a stochastic simulation methodology to generate itineraries and charging load profiles for a population of PEVs based on real-world vehicle driving data. In order to describe the sequence of daily travel activities, we use the trip chain model which contains the detailed information of each trip, namely start time, end time, trip distance, start location and end location. A trip chain generation method is developed based on the Naive Bayes model to generate a large number of trips which are temporally and spatially coupled. We apply the proposed methodology to investigate the multi-location charging loads in three different scenarios. Simulation results show that home charging can meet the energy demand of the majority of PEVs in an average condition. In addition, we calculate the lower bound of charging load peak on the premise of lowest charging cost. The results are instructive for the design and construction of charging facilities to avoid excessive infrastructure.
Lima, Juliana Maria; Salmazo Vieira, Plínio; Cavalcante de Oliveira, Arthur Henrique; Cardoso, Carmen Lúcia
2016-08-07
Nucleoside diphosphate kinase from Leishmania spp. (LmNDKb) has recently been described as a potential drug target to treat leishmaniasis disease. Therefore, screening of LmNDKb ligands requires methodologies that mimic the conditions under which LmNDKb acts in biological systems. Here, we compare two label-free methodologies that could help screen LmNDKb ligands and measure NDKb activity: an offline LC-UV assay for soluble LmNDKb and an online two-dimensional LC-UV system based on LmNDKb immobilised on a silica capillary. The target enzyme was immobilised on the silica capillary via Schiff base formation (to give LmNDKb-ICER-Schiff) or affinity attachment (to give LmNDKb-ICER-His). Several aspects of the ICERs resulting from these procedures were compared, namely kinetic parameters, stability, and procedure steps. Both the LmNDKb immobilisation routes minimised the conformational changes and preserved the substrate binding sites. However, considering the number of steps involved in the immobilisation procedure, the cost of reagents, and the stability of the immobilised enzyme, immobilisation via Schiff base formation proved to be the optimal procedure.
Nonlinear data assimilation: towards a prediction of the solar cycle
NASA Astrophysics Data System (ADS)
Svedin, Andreas
The solar cycle is the cyclic variation of solar activity, with a span of 9-14 years. The prediction of the solar cycle is an important and unsolved problem with implications for communications, aviation and other aspects of our high-tech society. Our interest is model-based prediction, and we present a self-consistent procedure for parameter estimation and model state estimation, even when only one of several model variables can be observed. Data assimilation is the art of comparing, combining and transferring observed data into a mathematical model or computer simulation. We use the 3DVAR methodology, based on the notion of least squares, to present an implementation of a traditional data assimilation. Using the Shadowing Filter — a recently developed method for nonlinear data assimilation — we outline a path towards model based prediction of the solar cycle. To achieve this end we solve a number of methodological challenges related to unobserved variables. We also provide a new framework for interpretation that can guide future predictions of the Sun and other astrophysical objects.
van Gelder, P.H.A.J.M.; Nijs, M.
2011-01-01
Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care. PMID:24753877
van Gelder, P H A J M; Nijs, M
2011-01-01
Decisions about pharmacotherapy are being taken by medical doctors and authorities based on comparative studies on the use of medications. In studies on fertility treatments in particular, the methodological quality is of utmost -importance in the application of evidence-based medicine and systematic reviews. Nevertheless, flaws and omissions appear quite regularly in these types of studies. Current study aims to present an overview of some of the typical statistical flaws, illustrated by a number of example studies which have been published in peer reviewed journals. Based on an investigation of eleven studies at random selected on fertility treatments with cryopreservation, it appeared that the methodological quality of these studies often did not fulfil the -required statistical criteria. The following statistical flaws were identified: flaws in study design, patient selection, and units of analysis or in the definition of the primary endpoints. Other errors could be found in p-value and power calculations or in critical p-value definitions. Proper -interpretation of the results and/or use of these study results in a meta analysis should therefore be conducted with care.
NASA Astrophysics Data System (ADS)
Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.
Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.
Simoens, Steven
2013-01-01
Objectives This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. Materials and Methods For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Results Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Conclusions Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation. PMID:24386474
Simoens, Steven
2013-01-01
This paper aims to assess the methodological quality of economic evaluations included in Belgian reimbursement applications for Class 1 drugs. For 19 reimbursement applications submitted during 2011 and Spring 2012, a descriptive analysis assessed the methodological quality of the economic evaluation, evaluated the assessment of that economic evaluation by the Drug Reimbursement Committee and the response to that assessment by the company. Compliance with methodological guidelines issued by the Belgian Healthcare Knowledge Centre was assessed using a detailed checklist of 23 methodological items. The rate of compliance was calculated based on the number of economic evaluations for which the item was applicable. Economic evaluations tended to comply with guidelines regarding perspective, target population, subgroup analyses, comparator, use of comparative clinical data and final outcome measures, calculation of costs, incremental analysis, discounting and time horizon. However, more attention needs to be paid to the description of limitations of indirect comparisons, the choice of an appropriate analytic technique, the expression of unit costs in values for the current year, the estimation and valuation of outcomes, the presentation of results of sensitivity analyses, and testing the face validity of model inputs and outputs. Also, a large variation was observed in the scope and depth of the quality assessment by the Drug Reimbursement Committee. Although general guidelines exist, pharmaceutical companies and the Drug Reimbursement Committee would benefit from the existence of a more detailed checklist of methodological items that need to be reported in an economic evaluation.
Alayli-Goebbels, Adrienne F G; Evers, Silvia M A A; Alexeeva, Daria; Ament, André J H A; de Vries, Nanne K; Tilly, Jan C; Severens, Johan L
2014-06-01
The objective of this study was to review methodological quality of economic evaluations of lifestyle behavior change interventions (LBCIs) and to examine how they address methodological challenges for public health economic evaluation identified in the literature. Pubmed and the NHS economic evaluation database were searched for published studies in six key areas for behavior change: smoking, physical activity, dietary behavior, (illegal) drug use, alcohol use and sexual behavior. From included studies (n = 142), we extracted data on general study characteristics, characteristics of the LBCIs, methodological quality and handling of methodological challenges. Economic evaluation evidence for LBCIs showed a number of weaknesses: methods, study design and characteristics of evaluated interventions were not well reported; methodological quality showed several shortcomings and progress with addressing methodological challenges remained limited. Based on the findings of this review we propose an agenda for improving future evidence to support decision-making. Recommendations for practice include improving reporting of essential study details and increasing adherence with good practice standards. Recommendations for research methods focus on mapping out complex causal pathways for modeling, developing measures to capture broader domains of wellbeing and community outcomes, testing methods for considering equity, identifying relevant non-health sector costs and advancing methods for evidence synthesis. © The Author 2013. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Barroso-Maldonado, J. M.; Belman-Flores, J. M.; Ledesma, S.; Aceves, S. M.
2018-06-01
A key problem faced in the design of heat exchangers, especially for cryogenic applications, is the determination of convective heat transfer coefficients in two-phase flow such as condensation and boiling of non-azeotropic refrigerant mixtures. This paper proposes and evaluates three models for estimating the convective coefficient during boiling. These models are developed using computational intelligence techniques. The performance of the proposed models is evaluated using the mean relative error (mre), and compared to two existing models: the modified Granryd's correlation and the Silver-Bell-Ghaly method. The three proposed models are distinguished by their architecture. The first is based on directly measured parameters (DMP-ANN), the second is based on equivalent Reynolds and Prandtl numbers (eq-ANN), and the third on effective Reynolds and Prandtl numbers (eff-ANN). The results demonstrate that the proposed artificial neural network (ANN)-based approaches greatly outperform available methodologies. While Granryd's correlation predicts experimental data within a mean relative error mre = 44% and the S-B-G method produces mre = 42%, DMP-ANN has mre = 7.4% and eff-ANN has mre = 3.9%. Considering that eff-ANN has the lowest mean relative error (one tenth of previously available methodologies) and the broadest range of applicability, it is recommended for future calculations. Implementation is straightforward within a variety of platforms and the matrices with the ANN weights are given in the appendix for efficient programming.
Callender, C O; Koizumi, N; Miles, P V; Melancon, J K
2016-09-01
The purpose was to review the increase of minority organ donation. The methodology was based on the efforts of the DC Organ Donor Program and the Dow Take Initiative Program that focused on increasing donors among African Americans (AAs). From 1982 to 1988, AA donor card signings increased from 20/month to 750/month, and Black donations doubled. A review of the data, including face-to-face grassroots presentations combined with national media, was conducted. Gallup polls in 1985 and 1990 indicated a tripling of black awareness of transplantation and the number of blacks signing donor cards. Based on the applied successful methodologies, in 1991, the National Minority Organ Tissues Transplant Education Program was established targeting AA, Hispanic, Asian, and other ethnic groups. A review of the United Network for Organ Sharing (UNOS) database from 1990 to 2010 was accomplished. Nationally, ethnic minority organ donors per million (ODM) increased from 8-10 ODM (1982) to 35 ODM (AA and Latino/Hispanics) in 2002. In 1995, ODMs were white 34.2, black 33.1, Hispanic 31.5, and Asian 17.9. In 2010, Black organ donors per million totaled 35.36 versus white 27.07, Hispanic 25.59, and Asian 14.70. Based on the data retrieved from UNOS in 2010, blacks were ranked above whites and other ethnic minority populations as the number one ethnic group of organ donors per million in the US. Copyright © 2016 Elsevier Inc. All rights reserved.
Tausczik, Yla; Faasse, Kate; Pennebaker, James W; Petrie, Keith J
2012-01-01
Web-based methodologies may provide a new and unique insight into public response to an infectious disease outbreak. This naturalistic study investigates the effectiveness of new web-based methodologies in assessing anxiety and information seeking in response to the 2009 H1N1 outbreak by examining language use in weblogs ("blogs"), newspaper articles, and web-based information seeking. Language use in blogs and newspaper articles was assessed using Linguistic Inquiry and Word Count, and information seeking was examined using the number of daily visits to H1N1-relevant Wikipedia articles. The results show that blogs mentioning "swine flu" used significantly higher levels of anxiety, health, and death words and lower levels of positive emotion words than control blogs. Change in language use on blogs was strongly related to change in language use in newspaper coverage for the same day. Both the measure of anxiety in blogs mentioning "swine flu" and the number of Wikipedia visits followed similar trajectories, peaking shortly after the announcement of H1N1 and then declining rapidly. Anxiety measured in blogs preceded information seeking on Wikipedia. These results show that the public reaction to H1N1 was rapid and short-lived. This research suggests that analysis of web behavior can provide a source of naturalistic data on the level and changing pattern of public anxiety and information seeking following the outbreak of a public health emergency.
NASA Astrophysics Data System (ADS)
Kurceren, Ragip; Modestino, James W.
1998-12-01
The use of forward error-control (FEC) coding, possibly in conjunction with ARQ techniques, has emerged as a promising approach for video transport over ATM networks for cell-loss recovery and/or bit error correction, such as might be required for wireless links. Although FEC provides cell-loss recovery capabilities it also introduces transmission overhead which can possibly cause additional cell losses. A methodology is described to maximize the number of video sources multiplexed at a given quality of service (QoS), measured in terms of decoded cell loss probability, using interlaced FEC codes. The transport channel is modelled as a block interference channel (BIC) and the multiplexer as single server, deterministic service, finite buffer supporting N users. Based upon an information-theoretic characterization of the BIC and large deviation bounds on the buffer overflow probability, the described methodology provides theoretically achievable upper limits on the number of sources multiplexed. Performance of specific coding techniques using interlaced nonbinary Reed-Solomon (RS) codes and binary rate-compatible punctured convolutional (RCPC) codes is illustrated.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media
NASA Astrophysics Data System (ADS)
Bruno, O. P.; Pérez-Arancibia, C.
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76, 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
Windowed Green function method for the Helmholtz equation in the presence of multiply layered media.
Bruno, O P; Pérez-Arancibia, C
2017-06-01
This paper presents a new methodology for the solution of problems of two- and three-dimensional acoustic scattering (and, in particular, two-dimensional electromagnetic scattering) by obstacles and defects in the presence of an arbitrary number of penetrable layers. Relying on the use of certain slow-rise windowing functions, the proposed windowed Green function approach efficiently evaluates oscillatory integrals over unbounded domains, with high accuracy, without recourse to the highly expensive Sommerfeld integrals that have typically been used to account for the effect of underlying planar multilayer structures. The proposed methodology, whose theoretical basis was presented in the recent contribution (Bruno et al. 2016 SIAM J. Appl. Math. 76 , 1871-1898. (doi:10.1137/15M1033782)), is fast, accurate, flexible and easy to implement. Our numerical experiments demonstrate that the numerical errors resulting from the proposed approach decrease faster than any negative power of the window size. In a number of examples considered in this paper, the proposed method is up to thousands of times faster, for a given accuracy, than corresponding methods based on the use of Sommerfeld integrals.
An approach to solve replacement problems under intuitionistic fuzzy nature
NASA Astrophysics Data System (ADS)
Balaganesan, M.; Ganesan, K.
2018-04-01
Due to impreciseness to solve the day to day problems the researchers use fuzzy sets in their discussions of the replacement problems. The aim of this paper is to solve the replacement theory problems with triangular intuitionistic fuzzy numbers. An effective methodology based on fuzziness index and location index is proposed to determine the optimal solution of the replacement problem. A numerical example is illustrated to validate the proposed method.
NASA Technical Reports Server (NTRS)
Rivera, J. M.; Simpson, R. W.
1980-01-01
The aerial relay system network design problem is discussed. A generalized branch and bound based algorithm is developed which can consider a variety of optimization criteria, such as minimum passenger travel time and minimum liner and feeder operating costs. The algorithm, although efficient, is basically useful for small size networks, due to its nature of exponentially increasing computation time with the number of variables.
Montella, Emma; Di Cicco, Maria Vincenza; Ferraro, Anna; Centobelli, Piera; Raiola, Eliana; Triassi, Maria; Improta, Giovanni
2017-06-01
Nowadays, the monitoring and prevention of healthcare-associated infections (HAIs) is a priority for the healthcare sector. In this article, we report on the application of the Lean Six Sigma (LSS) methodology to reduce the number of patients affected by sentinel bacterial infections who are at risk of HAI. The LSS methodology was applied in the general surgery department by using a multidisciplinary team of both physicians and academics. Data on more than 20 000 patients who underwent a wide range of surgical procedures between January 2011 and December 2014 were collected to conduct the study using the departmental information system. The most prevalent sentinel bacteria were determined among the infected patients. The preintervention (January 2011 to December 2012) and postintervention (January 2013 to December 2014) phases were compared to analyze the effects of the methodology implemented. The methodology allowed the identification of variables that influenced the risk of HAIs and the implementation of corrective actions to improve the care process, thereby reducing the percentage of infected patients. The improved process resulted in a 20% reduction in the average number of hospitalization days between preintervention and control phases, and a decrease in the mean (SD) number of days of hospitalization amounted to 36 (15.68), with a data distribution around 3 σ. The LSS is a helpful strategy that ensures a significant decrease in the number of HAIs in patients undergoing surgical interventions. The implementation of this intervention in the general surgery departments resulted in a significant reduction in both the number of hospitalization days and the number of patients affected by HAIs. This approach, together with other tools for reducing the risk of infection (surveillance, epidemiological guidelines, and training of healthcare personnel), could be applied to redesign and improve a wide range of healthcare processes. © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Lowe, Robert; Ziemke, Tom
2010-09-01
The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.
78 FR 16300 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-14
... data collection efforts. Methodological findings may be presented externally in technical papers at... individual survey may represent several methodological improvement projects. \\2\\ This number refers to the...
Reconsidering community-based health promotion: promise, performance, and potential.
Merzel, Cheryl; D'Afflitti, Joanna
2003-04-01
Contemporary public health emphasizes a community-based approach to health promotion and disease prevention. The evidence from the past 20 years indicates, however, that many community-based programs have had only modest impact, with the notable exception of a number of HIV prevention programs. To better understand the reasons for these outcomes, we conducted a systematic literature review of 32 community-based prevention programs. Reasons for poor performance include methodological challenges to study design and evaluation, concurrent secular trends, smaller-than-expected effect sizes, limitations of the interventions, and limitations of theories used. The effectiveness of HIV programs appears to be related in part to extensive formative research and an emphasis on changing social norms.
An empirical study using permutation-based resampling in meta-regression
2012-01-01
Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815
Counting glomeruli and podocytes: rationale and methodologies
Puelles, Victor G.; Bertram, John F.
2015-01-01
Purpose of review There is currently much interest in the numbers of both glomeruli and podocytes. This interest stems from greater understanding of the effects of suboptimal fetal events on nephron endowment, the associations between low nephron number and chronic cardiovascular and kidney disease in adults, and the emergence of the podocyte depletion hypothesis. Recent findings Obtaining accurate and precise estimates of glomerular and podocyte number has proven surprisingly difficult. When whole kidneys or large tissue samples are available, design-based stereological methods are considered gold-standard because they are based on principles that negate systematic bias. However, these methods are often tedious and time-consuming, and oftentimes inapplicable when dealing with small samples such as biopsies. Therefore, novel methods suitable for small tissue samples, and innovative approaches to facilitate high through put measurements, such as magnetic resonance imaging (MRI) to estimate glomerular number and flow cytometry to estimate podocyte number, have recently been described. Summary This review describes current gold-standard methods for estimating glomerular and podocyte number, as well as methods developed in the past 3 years. We are now better placed than ever before to accurately and precisely estimate glomerular and podocyte number, and to examine relationships between these measurements and kidney health and disease. PMID:25887899
[Methodologies for Ascertaining Local Education Needs and for Allocating and Developing Resources.
ERIC Educational Resources Information Center
Bellott, Fred
A survey of 125 school systems in the United States was conducted to investigate methodologies used for developing needs assessment programs at a local level. Schools were asked to reply to a questionnaire which attempted to detail and identify how needs assessment programs are set up, what methodologies are employed, the number of resultant…
A Systematic Approach for Model-Based Aircraft Engine Performance Estimation
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Garg, Sanjay
2010-01-01
A requirement for effective aircraft engine performance estimation is the ability to account for engine degradation, generally described in terms of unmeasurable health parameters such as efficiencies and flow capacities related to each major engine module. This paper presents a linear point design methodology for minimizing the degradation-induced error in model-based aircraft engine performance estimation applications. The technique specifically focuses on the underdetermined estimation problem, where there are more unknown health parameters than available sensor measurements. A condition for Kalman filter-based estimation is that the number of health parameters estimated cannot exceed the number of sensed measurements. In this paper, the estimated health parameter vector will be replaced by a reduced order tuner vector whose dimension is equivalent to the sensed measurement vector. The reduced order tuner vector is systematically selected to minimize the theoretical mean squared estimation error of a maximum a posteriori estimator formulation. This paper derives theoretical estimation errors at steady-state operating conditions, and presents the tuner selection routine applied to minimize these values. Results from the application of the technique to an aircraft engine simulation are presented and compared to the estimation accuracy achieved through conventional maximum a posteriori and Kalman filter estimation approaches. Maximum a posteriori estimation results demonstrate that reduced order tuning parameter vectors can be found that approximate the accuracy of estimating all health parameters directly. Kalman filter estimation results based on the same reduced order tuning parameter vectors demonstrate that significantly improved estimation accuracy can be achieved over the conventional approach of selecting a subset of health parameters to serve as the tuner vector. However, additional development is necessary to fully extend the methodology to Kalman filter-based estimation applications.
ADaCGH: A Parallelized Web-Based Application and R Package for the Analysis of aCGH Data
Díaz-Uriarte, Ramón; Rueda, Oscar M.
2007-01-01
Background Copy number alterations (CNAs) in genomic DNA have been associated with complex human diseases, including cancer. One of the most common techniques to detect CNAs is array-based comparative genomic hybridization (aCGH). The availability of aCGH platforms and the need for identification of CNAs has resulted in a wealth of methodological studies. Methodology/Principal Findings ADaCGH is an R package and a web-based application for the analysis of aCGH data. It implements eight methods for detection of CNAs, gains and losses of genomic DNA, including all of the best performing ones from two recent reviews (CBS, GLAD, CGHseg, HMM). For improved speed, we use parallel computing (via MPI). Additional information (GO terms, PubMed citations, KEGG and Reactome pathways) is available for individual genes, and for sets of genes with altered copy numbers. Conclusions/Significance ADaCGH represents a qualitative increase in the standards of these types of applications: a) all of the best performing algorithms are included, not just one or two; b) we do not limit ourselves to providing a thin layer of CGI on top of existing BioConductor packages, but instead carefully use parallelization, examining different schemes, and are able to achieve significant decreases in user waiting time (factors up to 45×); c) we have added functionality not currently available in some methods, to adapt to recent recommendations (e.g., merging of segmentation results in wavelet-based and CGHseg algorithms); d) we incorporate redundancy, fault-tolerance and checkpointing, which are unique among web-based, parallelized applications; e) all of the code is available under open source licenses, allowing to build upon, copy, and adapt our code for other software projects. PMID:17710137
Comparing three pedagogical approaches to psychomotor skills acquisition.
Willis, Ross E; Richa, Jacqueline; Oppeltz, Richard; Nguyen, Patrick; Wagner, Kelly; Van Sickle, Kent R; Dent, Daniel L
2012-01-01
We compared traditional pedagogical approaches such as time- and repetition-based methods with proficiency-based training. Laparoscopic novices were assigned randomly to 1 of 3 training conditions. In experiment 1, participants in the time condition practiced for 60 minutes, participants in the repetition condition performed 5 practice trials, and participants in the proficiency condition trained until reaching a predetermined proficiency goal. In experiment 2, practice time and number of trials were equated across conditions. In experiment 1, participants in the proficiency-based training conditions outperformed participants in the other 2 conditions (P < .014); however, these participants trained longer (P < .001) and performed more repetitions (P < .001). In experiment 2, despite training for similar amounts of time and number of repetitions, participants in the proficiency condition outperformed their counterparts (P < .038). In both experiments, the standard deviations for the proficiency condition were smaller than the other conditions. Proficiency-based training results in trainees who perform uniformly and at a higher level than traditional training methodologies. Copyright © 2012 Elsevier Inc. All rights reserved.
Langley Wind Tunnel Data Quality Assurance-Check Standard Results
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.
2000-01-01
A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.
Methodology and Method and Apparatus for Signaling with Capacity Optimized Constellations
NASA Technical Reports Server (NTRS)
Barsoum, Maged F. (Inventor); Jones, Christopher R. (Inventor)
2016-01-01
Design Methodology and Method and Apparatus for Signaling with Capacity Optimized Constellation Abstract Communication systems are described that use geometrically PSK shaped constellations that have increased capacity compared to conventional PSK constellations operating within a similar SNR band. The geometrically shaped PSK constellation is optimized based upon parallel decoding capacity. In many embodiments, a capacity optimized geometrically shaped constellation can be used to replace a conventional constellation as part of a firmware upgrade to transmitters and receivers within a communication system. In a number of embodiments, the geometrically shaped constellation is optimized for an Additive White Gaussian Noise channel or a fading channel. In numerous embodiments, the communication uses adaptive rate encoding and the location of points within the geometrically shaped constellation changes as the code rate changes.
Stacked Autoencoders for Outlier Detection in Over-the-Horizon Radar Signals
Protopapadakis, Eftychios; Doulamis, Anastasios; Doulamis, Nikolaos; Dres, Dimitrios; Bimpas, Matthaios
2017-01-01
Detection of outliers in radar signals is a considerable challenge in maritime surveillance applications. High-Frequency Surface-Wave (HFSW) radars have attracted significant interest as potential tools for long-range target identification and outlier detection at over-the-horizon (OTH) distances. However, a number of disadvantages, such as their low spatial resolution and presence of clutter, have a negative impact on their accuracy. In this paper, we explore the applicability of deep learning techniques for detecting deviations from the norm in behavioral patterns of vessels (outliers) as they are tracked from an OTH radar. The proposed methodology exploits the nonlinear mapping capabilities of deep stacked autoencoders in combination with density-based clustering. A comparative experimental evaluation of the approach shows promising results in terms of the proposed methodology's performance. PMID:29312449
Scalability analysis methodology for passive optical interconnects in data center networks using PAM
NASA Astrophysics Data System (ADS)
Lin, R.; Szczerba, Krzysztof; Agrell, Erik; Wosinska, Lena; Tang, M.; Liu, D.; Chen, J.
2017-11-01
A framework is developed for modeling the fundamental impairments in optical datacenter interconnects, i.e., the power loss and the receiver noises. This framework makes it possible, to analyze the trade-offs between data rates, modulation order, and number of ports that can be supported in optical interconnect architectures, while guaranteeing that the required signal-to-noise ratios are satisfied. To the best of our knowledge, this important assessment methodology is not yet available. As a case study, the trade-offs are investigated for three coupler-based top-of-rack interconnect architectures, which suffer from serious insertion loss. The results show that using single-port transceivers with 10 GHz bandwidth, avalanche photodiode detectors, and quadratical pulse amplitude modulation, more than 500 ports can be supported.
NASA Astrophysics Data System (ADS)
Liberal, Iñigo; Engheta, Nader
2018-02-01
Quantum emitters interacting through a waveguide setup have been proposed as a promising platform for basic research on light-matter interactions and quantum information processing. We propose to augment waveguide setups with the use of multiport devices. Specifically, we demonstrate theoretically the possibility of exciting N -qubit subradiant, maximally entangled, states with the use of suitably designed N -port devices. Our general methodology is then applied based on two different devices: an epsilon-and-mu-near-zero waveguide hub and a nonreciprocal circulator. A sensitivity analysis is carried out to assess the robustness of the system against a number of nonidealities. These findings link and merge the designs of devices for quantum state engineering with classical communication network methodologies.
Some design issues of strata-matched non-randomized studies with survival outcomes.
Mazumdar, Madhu; Tu, Donsheng; Zhou, Xi Kathy
2006-12-15
Non-randomized studies for the evaluation of a medical intervention are useful for quantitative hypothesis generation before the initiation of a randomized trial and also when randomized clinical trials are difficult to conduct. A strata-matched non-randomized design is often utilized where subjects treated by a test intervention are matched to a fixed number of subjects treated by a standard intervention within covariate based strata. In this paper, we consider the issue of sample size calculation for this design. Based on the asymptotic formula for the power of a stratified log-rank test, we derive a formula to calculate the minimum number of subjects in the test intervention group that is required to detect a given relative risk between the test and standard interventions. When this minimum number of subjects in the test intervention group is available, an equation is also derived to find the multiple that determines the number of subjects in the standard intervention group within each stratum. The methodology developed is applied to two illustrative examples in gastric cancer and sarcoma.
González-Sáiz, J M; Esteban-Díez, I; Rodríguez-Tecedor, S; Pérez-Del-Notario, N; Arenzana-Rámila, I; Pizarro, C
2014-12-15
The aim of the present work was to evaluate the effect of the main factors conditioning accelerated ageing processes (oxygen dose, chip dose, wood origin, toasting degree and maceration time) on the phenolic and chromatic profiles of red wines by using a multivariate strategy based on experimental design methodology. The results obtained revealed that the concentrations of monomeric anthocyanins and flavan-3-ols could be modified through the application of particular experimental conditions. This fact was particularly remarkable since changes in phenolic profile were closely linked to changes observed in chromatic parameters. The main strength of this study lies in the possibility of using its conclusions as a basis to make wines with specific colour properties based on quality criteria. To our knowledge, the influence of such a large number of alternative ageing parameters on wine phenolic composition and chromatic attributes has not been studied previously using a comprehensive experimental design methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.
LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics
NASA Astrophysics Data System (ADS)
Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel
2017-10-01
Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.
Rocket-Based Combined Cycle Engine Technology Development: Inlet CFD Validation and Application
NASA Technical Reports Server (NTRS)
DeBonis, J. R.; Yungster, S.
1996-01-01
A CFD methodology has been developed for inlet analyses of Rocket-Based Combined Cycle (RBCC) Engines. A full Navier-Stokes analysis code, NPARC, was used in conjunction with pre- and post-processing tools to obtain a complete description of the flow field and integrated inlet performance. This methodology was developed and validated using results from a subscale test of the inlet to a RBCC 'Strut-Jet' engine performed in the NASA Lewis 1 x 1 ft. supersonic wind tunnel. Results obtained from this study include analyses at flight Mach numbers of 5 and 6 for super-critical operating conditions. These results showed excellent agreement with experimental data. The analysis tools were also used to obtain pre-test performance and operability predictions for the RBCC demonstrator engine planned for testing in the NASA Lewis Hypersonic Test Facility. This analysis calculated the baseline fuel-off internal force of the engine which is needed to determine the net thrust with fuel on.
Siksik, May; Krishnamurthy, Vikram
2017-09-01
This paper proposes a multi-dielectric Brownian dynamics simulation framework for design-space-exploration (DSE) studies of ion-channel permeation. The goal of such DSE studies is to estimate the channel modeling-parameters that minimize the mean-squared error between the simulated and expected "permeation characteristics." To address this computational challenge, we use a methodology based on statistical inference that utilizes the knowledge of channel structure to prune the design space. We demonstrate the proposed framework and DSE methodology using a case study based on the KcsA ion channel, in which the design space is successfully reduced from a 6-D space to a 2-D space. Our results show that the channel dielectric map computed using the framework matches with that computed directly using molecular dynamics with an error of 7%. Finally, the scalability and resolution of the model used are explored, and it is shown that the memory requirements needed for DSE remain constant as the number of parameters (degree of heterogeneity) increases.
Monneret, Denis
2017-01-01
The relationship between nonalcoholic fatty liver disease (NAFLD) and obstructive sleep apnea (OSA) has been well demonstrated, but remains to be evidenced in chronic obstructive pulmonary disease (COPD). Recently, Viglino et al. (Eur Respir J, 2017) attempted to determine the prevalence of liver fibrosis, steatosis and nonalcoholic steatohepatitis (NASH) in COPD patients, some of whom had OSA, basing the NAFLD diagnostic on three circulating biomarker-based liver scores: the FibroTest, SteatoTest and NashTest, from the Fibromax® panel. Among the main findings, the absence of OSA treatment emerged as independently associated with liver fibrosis and steatosis, when compared to effective treatment. However, besides the low number of treated patients, no polysomnographic respiratory data was provided, making it difficult to differentiate the impact of OSA from that of COPD in NAFLD prevalence. Furthermore, NAFLD diagnosis relied exclusively on circulating biomarker-based liver scores, without histological, imagery or other liver exploratory methods. Therefore, in this article, some methodological points are reminded and discussed, including the choice of OSA measurements, and the significance of ActiTest and AshTest scores from Fibromax® in this pathophysiological context. PMID:29225775
Płaszewski, Maciej; Bettany-Saltikov, Josette
2014-01-01
Background Non-surgical interventions for adolescents with idiopathic scoliosis remain highly controversial. Despite the publication of numerous reviews no explicit methodological evaluation of papers labeled as, or having a layout of, a systematic review, addressing this subject matter, is available. Objectives Analysis and comparison of the content, methodology, and evidence-base from systematic reviews regarding non-surgical interventions for adolescents with idiopathic scoliosis. Design Systematic overview of systematic reviews. Methods Articles meeting the minimal criteria for a systematic review, regarding any non-surgical intervention for adolescent idiopathic scoliosis, with any outcomes measured, were included. Multiple general and systematic review specific databases, guideline registries, reference lists and websites of institutions were searched. The AMSTAR tool was used to critically appraise the methodology, and the Oxford Centre for Evidence Based Medicine and the Joanna Briggs Institute’s hierarchies were applied to analyze the levels of evidence from included reviews. Results From 469 citations, twenty one papers were included for analysis. Five reviews assessed the effectiveness of scoliosis-specific exercise treatments, four assessed manual therapies, five evaluated bracing, four assessed different combinations of interventions, and one evaluated usual physical activity. Two reviews addressed the adverse effects of bracing. Two papers were high quality Cochrane reviews, Three were of moderate, and the remaining sixteen were of low or very low methodological quality. The level of evidence of these reviews ranged from 1 or 1+ to 4, and in some reviews, due to their low methodological quality and/or poor reporting, this could not be established. Conclusions Higher quality reviews indicate that generally there is insufficient evidence to make a judgment on whether non-surgical interventions in adolescent idiopathic scoliosis are effective. Papers labeled as systematic reviews need to be considered in terms of their methodological rigor; otherwise they may be mistakenly regarded as high quality sources of evidence. Protocol registry number CRD42013003538, PROSPERO PMID:25353954
do Amaral, Leonardo L.; Pavoni, Juliana F.; Sampaio, Francisco; Netto, Thomaz Ghilardi
2015-01-01
Despite individual quality assurance (QA) being recommended for complex techniques in radiotherapy (RT) treatment, the possibility of errors in dose delivery during therapeutic application has been verified. Therefore, it is fundamentally important to conduct in vivo QA during treatment. This work presents an in vivo transmission quality control methodology, using radiochromic film (RCF) coupled to the linear accelerator (linac) accessory holder. This QA methodology compares the dose distribution measured by the film in the linac accessory holder with the dose distribution expected by the treatment planning software. The calculated dose distribution is obtained in the coronal and central plane of a phantom with the same dimensions of the acrylic support used for positioning the film but in a source‐to‐detector distance (SDD) of 100 cm, as a result of transferring the IMRT plan in question with all the fields positioned with the gantry vertically, that is, perpendicular to the phantom. To validate this procedure, first of all a Monte Carlo simulation using PENELOPE code was done to evaluate the differences between the dose distributions measured by the film in a SDD of 56.8 cm and 100 cm. After that, several simple dose distribution tests were evaluated using the proposed methodology, and finally a study using IMRT treatments was done. In the Monte Carlo simulation, the mean percentage of points approved in the gamma function comparing the dose distribution acquired in the two SDDs were 99.92%±0.14%. In the simple dose distribution tests, the mean percentage of points approved in the gamma function were 99.85%±0.26% and the mean percentage differences in the normalization point doses were −1.41%. The transmission methodology was approved in 24 of 25 IMRT test irradiations. Based on these results, it can be concluded that the proposed methodology using RCFs can be applied for in vivo QA in RT treatments. PACS number: 87.55.Qr, 87.55.km, 87.55.N‐ PMID:26699306
The methodological quality of animal research in critical care: the public face of science.
Bara, Meredith; Joffe, Ari R
2014-01-01
Animal research (AR) findings often do not translate to humans; one potential reason is the poor methodological quality of AR. We aimed to determine this quality of AR reported in critical care journals. All AR published from January to June 2012 in three high-impact critical care journals were reviewed. A case report form and instruction manual with clear definitions were created, based on published recommendations, including the ARRIVE guidelines. Data were analyzed with descriptive statistics. Seventy-seven AR publications were reviewed. Our primary outcome (animal strain, sex, and weight or age described) was reported in 52 (68%; 95% confidence interval, 56% to 77%). Of the 77 publications, 47 (61%) reported randomization; of these, 3 (6%) reported allocation concealment, and 1 (2%) the randomization procedure. Of the 77 publications, 31 (40%) reported some type of blinding; of these, disease induction (2, 7%), intervention (7, 23%), and/or subjective outcomes (17, 55%) were blinded. A sample size calculation was reported in 4/77 (5%). Animal numbers were missing in the Methods section in 16 (21%) publications; when stated, the median was 32 (range 6 to 320; interquartile range, 21 to 70). Extra animals used were mentioned in the Results section in 31 (40%) publications; this number was unclear in 23 (74%), and >100 for 12 (16%). When reporting most outcomes, numbers with denominators were given in 35 (45%), with no unaccounted numbers in 24 (31%), and no animals excluded from analysis in 20 (26%). Most (49, 64%) studies reported >40, and another 19 (25%) reported 21 to 40 statistical comparisons. Internal validity limitations were discussed in 7 (9%), and external validity (to humans) discussed in 71 (92%), most with no (30, 42%) or only a vague (9, 13%) limitation to this external validity mentioned. The reported methodological quality of AR was poor. Unless the quality of AR significantly improves, the practice may be in serious jeopardy of losing public support.
The methodological quality of animal research in critical care: the public face of science
2014-01-01
Background Animal research (AR) findings often do not translate to humans; one potential reason is the poor methodological quality of AR. We aimed to determine this quality of AR reported in critical care journals. Methods All AR published from January to June 2012 in three high-impact critical care journals were reviewed. A case report form and instruction manual with clear definitions were created, based on published recommendations, including the ARRIVE guidelines. Data were analyzed with descriptive statistics. Results Seventy-seven AR publications were reviewed. Our primary outcome (animal strain, sex, and weight or age described) was reported in 52 (68%; 95% confidence interval, 56% to 77%). Of the 77 publications, 47 (61%) reported randomization; of these, 3 (6%) reported allocation concealment, and 1 (2%) the randomization procedure. Of the 77 publications, 31 (40%) reported some type of blinding; of these, disease induction (2, 7%), intervention (7, 23%), and/or subjective outcomes (17, 55%) were blinded. A sample size calculation was reported in 4/77 (5%). Animal numbers were missing in the Methods section in 16 (21%) publications; when stated, the median was 32 (range 6 to 320; interquartile range, 21 to 70). Extra animals used were mentioned in the Results section in 31 (40%) publications; this number was unclear in 23 (74%), and >100 for 12 (16%). When reporting most outcomes, numbers with denominators were given in 35 (45%), with no unaccounted numbers in 24 (31%), and no animals excluded from analysis in 20 (26%). Most (49, 64%) studies reported >40, and another 19 (25%) reported 21 to 40 statistical comparisons. Internal validity limitations were discussed in 7 (9%), and external validity (to humans) discussed in 71 (92%), most with no (30, 42%) or only a vague (9, 13%) limitation to this external validity mentioned. Conclusions The reported methodological quality of AR was poor. Unless the quality of AR significantly improves, the practice may be in serious jeopardy of losing public support. PMID:25114829
Ruano, Juan; Aguilar-Luque, Macarena; Gómez-Garcia, Francisco; Alcalde Mellado, Patricia; Gay-Mimbrera, Jesus; Carmona-Fernandez, Pedro J; Maestre-López, Beatriz; Sanz-Cabanillas, Juan Luís; Hernández Romero, José Luís; González-Padilla, Marcelino; Vélez García-Nieto, Antonio; Isla-Tejera, Beatriz
2018-01-01
Researchers are increasingly using on line social networks to promote their work. Some authors have suggested that measuring social media activity can predict the impact of a primary study (i.e., whether or not an article will be highly cited). However, the influence of variables such as scientific quality, research disclosures, and journal characteristics on systematic reviews and meta-analyses has not yet been assessed. The present study aims to describe the effect of complex interactions between bibliometric factors and social media activity on the impact of systematic reviews and meta-analyses about psoriasis (PROSPERO 2016: CRD42016053181). Methodological quality was assessed using the Assessing the Methodological Quality of Systematic Reviews (AMSTAR) tool. Altmetrics, which consider Twitter, Facebook, and Google+ mention counts as well as Mendeley and SCOPUS readers, and corresponding article citation counts from Google Scholar were obtained for each article. Metadata and journal-related bibliometric indices were also obtained. One-hundred and sixty-four reviews with available altmetrics information were included in the final multifactorial analysis, which showed that social media and impact factor have less effect than Mendeley and SCOPUS readers on the number of cites that appear in Google Scholar. Although a journal's impact factor predicted the number of tweets (OR, 1.202; 95% CI, 1.087-1.049), the years of publication and the number of Mendeley readers predicted the number of citations in Google Scholar (OR, 1.033; 95% CI, 1.018-1.329). Finally, methodological quality was related neither with bibliometric influence nor social media activity for systematic reviews. In conclusion, there seems to be a lack of connectivity between scientific quality, social media activity, and article usage, thus predicting scientific success based on these variables may be inappropriate in the particular case of systematic reviews.
Gómez-Garcia, Francisco; Alcalde Mellado, Patricia; Gay-Mimbrera, Jesus; Carmona-Fernandez, Pedro J.; Maestre-López, Beatriz; Sanz-Cabanillas, Juan Luís; Hernández Romero, José Luís; González-Padilla, Marcelino; Vélez García-Nieto, Antonio; Isla-Tejera, Beatriz
2018-01-01
Researchers are increasingly using on line social networks to promote their work. Some authors have suggested that measuring social media activity can predict the impact of a primary study (i.e., whether or not an article will be highly cited). However, the influence of variables such as scientific quality, research disclosures, and journal characteristics on systematic reviews and meta-analyses has not yet been assessed. The present study aims to describe the effect of complex interactions between bibliometric factors and social media activity on the impact of systematic reviews and meta-analyses about psoriasis (PROSPERO 2016: CRD42016053181). Methodological quality was assessed using the Assessing the Methodological Quality of Systematic Reviews (AMSTAR) tool. Altmetrics, which consider Twitter, Facebook, and Google+ mention counts as well as Mendeley and SCOPUS readers, and corresponding article citation counts from Google Scholar were obtained for each article. Metadata and journal-related bibliometric indices were also obtained. One-hundred and sixty-four reviews with available altmetrics information were included in the final multifactorial analysis, which showed that social media and impact factor have less effect than Mendeley and SCOPUS readers on the number of cites that appear in Google Scholar. Although a journal’s impact factor predicted the number of tweets (OR, 1.202; 95% CI, 1.087–1.049), the years of publication and the number of Mendeley readers predicted the number of citations in Google Scholar (OR, 1.033; 95% CI, 1.018–1.329). Finally, methodological quality was related neither with bibliometric influence nor social media activity for systematic reviews. In conclusion, there seems to be a lack of connectivity between scientific quality, social media activity, and article usage, thus predicting scientific success based on these variables may be inappropriate in the particular case of systematic reviews. PMID:29377889
Bayesian outcome-based strategy classification.
Lee, Michael D
2016-03-01
Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014) recently developed a method for making inferences about the decision processes people use in multi-attribute forced choice tasks. Their paper makes a number of worthwhile theoretical and methodological contributions. Theoretically, they provide an insightful psychological motivation for a probabilistic extension of the widely-used "weighted additive" (WADD) model, and show how this model, as well as other important models like "take-the-best" (TTB), can and should be expressed in terms of meaningful priors. Methodologically, they develop an inference approach based on the Minimum Description Length (MDL) principles that balances both the goodness-of-fit and complexity of the decision models they consider. This paper aims to preserve these useful contributions, but provide a complementary Bayesian approach with some theoretical and methodological advantages. We develop a simple graphical model, implemented in JAGS, that allows for fully Bayesian inferences about which models people use to make decisions. To demonstrate the Bayesian approach, we apply it to the models and data considered by Hilbig and Moshagen (Psychonomic Bulletin & Review, 21, 1431-1443, 2014), showing how a prior predictive analysis of the models, and posterior inferences about which models people use and the parameter settings at which they use them, can contribute to our understanding of human decision making.
NASA Astrophysics Data System (ADS)
Neriani, Kelly E.; Herbranson, Travis J.; Reis, George A.; Pinkus, Alan R.; Goodyear, Charles D.
2006-05-01
While vast numbers of image enhancing algorithms have already been developed, the majority of these algorithms have not been assessed in terms of their visual performance-enhancing effects using militarily relevant scenarios. The goal of this research was to apply a visual performance-based assessment methodology to evaluate six algorithms that were specifically designed to enhance the contrast of digital images. The image enhancing algorithms used in this study included three different histogram equalization algorithms, the Autolevels function, the Recursive Rational Filter technique described in Marsi, Ramponi, and Carrato1 and the multiscale Retinex algorithm described in Rahman, Jobson and Woodell2. The methodology used in the assessment has been developed to acquire objective human visual performance data as a means of evaluating the contrast enhancement algorithms. Objective performance metrics, response time and error rate, were used to compare algorithm enhanced images versus two baseline conditions, original non-enhanced images and contrast-degraded images. Observers completed a visual search task using a spatial-forcedchoice paradigm. Observers searched images for a target (a military vehicle) hidden among foliage and then indicated in which quadrant of the screen the target was located. Response time and percent correct were measured for each observer. Results of the study and future directions are discussed.
Leininger's Ethnonursing Research Methodology and Studies of Cancer Survivors: A Review.
Farren, Arlene T
2015-09-01
The purpose of this article is to present the findings of a literature review regarding the use of Leininger's ethnonursing research methodology (ENRM) in studies addressing adult cancer survivors. It is important to learn about differences and similarities among cancer survivors' experiences so that patient-centered, culturally congruent care can be provided. A review of the literature was conducted using databases such as CINAHL and MEDLINE. Search terms included variations on ENRM and cancer survivors. The results were a small number of published studies that used the ENRM examining breast cancer survivors' perceptions and experiences. A review instrument was developed to estimate study quality based on established criteria. The studies are critiqued in relation to the theory-based methodology, evaluation criteria for qualitative research, and study findings are summarized. The author concludes that although there is a paucity of research using ENRM with adult cancer survivors, the preliminary findings of the included studies contribute to what is known about breast cancer survivors. Implications for research include recommendations to increase the use of ENRM to discover the universal and diverse experiences of care practices in adult cancer survivors and use the evidence to develop patient-centered, culturally congruent, quality care for cancer survivors. © The Author(s) 2014.
Methodology for nonwork travel analysis in suburban communities.
DOT National Transportation Integrated Search
1994-01-01
The increase in the number of nonwork trips during the past decade has contributed substantially to congestion and to environmental problems. Data collection methodologies, descriptive information, and reliable models of nonwork travel behavior are n...
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
Improving the Accuracy of Cloud Detection Using Machine Learning
NASA Astrophysics Data System (ADS)
Craddock, M. E.; Alliss, R. J.; Mason, M.
2017-12-01
Cloud detection from geostationary satellite imagery has long been accomplished through multi-spectral channel differencing in comparison to the Earth's surface. The distinction of clear/cloud is then determined by comparing these differences to empirical thresholds. Using this methodology, the probability of detecting clouds exceeds 90% but performance varies seasonally, regionally and temporally. The Cloud Mask Generator (CMG) database developed under this effort, consists of 20 years of 4 km, 15minute clear/cloud images based on GOES data over CONUS and Hawaii. The algorithms to determine cloudy pixels in the imagery are based on well-known multi-spectral techniques and defined thresholds. These thresholds were produced by manually studying thousands of images and thousands of man-hours to determine the success and failure of the algorithms to fine tune the thresholds. This study aims to investigate the potential of improving cloud detection by using Random Forest (RF) ensemble classification. RF is the ideal methodology to employ for cloud detection as it runs efficiently on large datasets, is robust to outliers and noise and is able to deal with highly correlated predictors, such as multi-spectral satellite imagery. The RF code was developed using Python in about 4 weeks. The region of focus selected was Hawaii and includes the use of visible and infrared imagery, topography and multi-spectral image products as predictors. The development of the cloud detection technique is realized in three steps. First, tuning of the RF models is completed to identify the optimal values of the number of trees and number of predictors to employ for both day and night scenes. Second, the RF models are trained using the optimal number of trees and a select number of random predictors identified during the tuning phase. Lastly, the model is used to predict clouds for an independent time period than used during training and compared to truth, the CMG cloud mask. Initial results show 97% accuracy during the daytime, 94% accuracy at night, and 95% accuracy for all times. The total time to train, tune and test was approximately one week. The improved performance and reduced time to produce results is testament to improved computer technology and the use of machine learning as a more efficient and accurate methodology of cloud detection.
Applying automatic item generation to create cohesive physics testlets
NASA Astrophysics Data System (ADS)
Mindyarto, B. N.; Nugroho, S. E.; Linuwih, S.
2018-03-01
Computer-based testing has created the demand for large numbers of items. This paper discusses the production of cohesive physics testlets using an automatic item generation concepts and procedures. The testlets were composed by restructuring physics problems to reveal deeper understanding of the underlying physical concepts by inserting a qualitative question and its scientific reasoning question. A template-based testlet generator was used to generate the testlet variants. Using this methodology, 1248 testlet variants were effectively generated from 25 testlet templates. Some issues related to the effective application of the generated physics testlets in practical assessments were discussed.
Age diagnosis based on incremental lines in dental cementum: a critical reflection.
Grosskopf, Birgit; McGlynn, George
2011-01-01
Age estimation based on the counting of incremental lines in dental cementum is a method frequently used for the estimation of the age at death for humans in bioarchaeology, and increasingly, forensic anthropology. Assessment of applicability, precision, and method reproducibility continue to be the focus of research in this area, and are occasionally accompanied by significant controversy. Differences in methodological techniques for data collection (e.g. number of sections, factor of magnification for counting or interpreting "outliers") are presented. Potential influences on method reliability are discussed, especially for their applicability in forensic contexts.
Software Requirements Engineering Methodology (Development)
1979-06-01
Higher Order Software [20]; and the Michael Jackson Design Methodology [21]. Although structured programming constructs have proven to be more useful...reviewed here. Similarly, the manual techniques for software design (e.g., HIPO Diagrams, Nassi-Schneidermann charts, Top-Down Design, the Michael ... Jackson Design Methodology, Yourdon’s Structured Design) are not addressed. 6.1.3 Research Programs There are a number of research programs underway
Levecke, Bruno; Kaplan, Ray M; Thamsborg, Stig M; Torgerson, Paul R; Vercruysse, Jozef; Dobson, Robert J
2018-04-15
Although various studies have provided novel insights into how to best design, analyze and interpret a fecal egg count reduction test (FECRT), it is still not straightforward to provide guidance that allows improving both the standardization and the analytical performance of the FECRT across a variety of both animal and nematode species. For example, it has been suggested to recommend a minimum number of eggs to be counted under the microscope (not eggs per gram of feces), but we lack the evidence to recommend any number of eggs that would allow a reliable assessment of drug efficacy. Other aspects that need further research are the methodology of calculating uncertainty intervals (UIs; confidence intervals in case of frequentist methods and credible intervals in case of Bayesian methods) and the criteria of classifying drug efficacy into 'normal', 'suspected' and 'reduced'. The aim of this study is to provide complementary insights into the current knowledge, and to ultimately provide guidance in the development of new standardized guidelines for the FECRT. First, data were generated using a simulation in which the 'true' drug efficacy (TDE) was evaluated by the FECRT under varying scenarios of sample size, analytic sensitivity of the diagnostic technique, and level of both intensity and aggregation of egg excretion. Second, the obtained data were analyzed with the aim (i) to verify which classification criteria allow for reliable detection of reduced drug efficacy, (ii) to identify the UI methodology that yields the most reliable assessment of drug efficacy (coverage of TDE) and detection of reduced drug efficacy, and (iii) to determine the required sample size and number of eggs counted under the microscope that optimizes the detection of reduced efficacy. Our results confirm that the currently recommended criteria for classifying drug efficacy are the most appropriate. Additionally, the UI methodologies we tested varied in coverage and ability to detect reduced drug efficacy, thus a combination of UI methodologies is recommended to assess the uncertainty across all scenarios of drug efficacy estimates. Finally, based on our model estimates we were able to determine the required number of eggs to count for each sample size, enabling investigators to optimize the probability of correctly classifying a theoretical TDE while minimizing both financial and technical resources. Copyright © 2018 Elsevier B.V. All rights reserved.
Prediction of Software Reliability using Bio Inspired Soft Computing Techniques.
Diwaker, Chander; Tomar, Pradeep; Poonia, Ramesh C; Singh, Vijander
2018-04-10
A lot of models have been made for predicting software reliability. The reliability models are restricted to using particular types of methodologies and restricted number of parameters. There are a number of techniques and methodologies that may be used for reliability prediction. There is need to focus on parameters consideration while estimating reliability. The reliability of a system may increase or decreases depending on the selection of different parameters used. Thus there is need to identify factors that heavily affecting the reliability of the system. In present days, reusability is mostly used in the various area of research. Reusability is the basis of Component-Based System (CBS). The cost, time and human skill can be saved using Component-Based Software Engineering (CBSE) concepts. CBSE metrics may be used to assess those techniques which are more suitable for estimating system reliability. Soft computing is used for small as well as large-scale problems where it is difficult to find accurate results due to uncertainty or randomness. Several possibilities are available to apply soft computing techniques in medicine related problems. Clinical science of medicine using fuzzy-logic, neural network methodology significantly while basic science of medicine using neural-networks-genetic algorithm most frequently and preferably. There is unavoidable interest shown by medical scientists to use the various soft computing methodologies in genetics, physiology, radiology, cardiology and neurology discipline. CBSE boost users to reuse the past and existing software for making new products to provide quality with a saving of time, memory space, and money. This paper focused on assessment of commonly used soft computing technique like Genetic Algorithm (GA), Neural-Network (NN), Fuzzy Logic, Support Vector Machine (SVM), Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO), and Artificial Bee Colony (ABC). This paper presents working of soft computing techniques and assessment of soft computing techniques to predict reliability. The parameter considered while estimating and prediction of reliability are also discussed. This study can be used in estimation and prediction of the reliability of various instruments used in the medical system, software engineering, computer engineering and mechanical engineering also. These concepts can be applied to both software and hardware, to predict the reliability using CBSE.
Scanlon, Kelly A; Gray, George M; Francis, Royce A; Lloyd, Shannon M; LaPuma, Peter
2013-03-06
Life cycle assessment (LCA) is a systems-based method used to determine potential impacts to the environment associated with a product throughout its life cycle. Conclusions from LCA studies can be applied to support decisions regarding product design or public policy, therefore, all relevant inputs (e.g., raw materials, energy) and outputs (e.g., emissions, waste) to the product system should be evaluated to estimate impacts. Currently, work-related impacts are not routinely considered in LCA. The objectives of this paper are: 1) introduce the work environment disability-adjusted life year (WE-DALY), one portion of a characterization factor used to express the magnitude of impacts to human health attributable to work-related exposures to workplace hazards; 2) outline the methods for calculating the WE-DALY; 3) demonstrate the calculation; and 4) highlight strengths and weaknesses of the methodological approach. The concept of the WE-DALY and the methodological approach to its calculation is grounded in the World Health Organization's disability-adjusted life year (DALY). Like the DALY, the WE-DALY equation considers the years of life lost due to premature mortality and the years of life lived with disability outcomes to estimate the total number of years of healthy life lost in a population. The equation requires input in the form of the number of fatal and nonfatal injuries and illnesses that occur in the industries relevant to the product system evaluated in the LCA study, the age of the worker at the time of the fatal or nonfatal injury or illness, the severity of the injury or illness, and the duration of time lived with the outcomes of the injury or illness. The methodological approach for the WE-DALY requires data from various sources, multi-step instructions to determine each variable used in the WE-DALY equation, and assumptions based on professional opinion. Results support the use of the WE-DALY in a characterization factor in LCA. Integrating occupational health into LCA studies will provide opportunities to prevent shifting of impacts between the work environment and the environment external to the workplace and co-optimize human health, to include worker health, and environmental health.
Phalanx. The Bulletin of Military Operations Research. Volume 45, Number 3, September 2012
2012-09-01
quantitative, information-based methodology of social and cultural reasoning began in the early 19th century, when Adolphe Quetelet and Auguste Comte ...representing MAS at the 29th ISMOR (www.ismor.com) near Hampshire, UK in late August . We plan to extend our inter- national presence and contributions...dedicated to the test and experimentation mission. In August of 1991, Dr. Bryson relinquished his command of CDEC and became the Technical Director
Comparison of single and consecutive dual frequency induction surface hardening of gear wheels
NASA Astrophysics Data System (ADS)
Barglik, J.; Ducki, K.; Kukla, D.; Mizera, J.; Mrówka-Nowotnik, G.; Sieniawski, J.; Smalcerz, A.
2018-05-01
Mathematical modelling of single and consecutive dual - frequency induction surface hardening systems are presented and compared. The both models are solved by the 3D FEM-based professional software supported by a number of own numerical procedures. The methodology is illustrated with some examples of surface induction hardening of a gear wheel made of steel 41Cr4. The computations are in a good accordance with experiments provided on the laboratory stand.
Identifying Aircraft and Personnel Needs to Meet On-station Patrol Requirements
2014-06-17
One option would be to develop a fully stochastic model that explicitly examined unplanned maintenance ( Marlow and Novak 2013; Mattila et al. 2008...stationed at the base and the serviceability rate, respectively (as in Marlow and Novak 2013). Next, if one assumes that, for the number of available AU...of Intelligent & Robotic Systems 70: 347-359. 7. Marlow D and Novak A (2013). Fleet Sizing Analysis Methodologies for the Royal Australian Navy’s
Sophocleous, M.
2000-01-01
A practical methodology for recharge characterization was developed based on several years of field-oriented research at 10 sites in the Great Bend Prairie of south-central Kansas. This methodology combines the soil-water budget on a storm-by-storm year-round basis with the resulting watertable rises. The estimated 1985-1992 average annual recharge was less than 50mm/year with a range from 15 mm/year (during the 1998 drought) to 178 mm/year (during the 1993 flood year). Most of this recharge occurs during the spring months. To regionalize these site-specific estimates, an additional methodology based on multiple (forward) regression analysis combined with classification and GIS overlay analyses was developed and implemented. The multiple regression analysis showed that the most influential variables were, in order of decreasing importance, total annual precipitation, average maximum springtime soil-profile water storage, average shallowest springtime depth to watertable, and average springtime precipitation rate. Therefore, four GIS (ARC/INFO) data "layers" or coverages were constructed for the study region based on these four variables, and each such coverage was classified into the same number of data classes to avoid biasing the results. The normalized regression coefficients were employed to weigh the class rankings of each recharge-affecting variable. This approach resulted in recharge zonations that agreed well with the site recharge estimates. During the "Great Flood of 1993," when rainfall totals exceeded normal levels by -200% in the northern portion of the study region, the developed regionalization methodology was tested against such extreme conditions, and proved to be both practical, based on readily available or easily measurable data, and robust. It was concluded that the combination of multiple regression and GIS overlay analyses is a powerful and practical approach to regionalizing small samples of recharge estimates.
Raymer, James; van der Erf, Rob; van Wissen, Leo
2010-01-01
Due to differences in definitions and measurement methods, cross-country comparisons of international migration patterns are difficult and confusing. Emigration numbers reported by sending countries tend to differ from the corresponding immigration numbers reported by receiving countries. In this paper, a methodology is presented to achieve harmonised estimates of migration flows benchmarked to a specific definition of duration. This methodology accounts for both differences in definitions and the effects of measurement error due to, for example, under reporting and sampling fluctuations. More specifically, the differences between the two sets of reported data are overcome by estimating a set of adjustment factors for each country’s immigration and emigration data. The adjusted data take into account any special cases where the origin–destination patterns do not match the overall patterns. The new method for harmonising migration flows that we present is based on earlier efforts by Poulain (European Journal of Population, 9(4): 353–381 1993, Working Paper 12, joint ECE-Eurostat Work Session on Migration Statistics, Geneva, Switzerland 1999) and is illustrated for movements between 19 European countries from 2002 to 2007. The results represent a reliable and consistent set of international migration flows that can be used for understanding recent changes in migration patterns, as inputs into population projections and for developing evidence-based migration policies. PMID:21124647
de Beer, Joop; Raymer, James; van der Erf, Rob; van Wissen, Leo
2010-11-01
Due to differences in definitions and measurement methods, cross-country comparisons of international migration patterns are difficult and confusing. Emigration numbers reported by sending countries tend to differ from the corresponding immigration numbers reported by receiving countries. In this paper, a methodology is presented to achieve harmonised estimates of migration flows benchmarked to a specific definition of duration. This methodology accounts for both differences in definitions and the effects of measurement error due to, for example, under reporting and sampling fluctuations. More specifically, the differences between the two sets of reported data are overcome by estimating a set of adjustment factors for each country's immigration and emigration data. The adjusted data take into account any special cases where the origin-destination patterns do not match the overall patterns. The new method for harmonising migration flows that we present is based on earlier efforts by Poulain (European Journal of Population, 9(4): 353-381 1993, Working Paper 12, joint ECE-Eurostat Work Session on Migration Statistics, Geneva, Switzerland 1999) and is illustrated for movements between 19 European countries from 2002 to 2007. The results represent a reliable and consistent set of international migration flows that can be used for understanding recent changes in migration patterns, as inputs into population projections and for developing evidence-based migration policies.
NASA Astrophysics Data System (ADS)
Peruchena, Carlos M. Fernández; García-Barberena, Javier; Guisado, María Vicenta; Gastón, Martín
2016-05-01
The design of Concentrating Solar Thermal Power (CSTP) systems requires a detailed knowledge of the dynamic behavior of the meteorology at the site of interest. Meteorological series are often condensed into one representative year with the aim of data volume reduction and speeding-up of energy system simulations, defined as Typical Meteorological Year (TMY). This approach seems to be appropriate for rather detailed simulations of a specific plant; however, in previous stages of the design of a power plant, especially during the optimization of the large number of plant parameters before a final design is reached, a huge number of simulations are needed. Even with today's technology, the computational effort to simulate solar energy system performance with one year of data at high frequency (as 1-min) may become colossal if a multivariable optimization has to be performed. This work presents a simple and efficient methodology for selecting number of individual days able to represent the electrical production of the plant throughout the complete year. To achieve this objective, a new procedure for determining a reduced set of typical weather data in order to evaluate the long-term performance of a solar energy system is proposed. The proposed methodology is based on cluster analysis and permits to drastically reduce computational effort related to the calculation of a CSTP plant energy yield by simulating a reduced number of days from a high frequency TMY.
Gurarie, David; Karl, Stephan; Zimmerman, Peter A; King, Charles H; St Pierre, Timothy G; Davis, Timothy M E
2012-01-01
Agent-based modeling of Plasmodium falciparum infection offers an attractive alternative to the conventional Ross-Macdonald methodology, as it allows simulation of heterogeneous communities subjected to realistic transmission (inoculation patterns). We developed a new, agent based model that accounts for the essential in-host processes: parasite replication and its regulation by innate and adaptive immunity. The model also incorporates a simplified version of antigenic variation by Plasmodium falciparum. We calibrated the model using data from malaria-therapy (MT) studies, and developed a novel calibration procedure that accounts for a deterministic and a pseudo-random component in the observed parasite density patterns. Using the parasite density patterns of 122 MT patients, we generated a large number of calibrated parameters. The resulting data set served as a basis for constructing and simulating heterogeneous agent-based (AB) communities of MT-like hosts. We conducted several numerical experiments subjecting AB communities to realistic inoculation patterns reported from previous field studies, and compared the model output to the observed malaria prevalence in the field. There was overall consistency, supporting the potential of this agent-based methodology to represent transmission in realistic communities. Our approach represents a novel, convenient and versatile method to model Plasmodium falciparum infection.
Consumer Neuroscience-Based Metrics Predict Recall, Liking and Viewing Rates in Online Advertising.
Guixeres, Jaime; Bigné, Enrique; Ausín Azofra, Jose M; Alcañiz Raya, Mariano; Colomer Granero, Adrián; Fuentes Hurtado, Félix; Naranjo Ornedo, Valery
2017-01-01
The purpose of the present study is to investigate whether the effectiveness of a new ad on digital channels (YouTube) can be predicted by using neural networks and neuroscience-based metrics (brain response, heart rate variability and eye tracking). Neurophysiological records from 35 participants were exposed to 8 relevant TV Super Bowl commercials. Correlations between neurophysiological-based metrics, ad recall, ad liking, the ACE metrix score and the number of views on YouTube during a year were investigated. Our findings suggest a significant correlation between neuroscience metrics and self-reported of ad effectiveness and the direct number of views on the YouTube channel. In addition, and using an artificial neural network based on neuroscience metrics, the model classifies (82.9% of average accuracy) and estimate the number of online views (mean error of 0.199). The results highlight the validity of neuromarketing-based techniques for predicting the success of advertising responses. Practitioners can consider the proposed methodology at the design stages of advertising content, thus enhancing advertising effectiveness. The study pioneers the use of neurophysiological methods in predicting advertising success in a digital context. This is the first article that has examined whether these measures could actually be used for predicting views for advertising on YouTube.
Consumer Neuroscience-Based Metrics Predict Recall, Liking and Viewing Rates in Online Advertising
Guixeres, Jaime; Bigné, Enrique; Ausín Azofra, Jose M.; Alcañiz Raya, Mariano; Colomer Granero, Adrián; Fuentes Hurtado, Félix; Naranjo Ornedo, Valery
2017-01-01
The purpose of the present study is to investigate whether the effectiveness of a new ad on digital channels (YouTube) can be predicted by using neural networks and neuroscience-based metrics (brain response, heart rate variability and eye tracking). Neurophysiological records from 35 participants were exposed to 8 relevant TV Super Bowl commercials. Correlations between neurophysiological-based metrics, ad recall, ad liking, the ACE metrix score and the number of views on YouTube during a year were investigated. Our findings suggest a significant correlation between neuroscience metrics and self-reported of ad effectiveness and the direct number of views on the YouTube channel. In addition, and using an artificial neural network based on neuroscience metrics, the model classifies (82.9% of average accuracy) and estimate the number of online views (mean error of 0.199). The results highlight the validity of neuromarketing-based techniques for predicting the success of advertising responses. Practitioners can consider the proposed methodology at the design stages of advertising content, thus enhancing advertising effectiveness. The study pioneers the use of neurophysiological methods in predicting advertising success in a digital context. This is the first article that has examined whether these measures could actually be used for predicting views for advertising on YouTube. PMID:29163251
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
Hosseini, Marjan; Kerachian, Reza
2017-09-01
This paper presents a new methodology for analyzing the spatiotemporal variability of water table levels and redesigning a groundwater level monitoring network (GLMN) using the Bayesian Maximum Entropy (BME) technique and a multi-criteria decision-making approach based on ordered weighted averaging (OWA). The spatial sampling is determined using a hexagonal gridding pattern and a new method, which is proposed to assign a removal priority number to each pre-existing station. To design temporal sampling, a new approach is also applied to consider uncertainty caused by lack of information. In this approach, different time lag values are tested by regarding another source of information, which is simulation result of a numerical groundwater flow model. Furthermore, to incorporate the existing uncertainties in available monitoring data, the flexibility of the BME interpolation technique is taken into account in applying soft data and improving the accuracy of the calculations. To examine the methodology, it is applied to the Dehgolan plain in northwestern Iran. Based on the results, a configuration of 33 monitoring stations for a regular hexagonal grid of side length 3600 m is proposed, in which the time lag between samples is equal to 5 weeks. Since the variance estimation errors of the BME method are almost identical for redesigned and existing networks, the redesigned monitoring network is more cost-effective and efficient than the existing monitoring network with 52 stations and monthly sampling frequency.
High-performance radial AMTEC cell design for ultra-high-power solar AMTEC systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendricks, T.J.; Huang, C.
1999-07-01
Alkali Metal Thermal to Electric Conversion (AMTEC) technology is rapidly maturing for potential application in ultra-high-power solar AMTEC systems required by potential future US Air Force (USAF) spacecraft missions in medium-earth and geosynchronous orbits (MEO and GEO). Solar thermal AMTEC power systems potentially have several important advantages over current solar photovoltaic power systems in ultra-high-power spacecraft applications for USAF MEO and GEO missions. This work presents key aspects of radial AMTEC cell design to achieve high cell performance in solar AMTEC systems delivering larger than 50 kW(e) to support high power USAF missions. These missions typically require AMTEC cell conversionmore » efficiency larger than 25%. A sophisticated design parameter methodology is described and demonstrated which establishes optimum design parameters in any radial cell design to satisfy high-power mission requirements. Specific relationships, which are distinct functions of cell temperatures and pressures, define critical dependencies between key cell design parameters, particularly the impact of parasitic thermal losses on Beta Alumina Solid Electrolyte (BASE) area requirements, voltage, number of BASE tubes, and system power production for both maximum power-per-BASE-area and optimum efficiency conditions. Finally, some high-level system tradeoffs are demonstrated using the design parameter methodology to establish high-power radial cell design requirements and philosophy. The discussion highlights how to incorporate this methodology with sophisticated SINDA/FLUINT AMTEC cell modeling capabilities to determine optimum radial AMTEC cell designs.« less
New geometric design consistency model based on operating speed profiles for road safety evaluation.
Camacho-Torregrosa, Francisco J; Pérez-Zuriaga, Ana M; Campoy-Ungría, J Manuel; García-García, Alfredo
2013-12-01
To assist in the on-going effort to reduce road fatalities as much as possible, this paper presents a new methodology to evaluate road safety in both the design and redesign stages of two-lane rural highways. This methodology is based on the analysis of road geometric design consistency, a value which will be a surrogate measure of the safety level of the two-lane rural road segment. The consistency model presented in this paper is based on the consideration of continuous operating speed profiles. The models used for their construction were obtained by using an innovative GPS-data collection method that is based on continuous operating speed profiles recorded from individual drivers. This new methodology allowed the researchers to observe the actual behavior of drivers and to develop more accurate operating speed models than was previously possible with spot-speed data collection, thereby enabling a more accurate approximation to the real phenomenon and thus a better consistency measurement. Operating speed profiles were built for 33 Spanish two-lane rural road segments, and several consistency measurements based on the global and local operating speed were checked. The final consistency model takes into account not only the global dispersion of the operating speed, but also some indexes that consider both local speed decelerations and speeds over posted speeds as well. For the development of the consistency model, the crash frequency for each study site was considered, which allowed estimating the number of crashes on a road segment by means of the calculation of its geometric design consistency. Consequently, the presented consistency evaluation method is a promising innovative tool that can be used as a surrogate measure to estimate the safety of a road segment. Copyright © 2012 Elsevier Ltd. All rights reserved.
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
A novel approach based on preference-based index for interval bilevel linear programming problem.
Ren, Aihong; Wang, Yuping; Xue, Xingsi
2017-01-01
This paper proposes a new methodology for solving the interval bilevel linear programming problem in which all coefficients of both objective functions and constraints are considered as interval numbers. In order to keep as much uncertainty of the original constraint region as possible, the original problem is first converted into an interval bilevel programming problem with interval coefficients in both objective functions only through normal variation of interval number and chance-constrained programming. With the consideration of different preferences of different decision makers, the concept of the preference level that the interval objective function is preferred to a target interval is defined based on the preference-based index. Then a preference-based deterministic bilevel programming problem is constructed in terms of the preference level and the order relation [Formula: see text]. Furthermore, the concept of a preference δ -optimal solution is given. Subsequently, the constructed deterministic nonlinear bilevel problem is solved with the help of estimation of distribution algorithm. Finally, several numerical examples are provided to demonstrate the effectiveness of the proposed approach.
Numerical and experimental investigation of a beveled trailing-edge flow field and noise emission
NASA Astrophysics Data System (ADS)
van der Velden, W. C. P.; Pröbsting, S.; van Zuijlen, A. H.; de Jong, A. T.; Guan, Y.; Morris, S. C.
2016-12-01
Efficient tools and methodology for the prediction of trailing-edge noise experience substantial interest within the wind turbine industry. In recent years, the Lattice Boltzmann Method has received increased attention for providing such an efficient alternative for the numerical solution of complex flow problems. Based on the fully explicit, transient, compressible solution of the Lattice Boltzmann Equation in combination with a Ffowcs-Williams and Hawking aeroacoustic analogy, an estimation of the acoustic radiation in the far field is obtained. To validate this methodology for the prediction of trailing-edge noise, the flow around a flat plate with an asymmetric 25° beveled trailing edge and obtuse corner in a low Mach number flow is analyzed. Flow field dynamics are compared to data obtained experimentally from Particle Image Velocimetry and Hot Wire Anemometry, and compare favorably in terms of mean velocity field and turbulent fluctuations. Moreover, the characteristics of the unsteady surface pressure, which are closely related to the acoustic emission, show good agreement between simulation and experiment. Finally, the prediction of the radiated sound is compared to the results obtained from acoustic phased array measurements in combination with a beamforming methodology. Vortex shedding results in a strong narrowband component centered at a constant Strouhal number in the acoustic spectrum. At higher frequency, a good agreement between simulation and experiment for the broadband noise component is obtained and a typical cardioid-like directivity is recovered.
Novel analytical methods to assess the chemical and physical properties of liposomes.
Kothalawala, Nuwan; Mudalige, Thilak K; Sisco, Patrick; Linder, Sean W
2018-08-01
Liposomes are used in commercial pharmaceutical formulations (PFs) and dietary supplements (DSs) as a carrier vehicle to protect the active ingredient from degradation and to increase the half-life of the injectable. Even as the commercialization of liposomal products has rapidly increased, characterization methodologies to evaluate physical and chemical properties of the liposomal products have not been well-established. Herein we develop rapid methodologies to evaluate chemical and selected physical properties of liposomal formulations. Chemical properties of liposomes are determined by their lipid composition. The lipid composition is evaluated by first screening of the lipids present in the sample using HPLC-ELSD followed by HPLC-MSMS analysis with high mass accuracy (<5 ppm), fragmentation pattern and lipid structure databases searching. Physical properties such as particle size and size distribution were investigated using Tunable Resistive Pulse Sensing (TRPS). The developed methods were used to analyze commercially available PFs and DSs. As results, PFs contain distinct number of lipids as indicated by the manufacture, but DSs were more complicated containing a large number of lipids belonging to different sub-classes. Commercially available liposomes have particles with wide size distribution based on size measurements performed by TRPS. The high mass accuracy as well as identification lipids using multiple fragment ions aided to accurately identify the lipids and differentiate them from other lipophilic molecules. The developed analytical methodologies were successfully adapted to measure the physiochemical properties of commercial liposomes. Copyright © 2018. Published by Elsevier B.V.
Treweek, Shaun; Francis, Jill J; Bonetti, Debbie; Barnett, Karen; Eccles, Martin P; Hudson, Jemma; Jones, Claire; Pitts, Nigel B; Ricketts, Ian W; Sullivan, Frank; Weal, Mark; MacLennan, Graeme
2016-12-01
Intervention Modeling Experiments (IMEs) are a way of developing and testing behavior change interventions before a trial. We aimed to test this methodology in a Web-based IME that replicated the trial component of an earlier, paper-based IME. Three-arm, Web-based randomized evaluation of two interventions (persuasive communication and action plan) and a "no intervention" comparator. The interventions were designed to reduce the number of antibiotic prescriptions in the management of uncomplicated upper respiratory tract infection. General practitioners (GPs) were invited to complete an online questionnaire and eight clinical scenarios where an antibiotic might be considered. One hundred twenty-nine GPs completed the questionnaire. GPs receiving the persuasive communication did not prescribe an antibiotic in 0.70 more scenarios (95% confidence interval [CI] = 0.17-1.24) than those in the control arm. For the action plan, GPs did not prescribe an antibiotic in 0.63 (95% CI = 0.11-1.15) more scenarios than those in the control arm. Unlike the earlier IME, behavioral intention was unaffected by the interventions; this may be due to a smaller sample size than intended. A Web-based IME largely replicated the findings of an earlier paper-based study, providing some grounds for confidence in the IME methodology. Copyright © 2016 Elsevier Inc. All rights reserved.
Li, Honghe; Ding, Ning; Zhang, Yuanyuan; Liu, Yang; Wen, Deliang
2017-01-01
Over the last three decades, various instruments were developed and employed to assess medical professionalism, but their measurement properties have yet to be fully evaluated. This study aimed to systematically evaluate these instruments' measurement properties and the methodological quality of their related studies within a universally acceptable standardized framework and then provide corresponding recommendations. A systematic search of the electronic databases PubMed, Web of Science, and PsycINFO was conducted to collect studies published from 1990-2015. After screening titles, abstracts, and full texts for eligibility, the articles included in this study were classified according to their respective instrument's usage. A two-phase assessment was conducted: 1) methodological quality was assessed by following the COnsensus-based Standards for the selection of health status Measurement INstruments (COSMIN) checklist; and 2) the quality of measurement properties was assessed according to Terwee's criteria. Results were integrated using best-evidence synthesis to look for recommendable instruments. After screening 2,959 records, 74 instruments from 80 existing studies were included. The overall methodological quality of these studies was unsatisfactory, with reasons including but not limited to unknown missing data, inadequate sample sizes, and vague hypotheses. Content validity, cross-cultural validity, and criterion validity were either unreported or negative ratings in most studies. Based on best-evidence synthesis, three instruments were recommended: Hisar's instrument for nursing students, Nurse Practitioners' Roles and Competencies Scale, and Perceived Faculty Competency Inventory. Although instruments measuring medical professionalism are diverse, only a limited number of studies were methodologically sound. Future studies should give priority to systematically improving the performance of existing instruments and to longitudinal studies.
Investigation of Super Learner Methodology on HIV-1 Small Sample: Application on Jaguar Trial Data.
Houssaïni, Allal; Assoumou, Lambert; Marcelin, Anne Geneviève; Molina, Jean Michel; Calvez, Vincent; Flandre, Philippe
2012-01-01
Background. Many statistical models have been tested to predict phenotypic or virological response from genotypic data. A statistical framework called Super Learner has been introduced either to compare different methods/learners (discrete Super Learner) or to combine them in a Super Learner prediction method. Methods. The Jaguar trial is used to apply the Super Learner framework. The Jaguar study is an "add-on" trial comparing the efficacy of adding didanosine to an on-going failing regimen. Our aim was also to investigate the impact on the use of different cross-validation strategies and different loss functions. Four different repartitions between training set and validations set were tested through two loss functions. Six statistical methods were compared. We assess performance by evaluating R(2) values and accuracy by calculating the rates of patients being correctly classified. Results. Our results indicated that the more recent Super Learner methodology of building a new predictor based on a weighted combination of different methods/learners provided good performance. A simple linear model provided similar results to those of this new predictor. Slight discrepancy arises between the two loss functions investigated, and slight difference arises also between results based on cross-validated risks and results from full dataset. The Super Learner methodology and linear model provided around 80% of patients correctly classified. The difference between the lower and higher rates is around 10 percent. The number of mutations retained in different learners also varys from one to 41. Conclusions. The more recent Super Learner methodology combining the prediction of many learners provided good performance on our small dataset.
Nuclear power plant digital system PRA pilot study with the dynamic flow-graph methodology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yau, M.; Motamed, M.; Guarro, S.
2006-07-01
Current Probabilistic Risk Assessment (PRA) methodology is well established in analyzing hardware and some of the key human interactions. However processes for analyzing the software functions of digital systems within a plant PRA framework, and accounting for the digital system contribution to the overall risk are not generally available nor are they well understood and established. A recent study reviewed a number of methodologies that have potential applicability to modeling and analyzing digital systems within a PRA framework. This study identified the Dynamic Flow-graph Methodology (DFM) and the Markov Methodology as the most promising tools. As a result of thismore » study, a task was defined under the framework of a collaborative agreement between the U.S. Nuclear Regulatory Commission (NRC) and the Ohio State Univ. (OSU). The objective of this task is to set up benchmark systems representative of digital systems used in nuclear power plants and to evaluate DFM and the Markov methodology with these benchmark systems. The first benchmark system is a typical Pressurized Water Reactor (PWR) Steam Generator (SG) Feedwater System (FWS) level control system based on an earlier ASCA work with the U.S. NRC 2, upgraded with modern control laws. ASCA, Inc. is currently under contract to OSU to apply DFM to this benchmark system. The goal is to investigate the feasibility of using DFM to analyze and quantify digital system risk, and to integrate the DFM analytical results back into the plant event tree/fault tree PRA model. (authors)« less
Jain, Ram B
2017-07-01
Prevalence of smoking is needed to estimate the need for future public health resources. To compute and compare smoking prevalence rates by using self-reported smoking statuses, two serum cotinine (SCOT) based biomarker methods, and one urinary 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanol (NNAL) based biomarker method. These estimates were then used to develop correction factors to be applicable to self-reported prevalences to arrive at corrected smoking prevalence rates. Data from National Health and Nutrition Examination Survey (NHANES) for 2007-2012 for those aged ≥20 years (N = 16826) were used. Self-reported prevalence rate for the total population computed as the weighted number of self-reported smokers divided by weighted number of all participants was 21.6% and 24% when computed by weighted number of self-reported smokers divided by the weighted number of self-reported smokers and nonsmokers. The corrected prevalence rate was found to be 25.8%. A 1% underestimate in smoking prevalence is equivalent to not being able to identify 2.2 million smokers in US in a given year. This underestimation, if not corrected, could lead to serious gap in the public health services available and needed to provide adequate preventive and corrective treatment to smokers.
Assessment of circulating copy number variant detection for cancer screening.
Molparia, Bhuvan; Nichani, Eshaan; Torkamani, Ali
2017-01-01
Current high-sensitivity cancer screening methods, largely utilizing correlative biomarkers, suffer from false positive rates that lead to unnecessary medical procedures and debatable public health benefit overall. Detection of circulating tumor DNA (ctDNA), a causal biomarker, has the potential to revolutionize cancer screening. Thus far, the majority of ctDNA studies have focused on detection of tumor-specific point mutations after cancer diagnosis for the purpose of post-treatment surveillance. However, ctDNA point mutation detection methods developed to date likely lack either the scope or analytical sensitivity necessary to be useful for cancer screening, due to the low (<1%) ctDNA fraction derived from early stage tumors. On the other hand, tumor-derived copy number variant (CNV) detection is hypothetically a superior means of ctDNA-based cancer screening for many tumor types, given that, relative to point mutations, each individual tumor CNV contributes a much larger number of ctDNA fragments to the overall pool of circulating free DNA (cfDNA). A small number of studies have demonstrated the potential of ctDNA CNV-based screening in select cancer types. Here we perform an in silico assessment of the potential for ctDNA CNV-based cancer screening across many common cancers, and suggest ctDNA CNV detection shows promise as a broad cancer screening methodology.
Probabilistic Analysis for Comparing Fatigue Data Based on Johnson-Weibull Parameters
NASA Technical Reports Server (NTRS)
Hendricks, Robert C.; Zaretsky, Erwin V.; Vicek, Brian L.
2007-01-01
Probabilistic failure analysis is essential when analysis of stress-life (S-N) curves is inconclusive in determining the relative ranking of two or more materials. In 1964, L. Johnson published a methodology for establishing the confidence that two populations of data are different. Simplified algebraic equations for confidence numbers were derived based on the original work of L. Johnson. Using the ratios of mean life, the resultant values of confidence numbers deviated less than one percent from those of Johnson. It is possible to rank the fatigue lives of different materials with a reasonable degree of statistical certainty based on combined confidence numbers. These equations were applied to rotating beam fatigue tests that were conducted on three aluminum alloys at three stress levels each. These alloys were AL 2024, AL 6061, and AL 7075. The results were analyzed and compared using ASTM Standard E739-91 and the Johnson-Weibull analysis. The ASTM method did not statistically distinguish between AL 6010 and AL 7075. Based on the Johnson-Weibull analysis confidence numbers greater than 99 percent, AL 2024 was found to have the longest fatigue life, followed by AL 7075, and then AL 6061. The ASTM Standard and the Johnson-Weibull analysis result in the same stress-life exponent p for each of the three aluminum alloys at the median or L(sub 50) lives.
NASA Astrophysics Data System (ADS)
Dib, Alain; Kavvas, M. Levent
2018-03-01
The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.
Application of a territorial-based filtering algorithm in turbomachinery blade design optimization
NASA Astrophysics Data System (ADS)
Bahrami, Salman; Khelghatibana, Maryam; Tribes, Christophe; Yi Lo, Suk; von Fellenberg, Sven; Trépanier, Jean-Yves; Guibault, François
2017-02-01
A territorial-based filtering algorithm (TBFA) is proposed as an integration tool in a multi-level design optimization methodology. The design evaluation burden is split between low- and high-cost levels in order to properly balance the cost and required accuracy in different design stages, based on the characteristics and requirements of the case at hand. TBFA is in charge of connecting those levels by selecting a given number of geometrically different promising solutions from the low-cost level to be evaluated in the high-cost level. Two test case studies, a Francis runner and a transonic fan rotor, have demonstrated the robustness and functionality of TBFA in real industrial optimization problems.
Additive Manufacturing in Production: A Study Case Applying Technical Requirements
NASA Astrophysics Data System (ADS)
Ituarte, Iñigo Flores; Coatanea, Eric; Salmi, Mika; Tuomi, Jukka; Partanen, Jouni
Additive manufacturing (AM) is expanding the manufacturing capabilities. However, quality of AM produced parts is dependent on a number of machine, geometry and process parameters. The variability of these parameters affects the manufacturing drastically and therefore standardized processes and harmonized methodologies need to be developed to characterize the technology for end use applications and enable the technology for manufacturing. This research proposes a composite methodology integrating Taguchi Design of Experiments, multi-objective optimization and statistical process control, to optimize the manufacturing process and fulfil multiple requirements imposed to an arbitrary geometry. The proposed methodology aims to characterize AM technology depending upon manufacturing process variables as well as to perform a comparative assessment of three AM technologies (Selective Laser Sintering, Laser Stereolithography and Polyjet). Results indicate that only one machine, laser-based Stereolithography, was feasible to fulfil simultaneously macro and micro level geometrical requirements but mechanical properties were not at required level. Future research will study a single AM system at the time to characterize AM machine technical capabilities and stimulate pre-normative initiatives of the technology for end use applications.
Indocyanine green videoangiography methodological variations: review.
Simal-Julián, Juan A; Miranda-Lloret, Pablo; Evangelista-Zamora, Rocio; Sanromán-Álvarez, Pablo; Pérez de San Román, Laila; Pérez-Borredá, Pedro; Beltrán-Giner, Andrés; Botella-Asunción, Carlos
2015-01-01
Indocyanine green videoangiography (ICGVA) procedures have become widespread within the spectrum of microsurgical techniques for neurovascular pathologies. We have conducted a review to identify and assess the impact of all of the methodological variations of conventional ICGVA applied in the field of neurovascular pathology that have been published to date in the English literature. A total of 18 studies were included in this review, identifying four primary methodological variants compared to conventional ICGVA: techniques based on the transient occlusion, intra-arterial ICG administration via catheters, use of endoscope system with a filter to collect florescence of ICG, and quantitative fluorescence analysis. These variants offer some possibilities for resolving the limitations of the conventional technique (first, the vascular structure to be analyzed must be exposed and second, vascular filling with ICG follows an additive pattern) and allow qualitatively superior information to be obtained during surgery. Advantages and disadvantages of each procedure are discussed. More case studies with a greater number of patients are needed to compare the different procedures with their gold standard, in order to establish these results consistently.
Determining radiated sound power of building structures by means of laser Doppler vibrometry
NASA Astrophysics Data System (ADS)
Roozen, N. B.; Labelle, L.; Rychtáriková, M.; Glorieux, C.
2015-06-01
This paper introduces a methodology that makes use of laser Doppler vibrometry to assess the acoustic insulation performance of a building element. The sound power radiated by the surface of the element is numerically determined from the vibrational pattern, offering an alternative for classical microphone measurements. Compared to the latter the proposed analysis is not sensitive to room acoustical effects. This allows the proposed methodology to be used at low frequencies, where the standardized microphone based approach suffers from a high uncertainty due to a low acoustic modal density. Standardized measurements as well as laser Doppler vibrometry measurements and computations have been performed on two test panels, a light-weight wall and a gypsum block wall and are compared and discussed in this paper. The proposed methodology offers an adequate solution for the assessment of the acoustic insulation of building elements at low frequencies. This is crucial in the framework of recent proposals of acoustic standards for measurement approaches and single number sound insulation performance ratings to take into account frequencies down to 50 Hz.
Wu, Yang; Doering, Jon A; Ma, Zhiyuan; Tang, Song; Liu, Hongling; Zhang, Xiaowei; Wang, Xiaoxiang; Yu, Hongxia
2016-09-01
A tremendous gap exists between the number of potential endocrine disrupting chemicals (EDCs) possibly in the environment and the limitation of traditional regulatory testing. In this study, the anti-androgenic potencies of 21 flavonoids were analyzed in vitro, and another 32 flavonoids from the literature were selected as additional chemicals. Molecular dynamic simulations were employed to obtain four different separation approaches based on the different behaviors of ligands and receptors during the process of interaction. Specifically, ligand-receptor complex which highlighted the discriminating features of ligand escape or retention via "mousetrap" mechanism, hydrogen bonds formed during simulation times, ligand stability and the stability of the helix-12 of the receptor were investigated. Together, a methodology was generated that 87.5% of flavonoids could be discriminated as active versus inactive antagonists, and over 90% inactive antagonists could be filtered out before QSAR study. This methodology could be used as a "proof of concept" to identify inactive anti-androgenic flavonoids, as well could be beneficial for rapid risk assessment and regulation of multiple new chemicals for androgenicity. Copyright © 2016 Elsevier Ltd. All rights reserved.
Health sciences descriptors in the brazilian speech-language and hearing science.
Campanatti-Ostiz, Heliane; Andrade, Claudia Regina Furquim de
2010-01-01
Terminology in Speech-Language and Hearing Science. To propose a specific thesaurus about the Speech-Language and Hearing Science, for the English, Portuguese and Spanish languages, based on the existing keywords available on the Health Sciences Descriptors (DeCS). Methodology was based on the pilot study developed by Campanatti-Ostiz and Andrade; that had as a purpose to verify the methodological viability for the creation of a Speech-Language and Hearing Science category in the DeCS. The scientific journals selected for analyses of the titles, abstracts and keywords of all scientific articles were those in the field of the Speech-Language and Hearing Science, indexed on the SciELO. 1. Recovery of the Descriptors in the English language (Medical Subject Headings--MeSH); 2. Recovery and hierarchic organization of the descriptors in the Portuguese language was done (DeCS). The obtained data was analyzed as follows: descriptive analyses and relative relevance analyses of the DeCS areas. Based on the first analyses, we decided to select all 761 descriptors, with all the hierarchic numbers, independently of their occurrence (occurrence number--ON), and based on the second analyses, we decided to propose to exclude the less relevant areas and the exclusive DeCS areas. The proposal was finished with a total of 1676 occurrences of DeCS descriptors, distributed in the following areas: Anatomy; Diseases; Analytical, Diagnostic and Therapeutic Techniques and Equipments; Psychiatry and Psychology; Phenomena and Processes; Health Care. The presented proposal of a thesaurus contains the specific terminology of the Brazilian Speech-Language and Hearing Sciences and reflects the descriptors of the published scientific production. Being the DeCS a trilingual vocabulary (Portuguese, English and Spanish), the present descriptors organization proposition can be used in these three languages, allowing greater cultural interchange between different nations.
A Novel Performance Evaluation Methodology for Single-Target Trackers.
Kristan, Matej; Matas, Jiri; Leonardis, Ales; Vojir, Tomas; Pflugfelder, Roman; Fernandez, Gustavo; Nebehay, Georg; Porikli, Fatih; Cehovin, Luka
2016-11-01
This paper addresses the problem of single-target tracker performance evaluation. We consider the performance measures, the dataset and the evaluation system to be the most important components of tracker evaluation and propose requirements for each of them. The requirements are the basis of a new evaluation methodology that aims at a simple and easily interpretable tracker comparison. The ranking-based methodology addresses tracker equivalence in terms of statistical significance and practical differences. A fully-annotated dataset with per-frame annotations with several visual attributes is introduced. The diversity of its visual properties is maximized in a novel way by clustering a large number of videos according to their visual attributes. This makes it the most sophistically constructed and annotated dataset to date. A multi-platform evaluation system allowing easy integration of third-party trackers is presented as well. The proposed evaluation methodology was tested on the VOT2014 challenge on the new dataset and 38 trackers, making it the largest benchmark to date. Most of the tested trackers are indeed state-of-the-art since they outperform the standard baselines, resulting in a highly-challenging benchmark. An exhaustive analysis of the dataset from the perspective of tracking difficulty is carried out. To facilitate tracker comparison a new performance visualization technique is proposed.
High-frequency measurements of aeolian saltation flux: Field-based methodology and applications
NASA Astrophysics Data System (ADS)
Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.
2018-02-01
Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.
Design Methodology for Multi-Element High-Lift Systems on Subsonic Civil Transport Aircraft
NASA Technical Reports Server (NTRS)
Pepper, R. S.; vanDam, C. P.
1996-01-01
The choice of a high-lift system is crucial in the preliminary design process of a subsonic civil transport aircraft. Its purpose is to increase the allowable aircraft weight or decrease the aircraft's wing area for a given takeoff and landing performance. However, the implementation of a high-lift system into a design must be done carefully, for it can improve the aerodynamic performance of an aircraft but may also drastically increase the aircraft empty weight. If designed properly, a high-lift system can improve the cost effectiveness of an aircraft by increasing the payload weight for a given takeoff and landing performance. This is why the design methodology for a high-lift system should incorporate aerodynamic performance, weight, and cost. The airframe industry has experienced rapid technological growth in recent years which has led to significant advances in high-lift systems. For this reason many existing design methodologies have become obsolete since they are based on outdated low Reynolds number wind-tunnel data and can no longer accurately predict the aerodynamic characteristics or weight of current multi-element wings. Therefore, a new design methodology has been created that reflects current aerodynamic, weight, and cost data and provides enough flexibility to allow incorporation of new data when it becomes available.
Survey of systematic review authors in dentistry: challenges in methodology and reporting.
Major, Michael P; Warren, Sharon; Flores-Mir, Carlos
2009-04-01
The study reported in this article had three objectives: 1) identify the challenges faced by authors of dental systematic reviews (SR) during the process of literature search and selection; 2) determine whether dental SR authors' responses to survey questions about their study methodology were consistent with the reported published methodology; and 3) assess whether dental SR authors' evidence-based publication experience was associated with reported methodology. Seventy-eight authors (53 percent) of dental SRs out of 147 potential authors published from 2000 to 2006 responded to an online survey. According to the respondents, the most challenging aspects of literature search and selection were the initial design and performing extended literature searches. Agreement between the protocol identified by SR authors on the survey and the actual protocol described in their publications was fair to moderate. There were virtually no correlations between authors' publication experience, systematic review literature search, and selection thoroughness except for the number of past SRs published, and no differences in thoroughness between SRs written by clinicians (dental practitioners in the community) and dental school faculty members. Dental SR authors do not appear to fully appreciate the importance of extensive literature searches as central to the validity of their systematic review methods and potential findings.
Rocha, C P; Croci, C S; Caria, P H F
2013-11-01
The objective of this systematic review was to find sufficient evidence to deny or accept the association between the head and cervical posture and temporomandibular disorders (TMDs), and thus assist health professionals in the evaluation and treatment of patients with TMDs. A search was conducted through all publications written in English about this topic using the databases from Medline, ISI Web of Science, EMBASE, PubMed and Lilacs. The abstracts that fulfilled the initial guideline were retrieved and evaluated to ensure they met the inclusion criteria. To assess the methodological quality of the studies, we developed a questionnaire considering the following criteria: participant's eligibility, control group, diagnosis of TMDs, posture diagnosis and randomisation. Twenty-two studies were selected as potential studies based on their abstracts. Only seventeen studies actually fulfilled the inclusion criteria. The search provided information about the methodological quality of the studies, in which several methodological defects were found. The evidence presented in this systematic review shows that the relation between TMDs and the head and neck posture is still controversial and unclear. The insufficient number of articles considered of excellent methodological quality is a factor that hinders the acceptance or denial of this association. © 2013 John Wiley & Sons Ltd.
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
A software engineering approach to expert system design and verification
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.; Goodwin, Mary Ann
1988-01-01
Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.
Quantification of effective exoelectrogens by most probable number (MPN) in a microbial fuel cell.
Heidrich, Elizabeth S; Curtis, Thomas P; Woodcock, Stephen; Dolfing, Jan
2016-10-01
The objective of this work was to quantify the number of exoelectrogens in wastewater capable of producing current in a microbial fuel cell by adapting the classical most probable number (MPN) methodology using current production as end point. Inoculating a series of microbial fuel cells with various dilutions of domestic wastewater and with acetate as test substrate yielded an apparent number of exoelectrogens of 17perml. Using current as a proxy for activity the apparent exoelectrogen growth rate was 0.03h(-1). With starch or wastewater as more complex test substrates similar apparent growth rates were obtained, but the apparent MPN based numbers of exoelectrogens in wastewater were significantly lower, probably because in contrast to acetate, complex substrates require complex food chains to deliver the electrons to the electrodes. Consequently, the apparent MPN is a function of the combined probabilities of members of the food chain being present. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Massetti, Greta M; Simon, Thomas R; Smith, Deborah Gorman
2016-10-01
Drawing on research that has identified specific predictors and trajectories of risk for violence and related negative outcomes, a multitude of small- and large-scale preventive interventions for specific risk behaviors have been developed, implemented, and evaluated. One of the principal challenges of these approaches is that a number of separate problem-specific programs targeting different risk areas have emerged. However, as many negative health behaviors such as substance abuse and violence share a multitude of risk factors, many programs target identical risk factors. There are opportunities to understand whether evidence-based programs can be leveraged for potential effects across a spectrum of outcomes and over time. Some recent work has documented longitudinal effects of evidence-based interventions on generalized outcomes. This work has potential for advancing our understanding of the effectiveness of promising and evidence-based prevention strategies. However, conducting longitudinal follow-up of established interventions presents a number of methodological and design challenges. To answer some of these questions, the Centers for Disease Control and Prevention convened a panel of multidisciplinary experts to discuss opportunities to take advantage of evaluations of early prevention programs and evaluating multiple long-term outcomes. This special section of the journal Prevention Science includes a series of papers that begin to address the relevant considerations for conducting longitudinal follow-up evaluation research. This collection of papers is intended to inform our understanding of the challenges and strategies for conducting longitudinal follow-up evaluation research that could be used to drive future research endeavors.
Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter
2017-02-01
Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.
Displacement Based Multilevel Structural Optimization
NASA Technical Reports Server (NTRS)
Sobieszezanski-Sobieski, J.; Striz, A. G.
1996-01-01
In the complex environment of true multidisciplinary design optimization (MDO), efficiency is one of the most desirable attributes of any approach. In the present research, a new and highly efficient methodology for the MDO subset of structural optimization is proposed and detailed, i.e., for the weight minimization of a given structure under size, strength, and displacement constraints. Specifically, finite element based multilevel optimization of structures is performed. In the system level optimization, the design variables are the coefficients of assumed polynomially based global displacement functions, and the load unbalance resulting from the solution of the global stiffness equations is minimized. In the subsystems level optimizations, the weight of each element is minimized under the action of stress constraints, with the cross sectional dimensions as design variables. The approach is expected to prove very efficient since the design task is broken down into a large number of small and efficient subtasks, each with a small number of variables, which are amenable to parallel computing.
PEBL: A Code for Penetrating and Blunt Trauma, Based on the H-ICDA index
1978-10-01
separation 0 no separation g =I sacroiliac joint separation 0 no separation The injury is encoded as follows 808,1230000110. The root code 808. of the PEBL...acetabulum c = I lschium 0 not ischlum d w I ilium 0 not ilium e - I sacrum 0 not sacrum f I pubic separation 0 no separation g = I sacroiliac Joint ...numbers of combat casualties. Development of methodologies for making these estimates was requested of the Biophysics Branch by the Joint Technical
A Cost Model for Testing Unmanned and Autonomous Systems of Systems
2011-02-01
those risks. In addition, the fundamental methods presented by Aranha and Borba to include the complexity and sizing of tests for UASoS, can be expanded...used as an input for test execution effort estimation models (Aranha & Borba , 2007). Such methodology is very relevant to this work because as a UASoS...calculate the test effort based on the complexity of the SoS. However, Aranha and Borba define test size as the number of steps required to complete
Low-fidelity bench models for basic surgical skills training during undergraduate medical education.
Denadai, Rafael; Saad-Hossne, Rogério; Todelo, Andréia Padilha; Kirylko, Larissa; Souto, Luís Ricardo Martinhão
2014-01-01
It is remarkable the reduction in the number of medical students choosing general surgery as a career. In this context, new possibilities in the field of surgical education should be developed to combat this lack of interest. In this study, a program of surgical training based on learning with models of low-fidelity bench is designed as a complementary alternative to the various methodologies in the teaching of basic surgical skills during medical education, and to develop personal interests in career choice.
Information Systems: Opportunities Exist to Strengthen SEC’s Oversight of Capacity and Security
2001-07-01
Strengthen SEC’s Oversight of Capacity and Security 5 . FUNDING NUMBERS 6. AUTHOR(S) GAO 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING...ANSI Std. Z39-18 298-102 Page i GAO-01-863 Information Systems Letter 1 Results in Brief 2 Background 4 Scope and Methodology 5 SEC Uses a Wide Range...or external organizations to conduct the independent reviews. These internal audits are performed cyclically based on an annual risk analysis. SEC
Static Strength Characteristics of Mechanically Fastened Composite Joints
NASA Technical Reports Server (NTRS)
Fox, D. E.; Swaim, K. W.
1999-01-01
The analysis of mechanically fastened composite joints presents a great challenge to structural analysts because of the large number of parameters that influence strength. These parameters include edge distance, width, bolt diameter, laminate thickness, ply orientation, and bolt torque. The research presented in this report investigates the influence of some of these parameters through testing and analysis. A methodology is presented for estimating the strength of the bolt-hole based on classical lamination theory using the Tsai-Hill failure criteria and typical bolthole bearing analytical methods.
Expert systems in transmission planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galiana, F.D.; McGillis, D.T.; Marin, M.A.
1992-05-01
In this paper the state of the field of expert systems and knowledge engineering in transmission planning is reviewed. A detailed analysis of the goals, definition, requirements and methodology of transmission planning is presented. Potential benefits of knowledge-based applications in transmission planning are reviewed. This is followed by a thorough review of the area broken down into subareas or important related topics. The conclusions offer a number of suggestions for possible future research and development. Finally, a detailed bibliography divided into subareas is presented.
Scalable smoothing strategies for a geometric multigrid method for the immersed boundary equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhalla, Amneet Pal Singh; Knepley, Matthew G.; Adams, Mark F.
2016-12-20
The immersed boundary (IB) method is a widely used approach to simulating fluid-structure interaction (FSI). Although explicit versions of the IB method can suffer from severe time step size restrictions, these methods remain popular because of their simplicity and generality. In prior work (Guy et al., Adv Comput Math, 2015), some of us developed a geometric multigrid preconditioner for a stable semi-implicit IB method under Stokes flow conditions; however, this solver methodology used a Vanka-type smoother that presented limited opportunities for parallelization. This work extends this Stokes-IB solver methodology by developing smoothing techniques that are suitable for parallel implementation. Specifically,more » we demonstrate that an additive version of the Vanka smoother can yield an effective multigrid preconditioner for the Stokes-IB equations, and we introduce an efficient Schur complement-based smoother that is also shown to be effective for the Stokes-IB equations. We investigate the performance of these solvers for a broad range of material stiffnesses, both for Stokes flows and flows at nonzero Reynolds numbers, and for thick and thin structural models. We show here that linear solver performance degrades with increasing Reynolds number and material stiffness, especially for thin interface cases. Nonetheless, the proposed approaches promise to yield effective solution algorithms, especially at lower Reynolds numbers and at modest-to-high elastic stiffnesses.« less
Quality Assurance in Trichiasis Surgery: a methodology
Buchan, John C; Limburg, Hans; Burton, Matthew J
2013-01-01
SUMMARY Trachoma remains a significant cause of blindness in many parts of the world. The major route to blindness involves upper lid entropion leading to trachomatous trichiasis (TT) which promotes progressive corneal opacification. The provision of surgery to correct TT in the populations most severely affected is a major challenge for the global effort to eliminate Trachoma blindness by the year 2020. Most attention has been paid to increasing the quantity of TT surgery performed, and large numbers of non-doctor operators have been trained to this end. Surgical audit by those performing TT surgery is not a routine part of any national trachoma control programme, and no effective mechanism exists for identifying surgeons experiencing poor outcomes. We propose a methodology for surgical audit at the level of the individual surgeon based on Lot Quality Assurance. A set number of patients operated on previously for upper eyelid TT are examined to detect the recurrence of TT. The number of recurrent cases found will lead to categorisation of the TT surgeon to either “high recurrence” or “low recurrence” with reasonable confidence. The threshold of unacceptability can be set by individual programmes according to previous local studies of recurrence rates or those from similar settings. Identification of surgeons delivering unacceptably high levels of recurrent TT will guide managers on the need for remedial intervention such as re-training. PMID:20881027
Prevalidation in pharmaceutical analysis. Part I. Fundamentals and critical discussion.
Grdinić, Vladimir; Vuković, Jadranka
2004-05-28
A complete prevalidation, as a basic prevalidation strategy for quality control and standardization of analytical procedure was inaugurated. Fast and simple, the prevalidation methodology based on mathematical/statistical evaluation of a reduced number of experiments (N < or = 24) was elaborated and guidelines as well as algorithms were given in detail. This strategy has been produced for the pharmaceutical applications and dedicated to the preliminary evaluation of analytical methods where linear calibration model, which is very often occurred in practice, could be the most appropriate to fit experimental data. The requirements presented in this paper should therefore help the analyst to design and perform the minimum number of prevalidation experiments needed to obtain all the required information to evaluate and demonstrate the reliability of its analytical procedure. In complete prevalidation process, characterization of analytical groups, checking of two limiting groups, testing of data homogeneity, establishment of analytical functions, recognition of outliers, evaluation of limiting values and extraction of prevalidation parameters were included. Moreover, system of diagnosis for particular prevalidation step was suggested. As an illustrative example for demonstration of feasibility of prevalidation methodology, among great number of analytical procedures, Vis-spectrophotometric procedure for determination of tannins with Folin-Ciocalteu's phenol reagent was selected. Favourable metrological characteristics of this analytical procedure, as prevalidation figures of merit, recognized the metrological procedure as a valuable concept in preliminary evaluation of quality of analytical procedures.
An engineering approach to design of dextran microgels size fabricated by water/oil emulsification.
Salimi-Kenari, Hamed; Imani, Mohammad; Nodehi, Azizollah; Abedini, Hossein
2016-09-01
A correlation, based on fluid mechanics, has been investigated for the mean particle diameter of crosslinked dextran microgels (CDMs) prepared via a water/oil emulsification methodology conducted in a single-stirred vessel. To this end, non-dimensional correlations were developed to predict the mean particle size of CDMs as a function of Weber number, Reynolds number and viscosity number similar to ones introduced for liquid-liquid dispersions. Moreover, a Rosin-Rammler distribution function has been successfully applied to the microgel particle size distributions. The correlations were validated using experimentally obtained mean particle sizes for CDMs prepared at different stirring conditions. The validated correlation is especially applicable to medical and pharmaceutical applications where strict control on the mean particle size and size distribution of CDMs are extremely essential. [Formula: see text].
Eyler, Lauren; Hubbard, Alan; Juillard, Catherine
2016-10-01
Low and middle-income countries (LMICs) and the world's poor bear a disproportionate share of the global burden of injury. Data regarding disparities in injury are vital to inform injury prevention and trauma systems strengthening interventions targeted towards vulnerable populations, but are limited in LMICs. We aim to facilitate injury disparities research by generating a standardized methodology for assessing economic status in resource-limited country trauma registries where complex metrics such as income, expenditures, and wealth index are infeasible to assess. To address this need, we developed a cluster analysis-based algorithm for generating simple population-specific metrics of economic status using nationally representative Demographic and Health Surveys (DHS) household assets data. For a limited number of variables, g, our algorithm performs weighted k-medoids clustering of the population using all combinations of g asset variables and selects the combination of variables and number of clusters that maximize average silhouette width (ASW). In simulated datasets containing both randomly distributed variables and "true" population clusters defined by correlated categorical variables, the algorithm selected the correct variable combination and appropriate cluster numbers unless variable correlation was very weak. When used with 2011 Cameroonian DHS data, our algorithm identified twenty economic clusters with ASW 0.80, indicating well-defined population clusters. This economic model for assessing health disparities will be used in the new Cameroonian six-hospital centralized trauma registry. By describing our standardized methodology and algorithm for generating economic clustering models, we aim to facilitate measurement of health disparities in other trauma registries in resource-limited countries. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Brahmi, Khaled; Bouguerra, Wided; Harbi, Soumaya; Elaloui, Elimame; Loungou, Mouna; Hamrouni, Béchir
2018-02-15
This laboratory study investigated the parameters efficiency of the new technology: ballasted electro-flocculation (BEF) using aluminum (Al) electrodes to remove cadmium and zinc from industrial mining wastewater (MWW). The principle of the BEF process is based on the use of micro-sand and polymer together to increase the weight of the flocs and the rate at which they settle is radically changing the electrocoagulation-electroflocculation settling methodology. Based on the examination of the operation parameters one by one, the best removal percentage was obtained at a current intensity of 2A, a the flow rate of 20L/h, a micro-sand dose of 6g/L, a polyéthylèneimine (PEI) polymer dose of 100mg, the contact times of 30min, a stirring speed of 50 RPM, a monopolar configuration of the electrodes, and an electrodes number of 10. The results showed that the flow rate and the current density have a preponderant effect on the variability of the quality of the settled water. In comparison, filterability was found to be more sensitive to number of electrodes, micro sand dosages and current density. It was dependent on the ratio of microsand to PEI polymer dosage, and improved when this ratio increased. Response surface methodology was applied to evaluate the main effects and interactions among stirring speed, polymer dose, current intensity, and electrodes number. The removal of Cd and Zn from industrial MWW was done for very low cost of 0.1TND/m 3 equivalent to 0.04€/m 3 . The investigation of BEF process proposes a highly cost-effective wastewater treatment method if compared to Actiflo TM and electrocoagulation. Copyright © 2017 Elsevier B.V. All rights reserved.
What are the Starting Points? Evaluating Base-Year Assumptions in the Asian Modeling Exercise
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chaturvedi, Vaibhav; Waldhoff, Stephanie; Clarke, Leon E.
2012-12-01
A common feature of model inter-comparison efforts is that the base year numbers for important parameters such as population and GDP can differ substantially across models. This paper explores the sources and implications of this variation in Asian countries across the models participating in the Asian Modeling Exercise (AME). Because the models do not all have a common base year, each team was required to provide data for 2005 for comparison purposes. This paper compares the year 2005 information for different models, noting the degree of variation in important parameters, including population, GDP, primary energy, electricity, and CO2 emissions. Itmore » then explores the difference in these key parameters across different sources of base-year information. The analysis confirms that the sources provide different values for many key parameters. This variation across data sources and additional reasons why models might provide different base-year numbers, including differences in regional definitions, differences in model base year, and differences in GDP transformation methodologies, are then discussed in the context of the AME scenarios. Finally, the paper explores the implications of base-year variation on long-term model results.« less
Extraction of breathing pattern using temperature sensor based on Arduino board
NASA Astrophysics Data System (ADS)
Patel, Rajesh; Sengottuvel, S.; Gireesan, K.; Janawadkar, M. P.; Radhakrishnan, T. S.
2015-06-01
Most of the basic functions of human body are assessed by measuring the different parameters from the body such as temperature, pulse activity and blood pressure etc. Respiration rate is the number of inhalations a person takes per minute and needs to be quantitatively assessed as it modulates other measurements such as SQUID based magnetocardiography (MCG) by bringing the chest closer to or away from the sensor array located inside a stationary liquid helium cryostat. The respiration rate is usually measured when a person is at rest and simply involves counting the number of inhalations for one minute. This paper aims at the development of a suitable methodology for the measurement of respiration rate with the help of a temperature sensor which monitors the very slight change in temperature near the nostril during inhalation & exhalation. The design and development of the proposed system is presented, along with typical experiment results.
Ostrovnaya, Irina; Seshan, Venkatraman E; Olshen, Adam B; Begg, Colin B
2011-06-15
If a cancer patient develops multiple tumors, it is sometimes impossible to determine whether these tumors are independent or clonal based solely on pathological characteristics. Investigators have studied how to improve this diagnostic challenge by comparing the presence of loss of heterozygosity (LOH) at selected genetic locations of tumor samples, or by comparing genomewide copy number array profiles. We have previously developed statistical methodology to compare such genomic profiles for an evidence of clonality. We assembled the software for these tests in a new R package called 'Clonality'. For LOH profiles, the package contains significance tests. The analysis of copy number profiles includes a likelihood ratio statistic and reference distribution, as well as an option to produce various plots that summarize the results. Bioconductor (http://bioconductor.org/packages/release/bioc/html/Clonality.html) and http://www.mskcc.org/mskcc/html/13287.cfm.
Uncertainty quantification in Eulerian-Lagrangian models for particle-laden flows
NASA Astrophysics Data System (ADS)
Fountoulakis, Vasileios; Jacobs, Gustaaf; Udaykumar, Hs
2017-11-01
A common approach to ameliorate the computational burden in simulations of particle-laden flows is to use a point-particle based Eulerian-Lagrangian model, which traces individual particles in their Lagrangian frame and models particles as mathematical points. The particle motion is determined by Stokes drag law, which is empirically corrected for Reynolds number, Mach number and other parameters. The empirical corrections are subject to uncertainty. Treating them as random variables renders the coupled system of PDEs and ODEs stochastic. An approach to quantify the propagation of this parametric uncertainty to the particle solution variables is proposed. The approach is based on averaging of the governing equations and allows for estimation of the first moments of the quantities of interest. We demonstrate the feasibility of our proposed methodology of uncertainty quantification of particle-laden flows on one-dimensional linear and nonlinear Eulerian-Lagrangian systems. This research is supported by AFOSR under Grant FA9550-16-1-0008.
Identifying treatment effect heterogeneity in clinical trials using subpopulations of events: STEPP.
Lazar, Ann A; Bonetti, Marco; Cole, Bernard F; Yip, Wai-Ki; Gelber, Richard D
2016-04-01
Investigators conducting randomized clinical trials often explore treatment effect heterogeneity to assess whether treatment efficacy varies according to patient characteristics. Identifying heterogeneity is central to making informed personalized healthcare decisions. Treatment effect heterogeneity can be investigated using subpopulation treatment effect pattern plot (STEPP), a non-parametric graphical approach that constructs overlapping patient subpopulations with varying values of a characteristic. Procedures for statistical testing using subpopulation treatment effect pattern plot when the endpoint of interest is survival remain an area of active investigation. A STEPP analysis was used to explore patterns of absolute and relative treatment effects for varying levels of a breast cancer biomarker, Ki-67, in the phase III Breast International Group 1-98 randomized clinical trial, comparing letrozole to tamoxifen as adjuvant therapy for postmenopausal women with hormone receptor-positive breast cancer. Absolute treatment effects were measured by differences in 4-year cumulative incidence of breast cancer recurrence, while relative effects were measured by the subdistribution hazard ratio in the presence of competing risks using O-E (observed-minus-expected) methodology, an intuitive non-parametric method. While estimation of hazard ratio values based on O-E methodology has been shown, a similar development for the subdistribution hazard ratio has not. Furthermore, we observed that the subpopulation treatment effect pattern plot analysis may not produce results, even with 100 patients within each subpopulation. After further investigation through simulation studies, we observed inflation of the type I error rate of the traditional test statistic and sometimes singular variance-covariance matrix estimates that may lead to results not being produced. This is due to the lack of sufficient number of events within the subpopulations, which we refer to as instability of the subpopulation treatment effect pattern plot analysis. We introduce methodology designed to improve stability of the subpopulation treatment effect pattern plot analysis and generalize O-E methodology to the competing risks setting. Simulation studies were designed to assess the type I error rate of the tests for a variety of treatment effect measures, including subdistribution hazard ratio based on O-E estimation. This subpopulation treatment effect pattern plot methodology and standard regression modeling were used to evaluate heterogeneity of Ki-67 in the Breast International Group 1-98 randomized clinical trial. We introduce methodology that generalizes O-E methodology to the competing risks setting and that improves stability of the STEPP analysis by pre-specifying the number of events across subpopulations while controlling the type I error rate. The subpopulation treatment effect pattern plot analysis of the Breast International Group 1-98 randomized clinical trial showed that patients with high Ki-67 percentages may benefit most from letrozole, while heterogeneity was not detected using standard regression modeling. The STEPP methodology can be used to study complex patterns of treatment effect heterogeneity, as illustrated in the Breast International Group 1-98 randomized clinical trial. For the subpopulation treatment effect pattern plot analysis, we recommend a minimum of 20 events within each subpopulation. © The Author(s) 2015.
NASA Technical Reports Server (NTRS)
2005-01-01
A number of titanium matrix composite (TMC) systems are currently being investigated for high-temperature air frame and propulsion system applications. As a result, numerous computational methodologies for predicting both deformation and life for this class of materials are under development. An integral part of these methodologies is an accurate and computationally efficient constitutive model for the metallic matrix constituent. Furthermore, because these systems are designed to operate at elevated temperatures, the required constitutive models must account for both time-dependent and time-independent deformations. To accomplish this, the NASA Lewis Research Center is employing a recently developed, complete, potential-based framework. This framework, which utilizes internal state variables, was put forth for the derivation of reversible and irreversible constitutive equations. The framework, and consequently the resulting constitutive model, is termed complete because the existence of the total (integrated) form of the Gibbs complementary free energy and complementary dissipation potentials are assumed a priori. The specific forms selected here for both the Gibbs and complementary dissipation potentials result in a fully associative, multiaxial, nonisothermal, unified viscoplastic model with nonlinear kinematic hardening. This model constitutes one of many models in the Generalized Viscoplasticity with Potential Structure (GVIPS) class of inelastic constitutive equations.
ERIC Educational Resources Information Center
And Others; Rynders, John E.
1978-01-01
For many years, the educational capabilities of Down's syndrome persons have been underestimated because a large number of studies purporting to give an accurate picture of Down's syndrome persons' developmental capabilities have had serious methodological flaws. (Author)
Lane Marking Detection and Reconstruction with Line-Scan Imaging Data.
Li, Lin; Luo, Wenting; Wang, Kelvin C P
2018-05-20
A bstract: Lane marking detection and localization are crucial for autonomous driving and lane-based pavement surveys. Numerous studies have been done to detect and locate lane markings with the purpose of advanced driver assistance systems, in which image data are usually captured by vision-based cameras. However, a limited number of studies have been done to identify lane markings using high-resolution laser images for road condition evaluation. In this study, the laser images are acquired with a digital highway data vehicle (DHDV). Subsequently, a novel methodology is presented for the automated lane marking identification and reconstruction, and is implemented in four phases: (1) binarization of the laser images with a new threshold method (multi-box segmentation based threshold method); (2) determination of candidate lane markings with closing operations and a marching square algorithm; (3) identification of true lane marking by eliminating false positives (FPs) using a linear support vector machine method; and (4) reconstruction of the damaged and dash lane marking segments to form a continuous lane marking based on the geometry features such as adjacent lane marking location and lane width. Finally, a case study is given to validate effects of the novel methodology. The findings indicate the new strategy is robust in image binarization and lane marking localization. This study would be beneficial in road lane-based pavement condition evaluation such as lane-based rutting measurement and crack classification.
NASA Astrophysics Data System (ADS)
Klein, Iris M.; Rousseau, Alain N.; Frigon, Anne; Freudiger, Daphné; Gagnon, Patrick
2016-06-01
Probable maximum snow accumulation (PMSA) is one of the key variables used to estimate the spring probable maximum flood (PMF). A robust methodology for evaluating the PMSA is imperative so the ensuing spring PMF is a reasonable estimation. This is of particular importance in times of climate change (CC) since it is known that solid precipitation in Nordic landscapes will in all likelihood change over the next century. In this paper, a PMSA methodology based on simulated data from regional climate models is developed. Moisture maximization represents the core concept of the proposed methodology; precipitable water being the key variable. Results of stationarity tests indicate that CC will affect the monthly maximum precipitable water and, thus, the ensuing ratio to maximize important snowfall events. Therefore, a non-stationary approach is used to describe the monthly maximum precipitable water. Outputs from three simulations produced by the Canadian Regional Climate Model were used to give first estimates of potential PMSA changes for southern Quebec, Canada. A sensitivity analysis of the computed PMSA was performed with respect to the number of time-steps used (so-called snowstorm duration) and the threshold for a snowstorm to be maximized or not. The developed methodology is robust and a powerful tool to estimate the relative change of the PMSA. Absolute results are in the same order of magnitude as those obtained with the traditional method and observed data; but are also found to depend strongly on the climate projection used and show spatial variability.
Landry, Michel D; Hack, Laurita M; Coulson, Elizabeth; Freburger, Janet; Johnson, Michael P; Katz, Richard; Kerwin, Joanne; Smith, Megan H; Wessman, Henry C Bud; Venskus, Diana G; Sinnott, Patricia L; Goldstein, Marc
2016-01-01
Health human resources continue to emerge as a critical health policy issue across the United States. The purpose of this study was to develop a strategy for modeling future workforce projections to serve as a basis for analyzing annual supply of and demand for physical therapists across the United States into 2020. A traditional stock-and-flow methodology or model was developed and populated with publicly available data to produce estimates of supply and demand for physical therapists by 2020. Supply was determined by adding the estimated number of physical therapists and the approximation of new graduates to the number of physical therapists who immigrated, minus US graduates who never passed the licensure examination, and an estimated attrition rate in any given year. Demand was determined by using projected US population with health care insurance multiplied by a demand ratio in any given year. The difference between projected supply and demand represented a shortage or surplus of physical therapists. Three separate projection models were developed based on best available data in the years 2011, 2012, and 2013, respectively. Based on these projections, demand for physical therapists in the United States outstrips supply under most assumptions. Workforce projection methodology research is based on assumptions using imperfect data; therefore, the results must be interpreted in terms of overall trends rather than as precise actuarial data-generated absolute numbers from specified forecasting. Outcomes of this projection study provide a foundation for discussion and debate regarding the most effective and efficient ways to influence supply-side variables so as to position physical therapists to meet current and future population demand. Attrition rates or permanent exits out of the profession can have important supply-side effects and appear to have an effect on predicting future shortage or surplus of physical therapists. © 2016 American Physical Therapy Association.
Energetics of ligand-receptor binding affinity on endothelial cells: An in vitro model.
Fotticchia, Iolanda; Guarnieri, Daniela; Fotticchia, Teresa; Falanga, Andrea Patrizia; Vecchione, Raffaele; Giancola, Concetta; Netti, Paolo Antonio
2016-08-01
Targeted therapies represent a challenge in modern medicine. In this contest, we propose a rapid and reliable methodology based on Isothermal Titration Calorimetry (ITC) coupled with confluent cell layers cultured around biocompatible templating microparticles to quantify the number of overexpressing receptors on cell membrane and study the energetics of receptor-ligand binding in near-physiological conditions. In the in vitro model here proposed we used the bEnd3 cell line as brain endothelial cells to mimic the blood brain barrier (BBB) cultured on dextran microbeads ranging from 67μm to 80μm in size (Cytodex) and the primary human umbilical vein cells (HUVEC) for comparison. The revealed affinity between transferrin (Tf) and transferrin receptor (TfR) in both systems is very high, Kd values are in the order of nM. Conversely, the value of TfRs/cell reveals a 100-fold increase in the number of TfRs per bEnd3 cells compared to HUVEC cells. The presented methodology can represent a novel and helpful strategy to identify targets, to address drug design and selectively deliver therapeutics that can cross biological barriers such as the blood brain barrier. Copyright © 2016 Elsevier B.V. All rights reserved.
A new approach to road accident rescue.
Morales, Alejandro; González-Aguilera, Diego; López, Alfonso I; Gutiérrez, Miguel A
2016-01-01
This article develops and validates a new methodology and tool for rescue assistance in traffic accidents, with the aim of improving its efficiency and safety in the evacuation of people, reducing the number of victims in road accidents. Different tests supported by professionals and experts have been designed under different circumstances and with different categories of damaged vehicles coming from real accidents and simulated trapped victims in order to calibrate and refine the proposed methodology and tool. To validate this new approach, a tool called App_Rescue has been developed. This tool is based on the use of a computer system that allows an efficient access to the technical information of the vehicle and sanitary information of the common passengers. The time spent during rescue using the standard protocol and the proposed method was compared. This rescue assistance system allows us to make vital information accessible in posttrauma care services, improving the effectiveness of interventions by the emergency services, reducing the rescue time and therefore minimizing the consequences involved and the number of victims. This could often mean saving lives. In the different simulated rescue operations, the rescue time has been reduced an average of 14%.
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark
2007-01-01
The Plug-in Image Component Widget (PICWidget) is a software component for building digital imaging applications. The component is part of a methodology described in GIS Methodology for Planning Planetary-Rover Operations (NPO-41812), which appears elsewhere in this issue of NASA Tech Briefs. Planetary rover missions return a large number and wide variety of image data products that vary in complexity in many ways. Supported by a powerful, flexible image-data-processing pipeline, the PICWidget can process and render many types of imagery, including (but not limited to) thumbnail, subframed, downsampled, stereoscopic, and mosaic images; images coregistred with orbital data; and synthetic red/green/blue images. The PICWidget is capable of efficiently rendering images from data representing many more pixels than are available at a computer workstation where the images are to be displayed. The PICWidget is implemented as an Eclipse plug-in using the Standard Widget Toolkit, which provides a straightforward interface for re-use of the PICWidget in any number of application programs built upon the Eclipse application framework. Because the PICWidget is tile-based and performs aggressive tile caching, it has flexibility to perform faster or slower, depending whether more or less memory is available.
Counting the cost: estimating the economic benefit of pedophile treatment programs.
Shanahan, M; Donato, R
2001-04-01
The principal objective of this paper is to identify the economic costs and benefits of pedophile treatment programs incorporating both the tangible and intangible cost of sexual abuse to victims. Cost estimates of cognitive behavioral therapy programs in Australian prisons are compared against the tangible and intangible costs to victims of being sexually abused. Estimates are prepared that take into account a number of problematic issues. These include the range of possible recidivism rates for treatment programs; the uncertainty surrounding the number of child sexual molestation offences committed by recidivists; and the methodological problems associated with estimating the intangible costs of sexual abuse on victims. Despite the variation in parameter estimates that impact on the cost-benefit analysis of pedophile treatment programs, it is found that potential range of economic costs from child sexual abuse are substantial and the economic benefits to be derived from appropriate and effective treatment programs are high. Based on a reasonable set of parameter estimates, in-prison, cognitive therapy treatment programs for pedophiles are likely to be of net benefit to society. Despite this, a critical area of future research must include further methodological developments in estimating the quantitative impact of child sexual abuse in the community.
[Evidence-based Chinese medicine:theory and practice].
Zhang, Jun-Hua; Li, You-Ping; Zhang, Bo-Li
2018-01-01
The introduction and popularization of evidence-based medicine has opened up a new research field of clinical efficacy evaluation of traditional Chinese medicine(TCM), produced new research ideas and methods, and promoted the progress of clinical research of TCM. After about 20 years assiduous study and earnest practice, the evidence based evaluation method and technique, which conforms to the characteristics of TCM theory and practice, has been developing continuously. Evidence-based Chinese medicine (EBCM) has gradually formed and become an important branch of evidence-based medicine. The basic concept of evidence-based Chinese medicine: EBCM is an applied discipline, following the theory and methodology of evidence-based medicine, to collect, evaluate, produce, transform the evidence of effectiveness, safety and economy of TCM, to reveal the feature and regular pattern of TCM taking effect, and to guide the development of clinical guidelines, clinical pathways and health decisions. The effects and achievements of EBCM development: secondary studies mainly based on systematic review/Meta-analysis were extensively carried out; clinical efficacy studies mainly relying on randomized controlled trials grew rapidly; clinical safety evaluations based on real world study have been conducted; methodological researches mainly focused on study quality control deepened gradually; internationalization researches mainly on report specifications have got some breakthroughs; standardization researches based on treatment specification were strengthened gradually; the research team and talents with the characteristics of inter-disciplinary have been steadily increased. A number of high-quality research findings have been published at international well-known journals; the clinical efficacy and safety evidence of TCM has been increased; the level of clinical rational use of TCM has been improved; a large number of Chinese patent medicines with big market have been cultured. The future missions of EBCM mainly consist of four categories (scientific research, methodology and standard, platform construction and personnel training) with nine tasks. ①Carry out systematic reviews to systematically collect clinical trial reports of TCM and establish database of clinical evidence of TCM; ②Carry out evidence transformation research to lay the foundation for the development of clinical diagnosis and treatment guidelines, clinical pathways of TCM, and for the screening of basic drug list and medical insurance list, and for the policy-making relevant to TCM; ③Conduct researches to evaluate the advantages and effective regular patterns of TCM and form the evidence chain of TCM efficacy; ④Carry out researches for the safety evaluation of TCM, and provide evidence supporting the rational and safe use of TCM in clinical practice; ⑤Conduct researches on methodology of EBCM and provide method for developing high quality evidence; ⑥Carry out researches to develop standards and norms of TCM, and to form methods, standards, specifications and technical systems; ⑦Establish data management platform for evidence-based evaluation of TCM, and promote data sharing; ⑧Build international academic exchange platform to promote international cooperation and mutual recognition of EBCM research; ⑨Carry out education and popularization activities of evidence-based evaluation methods, and train undergraduate students, graduate students, clinical healthcare providers and practitioners of TCM. The development of EBCM, as it was, not only promoted the transformation of clinical research and decision-making mode of TCM, contributed to the modernization and internationalization of TCM, but also enriched the connotation of Evidence-based Medicine. Copyright© by the Chinese Pharmaceutical Association.
Selection of organisms for the co-evolution-based study of protein interactions.
Herman, Dorota; Ochoa, David; Juan, David; Lopez, Daniel; Valencia, Alfonso; Pazos, Florencio
2011-09-12
The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest.
2014-01-01
Background The advent of human genome sequencing project has led to a spurt in the number of protein sequences in the databanks. Success of structure based drug discovery severely hinges on the availability of structures. Despite significant progresses in the area of experimental protein structure determination, the sequence-structure gap is continually widening. Data driven homology based computational methods have proved successful in predicting tertiary structures for sequences sharing medium to high sequence similarities. With dwindling similarities of query sequences, advanced homology/ ab initio hybrid approaches are being explored to solve structure prediction problem. Here we describe Bhageerath-H, a homology/ ab initio hybrid software/server for predicting protein tertiary structures with advancing drug design attempts as one of the goals. Results Bhageerath-H web-server was validated on 75 CASP10 targets which showed TM-scores ≥0.5 in 91% of the cases and Cα RMSDs ≤5Å from the native in 58% of the targets, which is well above the CASP10 water mark. Comparison with some leading servers demonstrated the uniqueness of the hybrid methodology in effectively sampling conformational space, scoring best decoys and refining low resolution models to high and medium resolution. Conclusion Bhageerath-H methodology is web enabled for the scientific community as a freely accessible web server. The methodology is fielded in the on-going CASP11 experiment. PMID:25521245
Selection of organisms for the co-evolution-based study of protein interactions
2011-01-01
Background The prediction and study of protein interactions and functional relationships based on similarity of phylogenetic trees, exemplified by the mirrortree and related methodologies, is being widely used. Although dependence between the performance of these methods and the set of organisms used to build the trees was suspected, so far nobody assessed it in an exhaustive way, and, in general, previous works used as many organisms as possible. In this work we asses the effect of using different sets of organism (chosen according with various phylogenetic criteria) on the performance of this methodology in detecting protein interactions of different nature. Results We show that the performance of three mirrortree-related methodologies depends on the set of organisms used for building the trees, and it is not always directly related to the number of organisms in a simple way. Certain subsets of organisms seem to be more suitable for the predictions of certain types of interactions. This relationship between type of interaction and optimal set of organism for detecting them makes sense in the light of the phylogenetic distribution of the organisms and the nature of the interactions. Conclusions In order to obtain an optimal performance when predicting protein interactions, it is recommended to use different sets of organisms depending on the available computational resources and data, as well as the type of interactions of interest. PMID:21910884
Mechatronics by Analogy and Application to Legged Locomotion
NASA Astrophysics Data System (ADS)
Ragusila, Victor
A new design methodology for mechatronic systems, dubbed as Mechatronics by Analogy (MbA), is introduced and applied to designing a leg mechanism. The new methodology argues that by establishing a similarity relation between a complex system and a number of simpler models it is possible to design the former using the analysis and synthesis means developed for the latter. The methodology provides a framework for concurrent engineering of complex systems while maintaining the transparency of the system behaviour through making formal analogies between the system and those with more tractable dynamics. The application of the MbA methodology to the design of a monopod robot leg, called the Linkage Leg, is also studied. A series of simulations show that the dynamic behaviour of the Linkage Leg is similar to that of a combination of a double pendulum and a spring-loaded inverted pendulum, based on which the system kinematic, dynamic, and control parameters can be designed concurrently. The first stage of Mechatronics by Analogy is a method of extracting significant features of system dynamics through simpler models. The goal is to determine a set of simpler mechanisms with similar dynamic behaviour to that of the original system in various phases of its motion. A modular bond-graph representation of the system is determined, and subsequently simplified using two simplification algorithms. The first algorithm determines the relevant dynamic elements of the system for each phase of motion, and the second algorithm finds the simple mechanism described by the remaining dynamic elements. In addition to greatly simplifying the controller for the system, using simpler mechanisms with similar behaviour provides a greater insight into the dynamics of the system. This is seen in the second stage of the new methodology, which concurrently optimizes the simpler mechanisms together with a control system based on their dynamics. Once the optimal configuration of the simpler system is determined, the original mechanism is optimized such that its dynamic behaviour is analogous. It is shown that, if this analogy is achieved, the control system designed based on the simpler mechanisms can be directly implemented to the more complex system, and their dynamic behaviours are close enough for the system performance to be effectively the same. Finally it is shown that, for the employed objective of fast legged locomotion, the proposed methodology achieves a better design than Reduction-by-Feedback, a competing methodology that uses control layers to simplify the dynamics of the system.
Zhang, Melvyn Wb; Tsang, Tammy; Cheow, Enquan; Ho, Cyrus Sh; Yeong, Ng Beng; Ho, Roger Cm
2014-11-11
The use of mobile phones, and specifically smartphones, in the last decade has become more and more prevalent. The latest mobile phones are equipped with comprehensive features that can be used in health care, such as providing rapid access to up-to-date evidence-based information, provision of instant communications, and improvements in organization. The estimated number of health care apps for mobile phones is increasing tremendously, but previous research has highlighted the lack of critical appraisal of new apps. This lack of appraisal of apps has largely been due to the lack of clinicians with technical knowledge of how to create an evidence-based app. We discuss two freely available methodologies for developing Web-based mobile phone apps: a website builder and an app builder. With these, users can program not just a Web-based app, but also integrate multimedia features within their app, without needing to know any programming language. We present techniques for creating a mobile Web-based app using two well-established online mobile app websites. We illustrate how to integrate text-based content within the app, as well as integration of interactive videos and rich site summary (RSS) feed information. We will also briefly discuss how to integrate a simple questionnaire survey into the mobile-based app. A questionnaire survey was administered to students to collate their perceptions towards the app. These two methodologies for developing apps have been used to convert an online electronic psychiatry textbook into two Web-based mobile phone apps for medical students rotating through psychiatry in Singapore. Since the inception of our mobile Web-based app, a total of 21,991 unique users have used the mobile app and online portal provided by WordPress, and another 717 users have accessed the app via a Web-based link. The user perspective survey results (n=185) showed that a high proportion of students valued the textbook and objective structured clinical examination videos featured in the app. A high proportion of students concurred that a self-designed mobile phone app would be helpful for psychiatry education. These methodologies can enable busy clinicians to develop simple mobile Web-based apps for academic, educational, and research purposes, without any prior knowledge of programming. This will be beneficial for both clinicians and users at large, as there will then be more evidence-based mobile phone apps, or at least apps that have been appraised by a clinician.
Tsang, Tammy; Cheow, Enquan; Ho, Cyrus SH; Yeong, Ng Beng; Ho, Roger CM
2014-01-01
Background The use of mobile phones, and specifically smartphones, in the last decade has become more and more prevalent. The latest mobile phones are equipped with comprehensive features that can be used in health care, such as providing rapid access to up-to-date evidence-based information, provision of instant communications, and improvements in organization. The estimated number of health care apps for mobile phones is increasing tremendously, but previous research has highlighted the lack of critical appraisal of new apps. This lack of appraisal of apps has largely been due to the lack of clinicians with technical knowledge of how to create an evidence-based app. Objective We discuss two freely available methodologies for developing Web-based mobile phone apps: a website builder and an app builder. With these, users can program not just a Web-based app, but also integrate multimedia features within their app, without needing to know any programming language. Methods We present techniques for creating a mobile Web-based app using two well-established online mobile app websites. We illustrate how to integrate text-based content within the app, as well as integration of interactive videos and rich site summary (RSS) feed information. We will also briefly discuss how to integrate a simple questionnaire survey into the mobile-based app. A questionnaire survey was administered to students to collate their perceptions towards the app. Results These two methodologies for developing apps have been used to convert an online electronic psychiatry textbook into two Web-based mobile phone apps for medical students rotating through psychiatry in Singapore. Since the inception of our mobile Web-based app, a total of 21,991 unique users have used the mobile app and online portal provided by WordPress, and another 717 users have accessed the app via a Web-based link. The user perspective survey results (n=185) showed that a high proportion of students valued the textbook and objective structured clinical examination videos featured in the app. A high proportion of students concurred that a self-designed mobile phone app would be helpful for psychiatry education. Conclusions These methodologies can enable busy clinicians to develop simple mobile Web-based apps for academic, educational, and research purposes, without any prior knowledge of programming. This will be beneficial for both clinicians and users at large, as there will then be more evidence-based mobile phone apps, or at least apps that have been appraised by a clinician. PMID:25486985
Fast maximum likelihood estimation using continuous-time neural point process models.
Lepage, Kyle Q; MacDonald, Christopher J
2015-06-01
A recent report estimates that the number of simultaneously recorded neurons is growing exponentially. A commonly employed statistical paradigm using discrete-time point process models of neural activity involves the computation of a maximum-likelihood estimate. The time to computate this estimate, per neuron, is proportional to the number of bins in a finely spaced discretization of time. By using continuous-time models of neural activity and the optimally efficient Gaussian quadrature, memory requirements and computation times are dramatically decreased in the commonly encountered situation where the number of parameters p is much less than the number of time-bins n. In this regime, with q equal to the quadrature order, memory requirements are decreased from O(np) to O(qp), and the number of floating-point operations are decreased from O(np(2)) to O(qp(2)). Accuracy of the proposed estimates is assessed based upon physiological consideration, error bounds, and mathematical results describing the relation between numerical integration error and numerical error affecting both parameter estimates and the observed Fisher information. A check is provided which is used to adapt the order of numerical integration. The procedure is verified in simulation and for hippocampal recordings. It is found that in 95 % of hippocampal recordings a q of 60 yields numerical error negligible with respect to parameter estimate standard error. Statistical inference using the proposed methodology is a fast and convenient alternative to statistical inference performed using a discrete-time point process model of neural activity. It enables the employment of the statistical methodology available with discrete-time inference, but is faster, uses less memory, and avoids any error due to discretization.
An actual load forecasting methodology by interval grey modeling based on the fractional calculus.
Yang, Yang; Xue, Dingyü
2017-07-17
The operation processes for thermal power plant are measured by the real-time data, and a large number of historical interval data can be obtained from the dataset. Within defined periods of time, the interval information could provide important information for decision making and equipment maintenance. Actual load is one of the most important parameters, and the trends hidden in the historical data will show the overall operation status of the equipments. However, based on the interval grey parameter numbers, the modeling and prediction process is more complicated than the one with real numbers. In order not lose any information, the geometric coordinate features are used by the coordinates of area and middle point lines in this paper, which are proved with the same information as the original interval data. The grey prediction model for interval grey number by the fractional-order accumulation calculus is proposed. Compared with integer-order model, the proposed method could have more freedom with better performance for modeling and prediction, which can be widely used in the modeling process and prediction for the small amount interval historical industry sequence samples. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Satellite-based terrestrial production efficiency modeling
McCallum, Ian; Wagner, Wolfgang; Schmullius, Christiane; Shvidenko, Anatoly; Obersteiner, Michael; Fritz, Steffen; Nilsson, Sten
2009-01-01
Production efficiency models (PEMs) are based on the theory of light use efficiency (LUE) which states that a relatively constant relationship exists between photosynthetic carbon uptake and radiation receipt at the canopy level. Challenges remain however in the application of the PEM methodology to global net primary productivity (NPP) monitoring. The objectives of this review are as follows: 1) to describe the general functioning of six PEMs (CASA; GLO-PEM; TURC; C-Fix; MOD17; and BEAMS) identified in the literature; 2) to review each model to determine potential improvements to the general PEM methodology; 3) to review the related literature on satellite-based gross primary productivity (GPP) and NPP modeling for additional possibilities for improvement; and 4) based on this review, propose items for coordinated research. This review noted a number of possibilities for improvement to the general PEM architecture - ranging from LUE to meteorological and satellite-based inputs. Current PEMs tend to treat the globe similarly in terms of physiological and meteorological factors, often ignoring unique regional aspects. Each of the existing PEMs has developed unique methods to estimate NPP and the combination of the most successful of these could lead to improvements. It may be beneficial to develop regional PEMs that can be combined under a global framework. The results of this review suggest the creation of a hybrid PEM could bring about a significant enhancement to the PEM methodology and thus terrestrial carbon flux modeling. Key items topping the PEM research agenda identified in this review include the following: LUE should not be assumed constant, but should vary by plant functional type (PFT) or photosynthetic pathway; evidence is mounting that PEMs should consider incorporating diffuse radiation; continue to pursue relationships between satellite-derived variables and LUE, GPP and autotrophic respiration (Ra); there is an urgent need for satellite-based biomass measurements to improve Ra estimation; and satellite-based soil moisture data could improve determination of soil water stress. PMID:19765285
Everson, Mark D; Faller, Kathleen Coulborn
2012-01-01
Developmentally inappropriate sexual behavior has long been viewed as a possible indicator of child sexual abuse. In recent years, however, the utility of sexualized behavior in forensic assessments of alleged child sexual abuse has been seriously challenged. This article addresses a number of the concerns that have been raised about the diagnostic value of sexualized behavior, including the claim that when population base rates for abuse are properly taken into account, the diagnostic value of sexualized behavior is insignificant. This article also identifies a best practice comprehensive evaluation model with a methodology that is effective in mitigating such concerns.
From wheels to wings with evolutionary spiking circuits.
Floreano, Dario; Zufferey, Jean-Christophe; Nicoud, Jean-Daniel
2005-01-01
We give an overview of the EPFL indoor flying project, whose goal is to evolve neural controllers for autonomous, adaptive, indoor micro-flyers. Indoor flight is still a challenge because it requires miniaturization, energy efficiency, and control of nonlinear flight dynamics. This ongoing project consists of developing a flying, vision-based micro-robot, a bio-inspired controller composed of adaptive spiking neurons directly mapped into digital microcontrollers, and a method to evolve such a neural controller without human intervention. This article describes the motivation and methodology used to reach our goal as well as the results of a number of preliminary experiments on vision-based wheeled and flying robots.
Methodology and reporting of meta-analyses in the neurosurgical literature.
Klimo, Paul; Thompson, Clinton J; Ragel, Brian T; Boop, Frederick A
2014-04-01
Neurosurgeons are inundated with vast amounts of new clinical research on a daily basis, making it difficult and time-consuming to keep up with the latest literature. Meta-analysis is an extension of a systematic review that employs statistical techniques to pool the data from the literature in order to calculate a cumulative effect size. This is done to answer a clearly defined a priori question. Despite their increasing popularity in the neurosurgery literature, meta-analyses have not been scrutinized in terms of reporting and methodology. The authors performed a literature search using PubMed/MEDLINE to locate all meta-analyses that have been published in the JNS Publishing Group journals (Journal of Neurosurgery, Journal of Neurosurgery: Pediatrics, Journal of Neurosurgery: Spine, and Neurosurgical Focus) or Neurosurgery. Accepted checklists for reporting (PRISMA) and methodology (AMSTAR) were applied to each meta-analysis, and the number of items within each checklist that were satisfactorily fulfilled was recorded. The authors sought to answer 4 specific questions: Are meta-analyses improving 1) with time; 2) when the study met their definition of a meta-analysis; 3) when clinicians collaborated with a potential expert in meta-analysis; and 4) when the meta-analysis was the only focus of the paper? Seventy-two meta-analyses were published in the JNS Publishing Group journals and Neurosurgery between 1990 and 2012. The number of published meta-analyses has increased dramatically in the last several years. The most common topics were vascular, and most were based on observational studies. Only 11 papers were prepared using an established checklist. The average AMSTAR and PRISMA scores (proportion of items satisfactorily fulfilled divided by the total number of eligible items in the respective instrument) were 31% and 55%, respectively. Major deficiencies were identified, including the lack of a comprehensive search strategy, study selection and data extraction, assessment of heterogeneity, publication bias, and study quality. Almost one-third of the papers did not meet our basic definition of a meta-analysis. The quality of reporting and methodology was better 1) when the study met our definition of a meta-analysis; 2) when one or more of the authors had experience or expertise in conducting a meta-analysis; 3) when the meta-analysis was not conducted alongside an evaluation of the authors' own data; and 4) in more recent studies. Reporting and methodology of meta-analyses in the neurosurgery literature is excessively variable and overall poor. As these papers are being published with increasing frequency, neurosurgical journals need to adopt a clear definition of a meta-analysis and insist that they be created using checklists for both reporting and methodology. Standardization will ensure high-quality publications.
Memristor-Based Computing Architecture: Design Methodologies and Circuit Techniques
2013-03-01
MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES AND CIRCUIT TECHNIQUES POLYTECHNIC INSTITUTE OF NEW YORK UNIVERSITY...TECHNICAL REPORT 3. DATES COVERED (From - To) OCT 2010 – OCT 2012 4. TITLE AND SUBTITLE MEMRISTOR-BASED COMPUTING ARCHITECTURE : DESIGN METHODOLOGIES...schemes for a memristor-based reconfigurable architecture design have not been fully explored yet. Therefore, in this project, we investigated
78 FR 40149 - Scientific Information Request on Chronic Urinary Retention (CUR) Treatment
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-03
... improve the quality of this review. AHRQ is conducting this comparative effectiveness review pursuant to..., study period, design, methodology, indication and diagnosis, proper use instructions, inclusion and... study number, the study period, design, methodology, indication and diagnosis, proper use instructions...
Individuals and Their Employability
ERIC Educational Resources Information Center
McQuade, Eamonn; Maguire, Theresa
2005-01-01
Purpose: This paper aims to describe a research project that is addressing the employability of individuals in the higher-cost Irish economy. Design/methodology/approach: The Programme for University-Industry Interface (PUII) uses a community-of-practice methodology combined with academic research. Findings: A number of emerging enterprise models…
Landorf, Karl B; Menz, Hylton B; Armstrong, David G; Herbert, Robert D
2015-07-01
Randomized trials must be of high methodological quality to yield credible, actionable findings. The main aim of this project was to evaluate whether there has been an improvement in the methodological quality of randomized trials published in the Journal of the American Podiatric Medical Association (JAPMA). Randomized trials published in JAPMA during a 15-year period (January 1999 to December 2013) were evaluated. The methodological quality of randomized trials was evaluated using the PEDro scale (scores range from 0 to 10, with 0 being lowest quality). Linear regression was used to assess changes in methodological quality over time. A total of 1,143 articles were published in JAPMA between January 1999 and December 2013. Of these, 44 articles were reports of randomized trials. Although the number of randomized trials published each year increased, there was only minimal improvement in their methodological quality (mean rate of improvement = 0.01 points per year). The methodological quality of the trials studied was typically moderate, with a mean ± SD PEDro score of 5.1 ± 1.5. Although there were a few high-quality randomized trials published in the journal, most (84.1%) scored between 3 and 6. Although there has been an increase in the number of randomized trials published in JAPMA, there is substantial opportunity for improvement in the methodological quality of trials published in the journal. Researchers seeking to publish reports of randomized trials should seek to meet current best-practice standards in the conduct and reporting of their trials.
Lattimore, Vanessa L.; Pearson, John F.; Currie, Margaret J.; Spurdle, Amanda B.; Robinson, Bridget A.; Walker, Logan C.
2018-01-01
PCR-based RNA splicing assays are commonly used in diagnostic and research settings to assess the potential effects of variants of uncertain clinical significance in BRCA1 and BRCA2. The Evidence-based Network for the Interpretation of Germline Mutant Alleles (ENIGMA) consortium completed a multicentre investigation to evaluate differences in assay design and the integrity of published data, raising a number of methodological questions associated with cell culture conditions and PCR-based protocols. We utilized targeted RNA-seq to re-assess BRCA1 and BRCA2 mRNA isoform expression patterns in lymphoblastoid cell lines (LCLs) previously used in the multicentre ENIGMA study. Capture of the targeted cDNA sequences was carried out using 34 BRCA1 and 28 BRCA2 oligonucleotides from the Illumina Truseq Targeted RNA Expression platform. Our results show that targeted RNA-seq analysis of LCLs overcomes many of the methodology limitations associated with PCR-based assays leading us to make the following observations and recommendations: (1) technical replicates (n > 2) of variant carriers to capture methodology induced variability associated with RNA-seq assays, (2) LCLs can undergo multiple freeze/thaw cycles and can be cultured up to 2 weeks without noticeably influencing isoform expression levels, (3) nonsense-mediated decay inhibitors are essential prior to splicing assays for comprehensive mRNA isoform detection, (4) quantitative assessment of exon:exon junction levels across BRCA1 and BRCA2 can help distinguish between normal and aberrant isoform expression patterns. Experimentally derived recommendations from this study will facilitate the application of targeted RNA-seq platforms for the quantitation of BRCA1 and BRCA2 mRNA aberrations associated with sequence variants of uncertain clinical significance. PMID:29774201
Lattimore, Vanessa L; Pearson, John F; Currie, Margaret J; Spurdle, Amanda B; Robinson, Bridget A; Walker, Logan C
2018-01-01
PCR-based RNA splicing assays are commonly used in diagnostic and research settings to assess the potential effects of variants of uncertain clinical significance in BRCA1 and BRCA2 . The Evidence-based Network for the Interpretation of Germline Mutant Alleles (ENIGMA) consortium completed a multicentre investigation to evaluate differences in assay design and the integrity of published data, raising a number of methodological questions associated with cell culture conditions and PCR-based protocols. We utilized targeted RNA-seq to re-assess BRCA1 and BRCA2 mRNA isoform expression patterns in lymphoblastoid cell lines (LCLs) previously used in the multicentre ENIGMA study. Capture of the targeted cDNA sequences was carried out using 34 BRCA1 and 28 BRCA2 oligonucleotides from the Illumina Truseq Targeted RNA Expression platform. Our results show that targeted RNA-seq analysis of LCLs overcomes many of the methodology limitations associated with PCR-based assays leading us to make the following observations and recommendations: (1) technical replicates ( n > 2) of variant carriers to capture methodology induced variability associated with RNA-seq assays, (2) LCLs can undergo multiple freeze/thaw cycles and can be cultured up to 2 weeks without noticeably influencing isoform expression levels, (3) nonsense-mediated decay inhibitors are essential prior to splicing assays for comprehensive mRNA isoform detection, (4) quantitative assessment of exon:exon junction levels across BRCA1 and BRCA2 can help distinguish between normal and aberrant isoform expression patterns. Experimentally derived recommendations from this study will facilitate the application of targeted RNA-seq platforms for the quantitation of BRCA1 and BRCA2 mRNA aberrations associated with sequence variants of uncertain clinical significance.
Sleep disturbances as an evidence-based suicide risk factor.
Bernert, Rebecca A; Kim, Joanne S; Iwata, Naomi G; Perlis, Michael L
2015-03-01
Increasing research indicates that sleep disturbances may confer increased risk for suicidal behaviors, including suicidal ideation, suicide attempts, and death by suicide. Despite increased investigation, a number of methodological problems present important limitations to the validity and generalizability of findings in this area, which warrant additional focus. To evaluate and delineate sleep disturbances as an evidence-based suicide risk factor, a systematic review of the extant literature was conducted with methodological considerations as a central focus. The following methodologic criteria were required for inclusion: the report (1) evaluated an index of sleep disturbance; (2) examined an outcome measure for suicidal behavior; (3) adjusted for presence of a depression diagnosis or depression severity, as a covariate; and (4) represented an original investigation as opposed to a chart review. Reports meeting inclusion criteria were further classified and reviewed according to: study design and timeframe; sample type and size; sleep disturbance, suicide risk, and depression covariate assessment measure(s); and presence of positive versus negative findings. Based on keyword search, the following search engines were used: PubMed and PsycINFO. Search criteria generated N = 82 articles representing original investigations focused on sleep disturbances and suicide outcomes. Of these, N = 18 met inclusion criteria for review based on systematic analysis. Of the reports identified, N = 18 evaluated insomnia or poor sleep quality symptoms, whereas N = 8 assessed nightmares in association with suicide risk. Despite considerable differences in study designs, samples, and assessment techniques, the comparison of such reports indicates preliminary, converging evidence for sleep disturbances as an empirical risk factor for suicidal behaviors, while highlighting important, future directions for increased investigation.